Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Hi, we have a project in which we want to monitor the logins and logouts of our Dataiku instance's users.
We need to use a Python recipe that imports the Dataiku log files directly from the instance without downloading and uploading them, but we don't know how to establish that connection between the log files and the Python recipe? We are not interested in modifying the files, only reading these.
Any information would be appreciated!
You may set up a connection to the filesystem path that hosts the logs (Administrations > Connections). You can then point a managed folder or dataset to that destination and read in the data.
If you have not done so already, you will need to install and configure an event server to host the audit logs. Please see the following docs for full details:
Please let me know if you have questions.
Thank you, it was very useful.
Now, we have another problem. We want to search all actions carried out by users such as creation, visualization, modification, execution, deletion and any other possible action in graphs that are made in the char tab of the datasets, notebooks and dashboards.
Also, to identificate the logs of modifications on dashboards in logs files we have been using the values ["message]["msgType"]: "dashboard-create", ["message]["msgType"]: "taggable-object-delete" and ["message][ "msgType"]: "dashboards-read-meta" and we want to validate the meaning of these.
How can we locate these logs, the user who created them and the project in which they were created?