Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
You can send logs to Kafka:
Or you can use the DSS Event Server:
to send audit logs to other technologies/locations. But this is intended for audit trail logs not for scenario logs. Scenario logs are stored in a directory (see https://doc.dataiku.com/dss/latest/operations/disk-usage.html#scenario-logs) so nothing stops you from developing a process to load them somewhere.
But what will be the purpose of doing so? What are you really trying to achieve?
we want to keep a tight watch on numerous processes.
Currently we use K8s-pods-stdout-stderr > Fluentd > Splunk .
Wondering how to do something similar in Dataiku.