Hi, is there a way to limit the log size when running a job? Sometimes we run a job and that fails. When this happens all temp data and logging is stored which leads to disk space issues. In no time logging of over 100GB is created.
You can perform such scheduled operations through the use of macros: https://doc.dataiku.com/dss/latest/operations/macros.html. As a matter of fact, we provide a built-in macro to clear old job logs. This macro can be run manually or scheduled through a scenario (if you have production capabilities activated in your license).
Note that this clears existing logs, so you may want to store them elsewhere for audit or compliance reasons.