Survey banner
Switching to Dataiku - a new area to help users who are transitioning from other tools and diving into Dataiku! CHECK IT OUT

Improve logging events to allow to trace back errors to actual objects

0 Kudos

We regularly our backend logs as there is always lots of user errors that need to be cleaned up (see some samples on this other idea). However some log events do not provide enough information to allow to trace back the errors to actual objects. Here are the ones I have found so far:

ErrorImprovement
[2024/05/08-13:02:42.815] [FT-WebAppBackendInstance-FFSP3va5-572217] [ERROR] [dku.webapps.abstract] - Exception while running the kernel
java.io.IOException: Kernel process return code is 137
Add kernel/notebook name
[2024/05/08-12:59:04.711] [ActiveTriggerLifecycleThread_8955] [ERROR] [dku.scenario.activetrigger.thread] - Failed to evaluate trigger
java.lang.IllegalArgumentException: Connection name not specified
Add trigger and scenario name
[2024/05/08-08:34:38.320] [qtp1900366749-352743] [ERROR] [dku.recipes.join.status] - Failed to generate SQL query
java.lang.IllegalArgumentException: Column "c" doesn't exist.
Add source of SQL query, either dataset, recipe or trigger name.
[2024/05/08-15:16:47.055] [qtp970781872-711067] [ERROR] [dip.controllers] - API call '/dip/api/image/get-image' failed
com.dataiku.dip.exceptions.UnauthorizedException: Failed to read project permissions
Add user ID and project ID
[2024/05/08-15:11:53.590] [qtp1900366749-2119135] [ERROR] [com.dataiku.dip.dataflow.exec.joinlike] - Invalid selected columns
java.lang.IllegalArgumentException: Output schema contains duplicate column name "imo"
Add dataset name
[2024/05/08-12:37:26.207] [qtp970781872-682151] [ERROR] [dip.controllers] - API call '/dip/api/tintercom/datasets/read-data/' failed
java.lang.IllegalArgumentException: Invalid partition spec: wti_M1_M2 for a non-partitioned dataset
Add dataset name
[2024/05/08-13:45:50.839] [qtp970781872-706452] [ERROR] [dip.controllers] - API call '/dip/api/projects/git/set-dirty' failed
java.lang.IllegalArgumentException: null not found in libraries
Add relevant into to trace this back
[2024/05/08-12:00:00.785] [pool-4278-thread-1] [ERROR] [dip.scenarios.smtp] - Email not sent: nullAdd scenario and trigger name
[2024/05/08-09:01:42.934] [FRT-76-FlowRunnable] [ERROR] [dku.input.push] - Push failed, cleanup resources
com.dataiku.dip.exceptions.DataStoreIOException: Failed to read data from table
Add dataset name, job name
[2024/05/08-07:23:51.674] [qtp970781872-633636] [ERROR] [dip.controllers] - API call '/dip/api/projects/get-project-access-info' failed
com.dataiku.dip.exceptions.UnauthorizedException: Failed to read project permissions. Project does not exist or you do not have the permission to access it.
Add user ID
[2024/05/08-02:04:42.404] [qtp970781872-609883] [ERROR] [dku.datasets.sql] - Connection error
java.sql.SQLException: Unable to connect
Add connection name
[2024/05/05-08:04:37.379] [qtp970781872-171721] [ERROR] [dip.controllers] - Response already committed, cannot send error
org.eclipse.jetty.io.EofException
Add relevant into to trace this back
[2024/05/05-08:04:37.378] [qtp970781872-171721] [ERROR] [dip.controllers] - API call '/dip/api/admin/connections/test-sql' failed
org.eclipse.jetty.io.EofException
Add connection name
[2024/05/04-09:26:53.791] [FT-WebAppBackendInstance-3Y4QnYwQ-5461] [ERROR] [dip.webapp.backend.instance] - Backend died, restarting it
com.dataiku.dip.autorestart.AutoRestartingProcessRunner$BackendDiedException: Backend died while running
Add WebApp ID
[2024/05/07-16:07:43.359] [SQLNBK-6CG7UtBQjvnT-556832] [ERROR] [dku.sql.runtime] - Failed query
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'XXX'.
Add dataset name or source of query
[2024/05/06-16:18:51.642] [FT-SamplesService$1-NN2btU6r-382572] [ERROR] [dku.eda.compute.sample] - Started creating new sample from datasetPROJECT_KEY.DATASET_IDWhy ERROR on new sample? Also no space between dataset and project_key

 

Please feel free to post your own "untraceable" error log events below. Thanks!