Kobernate error when use OpenAI GPT
Oops: an unexpected error occurred
Kubernetes job failed, exitCode=1, reason=Error
Operating system used: windows
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,212 Dataiker
Hi @saeed396
,
The error doesn't say much.
The job logs should have more information. In most cases, you may not want to share those on the community. I would encourage you to open a support ticket with the job diagnostics:
https://doc.dataiku.com/dss/latest/troubleshooting/problems/job-fails.html#getting-a-job-diagnosis
Thanks -
Hi Alex,
Thank you for your response. I tried to run a text summarization task and this is the exact log.
[2023/04/07-14:43:49.769] [ActivityExecutor-57] [INFO] [dku] running compute_summary_NP - ---------------------------------------- [2023/04/07-14:43:49.769] [ActivityExecutor-57] [INFO] [dku] running compute_summary_NP - DSS startup: jek version:11.3.2 [2023/04/07-14:43:49.769] [ActivityExecutor-57] [INFO] [dku] running compute_summary_NP - DSS home: /data/dataiku/dss_data [2023/04/07-14:43:49.769] [ActivityExecutor-57] [INFO] [dku] running compute_summary_NP - OS: Linux 3.10.0-1160.83.1.el7.x86_64 amd64 - Java: Red Hat, Inc. 1.8.0_362 [2023/04/07-14:43:49.769] [ActivityExecutor-57] [INFO] [dku.flow.jobrunner] running compute_summary_NP - Allocated a slot for this activity! [2023/04/07-14:43:49.770] [ActivityExecutor-57] [INFO] [dku.flow.jobrunner] running compute_summary_NP - Run activity [2023/04/07-14:43:49.774] [ActivityExecutor-57] [INFO] [dku.flow.activity] running compute_summary_NP - Executing default pre-activity lifecycle hook [2023/04/07-14:43:49.780] [ActivityExecutor-57] [INFO] [dku.flow.activity] running compute_summary_NP - Checking if sources are ready [2023/04/07-14:43:49.781] [ActivityExecutor-57] [INFO] [dku.flow.activity] running compute_summary_NP - Will check readiness of TEST.2022_DM_Field_Insights_report_prepared_prepared p=NP [2023/04/07-14:43:49.810] [ActivityExecutor-57] [DEBUG] [dku.datasets.fsbased] running compute_summary_NP - getReadiness: will enumerate partition <partition:NP> [2023/04/07-14:43:49.812] [ActivityExecutor-57] [INFO] [dku.datasets.bloblike] running compute_summary_NP - Enumerating blob-like dataset TEST.2022_DM_Field_Insights_report_prepared_prepared prefix= [2023/04/07-14:43:49.812] [ActivityExecutor-57] [DEBUG] [dku.datasets.fsbased] running compute_summary_NP - Building FS provider for dataset handler: TEST.2022_DM_Field_Insights_report_prepared_prepared [2023/04/07-14:43:49.819] [ActivityExecutor-57] [DEBUG] [dku.fs.s3] running compute_summary_NP - Created S3 provider bucket=nnedl-core-prd-us-east-1-raw pathInBucket=/nniscoe/dataiku/TEST/2022_DM_Field_Insights_report_prepared_prepared (connectionChroot=null) [2023/04/07-14:43:49.819] [ActivityExecutor-57] [DEBUG] [dku.datasets.fsbased] running compute_summary_NP - FS Provider built [2023/04/07-14:43:50.061] [ActivityExecutor-57] [INFO] [dku.fs.s3] running compute_summary_NP - Retrieving location from bucket [2023/04/07-14:43:50.329] [ActivityExecutor-57] [INFO] [dku.fs.s3] running compute_summary_NP - Bucket location is US, using us-east-1 [2023/04/07-14:43:50.332] [ActivityExecutor-57] [DEBUG] [dku.fs.s3] running compute_summary_NP - Done create S3 client [2023/04/07-14:43:50.369] [ActivityExecutor-57] [INFO] [dku.fs.s3] running compute_summary_NP - Start S3 Enumeration ON bucketPath=nniscoe/dataiku/TEST/2022_DM_Field_Insights_report_prepared_prepared/ prefix= fullPath=nniscoe/dataiku/TEST/2022_DM_Field_Insights_report_prepared_prepared/ [2023/04/07-14:43:50.400] [ActivityExecutor-57] [INFO] [dku.fs.s3] running compute_summary_NP - S3 enumeration done, found 1 items, 367729 bytes [2023/04/07-14:43:50.402] [ActivityExecutor-57] [DEBUG] [dku.datasets.fsbased] running compute_summary_NP - getReadiness: enumerated partition, found 1 paths, computing hash [2023/04/07-14:43:50.403] [ActivityExecutor-57] [INFO] [dku.flow.activity] running compute_summary_NP - Checked source readiness TEST.2022_DM_Field_Insights_report_prepared_prepared -> true [2023/04/07-14:43:50.403] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Computing hashes to propagate BEFORE activity [2023/04/07-14:43:50.404] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Recorded 1 hashes before activity run [2023/04/07-14:43:50.404] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Building recipe runner of type [2023/04/07-14:43:50.414] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Recipe runner built, will use 1 thread(s) [2023/04/07-14:43:50.415] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Starting execution thread: com.dataiku.dip.recipes.customcode.CustomPythonRecipeRunner@67c09ed6 [2023/04/07-14:43:50.416] [ActivityExecutor-57] [DEBUG] [dku.flow.activity] running compute_summary_NP - Execution threads started, waiting for activity end [2023/04/07-14:43:50.419] [FRT-63-FlowRunnable] [INFO] [dku.flow.activity] - Run thread for activity compute_summary_NP starting [2023/04/07-14:43:50.421] [FRT-63-FlowRunnable] [INFO] [dku.flow.custompython] - Dumping Python script to /data/dataiku/dss_data/jobs/TEST/Build_summary__NP__2023-04-07T14-43-46.656/compute_summary_NP/custom-python-recipe/pyoutCBh6mRKaOxQg/script.py
unning compute_summary_NP - Done post-activity tasks
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,212 Dataiker
@saeed396
These logs share don't contain any information in regards to the previous error reported "Kubernetes job failed, exitCode=1, reason=Error" The logs was likely truncated.
Can you please share the job diagnostics via Live Chat - https://doc.dataiku.com/dss/latest/troubleshooting/obtaining-support.html#live-chat Or a support ticket so we can assist.
Thanks -
Hi Alex!
I couldn't find chat place or link. Could you send me the link for chat. Also I sent the error's zip file through the email that I received from community.
Thank you,
Saeed