SocketTimeoutException in dataiku job

dong
dong Registered Posts: 1

currently i use redshift as the source dataset , and use export to folder recipe export excel to sharepoint, the destination dataset point on sharepoint. but when i running the job, if the excel less than 50mb, job success. but if more than 50mb , report below issue:

 Oops: an unexpected error occurred

Error in Python process: At line 18: <class 'Exception'>: None: b'Error while uploading chunk: Request failed with error, retry if necessary., caused by: ClientException: Error executing the request, caused by: SocketTimeoutException: timeout, caused by: ClientException: Request failed with error, retry if necessary., caused by: ClientException: Error executing the request, caused by: SocketTimeoutException: timeout'

Please see our options for getting help

HTTP code: , type: <class 'Exception'>

Operating system used: dataiku

Answers

  • Turribeach
    Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023, Circle Member Posts: 2,611 Neuron

    Sounds like your Sharepoint instance won’t take the 50MB. Try exporting to a local managed folder. If it works that proves it. You could copy the file to the Sharepoint folder with a Sync recipe.

Setup Info
    Tags
      Help me…