Dataiku shows me an error when I make a query on Redshift

Options
rafael_rosado97
rafael_rosado97 Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 58 Partner

Hello, everyone.

I want to unload data from Redshift to S3, but when I make the query in dataiku, it shows the following error: java.lang.outofmemoryerror: gc overhead limit exceeded.

When a query with a shorter time span and it runs correctly.

I read here about modification of jek.xmx. Can I apply it for this case? There is enough memory to run the process.

Thank you very much!


Operating system used: Linux

Tagged:

Best Answer

  • JordanB
    JordanB Dataiker, Dataiku DSS Core Designer, Dataiku DSS Adv Designer, Registered Posts: 293 Dataiker
    Answer ✓
    Options

    Hi @rafael_rosado97
    ,

    It should not be necessary to alter the jek.xmx. Please try adding the following properties in the "Advanced JDBC properties" of your Redshift connection.

    enableFetchRingBuffer -> false
    enableFetchReadAndProcessBuffers -> false

    Thanks!

    Jordan

Answers

Setup Info
    Tags
      Help me…