Want to Stop Rebuilding "Expensive" Parts of your Flow? Explicit Builds are the Answer!READ MORE

Issue with input as External (Spectrum) Redshift tables and output to S3 and Snowflake

kbutler27
Level 1
Issue with input as External (Spectrum) Redshift tables and output to S3 and Snowflake

I have external (Spectrum) Redshift tables that I can preview and query using Dataiku. However, when using any of the built-in recipes (filter, data prep, etc.), I cannot output the dataset to S3 or Snowflake. I think this issue occurs because the data in the external table sits in S3, not Redshift. I don't have direct access to the S3 bucket, so I'm looking for another workaround to be able to pull data from external tables and land data in S3 and Snowflake.

Job failed: Input dataset is not ready (table does not exist)

Additional technical details

  • Error code: ERR_JOB_INPUT_DATASET_NOT_READY_NO_TABLE
  • Error type: com.dataiku.dip.exceptions.SourceDatasetNotReadyException
0 Kudos
2 Replies
chprem
Level 1
Level 1

Even I'm facing the same issue. I'm not sure on how to proceed on this further , if there's any workaround or so to make the external tables as normal tables that work properly.

0 Kudos
AlexT
Dataiker
Dataiker

Hi,

Reading Redshift Spectrum was added in DSS 10.0.6.

https://doc.dataiku.com/dss/latest/connecting/sql/redshift.html#reading-external-tables

If you have upgraded to 10.0.6 and are still having issues please let us know the details or raise a support ticket.

Thanks,

0 Kudos