Want to Stop Rebuilding "Expensive" Parts of your Flow? Explicit Builds are the Answer!READ MORE

Copy my data to S3 from Snowflake

sj0071992
Neuron
Neuron
Copy my data to S3 from Snowflake

Hi Team,

 

I want to build one process which copy my Snowflake table data to s3.

I have one table which capture the details like which table data i need to copy to s3 based on a copy_flag column.

Based on that flag i have to move my data to s3 location.

 

Could you please help me in that.

 

Thanks in Advance 

0 Kudos
4 Replies
AlexT
Dataiker
Dataiker

@sj0071992 ,

The recommended way to sync data from Snowflake to S3  in DSS would be to use a Visual Sync recipe in DSS. This will ensure it uses fastpath to unload data from Snowflake to S3.

 https://doc.dataiku.com/dss/latest/connecting/sql/snowflake.html#syncing-to-from-s3

If you need to dynamically create a table on the value of the column. You could create a view in snowflake and this as a database table dataset in DSS and then sync this to S3. 

Hope this helps.

 

 

0 Kudos
sj0071992
Neuron
Neuron
Author

Hi Alex, 

 

Thanks for your response.

 

How can i do Sync of Multiple tables? As Sync recipe can unload once at a time?

 

Thanks in Advance

0 Kudos
AlexT
Dataiker
Dataiker

Yes, A sync recipe can only have 1 dataset as input. So you would need to create ahigh number of datasets please note you can create datasets through the API:

 https://doc.dataiku.com/dss/latest/python-api/datasets-other.html#sql-dataset-programmatic-creation

Then create a sync recipe through the API as well

https://doc.dataiku.com/dss/latest/python-api/recipes.html#dataikuapi.dss.recipe.SyncRecipeCreator

 

0 Kudos
sj0071992
Neuron
Neuron
Author

Hi Alex,

 

Is there any way to do this through python code?

Just creating cursor and then executing the unloading command.

 

As this will help in defining the path in during the execution and even to track the failed execution.

0 Kudos