You now have until September 15th to submit your use case or success story to the 2022 Dataiku Frontrunner Awards!ENTER YOUR SUBMISSION

Snowflake Load Multiple Files

NN
Neuron
Neuron
Snowflake Load Multiple Files

Hi Everyone,

I was trying to see if i can leverage dataikus existing connections to copy multiple files to snowflake.
So, i have a Managed S3 folder which has some 50-100 odd files in it and i need to create a separate snowflake table for each file.
I can for now loop through each file one by one and use the sync recipe to create a separate table each time. But this doesnt look like the best solution.
I was wondering if there was any other solution or approach that might work better. Maybe something where i can use the copy command inside an sql script recipe.

Any suggestions are welcome 🙂 

0 Kudos
3 Replies
NN
Neuron
Neuron
Author

Bumping this question one last time to see if there are any suggestions.
Essentially the time it takes for the sync recipe to spin up and copy only one file at a time to snowflake is something i wish to avoid.

0 Kudos
Marlan
Neuron
Neuron

Hi @NN,

It's been a while but I believe I as able to get a copy command inside a SQL script recipe to work. This was on Azure. Check the sync recipe logs and you should see the command that DSS is using in there. You can pull that out and run it in the SQL Script recipe.

Hope this is helpful at as something to explore.

Marlan

NN
Neuron
Neuron
Author

Hi @Marlan 
Thank you for your reply.
Yes tracing  the code did help. i think i got  fair idea of what dataiku does in the sync recipe.
Now i think the only challenge is leveraging the preset s3 connection in dataiku.

Ill see if it works , else will have to use the secrets directly (something i wish to avoid)
Anyway if it works i will update the post.

0 Kudos