Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I was trying to see if i can leverage dataikus existing connections to copy multiple files to snowflake.
So, i have a Managed S3 folder which has some 50-100 odd files in it and i need to create a separate snowflake table for each file.
I can for now loop through each file one by one and use the sync recipe to create a separate table each time. But this doesnt look like the best solution.
I was wondering if there was any other solution or approach that might work better. Maybe something where i can use the copy command inside an sql script recipe.
Any suggestions are welcome 🙂
Bumping this question one last time to see if there are any suggestions.
Essentially the time it takes for the sync recipe to spin up and copy only one file at a time to snowflake is something i wish to avoid.
It's been a while but I believe I as able to get a copy command inside a SQL script recipe to work. This was on Azure. Check the sync recipe logs and you should see the command that DSS is using in there. You can pull that out and run it in the SQL Script recipe.
Hope this is helpful at as something to explore.
Thank you for your reply.
Yes tracing the code did help. i think i got fair idea of what dataiku does in the sync recipe.
Now i think the only challenge is leveraging the preset s3 connection in dataiku.
Ill see if it works , else will have to use the secrets directly (something i wish to avoid)
Anyway if it works i will update the post.