Snowflake Load Multiple Files

NN
NN Neuron, Registered, Neuron 2022, Neuron 2023 Posts: 145 Neuron

Hi Everyone,

I was trying to see if i can leverage dataikus existing connections to copy multiple files to snowflake.
So, i have a Managed S3 folder which has some 50-100 odd files in it and i need to create a separate snowflake table for each file.
I can for now loop through each file one by one and use the sync recipe to create a separate table each time. But this doesnt look like the best solution.
I was wondering if there was any other solution or approach that might work better. Maybe something where i can use the copy command inside an sql script recipe.

Any suggestions are welcome

Answers

  • NN
    NN Neuron, Registered, Neuron 2022, Neuron 2023 Posts: 145 Neuron

    Bumping this question one last time to see if there are any suggestions.
    Essentially the time it takes for the sync recipe to spin up and copy only one file at a time to snowflake is something i wish to avoid.

  • Marlan
    Marlan Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Dataiku Frontrunner Awards 2021 Participant, Neuron 2023 Posts: 319 Neuron

    Hi @NN
    ,

    It's been a while but I believe I as able to get a copy command inside a SQL script recipe to work. This was on Azure. Check the sync recipe logs and you should see the command that DSS is using in there. You can pull that out and run it in the SQL Script recipe.

    Hope this is helpful at as something to explore.

    Marlan

  • NN
    NN Neuron, Registered, Neuron 2022, Neuron 2023 Posts: 145 Neuron

    Hi @Marlan

    Thank you for your reply.
    Yes tracing the code did help. i think i got fair idea of what dataiku does in the sync recipe.
    Now i think the only challenge is leveraging the preset s3 connection in dataiku.

    Ill see if it works , else will have to use the secrets directly (something i wish to avoid)
    Anyway if it works i will update the post.

Setup Info
    Tags
      Help me…