Documentation for S3 and Snowflake Connections Setup

NN
NN Neuron, Registered, Neuron 2022, Neuron 2023 Posts: 145 Neuron

Hi Dataiku Team,

We have a requirement to setup a Snowflake Connection and a S3 connection so that we can leverage the S3 to Snowflake SYNC engine functionality.

Can you guide me to some documentation which can show me how the S3 buckets permissions and snowflake permissions etc are to be setup specifically to make the bulkcopy functionality work in the SYNC recipe.

Thanks..

Best Answer

  • JordanB
    JordanB Dataiker, Dataiku DSS Core Designer, Dataiku DSS Adv Designer, Registered Posts: 296 Dataiker
    Answer ✓

    Hi @NN
    ,

    Once you have configured your Amazon S3 and Snowflake connections are in DSS. Then, in order to enable fast-write, please make the following adjustments in the settings of the Snowflake connection:

    • Enable “Automatic fast-write”

    • In “Auto fast write connection”, enter the name of the cloud storage connection to use

    • In “Path in connection”, enter a relative path to the root of the cloud storage connection, such as “snowflake-tmp”. This is a temporary path that will be used in order to put temporary upload files. This should not be a path containing datasets.

    DSS will now automatically use the optimal cloud-to-Snowflake copy mechanism when executing a recipe that needs to load data “from the outside” into Snowflake. If you set up a sync recipe from an S3 dataset to Snowflake, the engine will be set to "Direct S3 to Snowflake" automatically.

    Screen Shot 2022-09-23 at 1.26.14 PM.png

    Note, the S3 and Snowflake connections must be in the same cloud region.

    Please see our documentation for additional details: https://doc.dataiku.com/dss/latest/connecting/sql/snowflake.html#writing-data-into-snowflake

    If you have any questions, please let us know.

    Thanks!

    Jordan

Answers

Setup Info
    Tags
      Help me…