Want to Stop Rebuilding "Expensive" Parts of your Flow? Explicit Builds are the Answer!READ MORE

Dataiku Scenario used for build of table every day

Solved!
PK36313
Level 2
Level 2
Dataiku Scenario used for build of table every day

Need Help Urgently 

I am using Dataiku with snowflake , I have  flows in which source tables refresh daily , So I using scenario with Build mode : Force Rebuild dataset and Dependency  and I have found every day table get dropped and new table get created which was not supposed to be done .
We want on table rows get automatically overwrite without dropping the table and recreating it . 

Beacuse of recreation of table each time I am not able to use time travel feature of snowflake. 

Please let me know is there any way ?

Thanks in advance.

 

0 Kudos
1 Solution
AlexT
Dataiker
Dataiker

Hi,

This option is available at the connection level under  Administrations - Connections  - you  SQL Connection Name

If you don't want this behavior for all other datasets on this connection, then simply ask your admin to create a duplicated connection with this enabled to use here. 

Screenshot 2022-08-10 at 12.27.32.png

View solution in original post

0 Kudos
6 Replies
MiguelangelC
Dataiker
Dataiker

Hi,

Depends on what 'overwrite rows' means in this context. I am guessing you want to do updates.

You can do updates using a SQL code recipe: https://doc.dataiku.com/dss/latest/code_recipes/sql.html

 

0 Kudos
PK36313
Level 2
Level 2
Author

By Overwrite I mean , Every day when Scenario run Table previous data should get truncated instead of dropping the whole table and creating the new table . 

I have check in snowflake by this query show tables history in <schema_name> , that every day when the scenario runs my table get dropped and new table get created . 

But My need is everyday when I run the Scenario the table data get refreshed /overwrite , Not like whole table get dropped and then new date get created . 

I am using Force -rebuild datasets and Dependency , And only SQL Receipes in the flow .

Is there any way with which I can avoid Drop of table ,Please let me know 

Thanks in advance

0 Kudos
AlexT
Dataiker
Dataiker

Hi,

You can use the option "truncate to clear data" instead at the connection level.  This would avoid dropping the table with the exception of cases where the schema of the table changes in that case it would still be dropped and re-created. 

Screenshot 2022-08-10 at 12.09.52.png

0 Kudos
PK36313
Level 2
Level 2
Author

Could you pleas tell me where can I find this option it would be a great help? , In my target table advance option i am not getting it .

0 Kudos
AlexT
Dataiker
Dataiker

Hi,

This option is available at the connection level under  Administrations - Connections  - you  SQL Connection Name

If you don't want this behavior for all other datasets on this connection, then simply ask your admin to create a duplicated connection with this enabled to use here. 

Screenshot 2022-08-10 at 12.27.32.png

0 Kudos
Marlan
Neuron
Neuron

Hi @PK36313,

You could also use a SQL Script recipe (rather than a SQL query recipe) to update the output table. In SQL Script recipes, the output table is dropped only if you drop in your script. So you'd simply not drop it and instead insert (or delete and then insert) the desired rows into the output table. We do this all of the time.

Marlan