Appending to table keeps overwriting

My first time working with writing into tables other than creating a table. Thank you for your time in helping me!
I have a scenario that looks for the oldest record in table B of Server B and then pulls new records from table A from Server A. My intent is to append those new records to table B.
Table A has 4 columns. Table B has the same 4 plus 5 others. I want to append the records mapping to the 4 columns and leave the other columns blank.
I successfully built a Scenario using "Execute SQL", "Set Project Variables" and "Build" SQL to pull the new records. My issue is appending the records.
To append records, I'm using the STACK recipe. For "Output" I have the box checked to "Append instead of overwrite". For recipe "Settings", "Column Selection" (and here is where I think my problem could be), I'm using "Manually Remapping".
The result is all the original records in Table B are overwritten and the schema is redefined to be just 4 columns. I want to just append new records for the 4 columns and leave the 5 other columns blank.
What am I doing wrong?
Operating system used: Windows
Best Answer
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,579 Neuron
By default Dataiku will drop the target table if the schema differs from what the recipe it's trying to insert. This is problematic for "historical" tables which get loaded in append mode where a lot of users are caught by this behavior and lose their data when the schema changes and Dataiku drops the target table. Dataiku has finally changed this behavior in v13.1:
However this doesn't solve your problem since it will merely make the recipe fail when the schemas differ rather than dropping the target schema. To do what you want you have two options:
- Use SQLExecutor2 under a Python recipe:
- Use a SQL Script recipe:
Answers
-
Thank you! This saved me a lot of time. I ended up going down the SQL Script option then pivoted an alternative method.