Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
According to the documentation( https://doc.dataiku.com/dss/latest/connecting/sql/bigquery.html#writing-data-into-bigquery ), the insert method is not optimal to add data to a Big Query (BQ) table. How can I make an insert with an SQL Query recipe?
The idea is the following:
A table is obtained from a datacenter through a query. This table has rows with specific dates. Then, this table is sent to BQ (let's call table A).
I want to make a INSERT of the rows of table "A" to another BQ table (table "B") using a SQL Query recipe. There is wanted to use the INSERT method because we don't want to process the data of table "B"; we want to keep these exactly as these are.
I already tried to do an INSERT in recipe but it return me an error:
Job Failed: com.dataiku.dss.shadelib.com.google.cloud.bigquery.bigqueryexception: Syntax Error: Unexpected insert target name at [3: 1], caused by: Bigqueryexception: Syntax Error: Unexpeted insert target name at [3 : 1], Caused by: GooglejsonSrSException: 400 BAD Request Post https: //www.googleapis.com/...........
I tested in the BIg Query environment the INSERT query that was in my recipe and it worked, so this isn't a syntax error.
I would suggest you just use BigQuery directly using the stardard BigQuery Python API:
Dataiku doesn't support this directly but t the end of the day it's Python code so it's as good as any other Python code you write. I would agree with Dataiku that this won't be the fastest way to insert rows but for small datasets it should be fine.
Have you tried using a SQL Script recipe? I don't think a SQL Query recipe would work for INSERTs but a SQL Script recipe should.
We do INSERTs all the time in SQL Script recipes in Snowflake.