Survey banner
The Dataiku Community is moving to a new home! We are temporary in read only mode: LEARN MORE

Python SQL INSERT statement query with variables

jordan_sklansky
Level 2
Python SQL INSERT statement query with variables

I pulled this example from an old post, and also tried following the example from

https://knowledge.dataiku.com/latest/code/python/tutorial-sql-in-python-recipe.html#create-the-final...

This example without variables works fine!

 

 

from dataiku import SQLExecutor2
sqlcon = SQLExecutor2(dataset="output_table")

sql_str = """INSERT INTO "public"."output_table" (col_one, col_two) VALUES('value_one', 'value_two')"""

sqlcon.query_to_df(sql_str, post_queries=['COMMIT'])

 

 

I tried to insert a value as a variable instead

 

 

from dataiku import SQLExecutor2
sqlcon = SQLExecutor2(dataset="output_table")

sqlcon.query_to_df("""
INSERT INTO "public"."output_table" (col_one, col_two) VALUES(%s, %s);
"""
% (variable_one, variable_two),
post_queries=['COMMIT'])

 

 

However i get an error on the post_queries=['COMMIT'] line. Any ideas?

0 Kudos
4 Replies
Turribeach

Can you please post the full error trace?

0 Kudos
jordan_sklansky
Level 2
Author

Sure! I used the generic example above, this is specific to my use case but same result

---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
<ipython-input-12-ecc7922dcc07> in <module>
      7     """
      8     % (test_sso, 'Jane Doe', '123', 'Test Pym', 'Control Title Holder', 'CTH'),
----> 9   post_queries=['COMMIT'])     10 print("Data updated Successfully")

/u01/dataiku/dataiku-dss-12.5.1/python/dataiku/core/sql.py in query_to_df(self, query, pre_queries, post_queries, extra_conf, infer_from_schema, parse_dates, bool_as_str, dtypes, script_steps, script_input_schema, script_output_schema)
    242         if not self._iconn:
    243             raise ValueError("SQLExecutor2 not configured: either 'connection' or 'dataset' are required (in SQLExecutor2 constructor)")
--> 244         return _streamed_query_to_df(self._iconn, query, pre_queries, post_queries, self._find_connection_from_dataset, "sql", extra_conf, infer_from_schema=infer_from_schema, parse_dates=parse_dates, bool_as_str=bool_as_str, dtypes=dtypes, script_steps=script_steps, script_input_schema=script_input_schema, script_output_schema=script_output_schema)
    245 
    246     def query_to_iter(self, query, pre_queries=None, post_queries=None, extra_conf={}, script_steps=None, script_input_schema=None, script_output_schema=None):

/u01/dataiku/dataiku-dss-12.5.1/python/dataiku/core/sql.py in _streamed_query_to_df(connection, query, pre_queries, post_queries, find_connection_from_dataset, db_type, extra_conf, infer_from_schema, parse_dates, bool_as_str, dtypes, script_steps, script_input_schema, script_output_schema)
     32     # initiate the streaming (blocks until the database says it's ready to return values)
     33     streamingSession = jek_or_backend_json_call("sql-queries/start-streaming",
---> 34         data = data)     35 
     36     logger.info("Got initial SQL query response")

/u01/dataiku/dataiku-dss-12.5.1/python/dataiku/core/intercom.py in jek_or_backend_json_call(path, data, err_msg, **kwargs)
    470         return jek_json_call(path, data, err_msg, **kwargs)
    471     else:
--> 472         return backend_json_call(path, data, err_msg, **kwargs)
    473 
    474 def lambda_or_backend_json_call(path, data=None, err_msg=None, **kwargs):

/u01/dataiku/dataiku-dss-12.5.1/python/dataiku/core/intercom.py in backend_json_call(path, data, err_msg, **kwargs)
    458 
    459 def backend_json_call(path, data=None, err_msg=None, **kwargs):
--> 460     return _handle_json_resp(backend_api_post_call(path, data, **kwargs), err_msg = err_msg)
    461 
    462 def jek_json_call(path, data=None, err_msg=None, **kwargs):

/u01/dataiku/dataiku-dss-12.5.1/python/dataiku/core/intercom.py in _handle_json_resp(resp, err_msg)
    557         err_data = resp.text
    558         if err_data:
--> 559             raise Exception("%s: %s" % (err_msg, _get_error_message(err_data).encode("utf8")))
    560         else:
    561             raise Exception("%s: %s" % (err_msg, "No details"))

Exception: None: b'During query \'\n    INSERT INTO eng_analytics_eng.etcr_pyrm_updated (candidate_sso, full_name_per, time_on_job_mo, primary_pyramid_name, cth_level_name, cth_level_short)\n    VALUES (98745, Jane Doe, 123, Test Pym, Control Title Holder, CTH);\n    \': ERROR: syntax error at or near "Doe"\n  Position: 180, caused by: RedshiftException: ERROR: syntax error at or near "Doe"\n  Position: 180'
0 Kudos

You are mising quotes or double quotes in your string columns, whatever Redshift requires. You can test your insert in a Dataiku SQL Notebook first to be sure of the synthax.

0 Kudos
jordan_sklansky
Level 2
Author

Hm, not sure if its a quote issue. I can use this with psycopg and it works no problem... i was converting this over to use the SQLexecutor.

Here is the code I'm trying to execute, if you think there needs to be a change can you help be identify it?

test_sso = 98745

sqlcon.query_to_df("""
    INSERT INTO eng_analytics_eng.etcr_pyrm_updated (candidate_sso, full_name_per, time_on_job_mo, primary_pyramid_name, cth_level_name, cth_level_short)
    VALUES (%s, %s, %s, %s, %s, %s);
    """
    % (test_sso, 'Jane Doe', '123', 'Test Pym', 'Control Title Holder', 'CTH'),
  post_queries=['COMMIT'])
print("Data updated Successfully")

 

0 Kudos

Setup info

?
Tags (1)
A banner prompting to get Dataiku