I'm following the tutorial found here https://www.dataiku.com/learn/guide/code/python/use-python-sql.html but had some additional questions.
1. Is it possible to submit multiple statements with SQLExecutor2.exec_recipe_fragment()? The script creates multiple volatile tables before constructing the final table. I want to use exec_recipe_fragment to perform the entire SQL script in-database.
2. Is it possible to run exec_recipe_fragment without specifying an input source? I'm able to start a SQL - SQL script recipe without specifying an input dataset so it seems like I should be able to accomplish the same here.
I'm essentially trying to recreate the functionality of the recipe SQL - SQL script but with additional error handling that will be coded in python.
Psuedo code of what I'm trying to accomplish:
import dataiku
import pandas as pd, numpy as np
from dataiku.core.sql import SQLExecutor2
executor = SQLExecutor2(connection="Tera_conn")
sql = """
CALL TERA_DB.conditional_drop('TERA_DB', 'ProjKey_TABLE_TO_CREATE', outmsg);
CREATE MULTISET TABLE TERA_DB.ProjKey_TABLE_TO_CREATE
, NO FALLBACK, NO BEFORE JOURNAL, NO AFTER JOURNAL AS
(
SELECT top 1 * from TERA_DB.OTHER_TABLE
)
WITH DATA
PRIMARY INDEX(dw_acct_key);
"""
SQLExecutor2.exec_recipe_fragment(query = sql)
Hi,
1. You can define a list of temp table queries and put them in the `pre_queries` argument of `exec_recipe_fragment`. More information here: https://doc.dataiku.com/dss/latest/python-api/partial.html
2. In contrast of SQL recipe, I think you will need to have an input dataset otherwise you will get an error `Connection not specified`.
Cheers,
Du Phan