I'd like to access SQL output generated by a recipe programatically using Python API client. Is that possible? UPDATE: I see this information is returned by /dip/api/flow/recipes/generic/get-status
Hi all, The partition variables such as CURRENT_DAY are apparently available but it it impossible to find how to get 1_DAYS_BEFORE to work in SQL Recipes for manual partition specification. The log sh…
Hi there, I have one question related to global variables in DSS. Namely, is it possible to define a list as global variable and then use this in an SQL statement? What I am thinking of is something l…
Hello, I have troubles with one specific project in DSS: I can not start SQL jobs or notebooks in it, also we can not start any python jobs but jupyter notebooks are fine. Everything is doing fine in …
Can someone provide a high level overview of how DSS uploads data to Teradata? Is it just submitting a batch of inserts or using any Teradata utilities such as FastLoad or MultiLoad? I need to move ~7…
If a dataset is imported in Dataiku using "SQL query" as Mode it's not possible to use the SQL as execution engine for further operation even if all the tables involved are in the same SQL Database: I…
I'm trying to create a SQL script that has two statements: 1) Calling procedure that drops a table if it exists 2) Creating a new table I'm getting " [Error 3932] [SQLState 25000] Only an ET or null s…
Hi, As per latest DSS documentation https://doc.dataiku.com/dss/latest/apinode/endpoint-sql-query.html, SQL API endpoint on Hive is not supported. This also means Lookup API endpoint on a Hive Dataset…