Hi, I'm using the dataiku 12.5.2 in my workplace, and I found synchronizing design node and automation node is quite hard. Minor debugging issues that occur during ingestion are all processed by the automation node, while architectural modifications are build and tested in the design node. As time goes on, differences…
Thanks for your time at the beginning. I have a project and I want to know which datasets are shared-in from other projects (black icons) with Python API. I look through the previous Q&A, only find the way to find those which are shared-out to other project: def find_exposed_datasets(project): result = [] raw =…
I am working in Dataiku and need to create a Flask API inside a WebApp. However, I am encountering a 404 error. Specifically, when I add an additional input parameter to a GET request, the API returns a 404, although some other APIs are working correctly. Could you please assist me in understanding the cause and how to…
I am working on a Dataiku WebApp and facing an issue with the Flask API. When I create a new API inside the WebApp, I get a 404 error when calling it. Also, if I add a new input parameter to an existing GET API, it starts returning a 404 error as well. Could you please help me understand why this is happening and how to…
hi dataiku users, wondering what is the best practice if i have a knowledge bank that will be used as one of the source of data in one of my API endpoint service? currently the knowledge bank is still in the designer node and the api is in the api node which sits in different server i notice sometime the api response…
currently i use redshift as the source dataset , and use export to folder recipe export excel to sharepoint, the destination dataset point on sharepoint. but when i running the job, if the excel less than 50mb, job success. but if more than 50mb , report below issue: Oops: an unexpected error occurred Error in Python…
Hey everyone, im new here I’m facing a weird issue with the Window recipe while trying to compute rolling transaction metrics — specifically sum and count over 1-day, 7-day, and 30-day periods. Here’s my setup: Partitioned by cc_num Ordered by trans_date_trans_time Using a window frame (either limit by range in days/hourse…
I work on data fed daily and I try to find a way in Dataiku to do upsert: do at the same time update records and insert new records, what are the different ways to do this please?
Please advise how to setup a formula that outputs a null value instead of the empty string in the database. using redshift database. Simple example, ColumnB: if(isNull(ColumnA), "", ColumnA) What do i use instead of double quotes
Create an account to contribute great content, engage with others, and show your appreciation.