Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Dear Dataiku Community,
I am relevant new to Dataiku, and were searching through the internet to find an answer to a very straightforward question for me but didn't found a satisfying answer.
Imagine the following scenario: You have different datasources, you create a model and then you have your scoring of customers. This results needs now to be published inside a relational Database.
The batch script is executed every month at the same time, by reading the most newest data and performing the scoring of the customers.
However, I found it yet difficult to understand how the publishing process in Dataiku is working when it is not an API call but a batch script.
I am very happy if someone could direct me to some tutorial around this topic or directly explain it to me here in the chat.
I wish you all a nice day,
Welcome to the Dataiku community. We are glad to have you here.
Take a look at Scenarios. This content from the Dataiku academy might help.
Note that this feature is not available in the free edition of DSS.
Thank you for your response to my question. I was having a look at the mentioned "Automation course". However, I could not find in the explanations how to put data of a model prediction to an external datasource such sa DB2 or postgresql or even Hadoop.
If I would have to guess, I would have to create a customer python script which takes based on a triggering event the output of a model and then I need to write my on prepared statements based on the jdbc driver from python to write the data back into DB2 - Is this correct?