Reading Dataset in a Python Recipe without declaring as inputs
I need to combine data from severa datasets (from different workflows) in a Python Recipe
WF1.mydata_1
WF2.mydata_2
and so on, ( new WF will appear )
I don't how the number of WF, I cannot add each of them in the Input/Output menu.
I need a python code that will manage to import the datasets no matter how many they are
Answers
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,160 Neuron
What you are trying to do goes against the core concept of Dataiku: that every transformation step (ie: recipe) must have an input and an output defined so it can be displayed in the flow. You can write free style Python code and define your input and outputs dynamically but run it your code elsewhere. There are however several ways to work around Dataiku guard rails. However before suggesting any of them I think you really should clarify your actual requirement. What exactly are you trying to achieve? Why do you think you need dynamic inputs? When you say different "workflows" what exactly do you mean? Is that another Dataiku project's flow? Where are these datasets stored on? Are they of the same structure?