-
Using Project Variable in Partitioned Recipe
I would like to use a Project Variable to explicity state which partition of a recipe I would like to run. See the attached screenshot - I used the variable expansion syntax but DSS did not interpret it correctly. Is there a way to do this?
-
get a handle for the scenario in a custom trigger
Hi, how can I access the scenario object in the custom python trigger? import dataiku from dataiku.scenario import Trigger, Scenario t = Trigger() s = Scenario() # do something wiht the scenario, such as access it's last runs
-
Load different dataset based on parameter from same flow
Hi, I am working for a project where I have to create Test & Control data set based of different date parameter but logic to create the dataset is same for both flow. I want to create only one flow so that I do not need to code twice but load it in two different dataset as output. Could you please help me how to create…
-
Writing into Managed Folder
Hi, I have a spark dataframe which I'm converting into Pandas dataframe and then writing into Managed Folder. We were able to do successful previously but now we are getting below mentioned error. Error : Exception: None: b'Early EOF' Any help would be appreciated. ERROR:root:Pipe to generator thread failed Traceback (most…
-
[Processing tests] Delete a bundle, clean exported_bundles list.
Hi, I am developing a python lib for bundle exporting. I am implementing tests on the bundle exporting process and I face a problem concerning the bundles_ids. For each test session, I start by deleting the test project : project_test_1_automation = automation.client.get_project('PROJECT_TEST_1')…
-
Scenario with parameters
Hello, I've found in the documentation that I can execute scenario with parameters (https://doc.dataiku.com/dss/latest/python-api/rest-api-client/scenarios.html) doing something like scenario.run({'x': y}) but is there a way to run a scenario from another scenario sending execution parameters (in my case, a date)? Thank you
-
Create code environment via cmdline
Hi, is it possible to create a new code environment in an automated way? I tried to tar gz an existing code environment and move to the new DSS installation, but it did not worked. Thanks
-
How to automate multiple runs of a flow with different source file each time
Hi I have a flow that I want to run multiple times (manually triggered), each run requires a new source file. I'd like to load the files into a folder (file_A, file_X, file_c etc) and then kick off the flow to automatically run through each file in turn, so run_1 uses file_A and once complete it starts again with run_2…
-
How to generate a new Dataset from a custom recipe
Hello, I'm trying to generate a new Dataset from a custom python recipe (inside a Plugin) but I'm getting always Error Disk (DSS can't find the Dataset - i can found it in the flow). I think that I'm not "linking" well the Dataset with the real file. How can I do that? Thanks! My python code is: output_path =…
-
How to automate uploading local files to Dataiku dataset?
use case: you want to sync your local file with Dataiku dataset.