For example, I can find the function train_model(model_id, project_key=None, build_mode='RECURSIVE_BUILD', step_name=None, async=False, fail_fatal=True) But how could I know what kind of object this function returns, and how can I use with that returned value? From the sample code, I found function…
can we use evaluation store output for further processing using python recipe.
I'm using a connection to my ElasticSearch cluster to write some of my datasets into index. If I use the default mapping , the types of the schema are well mapped (date into date, string into text, integer into long, boolean) but text have not keyword subfield. I know I can generate for ext field the keyword subfield by…
How to call the dataset in Python function from API designer? im using Python function, and in the function, I'm using below function: def testing_example(parameter1): dataset = dataiku.Dataset("datasetname",project_key="project_id") df = dataset.get_dataframe() But it fails. Any idea how to solve this?
Hi there, I'm currently using Dataiku to train a model for a prediction task. I need to impose directionality constraints on certain columns, but the current version of xgboost in Dataiku doesn't support these kinds of constraints. I've learned that we can write custom Python models with monotonic constraints as detailed…
My recipe failed to resolve a pip package I installed on code env. The same code runs well on Notebook area but failed on recipe using the same code env. I suspect the recipe uses old version of the code env but I have no way of forcing it to use the latest code env build. How? Operating system used: Windows Operating…
Hello, while following this link, https://knowledge.dataiku.com/latest/plugins/development/tutorial-first-plugin-recipe-processor.html#test-the-preparation-processor I would like to completely delete the Processors Library called Hide colors created during the Test the preparation processor process. Even if delete the json…
Hi team, I currently encountered an urgent issue with our project automation pipeline. In a Python code recipe, I used builder to create a new python recipe, and have added a dataset as input, 3 new datasets as output. In order to save some model objects to a managed folder, I need to add an existing folder "Pickle_Files"…
I have a folder of data which can be converted to a desired format using a .exe application. The input data and .exe isgiven by 3rd party. How can I do this in DSS? Operating system used: Cloud
Hello Dataiku Community. I am connecting to an FTP folder to build a dataset. The data I got is the union of all the data in all the files in that directory. What I want is to process only the new files coming to that directory. I was looking at the advanced options but I couldn't find any documentation about the…
Create an account to contribute great content, engage with others, and show your appreciation.