-
The 'exec_recipe_fragment' method of 'SQLExecutor2' class fails to perform
Hello, I was exploring the following code example: https://developer.dataiku.com/latest/tutorials/data-engineering/sql-in-code/index.html#code-recipe-examples But I got an error from the exec_recipe_fragment method (attached below). I tried both options of the overwrite_output_schema parameter but did not succeed. Any help…
-
Best practices for real-time APIs?
Has anyone got any best practices they can recommend for development, testing and deploying real-time API services? At the moment we have considered documenting and testing under load. Does the community know of anything else we can add?
-
Append each refresh into a separate dataset
Hi team, I currently have a dataset that connects to an online sharepoint, with the scenario function it is now autorefreshed weekly. I would like to build a new dataset to append all the versions to compare the changes/ evolution. Could you help with some guidance? many thanks
-
Default variable values in an Application
Inside an Application, I define an "Edit project Variables" section, followed by my variable definitions: an "INT" and a "STRING". =========== [ { "name": "welcome", "label": "on", "defaultValue": "5.2 is my game", "mandatory": true, "canSelectForeign": false, "markCreatedAsBuilt": false, "allowDuplicates": true,…
-
where can i found the detailed API functions for scenario interaction
For example, I can find the function train_model(model_id, project_key=None, build_mode='RECURSIVE_BUILD', step_name=None, async=False, fail_fatal=True) But how could I know what kind of object this function returns, and how can I use with that returned value? From the sample code, I found function…
-
evaluation store
can we use evaluation store output for further processing using python recipe.
-
Automatic mapping not supported for Elasticsearch
I'm using a connection to my ElasticSearch cluster to write some of my datasets into index. If I use the default mapping , the types of the schema are well mapped (date into date, string into text, integer into long, boolean) but text have not keyword subfield. I know I can generate for ext field the keyword subfield by…
-
Calling dataset from API deisgner failed
How to call the dataset in Python function from API designer? im using Python function, and in the function, I'm using below function: def testing_example(parameter1): dataset = dataiku.Dataset("datasetname",project_key="project_id") df = dataset.get_dataframe() But it fails. Any idea how to solve this?
-
Montonic constraints in Xgboost with hyperparameter search
Hi there, I'm currently using Dataiku to train a model for a prediction task. I need to impose directionality constraints on certain columns, but the current version of xgboost in Dataiku doesn't support these kinds of constraints. I've learned that we can write custom Python models with monotonic constraints as detailed…
-
Recipe failed but notebook runs (same code environment)
My recipe failed to resolve a pip package I installed on code env. The same code runs well on Notebook area but failed on recipe using the same code env. I suspect the recipe uses old version of the code env but I have no way of forcing it to use the latest code env build. How? Operating system used: Windows Operating…