-
I have pre-trained models that I would like to use in dataiku recipe.
Hi, I have pre-trained model on my local machine that I would like to use in a recipe. One model is trained using the alibi-detect library and the other one is the popular SAM model. Appreciate any tips on how to use these models in a dataiku recipe.
-
custom properties of a dataset
Hi, Concerning the settings of a dataset, advances properties, what are the possible Advanced properties of a dataset (in Custom properties) ,and the conditions of use of this functionality ? I can't find any documentation on this subject Best Regards Operating system used: redhat Linux 7.9
-
join receipt
There is no ADD INPUT button in my join receipt My version of Dataiku is 12 any way to, owercome this ? Thanks Operating system used: Windows
-
Join recipe with Multiple datasets
Hello, I am in study of the Dataiku DSS. I have 3 datasets(Customer, Computer, Printer) that I am joining by customerID. Each datasets has column field called CustomerID. The Computer and Printer has the following columns i.e. Customer ID and Date when they purchased the item. I would like to know if the Customer bought a…
-
reference global variables in prepare recipe
how to reference a global variable in prepare recipe to create an if statement, for example: if global_var = 1: field A = 1 else: field A = 0 Operating system used: win 11
-
how can we append the different build of same file?
Hi Dataiku community , I have created one python recipe which takes 3 different datasets as input and produces one output after processing some data transformation steps. Now this output needs to be build every day and the one that runs on month end need to get appended in a other output which is basically created to keep…
-
Random Survival Forrest
Does someone has ever used a Random survival Forrest model in Dataiku ? This is a definition of what is this kind of algorithm : A Random Survival Forest ensures that individual trees are de-correlated by 1) building each tree on a different bootstrap sample of the original training data, and 2) at each node, only evaluate…
-
Write pandas dataframe to managed dataset using custom schema
The issue we are working through pertains to Dataiku auto-detect column type when passing a Pandas dataframe to Snowflake. The schema Dataiku creates based on the Pandas dataframe converts Object data types to VARCHAR which is fine. The issue is the VARCHAR size. In out test example all dataframe objects are given a…
-
Is there an automated method to verify the current status (enabled or disabled) of scenario steps
We are in the process of automating prechecks performed prior to initiating an end-to-end project. As part of this effort, we are exploring automated methods to confirm the current status (enabled or disabled) of scenario steps in Dataiku. If such a method exists, we would greatly appreciate your assistance in guiding us…
-
Sharepoint Files with Different Field Counts, but Same Field Names
Hello everyone, I am currently importing files from SharePoint that are sent to me via email on a weekly basis. Each file may have a different number of fields each time it is sent. I notices that Dataiku is not appending on the name each time and simply creating null values on the fields that arent visible on each…