-
SocketTimeoutException in dataiku job
currently i use redshift as the source dataset , and use export to folder recipe export excel to sharepoint, the destination dataset point on sharepoint. but when i running the job, if the excel less than 50mb, job success. but if more than 50mb , report below issue: Oops: an unexpected error occurred Error in Python…
-
Unexpected “ghost” values in Window recipe (sum and count not matching)
Hey everyone, im new here I’m facing a weird issue with the Window recipe while trying to compute rolling transaction metrics — specifically sum and count over 1-day, 7-day, and 30-day periods. Here’s my setup: Partitioned by cc_num Ordered by trans_date_trans_time Using a window frame (either limit by range in days/hourse…
-
Preparation for Core Designer certification
Hi: I took the 3 mandatory modules for Core Designer training. Is there any available material I can use to prepare for the certification? Operating system used: Windows Operating system used: Windows
-
Upsert of daily data in Dataiku DSS
I work on data fed daily and I try to find a way in Dataiku to do upsert: do at the same time update records and insert new records, what are the different ways to do this please?
-
Prepare Recipe, formula. How to output null vs ""
Please advise how to setup a formula that outputs a null value instead of the empty string in the database. using redshift database. Simple example, ColumnB: if(isNull(ColumnA), "", ColumnA) What do i use instead of double quotes
-
Stacking Excel Datasets
Hi all, i had the great luck to receive the objective of creating a customs import volume forecast for my company on DataIKU as part of my annual target agreement. So now i've started my new DataIKU Project. To be honest, i am a complete newbie to this platform. Personally i work daily with Excel, a very little bit with…
-
" Parse next line as column headers" option not working for csv files
When uploading a csv file, and ticking the " Parse next line as column headers" option, the created dataset doesn't have the column names contained in the first row of the file.
-
project export error
When I exported the project from dataiku, I couldn't complete the export due to the following error: "Invalid argument. An invalid argument has been encountered: Connection 'XXX' does not exist." Based on this error, I checked all the dataset connections and found that none of them require this connection. Operating system…
-
Meet "Connection xxx not found" when exporting a project
Thanks for your time at the beginning. I am currently exporting a project, only check the 4 default options. However, it failed with a warning "An invalid argument has been encountered : Connection 'SF_VAW_PROD_ATP_MED' does not exist"Then I tried to use Python API to find this connection, but failed again: import dataiku…
-
deploy local
how i deploy flow from dss node to automation node? Operating system used: LINUX Operating system used: LINUX
-
How to monitor resource usage of recipes by time zone
Hello. Is there a way to monitor the CPU, MEM, and other resource usage for each recipe in a project by time zone in Dataiku? I understand there's CRU, but can I also use it to monitor each recipe by time zone? Operating system used: REHL 9.4
-
About how to trigger a scenario on workdays only
Hi so I would like to set up a trigger for scenario that only fires on workdays, I have set up some global variables and I have set my custom trigger like this: from dataiku.scenario import Trigger t = Trigger() if variables['standard']['LastCalendarDay'] == variables['standard']['LastChinaWorkingDay'] or…
-
Invalid Scenario step logic condition should cause scenario failure
I have noted a very dangerous behavior in the latest v12 release although I believe this has been in DSS for a long while. DSS will not cause a scenario failure or even a warning if you have an invalid Scenario step logic. For instance I created a scenario step and set two variables: {"var1": 123, "var2": 456} Then I…
-
Dashboard Plot Crash
Hello, I've recently stumbled upon an issue where plots in a dashboard crash frequently, randomly, and upon refresh alternating, reappearing, or not appearing again. I've never seen this issue before. I've been using Dataiku for the past 2 years. Our company recently moved to a new server with an updated Dataiku version,…
-
How to Create Clickable Jira Issue Links in HTML Emails in Dataiku Scenario
I am working on automating the generation of email notifications within Dataiku, where the email body includes a table of Jira tickets. Each ticket has an "Issue Key" which I need to display as a clickable link in the email. The URL format for each ticket is: https://adsretailjira.atlassian.net/browse/{ISSUE_KEY}. I…
-
How to close an API connection and start a new one?
Using the python API: import dataikuapi client = dataikuapi.DSSClient(https://"some_host"….) …. process some things client = dataikuapi.DSSClient(https://"some_other_host"….) …. process some things How can I close the connection so I can move on to another host? I was looking for client.close() or something similar. I…
-
How do I reach Sales to ask a pricing question
I've done the "Contact Us " thing twice now over the last two weeks and still not heard from anyone. Does the Sales department still exist at Dataiku?
-
Is there a recipe that can be used to automatically print a dashboard?
Is there a recipe that can be used to automatically print a dashboard? The dashboard gets emailed daily and we would like to for it to automatically print once the workflow is done.
-
model’s performance on the train portion of the dataset
Hi everyone, I’m trying to figure out where I can find my model’s performance on the train portion of the dataset in Dataiku. I can easily access the test set metrics in the model interface, but I don’t see where Dataiku shows the scores computed on the training set (AUC, F1, etc.). Are these metrics available directly in…
-
HTML link for file in Managed Folder
Team, I have been coding in python recipe and unable to create a html link that enables users to download xlsx file in a Managed folder. Could someone please assist.
-
DSS Engine error on visual recipes
Greetings All, Please advise how to solve DSS Engine error on visual recipes " Oops: an unexpected error occurred Unknown error Please see our options for getting help HTTP code: 500, type: unknown" Therefore as an impact the datasets after the visual recipes are not updated. We are experiencing the issues mostly on the…
-
How to embed Dataiku Dashboard in the fullscreen version
Embed the dashboaed in fullscreen mode, without showing "project name", "edit buttons", etc. Zoom in the map in the middle as like the map in the left and right, and also make it stay. It would not change even after refreshing the webapp.
-
Getting below error
-
Migration & Synchronize issue in automation node
Hi, I'm using the dataiku 12.5.2 in my workplace, and I found synchronizing design node and automation node is quite hard. Minor debugging issues that occur during ingestion are all processed by the automation node, while architectural modifications are build and tested in the design node. As time goes on, differences…
-
How we can use a AWS EFS file share for Dataiku APIs deployed to Kubernetes
How we can use a AWS EFS file share for Dataiku APIs deployed to Kubernetes. We can configure CSI driver and storage class, then create PVC on Kubernetes cluster. But, how we can use it as mount in deployments that gets deployed through DSS API deployer.
-
Using Dataiku to analyze content update patterns from streaming platforms
Hi everyone I’m experimenting with Dataiku to track and analyze data from a few drama and entertainment streaming platforms. The idea is to study how frequently new content (episodes or shows) gets updated and whether there are noticeable trends in release timing or viewer engagement. I’ve connected some APIs and CSV data…
-
DataIku suddenly stopped working
Oops: an unexpected error occurred Unknown error All recipes stopped working and unexpected error occurred with HTTP code: 431, type: unknown Operating system used: Mac OS
-
ETL
Je propose de développer une solution moderne de Lakehouse pour accélérer la mise en œuvre des projets de Data Science et optimiser le temps. Se connecter aux différentes sources de données, les nettoyer et les mettre dans des formats adaptés à l’analyse et aux modèles de Machine Learning peut souvent être long et…
-
Best way to clean data for country meaning
I have a dataset which includes ISO 3166-1 alpha-3 country codes which are being detected properly, and full country names where about 14% are being flagged as not passing validation. Is there some place I can get the canonical list of country names dataiku uses for meaning? And if I can get that list, what's the most time…
-
why I can't overwritte a table from dataiku
Hello everybody, I have a table "my_table" in my database. I import my_table in dataiku and I realize an outer join to complete the value from two columns in my_table. In output of the outer_join, I create a file system. After that, I used the "sync" recipe to put the result of the join in "my_table". The "sync" recipe run…
-
How to quickly mass-change the database type of datasets for a project?
Thanks for your time firstly. I am currently changing the type of database from 'Snowflake' to 'Redshift' (except for source node). While I could use 'Change connection' function at the bottom-right border of DSS, but it's quite robotic. Is there any efficient approach or python API that could modify the connection in a…
-
Code Envs - What is the difference between AUTOMATION_SINGLE and AUTOMATION_VERSIONED?
Hello Dataiku Community, In my Dataiku console in Administration - Code Envs, i can see that several Code Envs are defined in General Env Info with the mode AUTOMATION_SINGLE and several Code Envs are defined in General Env Info with the mode AUTOMATION_VERSIONED. I can't fin any explanation in the Dataiku Documentation…
-
Using Deployment hook on multinode(2over automation)
Hello My environment uses two automation nodes, so I'm using multinode deployment. I'm trying to use a deployment hook, but I get a message saying the required automation_client value was not passed to the execute function of the deployment hook. I tested the same deployment hook on a single node, and the issue persists.…
-
How to change connection for SQL Notebook
Hello all, Does anybody have any suggestions for how to change the connection for an existing SQL Notebook? I have a bunch of queries in a notebook and I'd like to be able to just change the connection on a copy of the notebook rather than copying and pasting each query to a new notebook. Thanks! Marlan
-
How to connect Dataiku with Azure data lake? Please provide step by step guide.
-
I integrated DSS Desktop as admin with SQL, but on running the SQL recipe, facing the below issue.
I integrated DSS Desktop as admin (Free edition) with SQL, but on running the SQL recipe, facing the below issue. Need guidance to fix the issue. Thanks in advance.
-
How to integrate MYSQL with Dataiku Cloud Free trail account
-
Sharepoint metadata
Hello, Wanted to know how to get the "modified by" column on dataiku using my sharepoint connection? at this point in time i have the size and the modified time Thanks,
-
Sharepoint plugin update
Adding a column of "modified by" that is available on sharepoint on the sharepoint plugin. It would help me get my data on who actually uploaded datasets so we can track the users of who is adjusting the files.
-
Broken pages
Hello everyone ! I'm trying to obtain the Advanced Designer Certificate but several pages are broken, only the header seams to appear. I've tried from Firefox and Edge. Anyone noticed the same problem ?
-
Read xlsx. File from managed folder using Python
Hi team, I'm using a Python recipe in Dataiku to read a specific .xlsx file from a managed folder, but I'm encountering an error when trying to load the file into a DataFrame. Here’s a simplified version of my code: folder = dataiku.Folder("FOLDER_ID") file_list = folder.list_paths_in_partition() last_month_str = "YYYYMM"…
-
Configuring dev servers on a project- or user-level
Hi, my team at work is using Dataiku 13.5.5 and our typical setting is that we are collaborating on projects with external partners that have more limited access to data connections and other Dataiku features compared to us. Just today we were informed by our external partner that they are not able to run test queries to…
-
Cannot replicate a LGBM model
Hi, I have trained a LGBM model with visual ML and now I am trying to replicate this model in a local environment using the parameters estimated through model training. I cannot get the identical probabilities. Is it somehow possible to locally replicate a model built on Dataiku?
-
Problem in Sign up (Local Free Edition)
Hello When I try to sign up using the free edition of local dataiku Launcher not the Enterprise 14 days trail I used the other one Free Edition I fill the info and when i hit NEXT I got this error : Network error An attempt to communicate with DSS failed. Please check your network connectivity. Network error: ( My internet…
-
Get a list of code environments that use a specific python package and the version specified?
Hi, We have an internal python package that gets used by dozens of code environments and we need to know which code environments those are and what version is present. Is there a way to do this with DSS? thx in advance Operating system used: Windows 10
-
DSS code-env package extraction
Hi All, i am trying to extract the list of all installed packages with each and every code-env does any one has any idea how can we do so. Basically my object is to resolve the list of vulnerable packages identified. to do so can someone please suggest how can i find or search a certain package within dss. can some suggest…
-
Python Recipe
Ignore the python input dataset validation. Is this possible? I want to run a python script that reads the dataset metadata (description, etc) and outputs to a folder. This would be dynamic based on a user input of a dataset name or all datasets if no name is provided. working in notebook but validation creates this error…
-
How to create subfolders inside a Managed Folder using Python?
Hi everyone I’m currently working on a project where I need to organize files inside a Managed Folder in a hierarchical way (for example: mainfolder/subfolder/data.xlsx, mainfolder/REPORTS/report1.csv, etc.). Is there any supported or alternative way (via the API, SDK, or external libraries) to create real subfolders…
-
Mismatch in random ports for notebook kernels
When I try to use the notebook kernels in containerized environments I get different ports stored in DSS and in the kubernetes pod is that normal? The file jupyter-run/jupyter/runtime/kernel-c234fbb1-84dc-44ad-a18e-7ad0aff81702.json shows different ports: { "shell_port": 51875, "iopub_port": 38011, "stdin_port": 35689,…
-
Can the hyper-parameter change for each new training model with each new dataset
Dear dataikuler thanks for reading my question. Hi, so my problem is when i re-train my model with different dataset (like my first dataset is from 12/10 /2024 to 12/10/2025 and my second dataset is 30/11/2024 to 30/11/2025) and then i deploy the second model i check the hyperparameter of each version and i see all of them…
-
How do I search discussions?
I want to ask something but first I want to check if this has been asked (and maybe answered) before. How do I search in discussions? Eventually not only text search, but I bit smarter (AI). Operating system used: Windows
-
Cartesian product detection in join recipe
What's your use case? Cartesian product is a common issue when joining dataset with a bad key. It's not always easy to detect and users can even forget to check for it because they think they know their data. What's your proposed solution? What I suggest is an option to check if there will be a cartesian product on the…
-
Failed to read data from DB
I have an empty prepare recipe, used to copy a dataset from mysql to an oracle database before the job was executed but since one week it's turning into error , i tried a sync, and i am getting this error Failed to read data from DB, caused by: SQLException: Query failed : Query exceeded distributed user memory limit of…
-
How to retrieve input datasets for a specific dataset using the Python API?
Hi everyone, I'm trying to use the Dataiku Python API to identify which input datasets were used to create a specific dataset within a project. For example, in the project "PRISME_INTEGRATION_TABLES", I want to retrieve the direct input datasets that were used to generate the dataset "PRS_Decision_Complement". I attempted…
-
set up redshift connection
Operating system used: RHEL8 Operating system used: RHEL8 Operating system used: RHEL8
-
USE OF MOCODO WITHIN DATAIKU
Hello, As a data steward I setup Entity-Relationship Model (MCD in french). I use MOCODO tool for that concern. I record my Mocodo script in a Jupyter Notebook within Dataiku. i get an svg picture with my ERM (see the file here attached). I would like to get and save this picture in a pdf or png form. Could someone support…
-
Querying data using dataiku api
I have the following code that works, this is running it outside of the DSS instance import dataikuapi import requests import urllib3 import pandas as pd # Disable SSL verification warnings (if needed) urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) # Connect to the Dataiku DSS instance client =…
-
Dash and Standard Webapp
Hi community, I am currently using 13.4V of Dataiku where I don’t have access to the API services. About my Agent Code: I have created a multi-agent system in Python where each agent performs a specific task sequentially: Agent 1 – Searches for relevant content based on user input. Agent 2 – Extracts the required context…
-
Efficient Management of Code Environments for Multiple Plugins
Since each plugin in Dataiku requires its own code environment, I wanted to ask if there is a recommended or best-practice approach to manage this more efficiently. Specifically, is it possible to use a previously built code environment and add only the custom libraries required for different plugins to avoid duplication…
-
Allow the window recipe to ignore partitioning.
I frequently the window recipe along with partitioning. The window recipe currently has no option to ignore partitioning which causes me to unpartition, then repartition to get the recipe to work correctly. This causes a host of other issues. Please give us the option to ignore partitioning for recipes that frequently…
-
Shared Secrets
As we've been developing plugins and for other more exotic use cases, we've seen the need for shared secrets in Dataiku. Teams share account credentials or plugins may rely on some group based credential (e.g. Box JWT tokens for a "team account"). We hack around this using FTP type connections and parsing their secrets or…
-
How to extract the project creation date and last activity on the project
Hi All, Can anyone please suggest me action here on how can i get the project creation date and the last activity date of a project. We have a dataiku instance and for that i need to extract the dates for project creation and project last activity
-
Show webapps and lib in bundle
Hello, Working on projects especially based on webapp and using libraries. It appears that when creating a bundle, it does not appear. my workaround solution is to download the bundle and check manually what's in. Best regards, Simon
-
An easy way to download uploaded files
Hello all, Dataiku makes it very easy to upload files and create datasets based on these files. However, there isn't a clear and easy way (like a button) to download the said files while it can be pretty useful (here, the file is on my coworker machine, he uploaded it and I want to retrieve it… from what I undersand, I…
-
How can I retrieve the list of foreign datasets used in my Dataiku project?
How can I retrieve the list of foreign datasets used in my Dataiku project that originate from other projects?
-
API Node - What is the use of the folder 'code-envs-cache' ?
Hello, On the API node we noticed that the folder 'code-envs-cache' requires a lot of disk space. Please, could you tell us how this folder is initialized and used ? Thanks Annie
-
Messaging Channels
how to retrieve the list of channels used to send mail in scenarios from python code ? Operating system used: windows
-
ERROR PI System - Could not connect. Error: (traverse) unsupported hash type md4
Getting this error after running the job [11:53:15] [INFO] [dku.utils] - 2025-10-07 11:53:15,160 ERROR PI System - Could not connect. Error: (traverse) unsupported hash type md4[11:53:15] [INFO] [dku.utils] - 2025-10-07 11:53:15,160 WARNING PI System - Error while traversing path \\ Could not connect. Error: (traverse)…
-
Enable/Disable Reporters in Deployer
I would really like to be able to manually enable reporters in a deployed project, even if they are disabled in the design node. I imagine this would work exactly like triggers being enabled/disabled. My use case for this is I have triggers that send emails to end users when certain scenarios sucessfully complete their…
-
Is there a way to roll back the changes that we have done to a flow?
Hi all, when our flow involves lot of transformation there can come a point where some thing goes wrong and we want to revert back to the point where everything was working. Is there a way I can check what all changes were made to the flow, and go back to the previous version or branch out from there?
-
Efficiently applying window recipes across monthly database partitions
Hi all, I’m looking for best practices when applying window recipes that need to span multiple partitions. In my case, I have a dataset partitioned by month, and I’m using a window recipe with lead/lag functions to look ahead and behind by 3 months. To make this work, I currently: Unpartition the dataset using a sync…
-
SQL Query Recipes using API
I'm struggling to find good code examples on creating SQL query recipes via API version 14 (or compatible). I'm trying to get subsets of data pulled from a SQL table dataset into separate Azure blob datasets for consumption by other parts of our application. It seems like it should be straight forward to find examples, but…
-
Integrating Dataiku with Denodo
Has anyone created an integration with Denodo? If so, did you use a JDBC connection? How did you manage user permissions? Thanks! Operating system used: Red Hat
-
Changing Data Meaning and/or Storage Type Creates Recipe Step
Data meaning and storage types for columns are currently editable in explore tabs for datasets OR inside a prepare recipe. From my experience, adding it in a prepare recipe DOES allow it to be re-changed when the recipe is re-run. However, this does not visible appear as a processing step in the prepare recipe. Can that be…
-
"Enrich records with files info" in prepare recipes working only on csv files ?
Hi all, Working on a prepare recipe which is just after an initial dataset regrouping several files. I have no problem with the step when all my initial files are .csv. But when my files are .xml, the resulting column is empty. Same empty result when my files are "on data per-line" based. What am i missing here ? (in…
-
Search ability in the discussion and Product ideas
That will allow people if the question has been asked already and maybe even a solution/resolution is already. It will help reducing duplicates and make it easier to work with.
-
Label task
Hello, I have created a label task for the evaluation of a query. The input is a sample of rows of a bigger dataset. With the label task you can then assigned one of five categories. Using the label task has lead to changes of the query that feeds the label task. Now I want to reset the task / remove the data associated…
-
Automated table definition not creating long enough varchar fields
Anyone have an idea on why my tables are creating with only varchar(18) when the query produces text longer than that? Getting this error: ERROR: Value too long for character type Detail: ----------------------------------------------- error: Value too long for character type code: 8001 context: Value too long for type…
-
Data Catalog - Database Stored Procedures
Is there a way currently to have the dataiku catalog read the database stored procedures? It would be great if the procedure name and definition were searchable through the tool. Bonus points if the tool was able to read the procedures to determine input tables and output!
-
Can't get license
Hi, I have to take this ML cert as it is a requirement from my company, but I have tried refreshing it and tried disconnecting and connecting my network but there is still error as below, I hope I can get clarification for this. Operating system used: Windows
-
Como hacer Naive Bayes en Dataiku
-
Script to list all connections used in a project
Anyone have a script that lists all the connections used within a project?
-
Change Email to Work Email
Hi, I need to change my email to my work email so the certification can be valid at work. But I used google sign in, so I can't change it in edit profile. What should I do
-
Managed Folder tree-like view and preview for readable files
Hello, My team and me have a massive usage of managed folders for basically anything we do by code. Indeed, we have mixed data types (parquet, json, pickle, xlsx, etc) and to use them we found managed folders easy to use and work in Python. We find it difficult to navigate in a Managed Folder (DSS 13.5.4) and we think that…
-
Error during evaluation recipe using Prophet time series forecasting model
Im trying to evaluate the same dataset i used for training a Prophet time series forecasting model. However, if i choose the "Refit model" in Evaluation recipe, this error occurs: ** Im using prophet==1.1.7, holidays==0.80, python 3.9; DSS Instance 14.1.1 Error in python process: <class 'RuntimeError'>: Error during…
-
Error during evaluation recipe using Prophet time series forecasting model
Im trying to evaluate the same dataset i used for training a Prophet time series forecasting model. However, if i choose to use "Refit model", this error occurs: ** Im using prophet==1.1.7, holidays==0.80, python 3.9; DSS Instance 14.1.1 Error in python process: <class 'RuntimeError'>: Error during optimization! Command…
-
Appending to table keeps overwriting
My first time working with writing into tables other than creating a table. Thank you for your time in helping me! I have a scenario that looks for the oldest record in table B of Server B and then pulls new records from table A from Server A. My intent is to append those new records to table B. Table A has 4 columns.…
-
Dataiku DSS - Quick ML Pipeline Build, Require an Experienced Consultant
Hey data scientists! Need a Dataiku wizard to help us automate some machine learning workflows. What we're building: Custom data preparation recipes with advanced transformations and feature engineering Automated ML model training pipeline with cross-validation and hyperparameter tuning Simple deployment setup for model…
-
What is the best practice to share business-facing content with business users?
My team has developed several Dataiku Apps and Dashboards that are ready to be shared with business users. Currently, I am considering the following approach: Deploy the content from Designer to the Production Automation node. Share the Dataiku Apps and Dashboards through specific workspaces. I am wondering: Is it common…
-
Dark Mode
Every developer needs a dark mode A dark theme for the flow, datasets, and recipe configs would go a long way toward making Dataiku fit into workflows that involve many other dark mode tools. Dataiku is definitely very bright when swapping from other tools which operate in dark mode. Extensions like Dark Reader do a pretty…
-
Add option to support non-pandas dataframes (e.g. polars) in Python recipes
Hi, There are many pandas alternatives. One that is new and very fast is polars. Polars is built on Rust so it is memory safe and runs in parallel by design. I use polars in one of my recipes but have to convert it to pandas to write the dataset. thx
-
Error "Code env not usable"
Hi everyone, A project I created is currently shared with other users in my department as an application, allowing them to instantiate the project. They can run it, but they are not able to make any changes to the flow or the recipies within. After setting a new code env in my project, when a colleague of mine tries to…
-
Como hacer regresion multiple en dataiku
Operating system used: windows
-
Does DSS support deleting bundle thought API?
The only info I could find in the documentation is here: Processing-tests-Delete-a-bundle-clean-exported-bundles-list "To remove a bundle created using the export_bundle method, you need to remove it from the Dataiku DATA_DIRECTORY on the server. The bundles are located in…
-
Exporting preprocessing alongside model?
Hi, I am currently trying to export and deploy an ML model (LGBM) created and trained in DataIku to GitHub. It works just fine, but I'm not sure how to for the preprocessor of the model onto my validation dataset. I've tried searching online and have noticed there being some preprocessing functionality in the…
-
Dataiku DSS start failed : Download failed: unable to get local issuer certificate
Dataiku DSS start failed : Download failed: unable to get local issuer certificate Dataiku is not starting in the Macos 12.3.1 , the installation file is downloaded again from the Dataiku website tried reinstall multiple times 2022-06-21 11:01:40.749 +0400] INFO: Getting latest Dataiku DSS info from…
-
install-graphics-export air-gapped
I want to install install-graphics-export so I can export my flow or a dashboard to PDF. I’m in an air-gapped environment. When I start the installation, I get the following error message: ./dssadmin install-graphics-export [+] Saving installation log to /data/dataiku/data/run/install.log [+] Checking dependencies +…
-
Align and enrich the editing features of different recipe code types
Hello, Currently, there are differences in the editing functions offered by recipe codes. Of course, jupyter notebooks are better off. Our users ask that basic functions, such as "find & replace", be available regardless of the type of recipe code used (SQL, Python, Shell, Notebooks). Thank you in advance
-
How can I access Recipe Run Options on a free tier?
Hello! I am pursuing the Advanced Designer learning path and at the Partitioning module I need to access Recipe run options, which is supposed to be next to the Run button, but I can't find it. Is it possible that this option might not be available on a free tier? Thank you in advance. Operating system used: Windows…
-
Root path does not exist Root path of the dataset Server_Doc does not exist
For some reason i received this error today. It worked fine yesterday. I tested and checked files and they both came up blank. We did not edit anything in Dataiku or delete the document. What could be the issue? Operating system used: Windows 11