-
How to automate flow flow build ?
I have multiple python scripts that i use to ingest data from DB and perform ETL steps . At the moment all the ETL logic runs on an on premise server (mainly using python / cron). Since we have Dataiku available since few things, i'm thinking about migrating all our ETL to Dataiku (flows using python recipes). Yet , it is…
-
compatibility of the Foreach and transpose transformations
Hi, I have a project which uses Foreach statement and right after Transpose. The Result of the output should be 26 as u see in the picture but we get 19 for some reason. When we spited the prepare statement in two moving the Transpose in a separate prepare the output was correct. I do believe it has to do something with…
-
Performance issue in Dataiku.
Hi, I am new to Dataiku and creating one pipeline like, datbricks-read-only dataset to -> prepare recipe (databricks dataset) ->(sync) databricks dataset to ->(sync) Azure dataset and then further process . In prepare recipe I am taking only required columns and renaming it so no space should be there. pipeline is like as…
-
Stuck at starting
My Dataiku is stuck in 'Starting Dataiku DSS...' for hours when I first start Dataiku, but it was working fine just yesterday. It's running on Windows, haven't had problems so far. Operating system used: Windows
-
Ollama on DSS
Pardon my ignorance on the subject but is there a way to connect to a model like llama2 or mistral running on Ollama locally or another k8s cluster to DSS Operating system used: Suse 15
-
as_core_knowledge_bank not working on dataiku-api-client==14.2.3
To convert a DSSKnowledgeBank to a KnowledgeBank using the dataiku API we have to use the as_core_knowledge_bank() function (https://developer.dataiku.com/latest/tutorials/genai/nlp/create-knowledge-bank/index.html). However, the instantiation of the KnowledgeBank is not working because the context_project_key is missing…
-
API Returns Only Enabled Metrics After First Dataset Upload
When I upload a dataset for the first time, dataset.get_settings().get_raw().get('metrics') returns enabled metrics only be default. After toggling any metric in the UI (enable/disable), the API then returns ALL metrics including disabled ones. Is there a way to programmatically initialize all metrics without UI…
-
How to use a remote MCP server tool in Dataiku
Is there a doc about how to do this in a recipe or notebook? I have created an MCP server following the tutorial here: Building your MCP Server in Dataiku - Dataiku Developer Guide I have tested this via MCP Inspector. Now I'd like to see how I can test this or an external MCP server from a Dataiku project. The "wrapping…
-
How to convert Visual Recipes to Code Recipes?
I have a Dataiku project hand-overed from others which built via visual recipes. As I'm used to use python for implementation / maintenance, I would like to convert the project to code recipes, does Dataiku provide quick conversion for this scenario? Operating system used: Windows
-
Passing Previous Month Partition to workflow recipes
In the scenario build, there is an option for selecting time partitions. This works great, but I want to know how to pass this variable to recipes such as a python and sql recipe. I wasn't able to make this work until I looked at the log and saw this variable '$DKU_DST_YEAR_MONTH', this appears to work, but I don't know if…