Hi everyone I’m currently working on a project where I need to organize files inside a Managed Folder in a hierarchical way (for example: mainfolder/subfolder/data.xlsx, mainfolder/REPORTS/report1.csv, etc.). Is there any supported or alternative way (via the API, SDK, or external libraries) to create real subfolders…
Hello, As a data steward I setup Entity-Relationship Model (MCD in french). I use MOCODO tool for that concern. I record my Mocodo script in a Jupyter Notebook within Dataiku. i get an svg picture with my ERM (see the file here attached). I would like to get and save this picture in a pdf or png form. Could someone support…
Anyone have an idea on why my tables are creating with only varchar(18) when the query produces text longer than that? Getting this error: ERROR: Value too long for character type Detail: ----------------------------------------------- error: Value too long for character type code: 8001 context: Value too long for type…
Hi, I need to change my email to my work email so the certification can be valid at work. But I used google sign in, so I can't change it in edit profile. What should I do
I want to install install-graphics-export so I can export my flow or a dashboard to PDF. I’m in an air-gapped environment. When I start the installation, I get the following error message: ./dssadmin install-graphics-export [+] Saving installation log to /data/dataiku/data/run/install.log [+] Checking dependencies +…
SQL query tool のData context informationにもカラムのメタデータを記載できるため、データセット内のdescriptionのみでも十分ならSQL query tool のData context informationには何も記載せずでもいいのか気になりました。 またSQL query tool が実行したクエリを確認する方法はありますでしょうか
I have an excel file whose format is somewhat different. I need to skip first 9 lines and use 10th line as column names. I have tried using "parse next line as column headers" but it is not working for me. Has anyone faced this type of error. Please let me know how to resolve it. I am pasting input file format below:
Hi all, I'm currently working on a Python endpoint in Dataiku API Node, where I need to dynamically generate and execute SQL queries against a Snowflake connection. These SQL queries are generated via OpenAI, so I cannot use a predefined SQL endpoint (as those require fixed queries). I followed the official Dataiku…
In my Dataiku UI, I created two Python environments under Code Envs: One for Python 3.12 One for Python 3.9 Each environment installs the Python libraries listed in “packages to install” by retrieving them from repositories hosted on an Artifactory. I imported the Artifactory certificate into the keystore of my Dataiku VM…
Operating system used: windows
Create an account to contribute great content, engage with others, and show your appreciation.