-
Testing SQL Connections
We want to test database connections in all our instances since our internal security policy is to change database passwords yearly and this invariably leads to some user connections being missed and this invariably leads to some user database connections being missed, the password expiring, flows failing, data not being…
-
How to unzip files from managed folder in dataiku
I am getting error as : NotImplementedError: That compression method is not supported at zip_file.open(), even though code is able to list the filenames in zipped folder as you can see VBOX0001.vbo Operating system used: Windows Operating system used: Windows
-
Preformat and preprocessing a register Dataiku model through python by using
I would like to use through a python recipe a registered model and apply the preformat and preproccesing through a huge dataframe connected to Databricks (which is good for memory issues). But it seems this is not possible to do it without passing through a pandas dataframe. Anybody know how to resolve this? The error…
-
How do I get the average of a column to use in a DSS output file (not a metric)
I have a small table that gives me the average # of workdays, average daily value, and average forecast for the previous 3 months. I have to be able to calculate the average of each column so I can use the results in another prepare recipe (for example, the average of Workdays is 22). Is there a formula that will create…
-
Keeping Conditional Formatting While Export Doesn't Work.
I have an issue with the Applying conditional Formatting while export. Dataiku does not apply formatting while export. What is the issue how can i keep conditional formatting.
-
Label Encoding Dataiku Recipe
Hello, I cannot find any recipe that is the equivalent of the scikit learn LabelEncoder(). The One Hot encoding recipe can be found in prepare recipe as "Unfold" step but regarding LabelEncoder (IE label_1 = 1, label_2 = 2 … label_N = N) is very well hidden. I could make a python recipe but i would prefer to every task in…
-
automation of insights export to pdf from notebook
I have a Dataiku notebook in a flow that creates various visualizations. After the visualizations are generated, I manually publish and export them as a PDF. I would like to automate this entire process, from generating the visualizations to publishing and exporting the PDF. Could someone guide me on how to automate this…
-
Modify "Answers" WebApp Plugin
Hello all, I have DSS v13.1 , I want to modify the "Answers" plugin to add feature uploading documents (pdfs, images, excel, word …etc) , like the one used in chatgpt 4 for example . What is the best way to do so? Operating system used: Linux - Debian 11
-
If the dataset do not exists, I either want to catch this exception thrown, or check if df exists
Exception: Reading dataset failed: b"Failed to read data from table, caused by: SnowflakeSQLException: SQL compilation error:\nObject 'PROD_.GLOBAL_PILOT' does not exist or not authorized."
-
Dataiku visual Recipe Parallel
I am using Dataiku 12.5.2. How can I enable parallel processing when using a Sync recipe for the following cases: Filesystem Dataset to Filesystem Dataset JDBC Dataset to JDBC Dataset Or between Filesystem Dataset and JDBC Dataset? Are the only available options duplicating the flow, partitioning the data, or using code…