-
2 Features I desperately need added to Dataiku
Appending to output dataset in python code recipes. Currently the way to do this is with a check box in the settings of the recipes that says "Append instead of overwrite". However- this is limiting and does not have good functionality with respect to potential schema changes (this button has caused significant data loss…
-
Examples for custom prediction in API Designer
Are there any actual useful code examples of using custom prediction in python? I have a model that exists in my Flow and I want to use that model to make a prediction just like the Prediction model api endpoint would do to start and then add more custom code on top of that. The boiler plate code imports dataiku and…
-
Using a variable or not depending on the scenario
Hi guys, I have a flow with 2 different scenarios. I have one variable v_idproduct used in a post filter join recipe in a sql code (id_product IN v_id_product). In each scenario I have a different list of id products. I want to modify one of the scenarios so that this filter is no longer applied, allowing all product IDs…
-
How to create a conditional table in dataiku after flow is successful
Once the load of table is done and flow is successful add "1" to table value indicating load complete.
-
Display HTML File Content in a Dash Webapp
Hello everyone, I am working on a Dash web app to create a chatbot using RAG. I have stored my HTML files in a Dataiku folder. When the LLM responds to the user, I display the sources of the documents (thanks to the metadata in the embed recipe) as file paths. However, I would like to display them as hyperlinks that open a…
-
Issue with Custom Plugin: “Argument List Too Long” When Passing Large Config Parameter
Hello Dataiku Community, I’m developing a custom plugin in Dataiku, and I’m facing an issue when passing a large config parameter. The parameter is a list with around 500 items, and I can see that: •The list appears correctly in the plugin’s UI. •It also logs correctly in the browser console using JavaScript. •However,…
-
Why is there a limit of 75 Columns for Power Bi Exports?
I have a data set that I want to export to power bi. It will not let me export a file with more than 75 columns. Do not answer cut the file. I am not doing that. There is no problem exporting a dataset as a hyper file and using it as a data source in my tableau server. There should not be a problem with Power Bi.
-
How to setup Athena connection using s3 connection
Hello, i've already setup an s3 connection which is working perfectly. I'd like to use it to setup Athena connection. My s3 connection use sts-assume-role type. Iuse dataikucloudstacj (aws) Can you tell me why thing do not work? do I need specific policies in roles of s3 and intances profile? I've this error in screenshots.
-
Can you please help with the documentation on spark vs dss engine
Hi Team, Requesting your urgent attention to help us with the official documentation on spark vs dss engine with actual scenarios. Primarily we are looking when to use what. What are the recipe which fit good for spark engine? Secondly if a dataset is small why spark engine consume more time then DSS engine?
-
Survival Analysis plugin - interpreting outputs
Hi! I am trying to implement the survival analysis plugin using the Weibull distribution. I am trying to replicate a model in python where we use the predict_median method to get the predictions from the model. Is there a way to do this in dataiku?