-
Test Queries: batch prediction in the API Designer
Hi can i enter here a multiple calls , to receive back multiple predictions? if so how please ? , i can only run one query per a time. Thank you!
-
How to save a Pyspark DataFrame to a managed folder
* Hello, Community. May I know how I can use Pyspark recipe to save my pyspark dataframe as csv file to a output managed folder? I have searched in the community, but most of posts cover pandas dataframe only....
-
I have passed correct API key but still it showing API key error
Even though I've entered the correct API key, it still indicates an API key error. How can this issue be resolved positively?
-
Is it possible to output model parameters in an excel in dataiku?
I am using Dataiku to build models. After building models, I want to create a model report in Excel with multiple sheets. Here, we plan to output details such as Model training data (IDV, DV, model predictions, errors) in one Excel sheet, DV-IDV plots in another, and shap values. , hyperparameters in another excel etc. Is…
-
Save Static Insight using dkuSaveInsight()
Hi, I want to save a rpart plot (through a R recipe) as a static insight to publish it in a dashboard. I have found from this documentation that by using the dkuSaveInsight(), one can save any R plot as a static insight. However, the documentation provided is not very explanatory. Any ideas/help on this would be highly…
-
Scenario
I want to build a dataset on a monthly basis, but I want to save each result (the built dataset) separately in a folder. How do I do it? I do not want to append the data. Each built dataset month-on-month should be saved in separate datasets.
-
'Sub' scenario aborts 'main' scenario
I have created a scenario (Scenario A) that will trigger another scenario (Scenario B) I want to abort scenario B when certain condition is met (which should also abort Scenario A) My thought is to introduce Kill scenario step in Scenario B that will kill Scenario A (and by right, it will abort Scenario B) . Will this…
-
How to load a txt file delimiter by * ?
I normaly use it to load txt files delimted by * : test_df = spark.read.format("txt").option("inferSchema", infer_schema).option("header", first_row_is_header).option("sep", "*").load(file_location) But here in DataIku neither have the option to set the delimiter, neither of use my own code. Someone knows how to load using…
-
Write html file to azur blob storage with content type text/html
Hello, I am trying to write an html file with the help of a python recipe into a folder which is located on a blog storage. The writing works so far, but unfortunately the content type of the file is not text/html, but application/octet-stream. In the Element folder in Dataiku I have set the Content Type to text/html under…
-
Disable dash instance created before running dash project on dataiku
Hello, I have developped a dash web-app using Dash packag, and it works well locally. However, once I have uploaded it to Dataiku, it is not working as expected. My question is: how should I remove the "app = dash.Dash(...)" instance defined in dataiku? Is there a way to disable it and enable the dash instance declared in…