-
Do we know when Dataiku 13 will be released?
We are currently using Dataiku 11 Community edition. We plan to upgrade it to Dataiku 12, but that's a process that takes time for us due to internal procedures. When I checked the previous releases trends, it might be really close for the Dataiku 13 release date. Do we have any idea or official announcement when that will…
-
Dataiku application using csv input OR dataiku data set
Hi all, I am trying to build an application using dataiku application designer and need your help! Flow diagram is attached for additional reference. I have a flow with 4 inputs - input 1 folder, input 2 folder (both for csv inputs), an editable dataset and a dataiku dataset. I've created a UI using Application designer.…
-
Long running job - where to find in logs?
Hi all I was wondering where I can find the information display in the UI about long running jobs? In the UI this section is in administration -> monitoring -> running background tasks where can I find this information in a data directory? Additionally, is there a way to send a notification when a job is still running…
-
DSS Release Compatibility
Hi, Wonder how is Dataiku release compatibility being supported? For example, does Dataiku allow a project bundled on a release 11 DSS instance to be used on a release 12 DSS instance? Thanks
-
Parquet format table redetected as CSV
Hey, I've been running into an issue where after creating a dataset which is stored in parquet, while using a pyspark recipe, the dataset is redected as csv, without a very different schema. Here's the dataset before pressing redetect format: And after pressing redetect format, It goes from 18 to 75 columns: And the new…
-
Append Data without drop table with python
Hello, I am building a simple web app with some input fields in DSS in order to insert a single row of data to a SQL table of which i have created the Dataset in DSS. I have tried successfully to append the data using the write data frame function and pandas data frame. But this function is dropping the SQL table before…
-
Multi-label classification
Hello community! Is it currently possible to perform multi-label classification using the in-built recipes, lab (or any other) features in DSS?
-
Using Jar based spark data source in Dataiku Recipe
Hi, I have a JAR file that i use as a spark datasource to read data from a url. This is how i do it on my local conf = SparkConf().setAll([("spark.jars", "path-to-jar/<jar_name>.jar"),]) df = spark.read.format("cloud.alpha.spark.providers.appobject.AppObjectTableProvider") \ .option("appName", "<name>") \…
-
Cannot Edit Tile Titles Since Update to 12.6.0
Morning all, it appears that since the recent upgrade to 12.6.0 it is no longer possible to directly edit the Title of a tile added to a dashboard. Usually we would edit the tile title to remove the appended dataset name on which the insight was originally based. I am able to click on the dashboard tile and see the Title…
-
Sequentially run a Dataiku workflow for each file within a managed dataset
.