-
Gitlab integration via https
Hello, I've setup git https configuration (by login in ssh mode to my ec2 instance that run dataiku). I've created a token then added it in gitconfig with the rightsthings. This setup work as I tested cloning a project. In dataiku interface, I've issue when try to add remote repositoy. What the issue here? Not authorized…
-
Can dataiku output a transformed set of data into an encrypted excel file
I am looking to output a dataset after it has been transformed. I'd like to convert this into an excel file and then like to zip and encrypt them in say, winzip... Does anyone know if this is possible? Operating system used: Windows
-
Do we know when Dataiku 13 will be released?
We are currently using Dataiku 11 Community edition. We plan to upgrade it to Dataiku 12, but that's a process that takes time for us due to internal procedures. When I checked the previous releases trends, it might be really close for the Dataiku 13 release date. Do we have any idea or official announcement when that will…
-
Dataiku application using csv input OR dataiku data set
Hi all, I am trying to build an application using dataiku application designer and need your help! Flow diagram is attached for additional reference. I have a flow with 4 inputs - input 1 folder, input 2 folder (both for csv inputs), an editable dataset and a dataiku dataset. I've created a UI using Application designer.…
-
Long running job - where to find in logs?
Hi all I was wondering where I can find the information display in the UI about long running jobs? In the UI this section is in administration -> monitoring -> running background tasks where can I find this information in a data directory? Additionally, is there a way to send a notification when a job is still running…
-
DSS Release Compatibility
Hi, Wonder how is Dataiku release compatibility being supported? For example, does Dataiku allow a project bundled on a release 11 DSS instance to be used on a release 12 DSS instance? Thanks
-
Parquet format table redetected as CSV
Hey, I've been running into an issue where after creating a dataset which is stored in parquet, while using a pyspark recipe, the dataset is redected as csv, without a very different schema. Here's the dataset before pressing redetect format: And after pressing redetect format, It goes from 18 to 75 columns: And the new…
-
Append Data without drop table with python
Hello, I am building a simple web app with some input fields in DSS in order to insert a single row of data to a SQL table of which i have created the Dataset in DSS. I have tried successfully to append the data using the write data frame function and pandas data frame. But this function is dropping the SQL table before…
-
Multi-label classification
Hello community! Is it currently possible to perform multi-label classification using the in-built recipes, lab (or any other) features in DSS?
-
Using Jar based spark data source in Dataiku Recipe
Hi, I have a JAR file that i use as a spark datasource to read data from a url. This is how i do it on my local conf = SparkConf().setAll([("spark.jars", "path-to-jar/<jar_name>.jar"),]) df = spark.read.format("cloud.alpha.spark.providers.appobject.AppObjectTableProvider") \ .option("appName", "<name>") \…