Hi Team, I have created scenario to run daily wise to run the pipeline every day. in that scenario I have created 4 steps as below. 1)step 1: to get latest data available or not in table if available pick date_time of latest date 2)step 2: set the date_time as project variable. 3)step 3: based on date_time filter the data…
Hi Team, I am able to fetch Presto connection details in Jupyter notebook. But when I try to invoke API connection using same Presto details, getting error as Dss connection has no attribute cursor. my code as below: import dataiku client=dataiku.api_client() connection_handle=client.get_connection('my_connection') import…
Hi dear support, My name is Hossein and I am using Dataiku for a while. I just created a small database in Snowflake and wanted to read it from my DataIku project. My Snowflake is lunched and activated by partner to Dataiku. When I go for importing dataset, the snowflake is not active, the link is disabled. Would you…
I am trying to install the DSS on my desktop but running into error. Operating system used: Windows 10
Hi Team, As of now, Presto connection established in Dataiku to access from Database. but how can we retrieve those connection details into jupyter notebooks as variables to use them for making another connections. Thanks in Advance.
Hi, I am getting the error while running the python recipe. The error which I am getting like "" <class 'Exception'>: An error occurred during dataset write (tf7QiKhbhH): SQLException: [Teradata Database] [TeraJDBC 16.20.00.12] [Error 2644] [SQLState HY000] No more room in database" Can anyone help me out here asap?
How to update the flatfile in dataiku which already excits
I just tried the Writing data to snowflake with the Automatic fast-write using a S3 connection that uses "Credentials STS with AssumeRole" and the job build logs contain the AWS credentials in plain text like this: { "context": { "type": "JOB_ACTIVITY", "authIdentifier": "xxxxxx, "projectKey": "TESTSNOWFLAKEWRITE",…
Hi Everyone, I am trying to learn about computer vision on Dataiku (v12) thanks to the tutorial about "Leaf Image Classification". Unfortunately, I am not able to train my model because I get some errors during the training : "Subprocess failed to connect, it probably crashed at startup. Check the logs., caused by:…
Hello, If we externalize our runtime database with PostgreSQL and utilize PostgreSQL replication to ensure we have backups of each instance, is there still a need to backup the Dataiku DATADIR for disaster recovery purposes?
Create an account to contribute great content, engage with others, and show your appreciation.