How can I do experiment tracking for a model based on R code/recipe. The documentation here: https://doc.dataiku.com/dss/latest/mlops/experiment-tracking/tracking.html only describes this scenario for python code. My DSS version is actually 13 Operating system used: macOS Operating system used: macOS
I get the following error when I run an if then else error on spark. This runs fine on the local engine. I have if then else statements that run just fine in other sections of my workflow in the DSS. How should I interpret this and how do I debug? Job failed: Task not serializable, caused by:…
In containerized execution I can prepare the base container image and push it to a registry so that containerized execution pods will pull the image directly from my container. However for cde image I can build it using dssadmin build-base-image —type cde But I cannot push it to a registry. I found a command that does…
Hello, i'm trying to run a background task when my endpoint is call. Here is a simple example of what i am trying to do. import asyncio async def run_task_background(parameters): job_uid = await run_task(parameters) #add job_uid in a table def api_py_function(): asyncio.run(run_task_background(parameters)) return…
Hi all, I am new to Dataiku world, I'd like to ask the right way to output data with specific time range with partitioning method. The thing I want to do is: Dynamically build the recent 3-days data from input datasets. (Use Time Range day partition) I've tested but the output seems to grow even more than I expect, and the…
Hi everyone, I'm encountering an issue when trying to train a model in Dataiku DSS. The training fails with the following error: Training failed Read the logsSubprocess did not connect in 60000ms, it probably crashed at startup. Check the logs., caused by: SocketTimeoutException: Accept timed out I am on macbook air Mac os…
I've recently started to use the "Run integration test" scenario step for testing. It's definitely some work to create the test reference datasets but it once set up it's great to be able to run this test after later code changes to confirm the process works as expected. Our flows typically mostly use SQL script recipes.…
Are there any actual useful code examples of using custom prediction in python? I have a model that exists in my Flow and I want to use that model to make a prediction just like the Prediction model api endpoint would do to start and then add more custom code on top of that. The boiler plate code imports dataiku and…
While I am trying to add some column values in the resultant column I am getting the value as NaN. Operating system used: Windows
Hello, I’ve encountered an unusual error while using the Group By recipe in Dataiku. Here’s a summary of the issue: Context: I created a Group By recipe on three columns, applying three custom aggregations using SQL. Input Data: The recipe takes as input a PostgreSQL (PGSQL) table, which is the output of a JOIN operation…
Create an account to contribute great content, engage with others, and show your appreciation.