-
Show API deployment comments on "Last updates" tab
When you deploy an API or API version from the API Designer to the Local Deployer, you can enter a comment for this version. I would love to see this comment field represented on the new "Last updates" tab on the deployer for a service in order to have a quick overview on why each update to the service was made or what…
-
Use pickle to create API service
i have python code recipe that already create the pickle model and save the pickle to filesytem_folder and already in scenario for weekly pickle update. then i want to create api that use this pickle to classify. i use Custom prediction (Python) for Endpoint type and set the Working folder (optional) to the…
-
Default project key is not specified (no DKU_CURRENT_PROJECT_KEY in env)
I'm creating a python function endpoint with this script: And I don't know how to deal with this error: Dev server deployment FAILED Failed to initiate function server : <class 'Exception'> : Default project key is not specified (no DKU_CURRENT_PROJECT_KEY in env)
-
Automated alerts from Unified Monitoring on bundle or API endpoint failure
We find the Unified Monitoring (UM) feature extremely useful as it allows us to see the health of our bundle and real-time prediction APIs. However, the is no way to be alerted if a deployment fails or if an API endpoint is down. We currently have some Python scripts that scrape the data from UM and then identify any…
-
Add Seldon to deployment options
One of the deployment options in our company is Seldon (Seldon, MLOps for the Enterprise.). It would be great if Dataiku had the option to deploy directly to Seldon, the way deployment to K8, AWS, Databricks or Azure is now possible. Seldon in general deploys MLflow artefacts.
-
Put stuff in the API logging without sending in the response
We often run into situations where we'd like to log stuff from our internal API workings - like intermediary results for checking - without having to send these out in the response. It would be wonderful if there was an option to send things to the API log without it having to be part of either the request or the response.
-
HTTP status codes returned by DSS API node/service
Two questions: 1. Will an exception, raised during execution of API service code, always result in a response with a 500 HTTP status code? 2. Aside from that, errors can occur outside of API service code execution (e.g. service is unavailable). Is there an overview available of which situations map to which HTTP status…
-
What is the difference between the dataiku and the dataikuapi packages?
From what I could see, both seem to have the same functionality when I used them outside the Dataiku web interface. I see documentation for both the packages but they seem to be conflicting. For example, in this page section, in the illustrative code, we are importing dataiku, but just below it, it is written that a…
-
Dataiku Spark Remote Connection
Hello, I am using Dataiku 12.5.2 and currently running Spark 2.4. Dataiku is installed on a server named A, while Spark is installed on a server named B, configured as a standalone Spark installation without Hadoop. Both server A and server B are capable of TCP communication and allow SSH access. How can I use the Spark on…
-
remapping connections for API services
Goodday! In the API Designer, we can define connections to use with SQL Query Endpoints. How do we remap these connections based on deployments to different API nodes? (ie. use different connection for deployments to a production API node vs. deployments to an acceptance API node) I don't see any option in the deployer UI…