-
How to retrieve the test dataset used in the trained model With python?
Hello everyone, I am working on Dataiku, primarily using their API. I have trained my model and would like to retrieve the dataset that was used for testing via the API methods. Despite trying several methods, including get_train_info(), I am unable to obtain the test dataset. I don't want to export it; I just want to…
-
Use pickle to create API service
i have python code recipe that already create the pickle model and save the pickle to filesytem_folder and already in scenario for weekly pickle update. then i want to create api that use this pickle to classify. i use Custom prediction (Python) for Endpoint type and set the Working folder (optional) to the…
-
How can I access Dataiku Inbox using Dataiku Python API
I want to access the Dataiku Inbox to see who raised access requests using Dataiku Python API. Can you help me how to do that? I want to approve requests in an automated way using Python.
-
ModuleNotFoundError When Accessing Knowledge Bank via REST API SDK
Hi everyone, I'm running into an issue while trying to access a Dataiku Knowledge Bank from an external Python script using the Dataiku REST API client. When I execute my script, I get the following error:ModuleNotFoundError: No module named 'dataiku.core.vector_stores.lifecycle' Here are my environment details: Dataiku…
-
Calling APIs and licensing requirements
Hi all, We are building an API that could be called by many people running local apps and processes. I understand API keys are the way to go. The issue I'm having is that my security team wants separate keys for every user and process - and a process for managing those keys. Another problem is there is no guarantee that a…
-
refresh partitions in dss via API
Hi, we have added by a python api a new dataset into the project and pointing it to an existing location in HDFS where partition folders are stored. (This location is managed by another DSS instance). This kind of "import" of read only dataset works, but I did not find a way how to "refresh" the list of partitions, i.e.…
-
error : getting error while deployment in Automation node
I am trying to deploy my project in automation node in my dev1 or testing env. but I am getting this error : I am not using any API only is scenario in python code I am using this code client = dataiku.api_client() project_key = dataiku.default_project_key() project = client.get_project(project_key) An error occurred on…
-
Default project key is not specified (no DKU_CURRENT_PROJECT_KEY in env)
I'm creating a python function endpoint with this script: And I don't know how to deal with this error: Dev server deployment FAILED Failed to initiate function server : <class 'Exception'> : Default project key is not specified (no DKU_CURRENT_PROJECT_KEY in env)
-
How do I get created and modified dates fro an artifact in Dataiku Govern API
I can see this information in the timeline in the UI, but doesn't seem to be in the data returned by: ar = client.get_artifact('ar.26421') print(ar.get_definition().get_raw()) Operating system used: Windows
-
Python API for fleet manager
Hi all, I am trying to connect to the FMClientAWS using the documentation here : But I am not sure what to use as key_id and key_secret. I could not find anything related to this in the FM UI. Can anyone help me with how to get these two values for my FM? Thanks! Operating system used: linux Operating system used: linux