Interacting with non-deployed custom model

Solved!
fmonari
Level 2
Interacting with non-deployed custom model

Hi, 

I would like to load a custom model created within an analysis (not deployed yet) in a notebook in order to do a series of ad-hoc evaluations.

Is this possible or I need to deploy the model first?

Thanks!

Filippo

0 Kudos
1 Solution
AlexandreL
Dataiker

Hi,

It is not  possible to call the model directly from the analysis. However, you can use python code to deploy it temporarily and clean your deployment afterwards.  The following code might help you, it will temporarily deploy then delete each model in a ML task:

 

import dataiku
client = dataiku.api_client()
project = client.get_project(PROJECT_KEY)
analysis = project.get_analysis(ANALYSIS_ID)
ml_task = analysis.get_ml_task(ML_TASK_ID)
models_ids = ml_task.get_trained_models_ids()

for index, model_id in enumerate(models_ids):
#deploy model
    deployed_model_id, _ = ml_task.deploy_to_flow(model_id,
                                'temp_model', 'train_dataset_name').values()
    deployed_model = dataiku.Model(deployed_model_id, ignore_flow=True)
    predictor = deployed_model.get_predictor()

#delete model
    model = project.get_saved_model(deployed_model_id)
    model.delete()

 

 

PROJECT_KEY,  ANALYSIS_ID, ML_TASK_ID and train_dataset_name are respectively the ids of your project, your analysis, your ml task and your train dataset

Hoping this will meet your need,

Alex

View solution in original post

0 Kudos
1 Reply
AlexandreL
Dataiker

Hi,

It is not  possible to call the model directly from the analysis. However, you can use python code to deploy it temporarily and clean your deployment afterwards.  The following code might help you, it will temporarily deploy then delete each model in a ML task:

 

import dataiku
client = dataiku.api_client()
project = client.get_project(PROJECT_KEY)
analysis = project.get_analysis(ANALYSIS_ID)
ml_task = analysis.get_ml_task(ML_TASK_ID)
models_ids = ml_task.get_trained_models_ids()

for index, model_id in enumerate(models_ids):
#deploy model
    deployed_model_id, _ = ml_task.deploy_to_flow(model_id,
                                'temp_model', 'train_dataset_name').values()
    deployed_model = dataiku.Model(deployed_model_id, ignore_flow=True)
    predictor = deployed_model.get_predictor()

#delete model
    model = project.get_saved_model(deployed_model_id)
    model.delete()

 

 

PROJECT_KEY,  ANALYSIS_ID, ML_TASK_ID and train_dataset_name are respectively the ids of your project, your analysis, your ml task and your train dataset

Hoping this will meet your need,

Alex

0 Kudos