Hyper-parameter tuning and selection of winner model on the go

mansit_suman
Level 2
Hyper-parameter tuning and selection of winner model on the go

How can we design a flow, where hyper-parameter tunning happens in a single model(lets say random forest) and also training happens across different algorithms. And then, winner algorithm with its required hyper-parameter is selected and that is used in the prediction service?

0 Kudos
1 Reply
pmasiphelps
Dataiker

Hi,

 

You can use the python APIs to 1) retrieve an existing analysis and ml task from a saved model that you've configured and deployed to the flow manually (https://doc.dataiku.com/dss/latest/python-api/ml.html#obtaining-a-handle-to-an-existing-ml-task), 2) edit the settings of this ml task object to enable certain algorithms + hyperparameter combinations, 3) perform the multi-model retraining/hyperparameter tuning, 4) retrieve the best performing model and redeploy it to the flow over your existing saved model.

Steps 2-4 can generally be found in this example code: https://doc.dataiku.com/dss/latest/python-api/ml.html#the-whole-cycle

 

Best,

Pat

0 Kudos