Model Performance Alerting

It would be nice if we could trigger automatic alerts based on model evaluation stores with a simple click while in the evaluation store. Today you would have to write a bunch of custom python scripting to create a check, schedule a scenario that reports it back out.

It should be as simple as this:

1. Create Evaluation Store

2. Select metric in evaluation store and pick a threshold (AUC <0.8) then alert, retrain etc

3. Pick method of delivering alert (slack, email etc)

3 Comments
fsergot
Dataiker

Hello,

The logic of automating actions after model monitoring is based on the metrics & checks computed on Model Evaluation Store. Those can then be used in Scenarios to build any logic (as explained in https://doc.dataiku.com/dss/latest/mlops/model-evaluations/automating.html). I am not sure in which cases you need to write Python?

And from the current logic, this is very close to what you describe in your bullet list, the only difference is that the actions on point 2 & the point 3 are done in a scenario, and not on Model Evaluation Store.

Status changed to: Gathering Input

Hello,

The logic of automating actions after model monitoring is based on the metrics & checks computed on Model Evaluation Store. Those can then be used in Scenarios to build any logic (as explained in https://doc.dataiku.com/dss/latest/mlops/model-evaluations/automating.html). I am not sure in which cases you need to write Python?

And from the current logic, this is very close to what you describe in your bullet list, the only difference is that the actions on point 2 & the point 3 are done in a scenario, and not on Model Evaluation Store.

M1-Nick
Level 3

Yeah pretty much. You have to go into the scenario and run all the steps.

In an ideal state, I hit something like monitor this model - set up some thresholds in the evaluation store and it alerts me every time that evaluation store is built if any of the metrics dropped below their threshold.

Yeah pretty much. You have to go into the scenario and run all the steps.

In an ideal state, I hit something like monitor this model - set up some thresholds in the evaluation store and it alerts me every time that evaluation store is built if any of the metrics dropped below their threshold.

NR
Level 3

Hello,

Anyway to get model performance exported ? I'm interested on both training session results and model deployed performances.

Here are the screenshots:

Sans titre2.png

Sans titre2.png Thanks

Hello,

Anyway to get model performance exported ? I'm interested on both training session results and model deployed performances.

Here are the screenshots:

Sans titre2.png

Sans titre2.png Thanks