Applying SHAP on Isolation forest | Model Interpretability

adf057
Level 2
Applying SHAP on Isolation forest | Model Interpretability

Hello,

In Dataiku, for classification or regression problems, SHAP interpretability is generated.


For Isolation forest (Anomaly detection) we do not have model interpretability.  Do I write my own SHAP tree explainer in python ?


Initially I was thinking to further train my data based on the cluster labels, and since itโ€™s imbalanced, I will have to oversample the data to do so.

 

Any thoughts would be appreciated.

0 Kudos
1 Reply
louisplt
Dataiker

Hello @adf057,

You are right, there is no way to compute explanations (Shapley values in your case) on clustering models in Dataiku DSS.

Note that, if you train your data using the cluster labels as a Multi-class classification, by default DSS will activate class weights which are meant to correct possible imbalances between classes.

Hope this helps,

Louis Pouillot

0 Kudos