Applying SHAP on Isolation forest | Model Interpretability

adf057
adf057 Registered Posts: 6 ✭✭✭

Hello,

In Dataiku, for classification or regression problems, SHAP interpretability is generated.


For Isolation forest (Anomaly detection) we do not have model interpretability. Do I write my own SHAP tree explainer in python ?


Initially I was thinking to further train my data based on the cluster labels, and since it’s imbalanced, I will have to oversample the data to do so.

Any thoughts would be appreciated.

Tagged:

Answers

  • louisplt
    louisplt Dataiker Posts: 21 Dataiker

    Hello @adf057
    ,

    You are right, there is no way to compute explanations (Shapley values in your case) on clustering models in Dataiku DSS.

    Note that, if you train your data using the cluster labels as a Multi-class classification, by default DSS will activate class weights which are meant to correct possible imbalances between classes.

    Hope this helps,

    Louis Pouillot

Setup Info
    Tags
      Help me…