Visualizing SHAP Values for Model Feature Importance

  • Share this:

Code introduction


This function uses the SHAP library to visualize the feature importance of a given dataset and model.


Technology Stack : SHAP library, NumPy, scikit-learn, Matplotlib

Code Type : The type of code

Code Difficulty : Intermediate


                
                    
import numpy as np
import shap
import matplotlib.pyplot as plt

def visualize_shap_values(X, y, model):
    """
    Visualize the SHAP values for a given dataset and model.

    :param X: Input features as a NumPy array.
    :param y: Target variable as a NumPy array.
    :param model: A scikit-learn compatible model.
    """
    explainer = shap.Explainer(model, X)
    shap_values = explainer.shap_values(X)
    
    shap.summary_plot(shap_values, X, feature_names=list(X.columns))
    plt.show()