XGBoost Feature Importance Extraction

  • Share this:

Code introduction


This function uses the XGBoost library to train a model and returns the feature importances.


Technology Stack : XGBoost, NumPy

Code Type : Function

Code Difficulty : Intermediate


                
                    
import xgboost as xgb
import numpy as np

def random_xgb_feature_importance(X_train, y_train):
    """
    Train a XGBoost model and return the feature importances.
    """
    # Define the DMatrix
    dtrain = xgb.DMatrix(X_train, label=y_train)
    
    # Set the parameters
    params = {
        'max_depth': 3,
        'eta': 0.1,
        'objective': 'reg:squarederror',
        'eval_metric': 'rmse'
    }
    
    # Train the model
    model = xgb.train(params, dtrain)
    
    # Get feature importances
    importance = model.feature_importances_
    
    return importance                
              
Tags: