Learn R Programming

⚠️There's a newer version (0.4.1) of this package.Take me there.

vip: Variable Importance Plots

Overview

vip is an R package for constructing variable importance plots (VIPs). VIPs are part of a larger framework referred to as interpretable machine learning (IML), which includes (but not limited to): partial dependence plots (PDPs) and individual conditional expectation (ICE) curves. While PDPs and ICE curves (available in the R package pdp) help visualize feature effects, VIPs help visualize feature impact (either locally or globally). An in-progress, but comprehensive, overview of IML can be found here: https://github.com/christophM/interpretable-ml-book.

Many supervised learning algorithms can naturally emit some measure of importance for the features used in the model, and these approaches are embedded in many different packages. The downside, however, is that each package uses a different function and interface and it can be challenging (and distracting) to have to remember each one (e.g., remembering to use xgb.importance() for xgboost models and gbm.summary() for gbm models). With vip you get one consistent interface to computing variable importance for many types of supervised learning models across a number of packages. Additionally, vip offers a number of model-agnostic procedures for computing feature importance (see the next section) as well an experimental function for quantifying the strength of potential interaction effects. For details and example usage, visit the vip package website.

Features

  • Model-based variable importance - Compute variable importance specific to a particular model (like a random forest, gradient boosted decision trees, or multivariate adaptive regression splines) from a wide range of R packages (e.g., randomForest, ranger, xgboost, and many more). Also supports the caret and parsnip (starting with version 0.0.4) packages.

  • Permutation-based variable importance - An efficient implementation of the permutation feature importance algorithm discussed in this chapter from Christoph Molnar’s Interpretable Machine Learning book.

  • Shapley-based variable importance - An efficient implementation of feature importance based on the popular Shapley values via the fastshap package.

  • Variance-based variable importance - Compute variable importance using a simple feature importance ranking measure (FIRM) approach. For details, see see Greenwell et al. (2018) and Scholbeck et al. (2019).

Installation

# The easiest way to get vip is to install it from CRAN:
install.packages("vip")

# Alternatively, you can install the development version from GitHub:
if (!requireNamespace("remotes")) {
  install.packages("remotes")
}
remotes::install_github("koalaverse/vip")

Copy Link

Version

Install

install.packages('vip')

Monthly Downloads

8,869

Version

0.3.2

License

GPL (>= 2)

Issues

Pull Requests

Stars

Forks

Maintainer

Brandon Greenwell

Last Published

December 17th, 2020

Functions in vip (0.3.2)

list_metrics

List metrics
vi_permute

Permutation-based variable importance
vi_firm

Variance-based variable importance
vint

Interaction effects
vip

Variable importance plots
vi

Variable importance
vi_model

Model-specific variable importance
vi_shap

SHAP-based variable importance
add_sparklines

Add sparklines
bin

Bin a numeric vector
get_formula

Extract model formula
metric_mse

Model metrics
get_feature_names

Extract feature names
reexports

Objects exported from other packages
grid.arrange

Arrange multiple grobs on a page
gen_friedman

Friedman benchmark data
vi_pdp

Deprecated