Learn R Programming

⚠️There's a newer version (0.11.4) of this package.Take me there.

iml (version 0.3.0)

Interpretable Machine Learning

Description

Interpretability methods to analyze the behavior and predictions of any machine learning model. Implemented methods are: Feature importance described by Fisher et al. (2018) , partial dependence plots described by Friedman (2001) , individual conditional expectation ('ice') plots described by Goldstein et al. (2013) , local models (variant of 'lime') described by Ribeiro et. al (2016) , the Shapley Value described by Strumbelj et. al (2014) and tree surrogate models.

Copy Link

Version

Install

install.packages('iml')

Monthly Downloads

6,125

Version

0.3.0

License

MIT + file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Christoph Molnar

Last Published

April 10th, 2018

Functions in iml (0.3.0)

Ice

Individual conditional expectations (Ice)
plot.FeatureImp

Plot Feature Importance
Shapley

Prediction explanations with game theory
FeatureImp

Feature importance
iml-package

Make machine learning models and predictions interpretable
PartialDependence

Partial Dependence Plot
Partial

Partial Dependence and Individual Conditional Expectation
LocalModel

LocalModel
Predictor

Predictor object
TreeSurrogate

Decision tree surrogate model
plot.Ice

Plot ICE (Individual Conditional Expectation)
plot.TreeSurrogate

Plot Tree Surrogate
predict.TreeSurrogate

Predict Tree Surrogate
predict.LocalModel

Predict LocalModel
plot.PartialDependence

Plot Partial Dependence
plot.Shapley

Plot Shapley
plot.LocalModel

Plot Local Model
plot.Partial

Plot Partial Dependence