xgboost (version 0.4-1)

xgb.plot.tree: Plot a boosted tree model

Description

Read a tree model text dump. Plotting only works for boosted tree model (not linear model).

Usage

xgb.plot.tree(feature_names = NULL, filename_dump = NULL, model = NULL,
  n_first_tree = NULL, CSSstyle = NULL, width = NULL, height = NULL)

Arguments

feature_names
names of each feature as a character vector. Can be extracted from a sparse matrix (see example). If model dump already contains feature names, this argument should be NULL.
filename_dump
the path to the text file storing the model. Model dump must include the gain per feature and per tree (parameter with.stats = T in function xgb.dump). Possible to provide a model directly (see model argument).
model
generated by the xgb.train function. Avoid the creation of a dump file.
n_first_tree
limit the plot to the n first trees. If NULL, all trees of the model are plotted. Performance can be low for huge models.
CSSstyle
a character vector storing a css style to customize the appearance of nodes. Look at the https://github.com/knsv/mermaid/wiki{Mermaid wiki} for more information.
width
the width of the diagram in pixels.
height
the height of the diagram in pixels.

Value

  • A DiagrammeR of the model.

Details

The content of each node is organised that way:

  • featurevalue ;
  • cover: the sum of second order gradient of training data classified to the leaf, if it is square loss, this simply corresponds to the number of instances in that branch. Deeper in the tree a node is, lower this metric will be ;
  • gain: metric the importance of the node in the model.

Each branch finishes with a leaf. For each leaf, only the cover is indicated. It uses https://github.com/knsv/mermaid/{Mermaid} library for that purpose.

Examples

Run this code
data(agaricus.train, package='xgboost')

#Both dataset are list with two items, a sparse matrix and labels
#(labels = outcome column which will be learned).
#Each column of the sparse Matrix is a feature in one hot encoding format.
train <- agaricus.train

bst <- xgboost(data = train$data, label = train$label, max.depth = 2,
               eta = 1, nthread = 2, nround = 2,objective = "binary:logistic")

#agaricus.test$data@Dimnames[[2]] represents the column names of the sparse matrix.
xgb.plot.tree(agaricus.train$data@Dimnames[[2]], model = bst)

Run the code above in your browser using DataCamp Workspace