Learn R Programming

⚠️There's a newer version (1.7.10.1) of this package.Take me there.

xgboost (version 1.3.1.1)

Extreme Gradient Boosting

Description

Extreme Gradient Boosting, which is an efficient implementation of the gradient boosting framework from Chen & Guestrin (2016) . This package is its R interface. The package includes efficient linear model solver and tree learning algorithms. The package can automatically do parallel computation on a single machine which could be more than 10 times faster than existing gradient boosting packages. It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that users are also allowed to define their own objectives easily.

Copy Link

Version

Install

install.packages('xgboost')

Monthly Downloads

56,736

Version

1.3.1.1

License

Apache License (== 2.0) | file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Tong He

Last Published

January 5th, 2021

Functions in xgboost (1.3.1.1)

prepare.ggplot.shap.data

Combine and melt feature values and SHAP contributions for sample observations.
dim.xgb.DMatrix

Dimensions of xgb.DMatrix
dimnames.xgb.DMatrix

Handling of column names of xgb.DMatrix
getinfo

Get information of an xgb.DMatrix object
cb.save.model

Callback closure for saving a model file.
print.xgb.cv.synchronous

Print xgb.cv result
print.xgb.DMatrix

Print xgb.DMatrix
cb.early.stop

Callback closure to activate the early stopping.
xgb.create.features

Create new features from a previously learned model
xgb.config

Accessors for model parameters as JSON string.
xgb.cv

Cross Validation
xgb.dump

Dump an xgboost model in text format.
xgb.model.dt.tree

Parse a boosted tree model text dump
xgb.parameters<-

Accessors for model parameters.
xgb.ggplot.deepness

Plot model trees deepness
xgb.ggplot.importance

Plot feature importance as a bar graph
predict.xgb.Booster

Predict method for eXtreme Gradient Boosting model
xgb.plot.multi.trees

Project all trees on one tree and plot it
normalize

Scale feature value to have mean 0, standard deviation 1
xgb.DMatrix.save

Save xgb.DMatrix object to binary file
xgb.attr

Accessors for serializable attributes of a model.
xgb.serialize

Serialize the booster instance into R's raw vector. The serialization method differs from xgb.save.raw as the latter one saves only the model but not parameters. This serialization format is not stable across different xgboost versions.
xgb.shap.data

Prepare data for SHAP plots. To be used in xgb.plot.shap, xgb.plot.shap.summary, etc. Internal utility function.
slice

Get a new DMatrix containing the specified rows of original xgb.DMatrix object
setinfo

Set information of an xgb.DMatrix object
xgb.ggplot.shap.summary

SHAP contribution dependency summary plot
xgb.gblinear.history

Extract gblinear coefficients history.
xgb.plot.shap

SHAP contribution dependency plots
xgb.save

Save xgboost model to binary file
xgb.save.raw

Save xgboost model to R's raw vector, user can call xgb.load.raw to load the model back from raw vector
xgb.importance

Importance of features in a model.
xgb.load.raw

Load serialised xgboost model from R's raw vector
xgb.load

Load xgboost model from binary file
xgb.plot.tree

Plot a boosted tree model
xgb.DMatrix

Construct xgb.DMatrix object
print.xgb.Booster

Print xgb.Booster
xgb.train

eXtreme Gradient Boosting Training
xgb.unserialize

xgb.Booster.complete

Restore missing parts of an incomplete xgb.Booster object.
xgboost-deprecated

Deprecation notices.
cb.cv.predict

Callback closure for returning cross-validation based predictions.
callbacks

Callback closures for booster training.
agaricus.train

Training part from Mushroom Data Set
agaricus.test

Test part from Mushroom Data Set
a-compatibility-note-for-saveRDS-save

Do not use saveRDS or save for long-term archival of models. Instead, use xgb.save or xgb.save.raw.
cb.print.evaluation

Callback closure for printing the result of evaluation
cb.gblinear.history

Callback closure for collecting the model coefficients history of a gblinear booster during its training.
cb.reset.parameters

Callback closure for resetting the booster's parameters at each iteration.
cb.evaluation.log

Callback closure for logging the evaluation history