adaboost

0th

Percentile

AdaBoost Classifier

An implementation of the AdaBoost algorithm from Freund and Shapire (1997) applied to decision tree classifiers.

Usage
adaboost(X, y, tree_depth = 3, n_rounds = 100, verbose = FALSE,
  control = NULL)
Arguments
X

A matrix of continuous predictors.

y

A vector of responses with entries in c(-1, 1).

tree_depth

The depth of the base tree classifier to use.

n_rounds

The number of rounds of boosting to use.

verbose

Whether to print the number of iterations.

control

A rpart.control list that controls properties of fitted decision trees.

Value

Returns an object of class adaboost containing the following values:

alphas

Weights computed in the adaboost fit.

trees

The trees constructed in each round of boosting. Storing trees allows one to make predictions on new data.

confusion_matrix

A confusion matrix for the in-sample fits.

Note

Trees are grown using the CART algorithm implemented in the rpart package. In order to conserve memory, the only parts of the fitted tree objects that are retained are those essential to making predictions. In practice, the number of rounds of boosting to use is chosen by cross-validation.

References

Freund, Y. and Schapire, R. (1997). A decision-theoretic generalization of online learning and an application to boosting, Journal of Computer and System Sciences 55: 119-139.

Aliases
  • adaboost
Examples
# NOT RUN {
# Generate data from the circle model
set.seed(111)
dat = circle_data(n = 500)
train_index = sample(1:500, 400)

ada = adaboost(dat$X[train_index,], dat$y[train_index], tree_depth = 2,
               n_rounds = 200, verbose = TRUE)
print(ada)
yhat_ada = predict(ada, dat$X[-train_index,])

# calculate misclassification rate
mean(dat$y[-train_index] != yhat_ada)
# }
Documentation reproduced from package JOUSBoost, version 2.1.0, License: MIT + file LICENSE

Community examples

Looks like there are no examples yet.