An implementation of the AdaBoost algorithm from Freund and Shapire (1997) applied to decision tree classifiers.

```
adaboost(X, y, tree_depth = 3, n_rounds = 100, verbose = FALSE,
control = NULL)
```

X

A matrix of continuous predictors.

y

A vector of responses with entries in `c(-1, 1)`

.

tree_depth

The depth of the base tree classifier to use.

n_rounds

The number of rounds of boosting to use.

verbose

Whether to print the number of iterations.

control

A `rpart.control`

list that controls properties of fitted
decision trees.

Returns an object of class `adaboost`

containing the following values:

Weights computed in the adaboost fit.

The trees constructed in each round of boosting. Storing trees allows one to make predictions on new data.

A confusion matrix for the in-sample fits.

Freund, Y. and Schapire, R. (1997). A decision-theoretic generalization of online learning and an application to boosting, Journal of Computer and System Sciences 55: 119-139.

# NOT RUN { # Generate data from the circle model set.seed(111) dat = circle_data(n = 500) train_index = sample(1:500, 400) ada = adaboost(dat$X[train_index,], dat$y[train_index], tree_depth = 2, n_rounds = 200, verbose = TRUE) print(ada) yhat_ada = predict(ada, dat$X[-train_index,]) # calculate misclassification rate mean(dat$y[-train_index] != yhat_ada) # }

Run the code above in your browser using DataCamp Workspace