MachineShop (version 3.7.0)

AdaBoostModel: Boosting with Classification Trees

Description

Fits the AdaBoost.M1 (Freund and Schapire, 1996) and SAMME (Zhu et al., 2009) algorithms using classification trees as single classifiers.

Usage

AdaBoostModel(
  boos = TRUE,
  mfinal = 100,
  coeflearn = c("Breiman", "Freund", "Zhu"),
  minsplit = 20,
  minbucket = round(minsplit/3),
  cp = 0.01,
  maxcompete = 4,
  maxsurrogate = 5,
  usesurrogate = 2,
  xval = 10,
  surrogatestyle = 0,
  maxdepth = 30
)

Value

MLModel class object.

Arguments

boos

if TRUE, then bootstrap samples are drawn from the training set using the observation weights at each iteration. If FALSE, then all observations are used with their weights.

mfinal

number of iterations for which boosting is run.

coeflearn

learning algorithm.

minsplit

minimum number of observations that must exist in a node in order for a split to be attempted.

minbucket

minimum number of observations in any terminal node.

cp

complexity parameter.

maxcompete

number of competitor splits retained in the output.

maxsurrogate

number of surrogate splits retained in the output.

usesurrogate

how to use surrogates in the splitting process.

xval

number of cross-validations.

surrogatestyle

controls the selection of a best surrogate.

maxdepth

maximum depth of any node of the final tree, with the root node counted as depth 0.

Details

Response types:

factor

Automatic tuning of grid parameters:

mfinal, maxdepth, coeflearn*

* excluded from grids by default

Further model details can be found in the source link below.

See Also

boosting, fit, resample

Examples

Run this code
# \donttest{
## Requires prior installation of suggested package adabag to run

fit(Species ~ ., data = iris, model = AdaBoostModel(mfinal = 5))
# }

Run the code above in your browser using DataLab