MachineShop (version 3.7.0)

GLMBoostModel: Gradient Boosting with Linear Models

Description

Gradient boosting for optimizing arbitrary loss functions where component-wise linear models are utilized as base-learners.

Usage

GLMBoostModel(
  family = NULL,
  mstop = 100,
  nu = 0.1,
  risk = c("inbag", "oobag", "none"),
  stopintern = FALSE,
  trace = FALSE
)

Value

MLModel class object.

Arguments

family

optional Family object. Set automatically according to the class type of the response variable.

mstop

number of initial boosting iterations.

nu

step size or shrinkage parameter between 0 and 1.

risk

method to use in computing the empirical risk for each boosting iteration.

stopintern

logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.

trace

logical indicating whether status information is printed during the fitting process.

Details

Response types:

binary factor, BinomialVariate, NegBinomialVariate, numeric, PoissonVariate, Surv

Automatic tuning of grid parameter:

mstop

Default argument values and further model details can be found in the source See Also links below.

See Also

glmboost, Family, fit, resample

Examples

Run this code
# \donttest{
## Requires prior installation of suggested package mboost to run

data(Pima.tr, package = "MASS")

fit(type ~ ., data = Pima.tr, model = GLMBoostModel)
# }

Run the code above in your browser using DataLab