Gradient boosting for optimizing arbitrary loss functions where component-wise linear models are utilized as base-learners.
GLMBoostModel(
family = NULL,
mstop = 100,
nu = 0.1,
risk = c("inbag", "oobag", "none"),
stopintern = FALSE,
trace = FALSE
)
optional Family
object. Set
automatically according to the class type of the response variable.
number of initial boosting iterations.
step size or shrinkage parameter between 0 and 1.
method to use in computing the empirical risk for each boosting iteration.
logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.
logical indicating whether status information is printed during the fitting process.
MLModel
class object.
binary
, numeric
, Surv
mstop
Default values for the NULL
arguments and further model details can be
found in the source links below.
# NOT RUN {
library(MASS)
fit(type ~ ., data = Pima.tr, model = GLMBoostModel)
# }
Run the code above in your browser using DataLab