mboost (version 0.4-6)

methods: Methods for Gradient Boosting Objects

Description

Methods for models fitted by boosting algorithms.

Usage

## S3 method for class 'glmboost':
print(x, ...)
## S3 method for class 'gamboost':
print(x, ...)
## S3 method for class 'glmboost':
coef(object, ...)
## S3 method for class 'gb':
AIC(object, method = c("corrected", "classical"), ...)
## S3 method for class 'gbAIC':
mstop(object, ...)
## S3 method for class 'gb':
predict(object, newdata = NULL, type = c("lp", "response"), ...)
## S3 method for class 'gb':
fitted(object, type = c("lp", "response"), ...)
## S3 method for class 'gb':
logLik(object, ...)

Arguments

object
objects of class glmboost, gamboost or gbAIC.
x
objects of class glmboost or gamboost.
newdata
optionally, a data frame in which to look for variables with which to predict.
type
a character indicating whether the fit or the response (classes) should be predicted in case of classification problems.
method
a character specifying if the corrected AIC criterium or a classical (-2 logLik + 2 * df) should be computed.
...
additional arguments passed to callies.

Details

These functions can be used to extract details from fitted models. print shows a dense representation of the model fit and coef extracts the regression coefficients of a linear model fitted using the glmboost function. The predict function can be used to predict the status of the response variable for new observations whereas fitted extracts the regression fit for the observations in the learning sample.

For (generalized) linear and additive models, the AIC function can be used to compute both the classical and corrected AIC (Hurvich et al., 1998, only available when family = GaussReg() was used), which is useful for the determination of the optimal number of boosting iterations to be applied (which can be extracted via mstop).

Note that logLik and AIC only make sense when the corresponding Family implements the appropriate loss function.

References

Clifford M. Hurvich, Jeffrey S. Simonoff and Chih-Ling Tsai (1998), Smoothing parameter selection in nonparametric regression using an improved Akaike information criterion. Journal of the Royal Statistical Society, Series B, 20(2), 271--293.

Peter Buhlmann and Torsten Hothorn (2006), Boosting: A statistical perspective. Submitted manuscript.

Examples

Run this code
### a simple two-dimensional example: cars data
    cars.gb <- glmboost(dist ~ speed, data = cars, 
                        control = boost_control(mstop = 2000))
    cars.gb

    ### AIC criterion
    aic <- AIC(cars.gb, method = "corrected")
    aic

    ### coefficients for optimal number of boosting iterations
    coef(cars.gb[mstop(aic)])
    plot(cars$dist, predict(cars.gb[mstop(aic)]), 
         ylim = range(cars$dist))
    abline(a = 0, b = 1)

Run the code above in your browser using DataCamp Workspace