mboost (version 1.0-1)

gamboost: Gradient Boosting with Smooth Components

Description

Gradient boosting for optimizing arbitrary loss functions, where component-wise smoothing procedures are utilized as base learners.

Usage

## S3 method for class 'formula':
gamboost(formula, data = list(), weights = NULL, 
        na.action = na.omit, ...)
## S3 method for class 'matrix':
gamboost(x, y, weights = NULL, ...)
gamboost_fit(object, baselearner = c("bss", "bbs", "bols", "bns"), 
             dfbase = 4, family = GaussReg(), 
             control = boost_control(), weights = NULL)
## S3 method for class 'gamboost':
plot(x, which = NULL, ask = TRUE && dev.interactive(), 
    type = "b", ylab = expression(f[partial]), add_rug = TRUE, ...)

Arguments

formula
a symbolic description of the model to be fit.
data
a data frame containing the variables in the model.
weights
an optional vector of weights to be used in the fitting process.
na.action
a function which indicates what should happen when the data contain NAs.
x
design matrix (for gamboost.matrix) or an object returned by gamboost to be plotted via plot.
y
vector of responses.
object
an object of class boost_data, see boost_dpp.
baselearner
a character specifying the component-wise base learner to be used: bss means smoothing splines (see Buhlmann and Yu 2003), bbs P-splines with a B-spline
dfbase
an integer vector giving the degrees of freedom for the smoothing spline, either globally for all variables (when its length is one) or separately for each single covariate.
family
an object of class boost_family-class, implementing the negative gradient corresponding to the loss function to be optimized, by default, squared error loss
control
an object of class boost_control.
which
if a subset of the plots is required, specify a subset of the variables.
ask
logical; if TRUE, the user is asked before each plot, see par(ask=.).
type
what type of plot should be drawn: see plot.
ylab
a title for the y axis: see title.
add_rug
logical; if TRUE, rugs are added.
...
additional arguments passed to callies.

Value

  • An object of class gamboost with print, AIC and predict methods being available.

Details

A (generalized) additive model is fitted using a boosting algorithm based on component-wise univariate base learners. The base learner can either be specified via the formula object or via the baselearner argument (see bbs for an example). If the base learners specified in formula differ from baselearner, the latter argument will be ignored.

The function gamboost_fit provides access to the fitting procedure without data pre-processing, e.g. for cross-validation.

References

Peter Buhlmann and Bin Yu (2003), Boosting with the L2 loss: regression and classification. Journal of the American Statistical Association, 98, 324--339.

Peter Buhlmann and Torsten Hothorn (2007), Boosting algorithms: regularization, prediction and model fitting. Statistical Science, accepted. http://www.imstat.org/sts/future_papers.html Thomas Kneib, Torsten Hothorn and Gerhard Tutz (2007), Variable selection and model choice in geoadditive regression models. Technical Report No. 3, Institut fuer Statistik, LMU Muenchen. http://epub.ub.uni-muenchen.de/2063/

Matthias Schmid and Torsten Hothorn (2007), Boosting additive models using component-wise P-splines as base-learners. Technical Report No. 2, Institut fuer Statistik, LMU Muenchen. http://epub.ub.uni-muenchen.de/2057/

Examples

Run this code
### a simple two-dimensional example: cars data
    cars.gb <- gamboost(dist ~ speed, data = cars, dfbase = 4, 
                        control = boost_control(mstop = 50))
    cars.gb
    AIC(cars.gb, method = "corrected")

    ### plot fit for mstop = 1, ..., 50
    plot(dist ~ speed, data = cars)    
    tmp <- sapply(1:mstop(AIC(cars.gb)), function(i)
        lines(cars$speed, predict(cars.gb[i]), col = "red"))          
    lines(cars$speed, predict(smooth.spline(cars$speed, cars$dist),
                              cars$speed)$y, col = "green")

    ### artificial example: sinus transformation
    x <- sort(runif(100)) * 10
    y <- sin(x) + rnorm(length(x), sd = 0.25)
    plot(x, y)
    ### linear model
    lines(x, fitted(lm(y ~ sin(x) - 1)), col = "red")
    ### GAM
    lines(x, fitted(gamboost(y ~ x - 1, 
                    control = boost_control(mstop = 500))), 
          col = "green")

Run the code above in your browser using DataLab