GAMBoost (version 1.2-3)

cv.GLMBoost: Cross-validation for GLMBoost fits

Description

Performs a convenience wrapper around cv.GAMBoost for performing a K-fold cross-validation for GLMBoost in search for the optimal number of boosting steps.

Usage

cv.GLMBoost(x,y,penalty=length(y),just.criterion=TRUE,...)

Arguments

y
response vector of length n.
x
n * q matrix of covariates with linear influence.
penalty
penalty for the covariates with linear influence.
just.criterion
logical value indicating wether a list with the goodness-of-fit information should be returned or a GLMBoost fit with the optimal number of steps.
...
parameters to be passed to cv.GAMBoost or subsequently GAMBoost

Value

GLMBoost fit with the optimal number of boosting steps or list with the following components:
criterion
vector with goodness-of fit criterion for boosting step 1 , ... , maxstep
se
vector with standard error estimates for the goodness-of-fit criterion in each boosting step.
selected
index of the optimal boosting step.

See Also

GLMBoost, cv.GAMBoost, GAMBoost

Examples

Run this code
## Not run: 
# ##  Generate some data 
# x <- matrix(runif(100*8,min=-1,max=1),100,8)             
# eta <- -0.5 + 2*x[,1] + 4*x[,3]
# y <- rbinom(100,1,binomial()$linkinv(eta))
# 
# ##  Fit the model with only linear components
# gb1 <- GLMBoost(x,y,penalty=100,stepno=100,trace=TRUE,family=binomial()) 
# 
# 
# ##  10-fold cross-validation with prediction error as a criterion
# gb1.crit <- cv.GLMBoost(x,y,penalty=100,maxstepno=100,trace=TRUE,
#                         family=binomial(),
#                         K=10,type="error")
# 
# ##  Compare AIC and estimated prediction error
# 
# which.min(gb1$AIC)          
# which.min(gb1.crit$criterion)
# ## End(Not run)

Run the code above in your browser using DataCamp Workspace