Learn R Programming

bst (version 0.3-2)

cv.bst: Cross-Validation for Binary HingeBoost

Description

Cross-validated estimation of the empirical risk for boosting parameter selection.

Usage

cv.bst(x, y, K = 10, cost = 0.5, family = c("hinge", "gaussian"), 
learner = c("tree", "ls", "sm"), ctrl = bst_control(), 
type = c("risk", "misc"), plot.it = TRUE, se = TRUE, ...)

Arguments

x
a data frame containing the variables in the model.
y
vector of responses. y must be in {1, -1} for family = "hinge".
K
K-fold cross-validation
cost
price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.
family
family = "hinge" for hinge loss and family="gaussian" for squared error loss. Implementing the negative gradient corresponding to the loss function to be minimized. By default, hinge loss for +1/-1
learner
a character specifying the component-wise base learner to be used: ls linear models, sm smoothing splines, tree regression trees.
ctrl
an object of class bst_control.
type
cross-validation criteria. For family="hinge", type="risk" is hinge risk and type="misc" is misclassification error. For family="gaussian", only empirical risks.
plot.it
a logical value, to plot the estimated risks if TRUE.
se
a logical value, to plot with standard errors.
...
additional arguments.

Value

  • object with
  • residmatempirical risks in each cross-validation at boosting iterations
  • fractionabscissa values at which CV curve should be computed.
  • cvThe CV curve at each value of fraction
  • cv.errorThe standard error of the CV curve
  • ...

See Also

bst

Examples

Run this code
x <- matrix(rnorm(100*5),ncol=5)
c <- 2*x[,1]
p <- exp(c)/(exp(c)+exp(-c))
y <- rbinom(100,1,p)
y[y != 1] <- -1
x <- as.data.frame(x)
cv.bst(x, y, ctrl = bst_control(mstop=50), family = "hinge", learner = "ls")
cv.bst(x, y, ctrl = bst_control(mstop=50), family = "hinge", learner = "ls", type="misc")

Run the code above in your browser using DataLab