Learn R Programming

mpath (version 0.4-2.21)

cv.ccsvm: Cross-validation for ccsvm

Description

Does k-fold cross-validation for ccsvm

Usage

# S3 method for formula
cv.ccsvm(formula, data, weights, contrasts=NULL, ...)
# S3 method for matrix
cv.ccsvm(x, y, weights, ...)
# S3 method for default
cv.ccsvm(x,  ...)

Arguments

formula

symbolic description of the model, see details.

data

argument controlling formula processing via model.frame.

x

x matrix as in ccsvm.

y

response y as in ccsvm.

weights

Observation weights; defaults to 1 per observation

contrasts

the contrasts corresponding to levels from the respective models.

Other arguments that can be passed to ccsvm.

Value

An object contains a list of ingredients of cross-validation including optimal tuning parameters.

residmat

matrix with row values for kernel="linear" are s, cost, error, k, where k is the number of cross-validation fold. For nonlinear kernels, row values are s, gamma, cost, error, k.

cost

a value of cost that gives minimum cross-validated value in ccsvm.

gamma

a value of gamma that gives minimum cross-validated value in ccsvm

s

value of s for cfun that gives minimum cross-validated value in ccsvm.

Details

Does a K-fold cross-validation to determine optimal tuning parameters in SVM: cost and gamma if kernel is nonlinear. It can also choose s used in cfun.

References

Zhu Wang (2020) Unified Robust Estimation via the COCO, arXiv e-prints, https://arxiv.org/abs/2010.02848

See Also

ccsvm

Examples

Run this code
# NOT RUN {
x <- matrix(rnorm(40*2), ncol=2)
y <- c(rep(-1, 20), rep(1, 20))
x[y==1,] <- x[y==1, ] + 1
ccsvm.opt <- cv.ccsvm(x, y, type="C-classification", s=1, kernel="linear", cfun="acave")
ccsvm.opt$cost
ccsvm.opt$gamma
ccsvm.opt$s
# }

Run the code above in your browser using DataLab