glpls1a.cv.error

Leave-one-out cross-validation error using IRWPLS and IRWPLSF model

Leave-one-out cross-validation training set classification error for fitting IRWPLS or IRWPLSF model for two group classification

Keywords
regression
Usage
glpls1a.cv.error(train.X,train.y, K.prov=NULL,eps=1e-3,lmax=100,family="binomial",link="logit",br=T)
Arguments
train.X
n by p design matrix (with no intercept term) for training set
train.y
response vector (0 or 1) for training set
K.prov
number of PLS components, default is the rank of train.X
eps
tolerance for convergence
lmax
maximum number of iteration allowed
family
glm family, binomial is the only relevant one here
link
link function, logit is the only one practically implemented now
br
TRUE if Firth's bias reduction procedure is used
Details

Value

error
LOOCV training error
error.obs
the misclassified error observation indices

References

  • Ding, B.Y. and Gentleman, R. (2003) Classification using generalized partial least squares.
  • Marx, B.D (1996) Iteratively reweighted partial least squares estimation for generalized linear regression. Technometrics 38(4): 374-381.

See Also

glpls1a.train.test.error, glpls1a.mlogit.cv.error, glpls1a, glpls1a.mlogit,glpls1a.logit.all

Aliases
  • glpls1a.cv.error
Examples
 x <- matrix(rnorm(20),ncol=2)
 y <- sample(0:1,10,TRUE)

 ## no bias reduction
 glpls1a.cv.error(x,y,br=FALSE)
 ## bias reduction and 1 PLS component
 glpls1a.cv.error(x,y,K.prov=1, br=TRUE)
Documentation reproduced from package gpls, version 1.44.0, License: Artistic-2.0

Community examples

Looks like there are no examples yet.