Learn R Programming

bestglm (version 0.13)

CVDH: Adjusted K-fold Cross-Validation

Description

An adjustment to K-fold cross-validation is made to reduce bias.

Usage

CVDH(X, y, K = 10, REP = 1)

Arguments

X
training inputs
y
training output
K
size of validation sample
REP
number of replications

Value

  • Vector of two components comprising the cross-validation MSE and its sd based on the MSE in each validation sample.

Details

Algorithm 6.5 (Davison and Hinkley, p.295) is implemented.

References

Davison, A.C. and Hinkley, D.V. (1997). Bootstrap Methods and their Application. Cambridge University Press.

See Also

bestglm, CVHTF, CVd, LOOCV

Examples

Run this code
#Example 1. Variability in 10-fold CV with Davison-Hartigan Algorithm.
#Plot the CVs obtained by using 10-fold CV on the best subset
#model of size 2 for the prostate data. We assume the best model is
#the model with the first two inputs and then we compute the CV's
#using 10-fold CV, 100 times. The result is summarized by a boxplot as well 
#as the sd.
NUMSIM<-100
data(zprostate)
train<-(zprostate[zprostate[,10],])[,-10]
X<-train[,1:2]
y<-train[,9]
cvs<-numeric(NUMSIM)
set.seed(123321123)
for (isim in 1:NUMSIM)
    cvs[isim]<-CVDH(X,y,K=10,REP=1)[1]
boxplot(cvs)
sd(cvs)
#The CV MSE is about 60.4 with sd 0.0129
#95% c.i. is (60.2, 60.7)

Run the code above in your browser using DataLab