Learn R Programming

fda.usc (version 0.9.4)

GCV.S: The generalized cross-validation (GCV) score.

Description

The generalized cross-validation (GCV) score.

Usage

GCV.S(y,S,criteria="Rice",W=diag(1,ncol=ncol(S),nrow=nrow(S)),
trim=0,draw=FALSE,...)

Arguments

y
Matrix of set cases with dimension (n x m), where n is the number of curves and m are the points observed in each curve.
S
Smoothing matrix.
criteria
The penalizing function. By default "Rice" criteria. Possible values are "GCV", "AIC", "FPE", "Shibata", "Rice".
W
Matrix of weights.
trim
The alpha of the trimming.
draw
=TRUE, draw the curves, the sample median and trimmed mean.
...
Further arguments passed to or from other methods.

Value

  • resReturns GCV score calculated for input parameters.

Details

$$GCV(h)=p(h) \Xi(n^{-1}h^{-1})$$ Where $$p(h)=\frac{1}{n} \sum_{i=1}^{n}{\Big(y_i-r_{i}(x_i)\Big)^{2}w(x_i)}$$ and penalty function $$\Xi()$$ can be selected from the following criteria: Generalized Cross-validation (GCV): $$\Xi_{GCV}(n^{-1}h^{-1})=(1-n^{-1}S_{ii})^{-2}$$ Akaike's Information Criterion (AIC): $$\Xi_{AIC}(n^{-1}h^{-1})=exp(2n^{-1}S_{ii})$$ Finite Prediction Error (FPE) $$\Xi_{FPE}(n^{-1}h^{-1})=\frac{(1+n^{-1}S_{ii})}{(1-n^{-1}S_{ii})}$$ Shibata's model selector (Shibata): $$\Xi_{Shibata}(n^{-1}h^{-1})=(1+2n^{-1}S_{ii})$$ Rice's bandwidth selector (Rice): $$\Xi_{Rice}(n^{-1}h^{-1})=(1-2n^{-1}S_{ii})^{-1}$$

References

Wasserman, L. All of Nonparametric Statistics. Springer Texts in Statistics, 2006. Hardle, W. Applied Nonparametric Regression. Cambridge University Press, 1994.

See Also

See Also as min.np. Alternative method: CV.S

Examples

Run this code
data(phoneme)
mlearn<-phoneme$learn
np<-ncol(mlearn)
tt<-mlearn[["argvals"]]
S1 <- S.NW(tt,2.5)
S2 <- S.LLR(tt,2.5)
S3 <- S.KNN(tt,2.5)
gcv1 <- GCV.S(mlearn, S1)
gcv2 <- GCV.S(mlearn, S2)
gcv3 <- GCV.S(mlearn, S3)
gcv4 <- GCV.S(mlearn, S1,criteria="AIC")
gcv5 <- GCV.S(mlearn, S1,trim=0.01,draw=TRUE) 
gcv1; gcv2; gcv3; gcv4; gcv5

Run the code above in your browser using DataLab