fda.usc (version 1.5.0)

GCV.S: The generalized cross-validation (GCV) score.

Description

The generalized cross-validation (GCV) score.

Usage

GCV.S(y,S,criteria="GCV",W=NULL,trim=0,
      draw=FALSE,metric=metric.lp,…)

Arguments

y

Matrix of set cases with dimension (n x m), where n is the number of curves and m are the points observed in each curve.

S

Smoothing matrix, see S.NW, S.LLR or \(S.KNN\).

criteria

The penalizing function. By default "Rice" criteria. Possible values are "GCV", "AIC", "FPE", "Shibata", "Rice".

W

Matrix of weights.

trim

The alpha of the trimming.

draw

=TRUE, draw the curves, the sample median and trimmed mean.

metric

Metric function, by default metric.lp.

Further arguments passed to or from other methods.

Value

res

Returns GCV score calculated for input parameters.

Details

$$GCV(h)=p(h) \Xi(n^{-1}h^{-1})$$

Where A.-If trim=0: $$p(h)={\left\|\sqrt(W)\left(y_i-\hat{y}_{i}\right)\right\|}$$ B.-If trim>0: $$p(h)=\frac{1}{l} \sum_{i=1}^{l}{\Big(y_i-r_{i}(x_i)\Big)^{2}w(x_i)}$$ l: index of (1-trim) curves with less error. where \(h\) is the bandwidth parameter, \(w\) the weights and the penalty function \(\Xi\) can be selected from the following criteria:

  • Generalized Cross-validation (GCV): $$\Xi_{GCV}(n^{-1}h^{-1})=(1-n^{-1}S_{ii})^{-2}$$

  • Akaike's Information Criterion (AIC): $$\Xi_{AIC}(n^{-1}h^{-1})=exp(2n^{-1}S_{ii})$$

  • Finite Prediction Error (FPE): $$\Xi_{FPE}(n^{-1}h^{-1})=\frac{(1+n^{-1}S_{ii})}{(1-n^{-1}S_{ii})}$$

  • Shibata's model selector (Shibata): $$\Xi_{Shibata}(n^{-1}h^{-1})=(1+2n^{-1}S_{ii})$$

  • Rice's bandwidth selector (Rice): $$\Xi_{Rice}(n^{-1}h^{-1})=(1-2n^{-1}S_{ii})^{-1}$$

where \(S_{ii}\) the \(i\)th diagonal element of the smoothing matrix \(S\), in see S.NW, S.LLR or \(S.KNN\).

References

Wasserman, L. All of Nonparametric Statistics. Springer Texts in Statistics, 2006.

Hardle, W. Applied Nonparametric Regression. Cambridge University Press, 1994.

Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. http://www.jstatsoft.org/v51/i04/

See Also

See Also as min.np. Alternative method: CV.S

Examples

Run this code
# NOT RUN {
data(phoneme)
mlearn<-phoneme$learn
tt<-1:ncol(mlearn)
S1 <- S.NW(tt,2.5)
S2 <- S.LLR(tt,2.5)
gcv1 <- GCV.S(mlearn, S1)
gcv2 <- GCV.S(mlearn, S2)
gcv3 <- GCV.S(mlearn, S1,criteria="AIC")
gcv4 <- GCV.S(mlearn, S2,criteria="AIC")
gcv1; gcv2; gcv3; gcv4
 
# }

Run the code above in your browser using DataCamp Workspace