Learn R Programming

plsRglm (version 0.3.3)

OLD_PLS_v2_vc: Former plsRglm function

Description

This function in a beta version of the plsRglm with leave one out cross validation.

Usage

OLD_PLS_v2_vc(dataY, dataX, nt = 2, limQ2set = 0.0975, dataPredictY = dataX, modele = "pls", family = NULL, typeVC = "none", EstimXNA = FALSE, scaleX = TRUE, scaleY = NULL, pvals.expli = FALSE, alpha.pvals.expli = 0.05)

Arguments

dataY
response (training) dataset
dataX
predictor(s) (training) dataset
nt
number of components to be extracted
limQ2set
limit value for the Q2
dataPredictY
predictor(s) (testing) dataset
modele
name of the PLS model to be fitted ("pls", "pls-glm-gaussian", "pls-glm-logistic", "pls-glm-polr").
family
for the present moment the family argumlent is ignored and set thanks to the value of modele.
typeVC
type of crossed validation (only leave-one-out now). Several procedures are available and may be forced. [object Object],[object Object],[object Object],[object Object]
EstimXNA
only for modele="pls". Set whether the missing X values have to be estimated.
scaleX
scale the predictor(s) : must be set to TRUE for modele="pls" and should be for glms pls.
scaleY
scale the response : Yes/No. Ignored since non always possible for glm responses.
pvals.expli
should individual p-values be reported to tune model selection ?
alpha.pvals.expli
level of significance for predictors when pvals.expli=TRUE

Value

  • Depends on the chosen model.

Warning

Deprecated function.

Details

Cross validation not the same way for PLS than for PLS-glm : pls VC only on last components effect as in former SIMCA versions. There are four different models available : [object Object],[object Object],[object Object],[object Object]

References

Nicolas Meyer, Myriam Maumy-Bertrand et Fr�d�ric{Fr'ed'eric} Bertrand (2010). Comparaison de la r�gression{r'egression} PLS et de la r�gression{r'egression} logistique PLS : application aux donn�es{donn'ees} d'all�lotypage{d'all'elotypage}. Journal de la Soci�t� Fran�aise de Statistique, 151(2), pages 1-18. http://smf4.emath.fr/Publications/JSFdS/151_2/pdf/sfds_jsfds_151_2_1-18.pdf

See Also

plsR, plsRglm and PLS_glm_kfoldcv.

Examples

Run this code
data(pine)
Xpine<-pine[,1:10]
ypine<-pine[,11]
OLD_PLS_v2_vc(log(ypine),Xpine,4,typeVC="standard",modele="pls")$CVinfos
OLD_PLS_v2_vc(log(ypine),Xpine,4,typeVC="missingdata",modele="pls")$CVinfos

OLD_PLS_v2_vc(log(ypine),Xpine,4,typeVC="standard",modele="pls-glm-gaussian")$InfCrit
OLD_PLS_v2_vc(log(ypine),Xpine,4,typeVC="missingdata",modele="pls-glm-gaussian")$InfCrit


data(aze_compl)
Xaze_compl<-aze_compl[,2:34]
yaze_compl<-aze_compl$y
OLD_PLS_v2_vc(yaze_compl,Xaze_compl,10,modele="pls-glm-logistic",typeVC="none")$InfCrit
OLD_PLS_v2_vc(yaze_compl,Xaze_compl,10,modele="pls-glm-logistic",typeVC="standard")$InfCrit
OLD_PLS_v2_vc(yaze_compl,Xaze_compl,10,modele="pls-glm-logistic",typeVC="none",pvals.expli=TRUE)$valpvalstep


data(bordeaux)
Xbordeaux<-bordeaux[,1:4]
ybordeaux<-factor(bordeaux$Quality,ordered=TRUE)
OLD_PLS_v2_vc(as.numeric(ybordeaux),Xbordeaux,4,modele="pls",typeVC="standard")$CVinfos
OLD_PLS_v2_vc(ybordeaux,Xbordeaux,4,modele="pls-glm-polr",typeVC="none")$InfCrit


# plsR and gaussian plsRglm with missing data
XpineNAX21 <- Xpine
XpineNAX21[1,2] <- NA
cbind((log(ypine)),OLD_PLS_v2_vc(log(ypine),XpineNAX21,4,typeVC="none",modele="pls")$YChapeau,OLD_PLS_v2_vc(log(ypine),XpineNAX21,4,typeVC="none",modele="pls-glm-gaussian")$YChapeau)
cbind((log(ypine)),OLD_PLS_v2_vc(log(ypine),XpineNAX21,4,typeVC="none",modele="pls")$ValsPredictY,OLD_PLS_v2_vc(log(ypine),XpineNAX21,4,typeVC="none",modele="pls-glm-gaussian")$ValsPredictY)
rm("ypine","Xpine","XpineNAX21","yaze_compl","Xaze_compl","ybordeaux","Xbordeaux")

Run the code above in your browser using DataLab