Learn R Programming

pls (version 0.1-2)

svdpls1a: Univariate Partial Least Squares Regression

Description

Performs univariate partial least squares (PLS) regression of a vector on a matrix of explanatory variables using the Orthogonal Scores Algorithm.

Usage

svdpls1a(X, y, K=r)

Arguments

X
Matrix of explanatory variables. Each column represents a variable and each row an observation. The columns of this matrix are assumed to have been centred. The number of rows of X should equal the number of observations in y. <
y
Vector of responses. y is assumed to have been centred. NAs and Infs are not allowed.
K
Number of PLS factors to fit in the PLS regression. This must be less than or equal to the rank of X.

Value

  • a vector of regression coefficients

Details

Univariate Partial Least Squares Regression is an example of a regularised regression method. It creates a lower dimensional representation of the original explanatory variables and uses this representation in an ordinary least squares regression of the response variables. (cf. Principal Components Regression). Unlike Principal Components Regression, PLS regression chooses the lower dimensional representation of the original explanatory variables with reference to the response variable y.

References

Denham, M. C. (1994). Implementing partial least squares. Statistics and Computing (to appear)

Helland, I. S. (1988). On the Structure of partial least squares regression, Communications in Statistics, 17, pp. 581-607

Martens, H. and Naes, T. (1989). Multivariate Calibration. Wiley, New York.

See Also

pls1a, pls1b, pls1c, svdpls1b,svdpls1c

Examples

Run this code
data(USArrests)
attach(USArrests)
svdpls1a(scale(cbind(Murder, Assault, UrbanPop),scale=FALSE), 
      scale(Rape,scale=FALSE), 2)

Run the code above in your browser using DataLab