pls1b: Univariate Partial Least Squares Regression
Description
Performs univariate partial least squares (PLS) regression of a vector on a
matrix of explanatory variables using the Orthogonal Loadings Algorithm.
Usage
pls1b(X, y, K=min(dx[1]-1,dx[2]))
Arguments
X
Matrix of explanatory variables. Each column represents a variable and
each row an observation. The columns of this matrix are assumed to have been
centred. The number of rows of X should equal the number of observations in
y. <
y
Vector of responses. y is assumed to have been centred.
NAs and Infs are not allowed.
K
Number of PLS factors to fit in the PLS regression. This must
be less than or equal to the rank of X.
Value
a vector of regression coefficients
Details
Univariate Partial Least Squares Regression is an example of a
regularised regression method. It creates a lower dimensional
representation of the original explanatory variables and uses this
representation in an ordinary least squares regression of the response
variables. (cf. Principal Components Regression). Unlike Principal
Components Regression, PLS regression chooses the lower dimensional
representation of the original explanatory variables with reference to
the response variable y.
References
Denham, M. C. (1994).
Implementing partial least squares.
Statistics and Computing (to appear)
Helland, I. S. (1988).
On the Structure of partial least squares regression,
Communications in Statistics, 17, pp. 581-607
Martens, H. and Naes, T. (1989).
Multivariate Calibration.
Wiley, New York.