gpls (version 1.44.0)

gpls: A function to fit Generalized partial least squares models.

Description

Partial least squares is a commonly used dimension reduction technique. The paradigm can be extended to include generalized linear models in several different ways. The code in this function uses the extension proposed by Ding and Gentleman, 2004.

Usage

gpls(x, ...)
"gpls"(x, y, K.prov=NULL, eps=1e-3, lmax=100, b.ini=NULL, denom.eps=1e-20, family="binomial", link=NULL, br=TRUE, ...)
"gpls"(formula, data, contrasts=NULL, K.prov=NULL, eps=1e-3, lmax=100, b.ini=NULL, denom.eps=1e-20, family="binomial", link=NULL, br=TRUE, ...)

Arguments

x
The matrix of covariates.
formula
A formula of the form 'y ~ x1 + x2 + ...', where y is the response and the other terms are covariates.
y
The vector of responses
data
A data.frame to resolve the forumla, if used
K.prov
number of PLS components, default is the rank of X
eps
tolerance for convergence
lmax
maximum number of iteration allowed
b.ini
initial value of regression coefficients
denom.eps
small quanitity to guarantee nonzero denominator in deciding convergence
family
glm family, binomial is the only relevant one here
link
link function, logit is the only one practically implemented now
br
TRUE if Firth's bias reduction procedure is used
...
Additional arguements.
contrasts
an optional list. See the contrasts.arg of model.matrix.default.

Value

An object of class gpls with the following components:
coefficients
The estimated coefficients.
convergence
A boolean indicating whether convergence was achieved.
niter
The total number of iterations.
bias.reduction
A boolean indicating whether Firth's procedure was used.
family
The family argument that was passed in.
link
The link argument that was passed in.
terms
The constructed terms object.
call
The call
levs
The factor levels for prediction.

Details

This is a different interface to the functionality provided by glpls1a. The interface is intended to be simpler to use and more consistent with other matchine learning code in R.

The technology is intended to deal with two class problems where there are more predictors than cases. If a response variable (y) is used that has more than two levels the behavior may be unusual.

References

  • Ding, B.Y. and Gentleman, R. (2003) Classification using generalized partial least squares.
  • Marx, B.D (1996) Iteratively reweighted partial least squares estimation for generalized linear regression. Technometrics 38(4): 374-381.

See Also

glpls1a

Examples

Run this code
library(MASS)
m1 = gpls(type~., data=Pima.tr, K=3)

Run the code above in your browser using DataLab