# glpls1a.mlogit

0th

Percentile

##### Fit MIRWPLS and MIRWPLSF model

Fit multi-logit Iteratively ReWeighted Least Squares (MIRWPLS) with an option of Firth's bias reduction procedure (MIRWPLSF) for multi-group classification

Keywords
regression
##### Usage
glpls1a.mlogit(x, y, K.prov = NULL, eps = 0.001, lmax = 100, b.ini = NULL, denom.eps = 1e-20, family = "binomial", link = "logit", br = T)
##### Arguments
x
n by p design matrix (with intercept term)
y
response vector with class lables 1 to C+1 for C+1 group classification, baseline class should be 1
K.prov
number of PLS components
eps
tolerance for convergence
lmax
maximum number of iteration allowed
b.ini
initial value of regression coefficients
denom.eps
small quanitity to guarantee nonzero denominator in deciding convergence
family
glm family, binomial (i.e. multinomial here) is the only relevant one here
link function, logit is the only one practically implemented now
br
TRUE if Firth's bias reduction procedure is used

##### Value

coefficients
regression coefficient matrix
convergence
whether convergence is achieved
niter
total number of iterations
bias.reduction
whether Firth's procedure is used

##### References

• Ding, B.Y. and Gentleman, R. (2003) Classification using generalized partial least squares.
• Marx, B.D (1996) Iteratively reweighted partial least squares estimation for generalized linear regression. Technometrics 38(4): 374-381.

glpls1a,glpls1a.mlogit.cv.error, glpls1a.train.test.error, glpls1a.cv.error

##### Aliases
• glpls1a.mlogit
##### Examples
 x <- matrix(rnorm(20),ncol=2)
y <- sample(1:3,10,TRUE)
## no bias reduction and 1 PLS component
glpls1a.mlogit(cbind(rep(1,10),x),y,K.prov=1,br=FALSE)
## bias reduction
glpls1a.mlogit(cbind(rep(1,10),x),y,br=TRUE)

Documentation reproduced from package gpls, version 1.44.0, License: Artistic-2.0

### Community examples

Looks like there are no examples yet.