Learn R Programming

OrdinalLogisticBiplot (version 0.4)

pordlogist: Ordinal logistic regression with ridge penalization

Description

This function performs a logistic regression between a dependent ordinal variable y and some independent variables x, and solves the separation problem using ridge penalization.

Usage

pordlogist(y, x, penalization = 0.1, tol = 1e-04, maxiter = 200, show = FALSE)

Arguments

y
Dependent variable.
x
A matrix with the independent variables.
penalization
Penalization used to avoid singularities.
tol
Tolerance for the iterations.
maxiter
Maximum number of iterations.
show
Should the iteration history be printed?.

Value

An object of class "pordlogist". This has components:
nobs
Number of observations
J
Maximum value of the dependent variable
nvar
Number of independent variables
fitted.values
Matrix with the fitted probabilities
pred
Predicted values for each item
Covariances
Covariances matrix
clasif
Matrix of classification of the items
PercentClasif
Percent of good classifications
coefficients
Estimated coefficients for the ordinal logistic regression
thresholds
Thresholds of the estimated model
logLik
Logarithm of the likelihood
penalization
Penalization used to avoid singularities
Deviance
Deviance of the model
DevianceNull
Deviance of the null model
Dif
Diference between the two deviances values calculated
df
Degrees of freedom
pval
p-value of the contrast
CoxSnell
Cox-Snell pseudo R squared
Nagelkerke
Nagelkerke pseudo R squared
MacFaden
Nagelkerke pseudo R squared
iter
Number of iterations made

Details

The problem of the existence of the estimators in logistic regression can be seen in Albert (1984); a solution for the binary case, based on the Firth's method, Firth (1993) is proposed by Heinze(2002). All the procedures were initially developed to remove the bias but work well to avoid the problem of separation. Here we have chosen a simpler solution based on ridge estimators for logistic regression Cessie(1992).

Rather than maximizing $L_j (G | b_j0 , B_j)$ we maximize

$${{L_j}(\left. {\bf{G}} \right|{{\bf{b}}_{j0}},{{\bf{B}}_j})} - \lambda \left( {\left\| {{{\bf{b}}_{j0}}} \right\|^2 + \left\| {{{\bf{B}}_j}} \right\|^2} \right)$$ Changing the values of $\lambda$ we obtain slightly different solutions not affected by the separation problem.

References

Albert,A. & Anderson,J.A. (1984),On the existence of maximum likelihood estimates in logistic regression models, Biometrika 71(1), 1--10. Bull, S.B., Mak, C. & Greenwood, C.M. (2002), A modified score function for multinomial logistic regression, Computational Statistics and dada Analysis 39, 57--74. Firth, D.(1993), Bias reduction of maximum likelihood estimates, Biometrika 80(1), 27--38 Heinze, G. & Schemper, M. (2002), A solution to the problem of separation in logistic regression, Statistics in Medicine 21, 2109--2419 Le Cessie, S. & Van Houwelingen, J. (1992), Ridge estimators in logistic regression, Applied Statistics 41(1), 191--201.

See Also

OrdinalLogBiplotEM,CheckDataSet

Examples

Run this code
  
data(LevelSatPhd)
dataSet = CheckDataSet(LevelSatPhd)
datanom = dataSet$datanom
olb = OrdinalLogBiplotEM(datanom,dim = 2, nnodos = 10,
            tol = 0.001, maxiter = 100, penalization = 0.2)
model = pordlogist(datanom[, 1], olb$RowCoordinates, tol = 0.001,
        maxiter = 100, penalization = 0.2)
model

  

Run the code above in your browser using DataLab