Learn R Programming

BayesLogit (version 0.6.1)

mlogit: Bayesian Multinomial Logistic Regression

Description

Run a Bayesian multinomial logistic regression.

Usage

mlogit(y, X, n=rep(1,nrow(as.matrix(y))),
       m.0=array(0, dim=c(ncol(X), ncol(y))),
       P.0=array(0, dim=c(ncol(X), ncol(X), ncol(y))),
       samp=1000, burn=500)

Arguments

y

An N x J-1 dimensional matrix; \(y_{ij}\) is the average response for category j at \(x_i\).

X

An N x P dimensional design matrix; \(x_i\) is the ith row.

n

An N dimensional vector; \(n_i\) is the total number of observations at each \(x_i\).

m.0

A P x J-1 matrix with the \(beta_j\)'s prior means.

P.0

A P x P x J-1 array of matrices with the \(beta_j\)'s prior precisions.

samp

The number of MCMC iterations saved.

burn

The number of MCMC iterations discarded.

Value

mlogit returns a list.

beta

A samp x P x J-1 array; the posterior sample of the regression coefficients.

w

A samp x N' x J-1 array; the posterior sample of the latent variable. WARNING: N' may be less than N if data is combined.

y

The response matrix--different than input if data is combined.

X

The design matrix--different than input if data is combined.

n

The number of samples at each observation--different than input if data is combined.

Details

Multinomial logistic regression is a classifiction mechanism. Given the multinomial data \(\{y_i\}\) with J categories and the p-dimensional predictor variables \(\{x_i\}\), one wants to forecast whether a future data point y* at the predictor x*. Multinomial Logistic regression stiuplates that the statistical model for observing a draw category j after rolling the multinomial die \(n^*=1\) time is governed by

$$ P(y^* = j | x^*, \beta, n^*=1) = e^{x^* \beta_j} / \sum_{k=1}^J e^{x^* \beta_k}. $$

Instead of representing data as the total number of responses in each category, one may record the average number of responses in each category and the total number of responses \(n_i\) at \(x_i\). We follow this method of encoding data.

We assume that \(\beta_J = 0\) for purposes of identification!

You may use mlogit for binary logistic regression with a normal prior.

References

Nicholas G. Polson, James G. Scott, and Jesse Windle. Bayesian inference for logistic models using Polya-Gamma latent variables. http://arxiv.org/abs/1205.0310

See Also

rpg, logit.EM, logit

Examples

Run this code
# NOT RUN {
## Use the iris dataset.
data(iris)
N = nrow(iris)
P = ncol(iris)
J = nlevels(iris$Species)

X     = model.matrix(Species ~ ., data=iris);
y.all = model.matrix(~ Species - 1, data=iris);
y     = y.all[,-J];

out = mlogit(y, X, samp=1000, burn=100);

# }

Run the code above in your browser using DataLab