Learn R Programming

MCMCpack (version 0.4-5)

MCMCordfactanal: Markov chain Monte Carlo for Ordinal Data Factor Analysis Model

Description

This function generates a posterior density sample from an ordinal data factor analysis model. Normal priors are assumed on the factor loadings and factor scores while improper uniform priors are assumed on the cutpoints. The user supplies data and parameters for the prior distributions, and a sample from the posterior density is returned as an mcmc object, which can be subsequently analyzed with functions provided in the coda package.

Usage

MCMCordfactanal(x, factors, lambda.constraints=list(),
                data=list(), burnin = 1000, mcmc = 10000,
                thin=5, tune=NA, verbose = FALSE, seed = 0,
                lambda.start = NA, l0=0, L0=0,
                store.lambda=TRUE, store.scores=FALSE,
                drop.constantvars=TRUE, ... )

Arguments

x
Either a formula or a numeric matrix containing the manifest variables.
factors
The number of factors to be fitted.
lambda.constraints
List of lists specifying possible equality or simple inequality constraints on the factor loadings. A typical entry in the list has one of three forms: varname=list(d,c) which will constrain the dth loading for the variable named
data
A data frame.
burnin
The number of burn-in iterations for the sampler.
mcmc
The number of iterations for the sampler.
thin
The thinning interval used in the simulation. The number of iterations must be divisible by this value.
tune
The tuning parameter for the Metropolis-Hastings sampling. Can be either a scalar or a $k$-vector. Must be strictly positive.
verbose
A switch which determines whether or not the progress of the sampler is printed to the screen. If TRUE, the iteration number and the Metropolis-Hastings acceptance rate are printed to the screen.
seed
The seed for the random number generator. The code uses the Mersenne Twister, which requires an integer as an input. If nothing is provided, the Scythe default seed is used.
lambda.start
Starting values for the factor loading matrix Lambda. If lambda.start is set to a scalar the starting value for all unconstrained loadings will be set to that scalar. If lambda.start is a matrix of the same dimensions
l0
The means of the independent Normal prior on the factor loadings. Can be either a scalar or a matrix with the same dimensions as Lambda.
L0
The precisions (inverse variances) of the independent Normal prior on the factor loadings. Can be either a scalar or a matrix with the same dimensions as Lambda.
store.lambda
A switch that determines whether or not to store the factor loadings for posterior analysis. By default, the factor loadings are all stored.
store.scores
A switch that determines whether or not to store the factor scores for posterior analysis. NOTE: This takes an enormous amount of memory, so should only be used if the chain is thinned heavily, or for applications with a small num
drop.constantvars
A switch that determines whether or not observations and manifest variables that have no variation should be deleted before fitting the model. Default = TRUE.
...
further arguments to be passed

Value

  • An mcmc object that contains the posterior density sample. This object can be summarized by functions provided by the coda package.

Details

The model takes the following form:

Let $i=1,\ldots,N$ index observations and $j=1,\ldots,K$ index response variables within an observation. The typical observed variable $x_{ij}$ is ordinal with a total of $C_j$ categories. The distribution of $X$ is governed by a $N \times K$ matrix of latent variables $X^*$ and a series of cutpoints $\gamma$. $X^*$ is assumed to be generated according to: $$x^*_i = \Lambda \phi_i + \epsilon_i$$ $$\epsilon_i \sim \mathcal{N}(0,I)$$

where $x^*_i$ is the $k$-vector of latent variables specific to observation $i$, $\Lambda$ is the $k \times d$ matrix of factor loadings, and $\phi_i$ is the $d$-vector of latent factor scores. It is assumed that the first element of $\phi_i$ is equal to 1 for all $i$.

The probability that the $j$th variable in observation $i$ takes the value $c$ is:

$$\pi_{ijc} = \Phi(\gamma_{jc} - \Lambda'_j\phi_i) - \Phi(\gamma_{j(c-1)} - \Lambda'_j\phi_i)$$ The implementation used here assumes independent conjugate priors for each element of $\Lambda$ and each $\phi_i$. More specifically we assume:

$$\Lambda_{ij} \sim \mathcal{N}(l_{0_{ij}}, L_{0_{ij}}^{-1}), i=1,\ldots,k, j=1,\ldots,d$$

$$\phi_i \sim \mathcal{N}(0, I), i=1,\dots,n$$

The standard two-parameter item response theory model with probit link is a special case of the model sketched above. MCMCordfactanal simulates from the posterior density using standard Metropolis-Hastings within Gibbs sampling. The algorithm employed is based on work by Cowles (1996). Note that the first element of $\phi_i$ is a 1. As a result, the first column of $\Lambda$ can be interpretated as item difficulty parameters. Further, the first element $\gamma_1$ is normalized to zero, and thus not returned in the mcmc object. The simulation proper is done in compiled C++ code to maximize efficiency. Please consult the coda documentation for a comprehensive list of functions that can be used to analyze the posterior density sample.

References

Shawn Treier and Simon Jackman. 2003. ``Democracy as a Latent Variable." Paper presented at the Midwest Political Science Association Annual Meeting.

M. K. Cowles. 1996. ``Accelerating Monte Carlo Markov Chain Convergence for Cumulative-link Generalized Linear Models." Statistics and Computing. 6: 101-110. Valen E. Johnson and James H. Albert. 1999. ``Ordinal Data Modeling." Springer: New York. Andrew D. Martin, Kevin M. Quinn, and Daniel Pemstein. 2003. Scythe Statistical Library 0.4. http://scythe.wustl.edu. Martyn Plummer, Nicky Best, Kate Cowles, and Karen Vines. 2002. Output Analysis and Diagnostics for MCMC (CODA). http://www-fis.iarc.fr/coda/.

See Also

plot.mcmc, summary.mcmc, factanal, MCMCfactanal, MCMCirt1d, MCMCirtKd

Examples

Run this code
data(painters)
   new.painters <- painters[,1:4]
   cuts <- apply(new.painters, 2, quantile, c(.25, .50, .75))
   for (i in 1:4){
      new.painters[new.painters[,i]<cuts[1,i],i] <- 100
     new.painters[new.painters[,i]<cuts[2,i],i] <- 200
     new.painters[new.painters[,i]<cuts[3,i],i] <- 300
     new.painters[new.painters[,i]<100,i] <- 400
   }

   posterior <- MCMCordfactanal(~Composition+Drawing+Colour+Expression,
                        data=new.painters, factors=1,
                        lambda.constraints=list(Drawing=list(2,"+")),
                        burnin=5000, mcmc=500000, thin=200, verbose=TRUE,
                        L0=0.5, store.lambda=TRUE,
                        store.scores=TRUE, tune=.6)
   plot(posterior)
   summary(posterior)

Run the code above in your browser using DataLab