rrum (version 0.2.0)

rrum_helper: Gibbs sampler to estimate the rRUM

Description

Obtains samples from posterior distributon for the reduced Reparametrized Unified Model (rRUM).

Usage

rrum_helper(Y, Q, delta0, chain_length = 10000L, as = 1, bs = 1,
  ag = 1, bg = 1)

Value

A List

  • PISTAR A matrix where each column represents one draw from the posterior distribution of pistar.

  • RSTAR A \(J x K x chain_length\) array where J reperesents the number of items, and K represents the number of attributes. Each slice represents one draw from the posterior distribution of rstar.

  • PI matrix where each column reperesents one draw from the posterior distribution of pi.

  • ALPHA An \(N x K x chain_length\) array where N reperesents the number of individuals, and K represents the number of attributes. Each slice represents one draw from the posterior distribution of alpha.

Arguments

Y

A matrix with \(N\) rows and \(J\) columns indicating the indviduals' responses to each of the items.

Q

A matrix with \(J\) rows and \(K\) columns indicating which attributes are required to answer each of the items.An entry of 1 indicates attribute \(k\) is required to answer item \(j\). An entry of one indicates attribute \(k\) is not required.

chain_length

A numeric indicating the number of iterations of Gibbs sampler to be run. Default is set to 10000.

as

A numeric, parameter for the prior distribution of pistar. High values as encourage higher values of pistar and lower values of rstar.

bs

A numeric, parameter for the prior distribution of pistar. High values as encourage lower values of pistar and higher values of rstar.

ag

A numeric, parameter for the prior distribution of rstar. High values as encourage higher values of rstar.

bg

A numeric, parameter for the prior distribution of pistar. High values as encourage lower values of rstar.

deltas

A vector, parameters for the Dirichlet prior on pi.

Author

Steven Andrew Culpepper, Aaron Hudson, and James Joseph Balamuta

References

Culpepper, S. A. & Hudson, A. (In Press). An improved strategy for Bayesian estimation of the reduced reparameterized unified model. Applied Psychological Measurement.

Hudson, A., Culpepper, S. A., & Douglas, J. (2016, July). Bayesian estimation of the generalized NIDA model with Gibbs sampling. Paper presented at the annual International Meeting of the Psychometric Society, Asheville, North Carolina.

Examples

Run this code
# Set seed for reproducibility
set.seed(217)

## Define Simulation Parameters

N = 1000 # Number of Individuals
J = 6    # Number of Items
K = 2    # Number of Attributes

# Matrix where rows represent attribute classes
As = attribute_classes(K) 

# Latent Class probabilities
pis = c(.1, .2, .3, .4) 

# Q Matrix
Q = rbind(c(1, 0),
          c(0, 1),
          c(1, 0),
          c(0, 1),
          c(1, 1),
          c(1, 1)
    )
    
# The probabiliies of answering each item correctly for individuals 
# who do not lack any required attribute
pistar = rep(.9, J)

# Penalties for failing to have each of the required attributes
rstar  = .5 * Q

# Randomized alpha profiles
alpha  = As[sample(1:(K ^ 2), N, replace = TRUE, pis),]

# Simulate data
rrum_items = simcdm::sim_rrum_items(Q, rstar, pistar, alpha)

if (FALSE) {
# Note: This portion of the code is computationally intensive.

# Recover simulation parameters with Gibbs Sampler
Gibbs.out = rrum(rrum_items, Q)

# Iterations to be discarded from chain as burnin
burnin = 1:5000 

# Calculate summarizes of posterior distributions
rstar.mean  = with(Gibbs.out, apply(RSTAR[,,-burnin], c(1, 2), mean))
pistar.mean = with(Gibbs.out, apply(PISTAR[,-burnin], 1, mean))
pis.mean    = with(Gibbs.out, apply(PI[,-burnin], 1 ,mean))
}

Run the code above in your browser using DataCamp Workspace