Learn R Programming

bayesm (version 1.1-2)

rhierBinLogit: MCMC Algorithm for Hierachical Binary Logit

Description

rhierBinLogit implements an MCMC algorithm for hierarchical binary logits with a normal heterogeneity distribution. This is a hybrid sampler with a RW Metropolis step for unit-level logit parameters. rhierBinLogit is designed for use on choice-based conjoint data with partial profiles. The Design matrix is based on differences of characteristics between two alternatives. See Appendix A of Bayesian Statistics and Marketing for details.

Usage

rhierBinLogit(Data, Prior, Mcmc)

Arguments

Data
list(lgtdata,Z) (note: Z is optional)
Prior
list(Deltabar,ADelta,nu,V) (note: all are optional)
Mcmc
list(sbeta,R,keep) (note: all but R are optional)

Value

  • a list containing:
  • DeltadrawR/keep x nz*nvar matrix of draws of Delta
  • betadrawnlgt x nvar x R/keep array of draws of betas
  • VbetadrawR/keep x nvar*nvar matrix of draws of Vbeta
  • llikeR/keep vector of log-like values
  • rejectR/keep vector of reject rates over nlgt units

concept

  • bayes
  • MCMC
  • hierarchical models
  • binary logit

Details

Model: $y_{hi} = 1$ with $pr=exp(x_{hi}'beta_h)/(1+exp(x_{hi}'beta_h)$. $beta_h$ is nvar x 1. h=1,...,length(lgtdata) units or "respondents" for survey data. $beta_h$= ZDelta[h,] + $u_h$. Note: here ZDelta refers to Z%*%Delta, ZDelta[h,] is hth row of this product. Delta is an nz x nvar array. $u_h$ $\sim$ $N(0,V_{beta})$. Priors: $delta= vec(Delta)$ $\sim$ $N(vec(Deltabar),V_{beta} (x) ADelta^{-1})$ $V_{beta}$ $\sim$ $IW(nu,V)$ Lists contain:
  • lgtdata
{list of lists with each cross-section unit MNL data} lgtdata[[h]]$y{ $n_h$ vector of binary outcomes (0,1)} lgtdata[[h]]$X{ $n_h$ by nvar design matrix for hth unit} Deltabar{nz x nvar matrix of prior means (def: 0)} ADelta{ prior prec matrix (def: .01I)} nu{ d.f. parm for IW prior on norm comp Sigma (def: nvar+3)} V{ pds location parm for IW prior on norm comp Sigma (def: nuI)} sbeta{ scaling parm for RW Metropolis (def: .2)} R{ number of MCMC draws} keep{ MCMC thinning parm: keep every keepth draw (def: 1)}

References

For further discussion, see Bayesian Statistics and Marketing by Allenby, McCulloch, and Rossi, Chapter 5. http://gsbwww.uchicago.edu/fac/peter.rossi/research/bsm.html

See Also

rhierMnlRwMixture

Examples

Run this code
##  
if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=10000} else {R=10}

set.seed(66)
nvar=5                           ## number of coefficients
nlgt=1000                        ## number of cross-sectional units
nobs=10                          ## number of observations per unit
nz=2                             ## number of regressors in mixing distribution

## set hyper-parameters
##     B=ZDelta + U  

Z=matrix(c(rep(1,nlgt),runif(nlgt,min=-1,max=1)),nrow=nlgt,ncol=nz)
Delta=matrix(c(-2,-1,0,1,2,-1,1,-.5,.5,0),nrow=nz,ncol=nvar)
iota=matrix(1,nrow=nvar,ncol=1)
Vbeta=diag(nvar)+.5*iota%*%t(iota)

## simulate data
lgtdata=NULL

for (i in 1:nlgt) 
{ beta=t(Delta)%*%Z[i,]+as.vector(t(chol(Vbeta))%*%rnorm(nvar))
  X=matrix(runif(nobs*nvar),nrow=nobs,ncol=nvar)
  prob=exp(X%*%beta)/(1+exp(X%*%beta)) 
  unif=runif(nobs,0,1)
  y=ifelse(unif<prob,1,0)
  lgtdata[[i]]=list(y=y,X=X,beta=beta)
}

Data=list(Dat=lgtdata,Demo=Z)
out=rhierBinLogit(Data=list(lgtdata=lgtdata,Z=Z),Mcmc=list(R=R))

cat("Deltadraws ",fill=TRUE)
mat=apply(out$Deltadraw,2,quantile,probs=c(.01,.05,.5,.95,.99))
mat=rbind(as.vector(Delta),mat); rownames(mat)[1]="delta"; print(mat)
cat("Vbetadraws ",fill=TRUE)
mat=apply(out$Vbetadraw,2,quantile,probs=c(.01,.05,.5,.95,.99))
mat=rbind(as.vector(Vbeta),mat); rownames(mat)[1]="Vbeta"; print(mat)

if(0){
td=as.vector(Delta)
par(mfrow=c(2,2))
matplot(out$Deltadraw[,(1:nvar)],type="l")
abline(h=td[1:nvar],col=(1:nvar))
matplot(out$Deltadraw[,((nvar+1):(2*nvar))],type="l")
abline(h=td[(nvar+1):(2*nvar)],col=(1:nvar))
matplot(out$Vbetadraw[,c(1,7,13,19,25)],type="l")
abline(h=1.5)
matplot(out$Vbetadraw[,-c(1,7,13,19,25)],type="l")
abline(h=.5)
}

Run the code above in your browser using DataLab