mcgibbsit(data, q=0.025, r=0.0125, s=0.95, converge.eps=0.001, correct.cor=TRUE)
mcgibbsit
object with componentsmcgibbsit
computes the minimum run length $Nmin$,
required burn in $M$, total run length $N$, run length
inflation due to auto-correlation, $I$, and the run length
inflation due to between-chain correlation, $R$ for a set of
exchangeable MCMC simulations which need not be independent. The normal usage is to perform an initial MCMC run of some
pre-determined length (e.g., 300 iterations) for each of a set of
$k$ (e.g., $k=20$) MCMC samplers. The output from these samplers
is then read in to create an mcmc.list
object and
mcgibbsit
is run specifying the desired accuracy of estimation
for quantiles of interest. This will return the minimum number of
iterations to achieve the specified error bound. The set of MCMC
samplers is now run so that the total number of iterations exceeds
this minimum, and mcgibbsit
is again called. This should
continue until the number of iterations already complete is less than
the minimum number computed by mcgibbsit
.
If the initial number of iterations in data
is too
small to perform the calculations, an error message is printed
indicating the minimum pilot run length.
The parameters q
, r
, s
, converge.eps
, and
correct.cor
can be supplied as vectors. This will cause
mcgibbsit
to produce a list of results, with one element
produced for each set of values. I.e., setting q=(0.025,0.975),
r=(0.0125,0.005)
will yield a list containing two mcgibbsit
objects,
one computed with parameters q=0.025, r=0.0125
, and
the other with q=0.975, r=0.005
.
Warnes, G.W. (2004). The Normal Kernel Coupler: An adaptive MCMC method for efficiently sampling from multi-modal distributions, https://digital.lib.washington.edu/researchworks/handle/1773/9541 Warnes, G.W. (2000). Multi-Chain and Parallel Algorithms for Markov Chain Monte Carlo. Dissertation, Department of Biostatistics, University of Washington, https://www.stat.washington.edu/research/reports/2001/tr395.pdf
Raftery, A.E. and Lewis, S.M. (1992). One long run with diagnostics: Implementation strategies for Markov chain Monte Carlo. Statistical Science, 7, 493-497.
Raftery, A.E. and Lewis, S.M. (1995). The number of iterations, convergence diagnostics and generic Metropolis algorithms. In Practical Markov Chain Monte Carlo (W.R. Gilks, D.J. Spiegelhalter and S. Richardson, eds.). London, U.K.: Chapman and Hall.
read.mcmc
# this is a totally useless example, but it does exercise the code
for(i in 1:20){
x <- matrix(rnorm(1000),ncol=4)
x[,4] <- x[,4] + 1/3 * (x[,1] + x[,2] + x[,3])
colnames(x) <- c("alpha","beta","gamma", "nu")
write.csv(x, file=paste("mcmc",i,"csv",sep="."), row.names=FALSE)
}
data <- read.mcmc(20, "mcmc.#.csv", sep=",")
mcgibbsit(data)
Run the code above in your browser using DataLab