calibrator (version 1.2-8)

MH: Very basic implementation of the Metropolis-Hastings algorithm

Description

Very basic implementation of the Metropolis-Hastings algorithm using a multivariate Gaussian proposal distribution. Useful for sampling from p.eqn8.supp().

Usage

MH(n, start, sigma, pi)

Arguments

n

Number of samples to take

start

Start value

sigma

Variance matrix for kernel

pi

Functional proportional to the desired sampling pdf

Value

Returns a matrix whose rows are samples from \(\pi()\). Note that the first few rows will be “burn-in”, so should be ignored

Details

This is a basic implementation. The proposal distribution~\(q(X|Y)\) is \(q(\cdot|X)=N(X,\sigma^2)\)

References

  • W. R. Gilks et al 1996. Markov Chain Monte Carlo in practice. Chapman and Hall, 1996. ISBN 0-412-05551-1

  • N. Metropolis and others 1953. Equation of state calculations by fast computing machines. The Journal of Chemical Physics, volume 21, number 6, pages 1087-1092

See Also

p.eqn8.supp

Examples

Run this code
# NOT RUN {
# First, a bivariate Gaussian:
A <- diag(3) + 0.7
quad.form <- function(M,x){drop(crossprod(crossprod(M,x),x))}
pi.gaussian <- function(x){exp(-quad.form(A/2,x))}
x.gauss <- MH(n=1000, start=c(0,0,0),sigma=diag(3),pi=pi.gaussian)
cov(x.gauss)/solve(A) # Should be a matrix of 1s.


# Now something a bit weirder:
pi.triangle <- function(x){
  1*as.numeric( (abs(x[1])<1.0) & (abs(x[2])<1.0) ) +
  5*as.numeric( (abs(x[1])<0.5) & (abs(x[2])<0.5) ) *
    as.numeric(x[1]>x[2])
}
x.tri <- MH(n=100,start=c(0,0),sigma=diag(2),pi=pi.triangle)
plot(x.tri,main="Try with a higher n")


# Now a Gaussian mixture model:
pi.2gauss <- function(x){
  exp(-quad.form(A/2,x)) +
  exp(-quad.form(A/2,x+c(2,2,2)))
}
x.2 <- MH(n=100,start=c(0,0,0),sigma=diag(3),pi=pi.2gauss)
# }
# NOT RUN {
p3d(x.2, theta=44,d=1e4,d0=1,main="Try with more points")
# }
# NOT RUN {

# }

Run the code above in your browser using DataLab