Learn R Programming

sns (version 1.0.0)

sns: Stochastic Newton Sampler (SNS)

Description

SNS is a Metropolis-Hastings MCMC sampler with a multivariate Gaussian proposal function resulting from a local, second-order Taylor series expansion of log-density. The mean of the Gaussian proposal is identical to the full Newton-Raphson step from the current point. During burn-in, Newton-Raphson optimization can be performed to get close to the mode of the pdf which is unique due to convexity, resulting in faster convergence. For high dimensional densities, state space partitioning can be used to improve mixing.

Usage

sns(x, fghEval, rnd=TRUE, gfit=NULL, mh.diag = FALSE, part = NULL, ...)
sns.run(init, fghEval, niter = 100, nnr = min(10, round(niter/4))
  , mh.diag = FALSE, part = NULL
  , print.level = 0, report.progress = ceiling(niter/10), ...)
## S3 method for class 'sns':
print(x, ...)

Arguments

x
For sns, the current state vector. For print.sns, an object of class sns.
fghEval
function, should evaluate the log-density, its gradient and Hessian at any point. Must a return a list with labels: f - the log-probability density, g - gradient vector, h - Hessian matrix.
rnd
Runs 1 iteration of Newton-Raphson optimization method (non-stochastic or 'nr' mode) when FALSE. Runs Metropolis-Hastings (stochastic or 'mcmc' mode) for drawing a sample when TRUE.
gfit
Gaussian fit at point init. If NULL then sns will compute a Gaussian fit at x.
mh.diag
Boolean flag, indicating whether detailed MH diagnostics such as components of acceptance test must be returned or not.
part
List describing partitioning of state space into subsets. Each element of the list must be an integer vector containing a set of indexes (between 1 and length(x) or length(init)) indicating which subset of all dimens
init
Starting state for sns.run.
niter
Number of iterations to perform (in 'nr' and 'mcmc' mode).
nnr
Number of Newton-Raphson (non-stochastic) iterations to perform at the beginning.
print.level
If greater than 0, print sampling progress report.
report.progress
Number of sampling iterations to wait before printing progress reports.
...
Arguments passed to fghEval in the case of sns and sns.run.

Value

  • sns.run returns an object of class sns with elements:
  • samplesMatA matrix object with nsample rows and K cols.
  • acceptanceMetropolis proposal percentage acceptance.
  • burn.itersNumber of burn-in ierations.
  • sample.timeTime in seconds spent in sampling.
  • burnin.timeTime in seconds spent in burn-in.
  • sns returns the sample drawn as a vector, with attributes:
  • acceptA boolean indicating whether the proposal was accepted.
  • llValue of the log-pdf at the sampled point.
  • gfitList containing Gaussian fit to pdf at the sampled point.

References

Mahani, Alireza S. and Sharabiani, Mansour T.A. (2013) Metropolis-Hastings Sampling Using Multivariate Gaussian Tangents http://arxiv.org/pdf/1308.0657v1.pdf Qi, Y. and Minka, T.P. (2002) Hessian-Based Markov Chain Monte Carlo Algorithms

See Also

ess, summary.sns, plot.sns, predict.sns

Examples

Run this code
# use this library for constructing the
# log-likelihood and its gradient and Hessian
library(RegressionFactory)

# simulate data for logistic regression
N <- 1000 # number of observations
K <- 5 # number of attributes
X <- matrix(runif(N*K, min = -0.5, max = +0.5), ncol = K)
beta <- runif(K, min = -0.5, max = +0.5)
y <- 1*(runif(N) < 1/(1 + exp(- X %*% beta)))

# define log-likelihood using expander framework of
# RegressionFactory package
loglike <- function(beta, X, y) {
  regfac.expand.1par(beta, X, y, fbase1.binomial.logit, fgh = 2, n = 1)
}

# SNS sampling
nsmp <- 100
beta.init <- rep(0.0, K)
sns.out <- sns.run(beta.init, loglike, 200, X = X, y = y)
summary(sns.out)

Run the code above in your browser using DataLab