VARshrink (version 0.3.1)

lm_full_Bayes_SR: Full Bayesian Shrinkage Estimation Method for Multivariate Regression

Description

Estimate regression coefficients and scale matrix for noise by using Gibbs MCMC algorithm. The function assumes 1) multivariate t-distribution for noise as a sampling distribution, and 2) noninformative priors for regression coefficients and scale matrix for noise.

Usage

lm_full_Bayes_SR(Y, X, dof = Inf, burnincycle = 1000,
  mcmccycle = 2000)

Arguments

Y

An N x K matrix of dependent variables.

X

An N x M matrix of regressors.

dof

Degree of freedom for multivariate t-distribution. If dof = Inf (default), then multivariate normal distribution is applied and weight vector q is not estimated. If dof = NULL or dof <= 0, then dof and q are estimated automatically. If dof is a positive number, q is estimated.

burnincycle, mcmccycle

Number of burnin cycles is the number of initially generated sample values to drop. Number of MCMC cycles is the number of generated sample values to compute estimates.

Value

A list object with estimated parameters: Psi, Sigma, dof, delta (delta is the reciprocal of lambda), and lambda. Additional components are se.param (standard error of the parameters) and LINEXVARmodel (estimates under LINEX loss).

Details

Consider the multivariate regression: $$Y = X Psi + e, \quad e ~ mvt(0, dof, Sigma).$$ Psi is a M-by-K matrix of regression coefficients and Sigma is a K-by-K scale matrix for multivariate t-distribution for noise.

Sampling distribution for noise e is multivariate t-distribution with degree of freedom dof and scale matrix Sigma: e ~ mvt(0, dof, Sigma). The priors are noninformative priors: 1) the shrinkage prior for regression coefficients Psi, and 2) the reference prior for scale matrix Sigma.

The function implements Gibbs MCMC algorithm for estimating regression coefficients Psi and scale matrix Sigma.

References

S. Ni and D. Sun (2005). Bayesian estimates for vector autoregressive models. Journal of Business & Economic Statistics 23(1), 105-117.