Learn R Programming

bbemkr (version 1.1)

np_gibbs: Estimating bandwidths of the regressor and the variance of the error density

Description

Implements the random-walk Metropolis algorithm to tune the optimal bandwidth of the regressors and variance of the error density for finite sample size

Usage

np_gibbs(data_x, data_y, xh, inicost, prior_p = 2, sizep)

Arguments

data_x
Regressors
data_y
Response variable
xh
Log bandwidths of the regressors
inicost
Initial cost value
prior_p
Tuning parameter in the prior
sizep
Tuning parameter in the random-walk Metropolis algorithm. A large value of sizep decreases the acceptance rate, whereas a small value of sizep increases the acceptance rate

Value

  • xhLog bandwidths of the regressors
  • sigmaVariance of the error density
  • inicostInitial cost value
  • accept_hAcceptance rate of the random walk Metropolis algorithm

Details

1) The log bandwidths of the regressors are initialized using the normal reference rule. 2) Conditioning on the variance parameter of the error density, we implement random-walk Metropolis algorithm to update the bandwidths in order to achieve the optimal cost value 3) The variance parameter of the error density can be directly sampled. 4) Iterate steps 2) and 3) until the cost value is minimized. 5) Check the convergence of the parameters by examing the sif value. The smaller the sif value it, the better convergence of the parameters is.

References

X. Zhang and R. D. Brooks and M. L. King (2009) A Bayesian approach to bandwidth selection for multivariate kernel regression with an application to state-price density estimation, Journal of Econometrics, 153, 21-32.

See Also

bbeMCMCrecording, bbelogdensity

Examples

Run this code
inicost = bbecost(data_x, data_y, nrr(data_x))
np_gibbs(data_x, data_y, nrr(data_x), inicost, prior_p = 2, sizep = 1.2)

Run the code above in your browser using DataLab