Last chance! 50% off unlimited learning
Sale ends in
bvar
is used to create objects of class "bvar".
Forecasting a Bayesian VAR object of class "bvar"
with credible bands.
bvar(
data = NULL,
exogen = NULL,
y = NULL,
x = NULL,
A0 = NULL,
A = NULL,
B = NULL,
C = NULL,
Sigma = NULL
)# S3 method for bvar
predict(object, ..., n.ahead = 10, new_x = NULL, new_D = NULL, ci = 0.95)
the original time-series object of endogenous variables.
the original time-series object of unmodelled variables.
a gen_var
.
a gen_var
.
a
a
a
an
a
an object of class "bvar"
, usually, a result of a call to
bvar
or bvec_to_bvar
.
additional arguments.
number of steps ahead at which to predict.
a matrix of new non-deterministic, exogenous variables. Must have n.ahead
rows.
a matrix of new deterministic variables. Must have n.ahead
rows.
a numeric between 0 and 1 specifying the probability mass covered by the credible intervals. Defaults to 0.95.
An object of class "bvar" containing the following components, if specified:
the original time-series object of endogenous variables.
the original time-series object of unmodelled variables.
a
a
an
an
an
an
an
a list containing information on the model specification.
A time-series object of class "bvarprd".
For the VARX model
The draws of the different coefficient matrices provided in A0
, A
,
B
, C
and Sigma
have to correspond to the same MCMC iteration.
For the VAR model
n.ahead
forecasts.
L<U+00FC>tkepohl, H. (2007). New introduction to multiple time series analysis (2nd ed.). Berlin: Springer.
# NOT RUN {
data("e1")
e1 <- diff(log(e1))
data <- gen_var(e1, p = 2, deterministic = "const")
y <- data$Y[, 1:73]
x <- data$Z[, 1:73]
set.seed(1234567)
iter <- 500 # Number of iterations of the Gibbs sampler
# Chosen number of iterations should be much higher, e.g. 30000.
burnin <- 100 # Number of burn-in draws
store <- iter - burnin
t <- ncol(y) # Number of observations
k <- nrow(y) # Number of endogenous variables
m <- k * nrow(x) # Number of estimated coefficients
# Set (uninformative) priors
a_mu_prior <- matrix(0, m) # Vector of prior parameter means
a_v_i_prior <- diag(0, m) # Inverse of the prior covariance matrix
u_sigma_df_prior <- 0 # Prior degrees of freedom
u_sigma_scale_prior <- diag(0, k) # Prior covariance matrix
u_sigma_df_post <- t + u_sigma_df_prior # Posterior degrees of freedom
# Initial values
u_sigma_i <- diag(.00001, k)
u_sigma <- solve(u_sigma_i)
# Data containers for posterior draws
draws_a <- matrix(NA, m, store)
draws_sigma <- matrix(NA, k^2, store)
# Start Gibbs sampler
for (draw in 1:iter) {
# Draw conditional mean parameters
a <- post_normal(y, x, u_sigma_i, a_mu_prior, a_v_i_prior)
# Draw variance-covariance matrix
u <- y - matrix(a, k) %*% x # Obtain residuals
u_sigma_scale_post <- solve(u_sigma_scale_prior + tcrossprod(u))
u_sigma_i <- matrix(rWishart(1, u_sigma_df_post, u_sigma_scale_post)[,, 1], k)
u_sigma <- solve(u_sigma_i) # Invert Sigma_i to obtain Sigma
# Store draws
if (draw > burnin) {
draws_a[, draw - burnin] <- a
draws_sigma[, draw - burnin] <- u_sigma
}
}
# Generate bvar object
bvar_est <- bvar(y = y, x = x, A = draws_a[1:18,],
C = draws_a[19:21, ], Sigma = draws_sigma)
data("e1")
e1 <- diff(log(e1))
data <- gen_var(e1, p = 2, deterministic = "const")
y <- data$Y[, 1:73]
x <- data$Z[, 1:73]
set.seed(1234567)
iter <- 500 # Number of iterations of the Gibbs sampler
# Chosen number of iterations should be much higher, e.g. 30000.
burnin <- 100 # Number of burn-in draws
store <- iter - burnin
t <- ncol(y) # Number of observations
k <- nrow(y) # Number of endogenous variables
m <- k * nrow(x) # Number of estimated coefficients
# Set (uninformative) priors
a_mu_prior <- matrix(0, m) # Vector of prior parameter means
a_v_i_prior <- diag(0, m) # Inverse of the prior covariance matrix
u_sigma_df_prior <- 0 # Prior degrees of freedom
u_sigma_scale_prior <- diag(0, k) # Prior covariance matrix
u_sigma_df_post <- t + u_sigma_df_prior # Posterior degrees of freedom
# Initial values
u_sigma_i <- diag(.00001, k)
u_sigma <- solve(u_sigma_i)
# Data containers for posterior draws
draws_a <- matrix(NA, m, store)
draws_sigma <- matrix(NA, k^2, store)
# Start Gibbs sampler
for (draw in 1:iter) {
# Draw conditional mean parameters
a <- post_normal(y, x, u_sigma_i, a_mu_prior, a_v_i_prior)
# Draw variance-covariance matrix
u <- y - matrix(a, k) %*% x # Obtain residuals
u_sigma_scale_post <- solve(u_sigma_scale_prior + tcrossprod(u))
u_sigma_i <- matrix(rWishart(1, u_sigma_df_post, u_sigma_scale_post)[,, 1], k)
u_sigma <- solve(u_sigma_i) # Invert Sigma_i to obtain Sigma
# Store draws
if (draw > burnin) {
draws_a[, draw - burnin] <- a
draws_sigma[, draw - burnin] <- u_sigma
}
}
# Generate bvar object
bvar_est <- bvar(y = y, x = x, A = draws_a[1:18,],
C = draws_a[19:21, ], Sigma = draws_sigma)
# Generate forecasts
bvar_pred <- predict(bvar_est, n.ahead = 10, new_D = rep(1, 10))
# Plot forecasts
plot(bvar_pred)
# }
Run the code above in your browser using DataLab