Learn R Programming

bayess (version 1.6)

ModChoBayesReg: Bayesian model choice procedure for the linear model

Description

This function computes the posterior probabilities of all (for less than 15 covariates) or the most probable (for more than 15 covariates) submodels obtained by eliminating some covariates.

Usage

ModChoBayesReg(y, X, g = length(y), betatilde = rep(0, dim(X)[2]), 
niter = 1e+05, prt = TRUE)

Value

top10models

models with the ten largest posterior probabilities

postprobtop10

posterior probabilities of those ten most likely models

Arguments

y

response variable

X

covariate matrix

g

constant in the \(g\) prior

betatilde

prior expectation of the regression coefficient \(\beta\)

niter

number of Gibbs iterations in the case there are more than 15 covariates

prt

boolean variable for printing the standard output

Details

When using a conjugate prior for the linear model such as the \(G\) prior, the marginal likelihood and hence the evidence are available in closed form. If the number of explanatory variables is less than 15, the exact derivation of the posterior probabilities for all submodels can be undertaken. Indeed, \(2^{15}=32768\) means that the problem remains tractable. When the number of explanatory variables gets larger, a random exploration of the collection of submodels becomes necessary, as explained in the book (Chapter 3). The proposal to change one variable indicator is made at random and accepting this move follows from a Metropolis--Hastings step.

Examples

Run this code
data(caterpillar)
y=log(caterpillar$y)
X=as.matrix(caterpillar[,1:8])
res2=ModChoBayesReg(y,X)

Run the code above in your browser using DataLab