eplogprob.marg calculates approximate marginal posterior
inclusion probabilities from p-values computed from a series of simple
linear regression models using a lower bound approximation to Bayes
factors. Used to order variables and if appropriate obtain initial
inclusion probabilities for sampling using Bayesian Adaptive Sampling
bas.lm
eplogprob.marg(Y, X, thresh=.5, max = 0.99, int=TRUE)bas.lm function to keep initial inclusion probabilities
away from 1.eplogprob.prob returns a vector of marginal posterior inclusion
probabilities for each of the variables in the linear model. If int =
TRUE, then the inclusion probability for the intercept is set to 1.
BF(p) = -e p log(p)
which provide a lower bound to a Bayes factor for comparing H0: beta = 0 versus H1: beta not equal to 0, when the p-value p is less than 1/e. Using equal prior odds on the hypotheses H0 and H1, the approximate marginal posterior inclusion probability
p(beta != 0 | data ) = 1/(1 + BF(p))
When p > 1/e, we set the marginal inclusion probability to 0.5 or the
value given by thresh.
For the eplogprob.marg the marginal p-values are obtained using
statistics from the p simple linear regressions
P(F > (n-2) R2/(1 - R2)) where F ~ F(1, n-2) where R2 is the square of the correlation coefficient between y and X_j.
bas
library(MASS)
data(UScrime)
UScrime[,-2] = log(UScrime[,-2])
eplogprob(lm(y ~ ., data=UScrime))
Run the code above in your browser using DataLab