bas.glm(formula, data, family = binomial(link = "logit"), n.models = NULL, betaprior=CCH(alpha=.5, beta=nrow(data), s=0), modelprior = beta.binomial(1,1), initprobs = "Uniform", method = "MCMC", update = NULL, bestmodel = NULL, prob.rw = 0.5, MCMC.iterations = NULL, control = glm.control(), offset = rep(0, nobs), weights = rep(1, nobs), laplace=FALSE)uniform, Bernoulli,
beta.binomial, truncated Beta-Binomial,
tr.beta.binomial, and truncated power family tr.power.prior.eplogprob function to aproximate the
Bayes factor using p-values to find initial marginal inclusion probabilitites and
sample without replacement using these
inclusion probabilaties, which may be updated using estimates of the
marginal inclusion probabilites. "eplogp" assumes that MLEs from the
full model exist; for problems where that is not the case or 'p' is
large, initial sampling probabilities may be obtained using
eplogprob.marg which fits a model to each predictor seaparately.
For variables that should always be
included set the corresponding initprobs to 1. To run a
Markov Chain to provide initial estimates of marginal
inclusion probabilities, use method="MCMC+BAS" below.glm.control()bas.glm returns an object of class BMAAn object of class BMA is a list containing at least the following components:initprobs,
which may impact the results in high-dimensional problems.
The deterinistic sampler provides a list of the top models in order of an
approximation of independence using the provided initprobs. This
may be effective after running the other algorithms to identify high
probability models and works well if
the correlations of variables are small to modest. The priors on
coefficients are mixtures of g-priors that provide approximations to the
power prior.Clyde, M. Ghosh, J. and Littman, M. (2010) Bayesian Adaptive Sampling for Variable Selection and Model Averaging. Journal of Computational Graphics and Statistics. 20:80-101 http://dx.doi.org/10.1198/jcgs.2010.09049
Raftery, A.E, Madigan, D. and Hoeting, J.A. (1997) Bayesian Model Averaging for Linear Regression Models. Journal of the American Statistical Association.
##---- Should be DIRECTLY executable !! ----
library(MASS)
data(Pima.tr)
out = bas.glm(type ~ ., data=Pima.tr, n.models= 2^7, method="BAS",
betaprior=CCH(a=1, b=532/2, s=0), family=binomial(),
modelprior=beta.binomial(1,1), laplace=FALSE)
summary(out)
image(out)
Run the code above in your browser using DataLab