bas.glm(formula, data,
family = binomial(link = "logit"),
n.models = NULL, betaprior=CCH(alpha=.5, beta=nrow(data), s=0),
modelprior = beta.binomial(1,1),
initprobs = "Uniform", method = "MCMC", update = NULL,
bestmodel = NULL, prob.rw = 0.5,
MCMC.iterations = NULL, control = glm.control(),
offset = rep(0, nobs), weights = rep(1, nobs), laplace=FALSE)uniform, Bernoulli or beta.binomial.glm.control()bas.glm returns an object of class BMAAn object of class BMA is a list containing at least the following components:
initprobs,
which may impact the results in high-dimensional problems.
The deterinistic sampler provides a list of the top models in order of an
approximation of independence using the provided initprobs. This
may be effective after running the other algorithms to identify high
probability models and works well if
the correlations of variables are small to modest. The priors on
coefficients are mixtures of g-priors that provide approximations to the
power prior. Clyde, M. Ghosh, J. and Littman, M. (2010) Bayesian Adaptive Sampling
for Variable Selection and Model Averaging. Journal of Computational
Graphics and Statistics. 20:80-101
Raftery, A.E, Madigan, D. and Hoeting, J.A. (1997) Bayesian Model Averaging for Linear Regression Models. Journal of the American Statistical Association.
##---- Should be DIRECTLY executable !! ----
library(MASS)
data(Pima.tr)
out = bas.glm(type ~ ., data=Pima.tr, n.models= 2^7, method="BAS",
betaprior=CCH(a=1, b=532/2, s=0), family=binomial(),
modelprior=beta.binomial(1,1), laplace=FALSE)
summary(out)
image(out)Run the code above in your browser using DataLab