eha (version 2.4-5)

glmmML: Generalized Linear Models with random intercept

Description

Fits GLMs with random intercept by Maximum Likelihood and numerical integration via Gauss-Hermite quadrature.

Usage

glmmML(formula, family = binomial, data, cluster, weights,
cluster.weights, subset, na.action, 
offset, prior = c("gaussian", "logistic", "cauchy", "gamma"),
start.coef = NULL, start.sigma = NULL, fix.sigma = FALSE, x = FALSE, 
control = list(epsilon = 1e-08, maxit = 200, trace = FALSE),
method = c("Laplace", "ghq"), n.points = 8, boot = 0)

Arguments

formula
a symbolic description of the model to be fit. The details of model specification are given below.
family
Currently, the only valid values are binomial and poisson. The binomial family allows for the logit and cloglog links.
data
an optional data frame containing the variables in the model. By default the variables are taken from `environment(formula)', typically the environment from which `glmmML' is called.
cluster
Factor indicating which items are correlated.
weights
Case weights. Defaults to one.
cluster.weights
Cluster weights. Defaults to one.
subset
an optional vector specifying a subset of observations to be used in the fitting process.
na.action
See glm.
start.coef
starting values for the parameters in the linear predictor. Defaults to zero.
start.sigma
starting value for the mixing standard deviation. Defaults to 0.5.
fix.sigma
Should sigma be fixed at start.sigma?
x
If TRUE, the design matrix is returned (as x).
offset
this can be used to specify an a priori known component to be included in the linear predictor during fitting.
prior
Which "prior" distribution (for the random effects)? Possible choices are "gaussian" (default), "logistic", and "cauchy". For the poisson family, it is possible to use the conjugate "gamma" prior, which avoids numerical integration.
control
Controls the convergence criteria. See glm.control for details.
method
There are two choices "Laplace" (default) and "ghq" (Gauss-Hermite).
n.points
Number of points in the Gauss-Hermite quadrature. If n.points == 1, the Gauss-Hermite is the same as Laplace approximation. If method is set to "Laplace", this parameter is ignored.
boot
Do you want a bootstrap estimate of cluster effect? The default is No (boot = 0). If you want to say yes, enter a positive integer here. It should be equal to the number of bootstrap samples you want to draw. A recomended absolute minimum value is boot = 2000.

Value

The return value is a list, an object of class 'glmmML'. The components are:
boot
No. of boot replicates
converged
Logical
coefficients
Estimated regression coefficients
coef.sd
Their standard errors
sigma
The estimated random effects' standard deviation
sigma.sd
Its standard error
variance
The estimated variance-covariance matrix. The last column/row corresponds to the standard deviation of the random effects (sigma)
aic
AIC
bootP
Bootstrap p value from testing the null hypothesis of no random effect (sigma = 0)
deviance
Deviance
mixed
Logical
df.residual
Degrees of freedom
cluster.null.deviance
Deviance from a glm with no clustering. Subtracting deviance gives a test statistic for the null hypothesis of no clustering. Its asymptotic distribution is a symmetric mixture a constant at zero and a chi-squared distribution with one df. The printed p-value is based on this.
cluster.null.df
Its degrees of freedom
posterior.modes
Estimated posterior modes of the random effects
terms
The terms object
info
From hessian inversion. Should be 0. If not, no variances could be estimated. You could try fixing sigma at the estimated value and rerun.
prior
Which prior was used?
call
The function call
x
The design matrix if asked for, otherwise not present

Details

The integrals in the log likelihood function are evaluated by the Laplace approximation (default) or Gauss-Hermite quadrature. The latter is now fully adaptive; however, only approximate estimates of variances are available for the Gauss-Hermite (n.points > 1) method.

For the binomial families, the response can be a two-column matrix, see the help page for glm for details.

References

Brostr<U+001AD828>2003). Generalized linear models with random intercepts. http://www.stat.umu.se/forskning/reports/glmmML.pdf

See Also

glmmboot, glm, optim, lmer in the package lme4 and glmmPQL in the package MASS.

Examples

Run this code
id <- factor(rep(1:20, rep(5, 20)))
y <- rbinom(100, prob = rep(runif(20), rep(5, 20)), size = 1)
x <- rnorm(100)
dat <- data.frame(y = y, x = x, id = id)
glmmML(y ~ x, data = dat, cluster = id)

Run the code above in your browser using DataLab