50% off: Unlimited data and AI learning.
State of Data and AI Literacy Report 2025

defm (version 0.1-1)

logodds: Maximum Likelihood Estimation of DEFM

Description

Fits a Discrete Exponential-Family Model using Maximum Likelihood.

Usage

logodds(m, par, i, j)

defm_mle(object, start, lower, upper, ...)

summary_table(object, as_texreg = FALSE, ...)

Value

  • logodds returns a numeric vector with the log-odds for each observation in the data.

An object of class stats4::mle.

Arguments

m

An object of class DEFM.

par

The parameters of the model.

i, j

The row and column of the array to turn on for the log odds.

object

An object of class DEFM.

start

Double vector. Starting point for the MLE.

lower, upper

Lower and upper limits for the optimization (passed to stats4::mle.)

...

Further arguments passed to stats4::mle.

as_texreg

When TRUE, wraps the result in a texreg object

Details

The likelihood function of the DEFM is closely-related to the Exponential-Family Random Graph Model [ERGM]. Furthermore, the DEFM can be treated as a generalization of the ERGM. The model implemented here can be viewed as an ERGM for a bipartite network, where the actors are individuals and the events are the binary outputs.

If the model features no markov terms, i.e., terms that depend on more than one output, then the model is equivalent to a logistic regression. The example below shows this equivalence.

The function summary_table computes pvalues and returns a table with the estimates, se, and pvalues. If as_texreg = TRUE, then it will return a texreg object.

References

Vega Yon, G. G., Pugh, M. J., & Valente, T. W. (2022). Discrete Exponential-Family Models for Multivariate Binary Outcomes (arXiv:2211.00627). arXiv. https://arxiv.org/abs/2211.00627

See Also

DEFM for objects of class DEFM and loglike_defm() for the log-likelihood function of DEFMs.

Examples

Run this code
#' Using Valente's SNS data
data(valentesnsList)

# Creating the DEFM object
logit_0 <- new_defm(
  id = valentesnsList$id,
  X = valentesnsList$X,
  Y = valentesnsList$Y[,1,drop=FALSE],
  order = 0
)

# Building the model
term_defm_logit_intercept(logit_0)
term_defm_logit_intercept(logit_0, idx = "Hispanic")
term_defm_logit_intercept(
 logit_0, idx = "exposure_smoke",
 vname = "Smoke Exp"
)
term_defm_logit_intercept(logit_0, idx = "Grades")
init_defm(logit_0) # Needs to be initialized

# Fitting the model
res_0 <- defm_mle(logit_0)

# Refitting the model using GLM
res_glm <- with(
  valentesnsList,
  glm(Y[,1] ~ X[,1] + X[,3] + X[,7], family = binomial())
  )

# Comparing results
summary_table(res_0)
summary(res_glm)

# Comparing the logodds
head(logodds(logit_0, par = coef(res_0), i = 0, j = 0))

Run the code above in your browser using DataLab