Learn R Programming

psychonetrics (version 0.15)

find_penalized_lambda: Automated Lambda Selection for Penalized Estimation

Description

Selects the optimal penalty strength (lambda) for penalized maximum likelihood (PML) or penalized full-information maximum likelihood (PFIML) models via EBIC-based grid search. A log-spaced grid of lambda values is searched from lambda_max (the smallest lambda for which all penalized parameters are exactly zero, derived from KKT conditions) down to lambda_max * lambda_min_ratio. Each lambda is optimized using ISTA (Iterative Shrinkage-Thresholding Algorithm) with warm starts from the previous solution, and the model with the best EBIC is selected. After selection, a beta_min threshold is applied to set near-zero penalized parameters exactly to zero.

This function is called automatically by runmodel when the model contains auto-select penalty parameters (i.e., penalty_lambda = NA in the parameter table, which is the default when using estimator = "PML" or estimator = "PFIML"). It can also be called directly for more control over the search.

Usage

find_penalized_lambda(model, criterion = "ebic.5", nLambda = 50,
                      lambda_min_ratio = 1e-4,
                      beta_min = c("numerical", "theoretical"),
                      verbose)

Value

An object of class psychonetrics (psychonetrics-class) with:

  • The selected lambda assigned to all auto-select penalty parameters

  • Parameters optimized at the best lambda

  • Near-zero penalized parameters set to exactly zero (via beta_min)

  • Search metadata stored in model@optim$lambda_search

After running find_penalized_lambda, use refit to obtain valid standard errors and fit indices via post-selection inference.

Arguments

model

A psychonetrics model with estimator = "PML" or estimator = "PFIML". The model should contain parameters with penalty_lambda = NA (auto-select), which is the default when specifying estimator = "PML" or "PFIML" in model constructors such as ggm, lvm, etc.

criterion

Character string specifying the information criterion used to select the best lambda. One of:

"bic"

BIC (equivalent to EBIC with gamma = 0)

"ebic.25"

EBIC with gamma = 0.25

"ebic.5"

EBIC with gamma = 0.5 (default; recommended for network models)

"ebic.75"

EBIC with gamma = 0.75

"ebic1"

EBIC with gamma = 1.0

Higher gamma values penalize model complexity more strongly and tend to produce sparser solutions. Gamma = 0.5 is widely recommended for network models (Foygel & Drton, 2010).

nLambda

Integer, the number of lambda values in the search grid. Default is 50. The grid is log-spaced between lambda_max and lambda_max * lambda_min_ratio.

lambda_min_ratio

Numeric, the ratio of the smallest to the largest lambda in the grid. Default is 1e-4, so that lambda_min = lambda_max * 1e-4.

beta_min

Threshold for zeroing out small penalized parameter estimates. Parameters with absolute value below beta_min are set to zero, both during EBIC evaluation (for parameter counting) and in the final model. Can be:

"numerical"

Uses a fixed threshold of 1e-05 (default).

"theoretical"

Uses sqrt(log(p) / n) where p is the number of penalized parameters and n is the total sample size.

numeric value

Uses the provided value directly as the threshold.

verbose

Logical, should progress messages be printed? Defaults to the model's verbose setting.

Author

Sacha Epskamp

Details

The search proceeds as follows:

  1. Compute lambda_max: The analytical upper bound is derived from KKT (Karush-Kuhn-Tucker) conditions at the null-penalized model (all penalized parameters fixed at zero). Specifically, lambda_max = max(|grad_j|) / alpha where grad_j are the gradient elements for penalized parameters at the null solution and alpha is the elastic net mixing parameter.

  2. Build lambda grid: A log-spaced sequence of nLambda values from lambda_max down to lambda_max * lambda_min_ratio.

  3. Grid search with warm starts: For each lambda (from largest to smallest), the model is optimized using ISTA. The solution from the previous lambda serves as the starting point (warm start), which substantially reduces computation time. The EBIC (Extended Bayesian Information Criterion) is computed using the unpenalized log-likelihood (Convention A), counting only parameters with |estimate| >= beta_min as non-zero.

  4. Beta-min thresholding: Penalized parameters with |estimate| < beta_min are set exactly to zero in the final model.

The EBIC is computed as: $$EBIC = -2 \cdot LL + k \cdot \log(n) + 4 \cdot k \cdot \gamma \cdot \log(p_v)$$ where \(LL\) is the unpenalized log-likelihood, \(k\) is the effective number of parameters (unpenalized free parameters plus non-zero penalized parameters), \(n\) is the sample size, \(\gamma\) is the EBIC tuning parameter, and \(p_v\) is the number of observed variables.

The returned model stores metadata about the search in model@optim$lambda_search, a list containing: criterion, gamma, best_lambda, best_criterion, beta_min, lambda_max, nLambda, and n_thresholded.

References

Foygel, R., & Drton, M. (2010). Extended Bayesian Information Criteria for Gaussian Graphical Models. In Advances in Neural Information Processing Systems (Vol. 23).

See Also

penalize for manually setting penalty parameters, runmodel for running models (calls find_penalized_lambda automatically when auto-select penalties are present), refit for post-selection inference after penalized estimation.

Examples

Run this code
# \donttest{
# Load bfi data from psych package:
library("psychTools")
library("dplyr")
data(bfi)

ConsData <- bfi %>%
  select(A1:A5) %>%
  na.omit

vars <- names(ConsData)

# Automatic lambda selection (called implicitly by runmodel):
mod <- ggm(ConsData, vars = vars, estimator = "PML")
mod <- mod %>% runmodel  # automatically searches for best lambda

# Check lambda search results:
mod@optim$lambda_search

# Post-selection refit for standard errors:
mod_refit <- mod %>% refit
mod_refit %>% parameters

# Direct call with custom criterion:
mod2 <- ggm(ConsData, vars = vars, estimator = "PML")
mod2 <- find_penalized_lambda(mod2, criterion = "ebic1",
                              nLambda = 100)
# }

Run the code above in your browser using DataLab