50% off: Unlimited data and AI learning.
State of Data and AI Literacy Report 2025

grpnet (version 1.0)

family.grpnet: Prepare 'family' Argument for grpnet

Description

Takes in the family argument from grpnet and returns a list containing the information needed for fitting and/or tuning the model.

Usage

family.grpnet(object, theta = 1)

Value

List with components:

family

same as input object, i.e., character specifying the family

linkinv

function for computing inverse of link function

dev.resids

function for computing deviance residuals

Arguments

object

two options: (1) an object of class "grpnet" or "cv.grpnet"; or (2) a character specifying the exponential family: "gaussian", "multigaussian", "svm1", "svm2", "logit", "binomial", "multinomial", "poisson", "negative.binomial", "Gamma", "inverse.gaussian"

theta

positive scalar that serves as an additional hyperparameter for various loss functions.

svm1: additional parameter that controls the smoothing rate for the hinge loss function (see Note below).

negative.binomial: size parameter such that the variance function is defined as \(V(\mu) = \mu + \mu^2/ \theta\)

Author

Nathaniel E. Helwig <helwig@umn.edu>

Details

There is only one available link function for each family:
* gaussian (identity): \(\mu = \mathbf{X}^\top \boldsymbol\beta\)
* multigaussian (identity): \(\mu = \mathbf{X}^\top \boldsymbol\beta\)
* svm1/svm2 (identity): \(\mu = \mathbf{X}^\top \boldsymbol\beta\)
* binomial/logit (logit): \(\log(\frac{\pi}{1 - \pi}) = \mathbf{X}^\top \boldsymbol\beta\)
* multinomial (symmetric): \(\pi_\ell = \frac{\exp(\mathbf{X}^\top \boldsymbol\beta_\ell)}{\sum_{l = 1}^m \exp(\mathbf{X}^\top \boldsymbol\beta_l)}\)
* poisson (log): \(\log(\mu) = \mathbf{X}^\top \boldsymbol\beta\)
* negative.binomial (log): \(\log(\mu) = \mathbf{X}^\top \boldsymbol\beta\)
* Gamma (log): \(\log(\mu) = \mathbf{X}^\top \boldsymbol\beta\)
* inverse.gaussian (log): \(\log(\mu) = \mathbf{X}^\top \boldsymbol\beta\)

References

Helwig, N. E. (2025). Versatile descent algorithms for group regularization and variable selection in generalized linear models. Journal of Computational and Graphical Statistics, 34(1), 239-252. tools:::Rd_expr_doi("10.1080/10618600.2024.2362232")

See Also

visualize.loss for plotting loss functions

grpnet for fitting group elastic net regularization paths

cv.grpnet for k-fold cross-validation of lambda

Examples

Run this code
family.grpnet("gaussian")

family.grpnet("multigaussian")

family.grpnet("svm1", theta = 0.1)

family.grpnet("svm2")

family.grpnet("logit")

family.grpnet("binomial")

family.grpnet("multinomial")

family.grpnet("poisson")

family.grpnet("negative.binomial", theta = 10)

family.grpnet("Gamma")

family.grpnet("inverse.gaussian")

Run the code above in your browser using DataLab