Learn R Programming

msaenet (version 2.1)

aenet: Adaptive Elastic-Net

Description

Adaptive Elastic-Net

Usage

aenet(x, y, family = c("gaussian", "binomial", "poisson", "multinomial", "cox", "mgaussian"), init = c("enet", "ridge"), nfolds = 5L, alphas = seq(0.05, 0.95, 0.05), gamma = 1, rule = c("lambda.min", "lambda.1se"), seed = 1001, parallel = FALSE, verbose = FALSE)

Arguments

x
Data matrix.
y
Response vector.
family
Response type. See glmnet for details.
init
Type of the penalty used in the initial estimation step. Can be "enet" or "ridge".
nfolds
Fold numbers of cross-validation.
alphas
Vector of candidate alphas to use in cv.glmnet.
gamma
Scaling factor for adaptive weights: weights = coefs^(-gamma).
rule
Model selection criterion, "lambda.min" or "lambda.1se". See cv.glmnet for details.
seed
Random seed for cross-validation fold division.
parallel
Logical. Enable parallel parameter tuning or not, default is FALSE. To enable parallel tuning, load the doParallel package and run registerDoParallel() with the number of CPU cores before calling this function.
verbose
Should we print out the estimation progress?

Value

List of coefficients beta and glmnet model object model.

References

Zou, Hui, and Hao Helen Zhang. (2009). On the Adaptive Elastic-Net with a Diverging Number of Parameters. The Annals of Statistics 37(4), 1733--51.

Examples

Run this code
dat = msaenet.sim.gaussian(n = 150, p = 500, rho = 0.6,
                           coef = rep(1, 5), snr = 2, p.train = 0.7,
                           seed = 1001)

aenet.fit = aenet(dat$x.tr, dat$y.tr,
                  alphas = seq(0.2, 0.8, 0.2), seed = 1002)

print(aenet.fit)
msaenet.nzv(aenet.fit)
msaenet.fp(aenet.fit, 1:5)
msaenet.tp(aenet.fit, 1:5)
aenet.pred = predict(aenet.fit, dat$x.te)
msaenet.rmse(dat$y.te, aenet.pred)
plot(aenet.fit)

Run the code above in your browser using DataLab