50% off: Unlimited data and AI learning.
State of Data and AI Literacy Report 2025

lassoshooting (version 0.1.5-1.1)

lassoshooting: Lasso Shooting

Description

Efficient estimates of sparse regression coefficients with a lasso (L1) penalty

Usage

lassoshooting(X=NULL, y=NULL, lambda, XtX=NULL, Xty=NULL, thr=1.0e-6,
              maxit=1e4, nopenalize=NULL, penaltyweight=NULL, trace=0, ...)

Arguments

X

Design matrix: N by p matrix of p explanatory variables

y

vector of 1 response variable for N observations

XtX

X'X, could be given together with X'y instead of X and y

Xty

X'y, could be given together with X'X instead of X and y

lambda

(Non-negative) regularization parameter for lasso. lambda=0 means no regularization.

thr

Threshold for convergence. Default value is 1e-4. Iterations stop when max absolute parameter change is less than thr

maxit

Maximum number of iterations of outer loop. Default 10,000

nopenalize

List of coefficients not to penalize starting at 0

penaltyweight

p weights, one per variable, will be multiplied by overall lambda penalty

trace

Level of detail for printing out information as iterations proceed. Default 0 -- no information

...

Reserved for experimental options

Value

A list with components

coefficients

Estimated regression coefficient vector

iterations

Number of iterations of outer loop used by algorithm

delta

Change in parameter value at convergence

infnorm

||Xy||

Details

Estimates a sparse regression coefficient vector using a lasso (L1) penalty using the approach of cyclic coordinate descent. See references for details.

The solver does NOT include an intercept, add a column of ones to x if your data is not centered.

References

Rebecka J<U+00F6>rnsten, Tobias Abenius, Teresia Kling, Linn<U+00E9>a Schmidt, Erik Johansson, Torbj<U+00F6>rn Nordling, Bodil Nordlander, Chris Sander, Peter Gennemark, Keiko Funa, Bj<U+00F6>rn Nilsson, Linda Lindahl, Sven Nelander. (2011) Network modeling of the transcriptional effects of copy number aberrations in glioblastoma. Molecular Systems Biology 7 (to appear)

Friedman J, Hastie T, et al. (2007) Pathwise coordinate optimization. Ann Appl Stat 1: 302--332

Fu WJ (1998) Penalized regressions: the bridge versus the lasso. J Comput Graph Statist 7: 397--416

Examples

Run this code
# NOT RUN {
  
# }
# NOT RUN {
set.seed(42)

b <- seq(3,3,length=10)
n<-100;
p<-10; 
X <- matrix(rnorm(n*p),n,p)
noise <- as.matrix(rnorm(n,sd=0.1))
y <- X <!-- %*% b + noise -->

require(lassoshooting)
# FIXME: write proper example using R built in dataset
#add intercept column to the design matrix
Xdesign <- cbind(1,X)
lambda <- 20
#don't penalize the intercept
bhat <- lassoshooting(X=Xdesign,y=y,lambda=lambda,nopenalize=0) 

#above equals below 
bhat1 <- lassoshooting(X=Xdesign,y=y,lambda=2*lambda,penaltyweight=c(0,seq(0.5,0.5,length=p-1)))

T1 <- all(abs(bhat1-bhat) < 1e-20)

c <- 10
bhat2 <- lassoshooting(X=Xdesign,y=y, lambda=lambda, penaltyweight=c(0,1,1,1,1,1,c,c,c,c,c))

T2 <- all(bhat2[2:6] > bhat2[7:11])
T1 && T2
  
# }

Run the code above in your browser using DataLab