Learn R Programming

DESP (version 0.2-2)

sqR_Lasso: computation of beta that minimize |Y-X*beta|_2 + lambda |beta|_1 (square-root Lasso)

Description

This function estimates the vector of regression coefficient under sparsity constraints, by square-root Lasso. That is, $beta$ which minimize $${\|Y-X \beta\|}_2 + \lambda {\|\beta\|}_1.$$

Usage

sqR_Lasso(X, Y, lambda, solver = 'CD', sto = '0')

Arguments

X
The matrix of explanatory variables (must be a double-precision matrix).
Y
The response variable.
lambda
The penalization parameter.
solver
The solver. A string indicating the solver to use.

The default is "CD".

sto
Indicates whether a randomized algorithm (stochastic coordinate descent) have to be used when choosing the coordinate descent method. By default, this parameter is set to '0', that means that the coordinates are updated in the order in which the corresponding variables appear in X. Another option would be '2', the coordinates are all updated but in a uniformly random order. The last option (experimental) would be '1', in this case the sole coordinate to be updated is chosen uniformly at random at each iteration.

Value

The coefficient vector.

Details

This method can use the Mosek solver, the Gurobi solver or (by default) the SCS solver.

See Also

mosek,gurobi,scsSOCP

Examples

Run this code
## set the design matrix
X <- matrix(c(1,0,2,2,1,0,-1,1,1,2,0,1),4,3,byrow=TRUE)
## set the vector of observations
Y <- c(1,0,2,1)
## set the penalty level
lambda <- 1
## compute the square-root Lasso estimate using SCS
## get beta, the vector of the coefficients of regression
sqR_Lasso(X, Y, lambda, solver="SCS")

Run the code above in your browser using DataLab