selectiveInference (version 1.2.5)

lar: Least angle regression

Description

This function implements least angle regression, for use in the selectiveInference package

Usage

lar(x, y, maxsteps=2000, minlam=0, intercept=TRUE, normalize=TRUE,
    verbose=FALSE)

Arguments

x

Matrix of predictors (n by p)

y

Vector of outcomes (length n)

maxsteps

Maximum number of steps to take

minlam

Minimum value of lambda to consider

intercept

Should an intercept be included on the model? Default is TRUE

normalize

Should the predictors be normalized? Default is TRUE

verbose

Print out progress along the way? Default is FALSE

Value

lambda

Values of lambda (knots) visited along the path

action

Vector of predictors in order of entry

sign

Signs of coefficients of predictors, upon entry

df

Degrees of freedom of each active model

beta

Matrix of regression coefficients for each model along the path, one model per column

completepath

Was the complete stepwise path computed?

bls

If completepath is TRUE, the full least squares coefficients

Gamma

Matrix that captures the polyhedral selection at each step

nk

Number of polyhedral constraints at each step in path

vreg

Matrix of linear contrasts that gives coefficients of variables to enter along the path

mp

Value of M+ (for internal use with the spacing test)

x

Matrix of predictors used

y

Vector of outcomes used

bx

Vector of column means of original x

by

Mean of original y

sx

Norm of each column of original x

intercept

Was an intercept included?

normalize

Were the predictors normalized?

call

The call to lar

Details

The least angle regression algorithm is described in detail by Efron et al. (2002). This function should match (in terms of its output) that from the lars package, but returns additional information (namely, the polyhedral constraints) needed for the selective inference calculations.

References

Brad Efron, Trevor Hastie, Iain Johnstone, and Rob Tibshirani (2002). Least angle regression. Annals of Statistics (with discussion).

See also the descriptions in Trevor Hastie, Rob Tibshirani, and Jerome Friedman (2002, 2009). Elements of Statistical Learning.

See Also

larInf, predict.lar, coef.lar, plot.lar

Examples

Run this code
# NOT RUN {
set.seed(43)
n = 50
p = 10
sigma = 1
x = matrix(rnorm(n*p),n,p)
beta = c(3,2,rep(0,p-2))
y = x%*%beta + sigma*rnorm(n)

# run LAR, plot results
larfit = lar(x,y)
plot(larfit)

# compute sequential p-values and confidence intervals
# (sigma estimated from full model)
out = larInf(larfit)
out                                    
# }

Run the code above in your browser using DataLab