Learn R Programming

liso (version 0.2)

liso.backfit: Function to fit penalized additive isotonic models

Description

Fits penalized additive isotonic models using a total variation penalty.

Usage

liso.backfit(x, y, lambda=0, givebeta = FALSE, tol.target = 1e-04, weights= rep(1, length(y)), covweights=rep(1, ncol(x)), feed, trace=FALSE, monotone=TRUE, randomise=FALSE, huber=Inf)

Arguments

x
Design matrix (without intercept).
y
Response value.
lambda
Value of the penalty parameter lambda. Can be either a single value or a vector, in which case the calculations are done sequentially, using the previous calculation as the feed input.
givebeta
If TRUE, output result as a vector instead of a multistep object.
tol.target
Threshold at which Liso loss change is considered small enough for convergence.
weights
Observation weights. Should be a vector of length equal to the number of observations.
covweights
Covariate weights. Should be a vector of length equal to the number of covariates, or more if different weights are to be applied to positive and negative fits of non-monotone components.
feed
Initial values for backfitting calculation. By default, the zero fit is used. Any multistep fit can be used instead.
trace
If TRUE, print diagnostic information as calculation is done.
monotone
Monotonicity pattern. Can be a single value, or a vector of length equal to the number of covariates. Takes values -1, 0, 1, indicating monotonically decreasing, non-monotonic, monotonically increasing respectively.
randomise
If TRUE, randomly permute the order of backfitting in each cycle. Usually slower, but possibly more stable.
huber
If less than Inf, huberization parameter for huberized liso. (Experimental)

Value

  • With a single value of lambda, a lisofit object is returned, which inherits from class multistep. With more than one value, a list of lisofit values are generated. plot, summary, print, `*` and other methods exist.

References

Zhou Fang and Nicolai Meinshausen (2009), Liso for High Dimensional Additive Isotonic Regression, available at http://blah.com

See Also

cv.liso

Examples

Run this code
## Use the method on a simulated data set

set.seed(79)
n <- 100; p <- 50

## Simulate design matrix and response
x <- matrix(runif(n * p, min = -2.5, max = 2.5), nrow = n, ncol = p)
y <- scale(3 * (x[,1]< 0), scale=FALSE)  + x[,2]^3 + rnorm(n)

## Try lambda = 2, lambda = 1
fits <- liso.backfit(x,y, c(2,1), monotone=c(-1,rep(1, 49)))

## plot the result for lambda = 2
plot(fits[[2]])

## Plot y-yhat plot
plot(y,fits[[2]] * x)

Run the code above in your browser using DataLab