Local Polynomial Regression Fitting
Fit a polynomial surface determined by one or more numerical predictors, using local fitting.
loess(formula, data, weights, subset, na.action, model = FALSE, span = 0.75, enp.target, degree = 2, parametric = FALSE, drop.square = FALSE, normalize = TRUE, family = c("gaussian", "symmetric"), method = c("loess", "model.frame"), control = loess.control(...), ...)
- a formula specifying the numeric response and one to four numeric predictors (best specified via an interaction, but can also be specified additively). Will be coerced to a formula if necessary.
- an optional data frame, list or environment (or object
as.data.frameto a data frame) containing the variables in the model. If not found in
data, the variables are taken from
environment(formula), typically the environment from which
- optional weights for each case.
- an optional specification of a subset of the data to be used.
- the action to be taken with missing values in the
response or predictors. The default is given by
- should the model frame be returned?
- the parameter $\alpha$ which controls the degree of smoothing.
- an alternative way to specify
span, as the approximate equivalent number of parameters to be used.
- the degree of the polynomials to be used, normally 1 or
2. (Degree 0 is also allowed, but see the
- should any terms be fitted globally rather than locally? Terms can be specified by name, number or as a logical vector of the same length as the number of predictors.
- for fits with more than one predictor and
degree = 2, should the quadratic term be dropped for particular predictors? Terms are specified in the same way as for
- should the predictors be normalized to a common scale if there is more than one? The normalization used is to set the 10% trimmed standard deviation to one. Set to false for spatial coordinate predictors and others known to be on a common scale.
"gaussian"fitting is by least-squares, and if
"symmetric"a re-descending M estimator is used with Tukey's biweight function. Can be abbreviated.
- fit the model or just extract the model frame. Can be abbreviated.
- control parameters: see
- control parameters can also be supplied directly
controlis not specified).
Fitting is done locally. That is, for the fit at point $x$, the
fit is made using points in a neighbourhood of $x$, weighted by
their distance from $x$ (with differences in
enp.target). For $\alpha < 1$, the
neighbourhood includes proportion $\alpha$ of the points,
and these have tricubic weighting (proportional to $(1 -
$\alpha > 1$, all points are used, with the
For the default family, fitting is by (weighted) least squares. For
family="symmetric" a few iterations of an M-estimation
procedure with Tukey's biweight are used. Be aware that as the initial
value is the least-squares fit, this need not be a very resistant fit.
It can be important to tune the control list to achieve acceptable
loess.control for details.
- An object of class
As this is based on
cloess, it is similar to but not identical to
loess function of S. In particular, conditioning is not
The memory usage of this implementation of
loess is roughly
quadratic in the number of points, with 1000 points taking about 10Mb.
degree = 0, local constant fitting, is allowed in this
implementation but not documented in the reference. It seems very little
tested, so use with caution.
The 1998 version of
cloess package of Cleveland,
Grosse and Shyu. A later version is available as
W. S. Cleveland, E. Grosse and W. M. Shyu (1992) Local regression models. Chapter 8 of Statistical Models in S eds J.M. Chambers and T.J. Hastie, Wadsworth & Brooks/Cole.
lowess, the ancestor of
cars.lo <- loess(dist ~ speed, cars) predict(cars.lo, data.frame(speed = seq(5, 30, 1)), se = TRUE) # to allow extrapolation cars.lo2 <- loess(dist ~ speed, cars, control = loess.control(surface = "direct")) predict(cars.lo2, data.frame(speed = seq(5, 30, 1)), se = TRUE)