lmrob(formula, data, subset, weights, na.action, method = "MM",
model = TRUE, x = !control$compute.rd, y = FALSE,
singular.ok = TRUE, contrasts = NULL, offset = NULL,
control = NULL, init = NULL, ...)
as.data.frame
to a data frame) containing
the variables in the model. If not found in data
, the
variables NA
s. The default is set by
the na.action
setting of options
, and is
MM
is interpreted as SM
. See Details, notably the
currently recommended setting = "KS2014"
.TRUE
the corresponding
components of the fit (the model frame, the model matrix, the
response) are returned.FALSE
(the default in S but
not in R) a singular fit is an error.contrasts.arg
of model.matrix.default
.offset
term can be included in the
formula instead or as well, and if blist
specifying control parameters; use
the function lmrob.control(.)
and see its help page.control
.lmrob
; a list including the following
components:TRUE
if the IRWLS iterations have converged.lmrob.S
or
lmrob.M.S
(for MM-estimates only)model.frame
on the special handling of NA
s.terms
object used.assign
,
and qr
relating to the linear fit, for use by extractor
functions such as summary
.Koller, M. (2012), Nonsingular subsampling for S-estimators with categorical predictors, ArXiv e-prints, arXiv:1208.5595v1.
Koller, M. and Stahel, W.A. (2011), Sharpening Wald-type inference in robust regression for small samples, Computational Statistics & Data Analysis 55(8), 2504--2515.
Maronna, R. A., and Yohai, V. J. (2000). Robust regression with both continuous and categorical predictors. Journal of Statistical Planning and Inference 89, 197--214.
Rousseeuw, P.J. and Yohai, V.J. (1984) Robust regression by means of S-estimators, In Robust and Nonlinear Time Series, J. Franke, W. Härdle and R. D. Martin (eds.). Lectures Notes in Statistics 26, 256--272, Springer Verlag, New York.
Salibian-Barrera, M. and Yohai, V.J. (2006) A fast algorithm for S-regression estimates, Journal of Computational and Graphical Statistics, 15(2), 414--427.
Yohai, V.J. (1987) High breakdown-point and high efficiency estimates for regression. The Annals of Statistics 15, 642--65.
lmrob.control
;
for the algorithms lmrob.S
, lmrob.M.S
and
lmrob.fit
;
and for methods,
summary.lmrob
, for the extra predict.lmrob
,
print.lmrob
, plot.lmrob
, and
weights.lmrob
.data(coleman)
set.seed(0)
## Default for a very long time:
summary( m1 <- lmrob(Y ~ ., data=coleman) )
## Nowadays strongly recommended for routine use:
summary( m2 <- lmrob(Y ~ ., data=coleman, setting = "KS2011") )
plot(residuals(m2) ~ weights(m2, type="robustness")) ##-> weights.lmrob()
abline(h=0, lty=3)
data(starsCYG, package = "robustbase")
## Plot simple data and fitted lines
plot(starsCYG)
lmST <- lm(log.light ~ log.Te, data = starsCYG)
(RlmST <- lmrob(log.light ~ log.Te, data = starsCYG))
abline(lmST, col = "red")
abline(RlmST, col = "blue")
summary(RlmST)
vcov(RlmST)
stopifnot(all.equal(fitted(RlmST),
predict(RlmST, newdata = starsCYG),
tolerance = 1e-14))
## --- init argument
## string
set.seed(0)
m3 <- lmrob(Y ~ ., data=coleman, init = "S")
stopifnot(all.equal(m1[-18], m3[-18]))
## function
initFun <- function(x, y, control, mf) {
init.S <- lmrob.S(x, y, control)
list(coefficients=init.S$coef, scale = init.S$scale)
}
set.seed(0)
m4 <- lmrob(Y ~ ., data=coleman, method = "M", init = initFun)
## list
m5 <- lmrob(Y ~ ., data=coleman, method = "M",
init = list(coefficients = m3$init$coef, scale = m3$scale))
stopifnot(all.equal(m4[-17], m5[-17]))
Run the code above in your browser using DataLab