Implemented regression methods
rf(y, x, ...)survforest(y, x, ...)
qrf(y, x, ...)
lrm(y, x, ...)
glrm(y, x, ...)
lasso(y, x, s = "lambda.min", ...)
ridge(y, x, s = "lambda.min", ...)
postlasso(y, x, s = "lambda.min", ...)
cox(y, x, ...)
tuned_rf(
y,
x,
max.depths = 1:5,
mtrys = list(1, function(p) ceiling(sqrt(p)), identity),
verbose = FALSE,
...
)
xgb(y, x, nrounds = 2L, verbose = 0L, ...)
tuned_xgb(
y,
x,
nfold,
folds,
etas = c(0.1, 0.5, 1),
max_depths = 1:5,
nrounds = c(2, 10, 50),
verbose = 0,
metrics = list("rmse"),
...
)
lgbm(y, x, nrounds = 100L, verbose = -1L, ...)
Vector (or matrix) of response values.
Design matrix of predictors.
Additional arguments passed to the underlying regression method.
In case of "rf", "tuned_rf", "survforest" and
"qrf", this is ranger. In case of
"lasso" and "ridge", this is glmnet.
In case of "cox", this is coxph. In case
of "xgb" and "tuned_xgb" this is
xgboost.
Which lambda to use for prediction, defaults to
"lambda.min". See cv.glmnet
Values for max.depth to tune out-of-bag. See
ranger.
for mtry to tune out-of-bag. See
ranger.
See xgboost.
See xgboost.
Number of folds for nfold-cross validation.
Specify folds for cross validation.
Values for eta to cross-validate. See
xgboost.
Values for max_depth to cross-validate. See
xgboost.
See xgboost.
The implemented choices are "rf" for random forests as implemented in
ranger, "lasso" for cross-validated Lasso regression (using the
one-standard error rule), "ridge"
for cross-validated ridge regression (using the one-standard error rule),
"cox" for the Cox proportional
hazards model as implemented in survival, "qrf" or "survforest"
for quantile and survival random forests, respectively. The option
"postlasso" option refers to a cross-validated LASSO (using the
one-standard error rule) and subsequent OLS regression. The "lrm"
option implements a standard linear regression model. The "xgb" and
"tuned_xgb" options require the xgboost package.
The "tuned_rf" regression method tunes the mtry and
max.depth parameters in ranger out-of-bag.
The "tuned_xgb" regression method uses k-fold cross-validation to
tune the nrounds, mtry and max_depth parameters in
xgb.cv.
New regression methods can be implemented and supplied as well and need the
following structure. The regression method "custom_reg" needs to take
arguments y, x, ..., fit the model using y and x as
matrices and return an object of a user-specified class, for instance,
'custom'. For the GCM test, implementing a residuals.custom
method is sufficient, which should take arguments
object, response = NULL, data = NULL, .... For the PCM test, a
predict.custom method is necessary for out-of-sample prediction
and computation of residuals.