Fit an XGBoost model
tl_fit_xgboost(
data,
formula,
is_classification = FALSE,
nrounds = 100,
max_depth = 6,
eta = 0.3,
subsample = 1,
colsample_bytree = 1,
min_child_weight = 1,
gamma = 0,
alpha = 0,
lambda = 1,
early_stopping_rounds = NULL,
nthread = NULL,
verbose = 0,
...
)A fitted XGBoost model
A data frame containing the training data
A formula specifying the model
Logical indicating if this is a classification problem
Number of boosting rounds (default: 100)
Maximum depth of trees (default: 6)
Learning rate (default: 0.3)
Subsample ratio of observations (default: 1)
Subsample ratio of columns (default: 1)
Minimum sum of instance weight needed in a child (default: 1)
Minimum loss reduction to make a further partition (default: 0)
L1 regularization term (default: 0)
L2 regularization term (default: 1)
Early stopping rounds (default: NULL)
Number of threads (default: max available)
Verbose output (default: 0)
Additional arguments to pass to xgb.train()