Learn R Programming

E2E (version 0.1.2)

gbm_dia: Train a Gradient Boosting Machine (GBM) Model for Classification

Description

Trains a Gradient Boosting Machine (GBM) model using caret::train for binary classification.

Usage

gbm_dia(X, y, tune = FALSE, cv_folds = 5, tune_length = 10)

Value

A caret::train object representing the trained GBM model.

Arguments

X

A data frame of features.

y

A factor vector of class labels.

tune

Logical, whether to perform hyperparameter tuning for interaction.depth, n.trees, and shrinkage (if TRUE) or use fixed values (if FALSE).

cv_folds

An integer, the number of cross-validation folds for caret.

tune_length

An integer, the number of random parameter combinations to try when tune=TRUE. Only used when search="random". Default is 20.

Examples

Run this code
# \donttest{
set.seed(42)
n_obs <- 200
X_toy <- data.frame(
  FeatureA = rnorm(n_obs),
  FeatureB = runif(n_obs, 0, 100)
)
y_toy <- factor(sample(c("Control", "Case"), n_obs, replace = TRUE),
                levels = c("Control", "Case"))

# Train the model with default parameters
gbm_model <- gbm_dia(X_toy, y_toy)
print(gbm_model)

# Train with extensive tuning (random search)
gbm_model_tuned <- gbm_dia(X_toy, y_toy, tune = TRUE, tune_length = 30)
print(gbm_model_tuned)
# }

Run the code above in your browser using DataLab