Learn R Programming

IBLM (version 1.0.2)

train_xgb_as_per_iblm: Train XGBoost Model Using the IBLM Model Parameters

Description

Trains an XGBoost model using parameters extracted from the booster residual component of the iblm model. This is a convenient way to fit an XGBoost model for direct comparison with a fitted iblm model

Usage

train_xgb_as_per_iblm(iblm_model, ...)

Value

Trained XGBoost model object (class "xgb.Booster").

Arguments

iblm_model

Ensemble model object of class "iblm" containing GLM and XGBoost model components. Also contains data that was used to train it.

...

optional arguments to insert into xgb.train(). Note this will cause deviation from the settings used for training `iblm_model`

See Also

xgb.train

Examples

Run this code
df_list <- freMTPLmini |> split_into_train_validate_test(seed = 9000)

# training with plenty of rounds allowed
iblm_model1 <- train_iblm_xgb(
  df_list,
  response_var = "ClaimRate",
  family = "poisson",
  params = list(max_depth = 6),
  nrounds = 1000
)

xgb1 <- train_xgb_as_per_iblm(iblm_model1)

# training with severe restrictions (expected poorer results)
iblm_model2 <- train_iblm_xgb(
  df_list,
  response_var = "ClaimRate",
  family = "poisson",
  params = list(max_depth = 1),
  nrounds = 5
)

xgb2 <- train_xgb_as_per_iblm(iblm_model2)

# comparison shows the poor training mirrored in second set:
get_pinball_scores(
  df_list$test,
  iblm_model1,
  trim = NA_real_,
  additional_models = list(iblm2 = iblm_model2, xgb1 = xgb1, xgb2 = xgb2)
)

Run the code above in your browser using DataLab