Learn R Programming

RemixAutoML (version 0.11.0)

AutoH2oGBMRegression: AutoH2oGBMRegression is an automated H2O modeling framework with grid-tuning and model evaluation

Description

AutoH2oGBMRegression is an automated H2O modeling framework with grid-tuning and model evaluation that runs a variety of steps. First, the function will run a random grid tune over N number of models and find which model is the best (a default model is always included in that set). Once the model is identified and built, several other outputs are generated: validation data with predictions, evaluation plot, evaluation boxplot, evaluation metrics, variable importance, partial dependence calibration plots, partial dependence calibration box plots, and column names used in model fitting.

Usage

AutoH2oGBMRegression(data, TrainOnFull = FALSE, ValidationData,
  TestData = NULL, TargetColumnName = NULL, FeatureColNames = NULL,
  TransformNumericColumns = NULL, Alpha = NULL,
  Distribution = "poisson", eval_metric = "RMSE", Trees = 50,
  GridTune = FALSE, MaxMem = "32G", NThreads = max(1,
  parallel::detectCores() - 2), MaxModelsInGrid = 2, model_path = NULL,
  metadata_path = NULL, ModelID = "FirstModel", NumOfParDepPlots = 3,
  ReturnModelObjects = TRUE, SaveModelObjects = FALSE,
  IfSaveModel = "mojo", H2OShutdown = TRUE, Methods = c("BoxCox",
  "Asinh", "Asin", "Log", "LogPlus1", "Logit", "YeoJohnson"))

Arguments

data

This is your data set for training and testing your model

TrainOnFull

Set to TRUE to train on full data

ValidationData

This is your holdout data set used in modeling either refine your hyperparameters.

TestData

This is your holdout data set. Catboost using both training and validation data in the training process so you should evaluate out of sample performance with this data set.

TargetColumnName

Either supply the target column name OR the column number where the target is located (but not mixed types).

FeatureColNames

Either supply the feature column names OR the column number where the target is located (but not mixed types)

TransformNumericColumns

Set to NULL to do nothing; otherwise supply the column names of numeric variables you want transformed

Alpha

This is the quantile value you want to use for quantile regression. Must be a decimal between 0 and 1.

Distribution

Choose from gaussian", "poisson", "gamma", "tweedie", "laplace", "quantile", "huber"

eval_metric

This is the metric used to identify best grid tuned model. Choose from "MSE", "RMSE", "MAE", "RMSLE"

Trees

The maximum number of trees you want in your models

GridTune

Set to TRUE to run a grid tuning procedure. Set a number in MaxModelsInGrid to tell the procedure how many models you want to test.

MaxMem

Set the maximum amount of memory you'd like to dedicate to the model run. E.g. "32G"

NThreads

Set to the mamimum amount of threads you want to use for this function

MaxModelsInGrid

Number of models to test from grid options (1080 total possible options)

model_path

A character string of your path file to where you want your output saved

metadata_path

A character string of your path file to where you want your model evaluation output saved. If left NULL, all output will be saved to model_path.

ModelID

A character string to name your model and output

NumOfParDepPlots

Tell the function the number of partial dependence calibration plots you want to create. Calibration boxplots will only be created for numerical features (not dummy variables)

ReturnModelObjects

Set to TRUE to output all modeling objects (E.g. plots and evaluation metrics)

SaveModelObjects

Set to TRUE to return all modeling objects to your environment

IfSaveModel

Set to "mojo" to save a mojo file, otherwise "standard" to save a regular H2O model object

H2OShutdown

Set to FALSE to keep H2O running after you build your model

Methods

Default is all transformation methods. You can select a subset of them. Choices are in the default model in the help file.

Value

Saves to file and returned in list: VariableImportance.csv, Model, ValidationData.csv, EvalutionPlot.png, EvalutionBoxPlot.png, EvaluationMetrics.csv, ParDepPlots.R a named list of features with partial dependence calibration plots, ParDepBoxPlots.R, GridCollect, GridList, and metadata

See Also

Other Automated Regression: AutoCatBoostHurdleModel, AutoCatBoostRegression, AutoH2oDRFHurdleModel, AutoH2oDRFRegression, AutoH2oGBMHurdleModel, AutoNLS, AutoXGBoostHurdleModel, AutoXGBoostRegression

Examples

Run this code
# NOT RUN {
Correl <- 0.85
N <- 1000
data <- data.table::data.table(Target = runif(N))
data[, x1 := qnorm(Target)]
data[, x2 := runif(N)]
data[, Independent_Variable1 := log(pnorm(Correl * x1 +
                                            sqrt(1-Correl^2) * qnorm(x2)))]
data[, Independent_Variable2 := (pnorm(Correl * x1 +
                                         sqrt(1-Correl^2) * qnorm(x2)))]
data[, Independent_Variable3 := exp(pnorm(Correl * x1 +
                                            sqrt(1-Correl^2) * qnorm(x2)))]
data[, Independent_Variable4 := exp(exp(pnorm(Correl * x1 +
                                                sqrt(1-Correl^2) * qnorm(x2))))]
data[, Independent_Variable5 := sqrt(pnorm(Correl * x1 +
                                             sqrt(1-Correl^2) * qnorm(x2)))]
data[, Independent_Variable6 := (pnorm(Correl * x1 +
                                         sqrt(1-Correl^2) * qnorm(x2)))^0.10]
data[, Independent_Variable7 := (pnorm(Correl * x1 +
                                         sqrt(1-Correl^2) * qnorm(x2)))^0.25]
data[, Independent_Variable8 := (pnorm(Correl * x1 +
                                         sqrt(1-Correl^2) * qnorm(x2)))^0.75]
data[, Independent_Variable9 := (pnorm(Correl * x1 +
                                         sqrt(1-Correl^2) * qnorm(x2)))^2]
data[, Independent_Variable10 := (pnorm(Correl * x1 +
                                          sqrt(1-Correl^2) * qnorm(x2)))^4]
data[, Independent_Variable11 := as.factor(
  ifelse(Independent_Variable2 < 0.20, "A",
         ifelse(Independent_Variable2 < 0.40, "B",
                ifelse(Independent_Variable2 < 0.6,  "C",
                       ifelse(Independent_Variable2 < 0.8,  "D", "E")))))]
data[, ':=' (x1 = NULL, x2 = NULL)]
TestModel <- AutoH2oGBMRegression(data,
                                  TrainOnFull = FALSE,
                                  ValidationData = NULL,
                                  TestData = NULL,
                                  TargetColumnName = "Target",
                                  FeatureColNames = 2:ncol(data),
                                  TransformNumericColumns = NULL,
                                  Alpha = NULL,
                                  Distribution = "poisson",
                                  eval_metric = "RMSE",
                                  Trees = 50,
                                  GridTune = FALSE,
                                  MaxMem = "32G",
                                  NThreads = max(1,parallel::detectCores()-2),
                                  MaxModelsInGrid = 10,
                                  model_path = NULL,
                                  metadata_path = NULL,
                                  ModelID = "FirstModel",
                                  NumOfParDepPlots = 3,
                                  ReturnModelObjects = TRUE,
                                  SaveModelObjects = FALSE,
                                  IfSaveModel = "mojo",
                                  H2OShutdown = TRUE,
                                  Methods = c("BoxCox", 
                                              "Asinh", 
                                              "Asin",
                                              "Log",
                                              "LogPlus1", 
                                              "Logit", 
                                              "YeoJohnson"))
# }

Run the code above in your browser using DataLab