That is, if the regression model contains hyperparameters we are to tune using cross validation. See tidymodels for default model hyperparameters.
regression.get_tune(regression.model, regression.tune_values, x_train)
A boolean variable indicating if the regression model is to be tuned.
A tidymodels
object of class model_specs
. Default is a linear regression model, i.e.,
parsnip::linear_reg()
. See tidymodels for all possible models,
and see the vignette for how to add new/own models. Note, to make it easier to call explain()
from Python, the
regression.model
parameter can also be a string specifying the model which will be parsed and evaluated. For
example, "parsnip::rand_forest(mtry = hardhat::tune(), trees = 100, engine = "ranger", mode = "regression")"
is also a valid input. It is essential to include the package prefix if the package is not loaded.
Either NULL
(default), a data.frame/data.table/tibble, or a function.
The data.frame must contain the possible hyperparameter value combinations to try.
The column names must match the names of the tunable parameters specified in regression.model
.
If regression.tune_values
is a function, then it should take one argument x
which is the training data
for the current coalition and returns a data.frame/data.table/tibble with the properties described above.
Using a function allows the hyperparameter values to change based on the size of the coalition See the regression
vignette for several examples.
Note, to make it easier to call explain()
from Python, the regression.tune_values
can also be a string
containing an R function. For example,
"function(x) return(dials::grid_regular(dials::mtry(c(1, ncol(x)))), levels = 3))"
is also a valid input.
It is essential to include the package prefix if the package is not loaded.
Data.table with training data.
Lars Henry Berge Olsen