
Last chance! 50% off unlimited learning
Sale ends in
Given a set of possible hyperparameter values, the function trains models with all the possible combinations of hyperparameters.
gridSearch(model, hypers, metric, test = NULL, env = NULL,
parallel = FALSE, save_models = TRUE)
named list containing the values of the hyperparameters that should be tuned, see details.
character. The metric used to evaluate the models, possible values are: "auc", "tss" and "aicc".
code'>SWD object. Test dataset used to evaluate the
model, not used with aicc
and code'>SDMmodelCV
objects, default is NULL
.
stack
containing the environmental
variables, used only with "aicc", default is NULL
.
logical, if TRUE
it uses parallel computation, default
is FALSE
. Used only with metric = "aicc"
, see details.
logical, if FALSE
the models are not saved and the
output contains only a data frame with the metric values for each
hyperparameter combination. Default is TRUE
, set it to FALSE
when there are many combinations to avoid R crashing for memory overload.
To know which hyperparameters can be tuned you can use the output
of the function get_tunable_args
. Hyperparameters not included
in the hypers
argument take the value that they have in the passed
model.
Parallel computation is used only during the execution of the predict function, and increases the speed only for large datasets. For small dataset it may result in a longer execution, due to the time necessary to create the cluster.
# NOT RUN {
# Acquire environmental variables
files <- list.files(path = file.path(system.file(package = "dismo"), "ex"),
pattern = "grd", full.names = TRUE)
predictors <- raster::stack(files)
# Prepare presence and background locations
p_coords <- virtualSp$presence
bg_coords <- virtualSp$background
# Create SWD object
data <- prepareSWD(species = "Virtual species", p = p_coords, a = bg_coords,
env = predictors, categorical = "biome")
# Split presence locations in training (80%) and testing (20%) datasets
datasets <- trainValTest(data, test = 0.2, only_presence = TRUE)
train <- datasets[[1]]
test <- datasets[[2]]
# Train a model
model <- train(method = "Maxnet", data = train, fc = "l")
# Define the hyperparameters to test
h <- list(reg = 1:2, fc = c("lqp", "lqph"))
# Run the function using the AUC as metric
output <- gridSearch(model, hypers = h, metric = "auc", test = test)
output@results
output@models
# Order rusults by highest test AUC
head(output@results[order(-output@results$test_AUC), ])
# Run the function using the AICc as metric and without saving the trained
# models, helpful when numerous hyperparameters are tested to avoid memory
# problems
output <- gridSearch(model, hypers = h, metric = "aicc", env = predictors,
save_models = FALSE)
output@results
# }
Run the code above in your browser using DataLab