
Subclass for random search tuning.
The random points are sampled by paradox::generate_design_random()
.
This Tuner can be instantiated via the dictionary
mlr_tuners or with the associated sugar function tnr()
:
mlr_tuners$get("random_search") tnr("random_search")
In order to support general termination criteria and parallelization, we
evaluate points in a batch-fashion of size batch_size
. Larger batches mean
we can parallelize more, smaller batches imply a more fine-grained checking
of termination criteria. A batch contains of batch_size
times resampling$iters
jobs.
E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can
utilize up to 50 cores.
Parallelization is supported via package future (see mlr3::benchmark()
's
section on parallelization for more details).
All Tuners use a logger (as implemented in lgr) from package
bbotk.
Use lgr::get_logger("bbotk")
to access and control the logger.
batch_size
integer(1)
Maximum number of points to try in a batch.
$optimize()
supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
mlr3tuning::Tuner
-> mlr3tuning::TunerFromOptimizer
-> TunerRandomSearch
new()
Creates a new instance of this R6 class.
TunerRandomSearch$new()
clone()
The objects of this class are cloneable with this method.
TunerRandomSearch$clone(deep = FALSE)
deep
Whether to make a deep clone.
Package mlr3hyperband for hyperband tuning.
Other Tuner:
mlr_tuners_cmaes
,
mlr_tuners_design_points
,
mlr_tuners_gensa
,
mlr_tuners_grid_search
,
mlr_tuners_nloptr
# NOT RUN {
library(mlr3)
library(paradox)
search_space = ParamSet$new(list(
ParamDbl$new("cp", lower = 0.001, upper = 0.1)
))
terminator = trm("evals", n_evals = 3)
instance = TuningInstanceSingleCrit$new(
task = tsk("iris"),
learner = lrn("classif.rpart"),
resampling = rsmp("holdout"),
measure = msr("classif.ce"),
search_space = search_space,
terminator = terminator
)
tt = tnr("random_search")
# modifies the instance by reference
tt$optimize(instance)
# returns best configuration and best performance
instance$result
# allows access of data.table of full path of all evaluations
instance$archive
# }
Run the code above in your browser using DataLab