Learn R Programming

mlr3mbo (version 0.2.9)

mlr_optimizers_adbo: Asynchronous Decentralized Bayesian Optimization

Description

OptimizerADBO class that implements Asynchronous Decentralized Bayesian Optimization (ADBO). ADBO is a variant of Asynchronous Model Based Optimization (AMBO) that uses AcqFunctionStochasticCB with exponential lambda decay.

Currently, only single-objective optimization is supported and OptimizerADBO is considered an experimental feature and API might be subject to changes.

Arguments

Parameters

lambda

numeric(1)
Value used for sampling the lambda for each worker from an exponential distribution.

rate

numeric(1)
Rate of the exponential decay.

period

integer(1)
Period of the exponential decay.

initial_design

data.table::data.table()
Initial design of the optimization. If NULL, a design of size design_size is generated with the specified design_function. Default is NULL.

design_size

integer(1)
Size of the initial design if it is to be generated. Default is 100.

design_function

character(1)
Sampling function to generate the initial design. Can be random paradox::generate_design_random, lhs paradox::generate_design_lhs, or sobol paradox::generate_design_sobol. Default is sobol.

n_workers

integer(1)
Number of parallel workers. If NULL, all rush workers specified via rush::rush_plan() are used. Default is NULL.

Super classes

bbotk::Optimizer -> bbotk::OptimizerAsync -> mlr3mbo::OptimizerAsyncMbo -> OptimizerADBO

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

OptimizerADBO$new()


Method optimize()

Performs the optimization on an bbotk::OptimInstanceAsyncSingleCrit until termination. The single evaluations will be written into the bbotk::ArchiveAsync. The result will be written into the instance object.

Usage

OptimizerADBO$optimize(inst)

Arguments

inst

(bbotk::OptimInstanceAsyncSingleCrit).


Method clone()

The objects of this class are cloneable with this method.

Usage

OptimizerADBO$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

References

  • Egelé, Romain, Guyon, Isabelle, Vishwanath, Venkatram, Balaprakash, Prasanna (2023). “Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization.” In 2023 IEEE 19th International Conference on e-Science (e-Science), 1--10.

Examples

Run this code
# \donttest{
if (requireNamespace("rush") &
    requireNamespace("mlr3learners") &
    requireNamespace("DiceKriging") &
    requireNamespace("rgenoud")) {

  if (redis_available()) {

    library(bbotk)
    library(paradox)
    library(mlr3learners)

    fun = function(xs) {
      list(y = xs$x ^ 2)
    }
    domain = ps(x = p_dbl(lower = -10, upper = 10))
    codomain = ps(y = p_dbl(tags = "minimize"))
    objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)

    instance = OptimInstanceAsyncSingleCrit$new(
      objective = objective,
      terminator = trm("evals", n_evals = 10))

    rush::rush_plan(n_workers=2)

    optimizer = opt("adbo", design_size = 4, n_workers = 2)

    optimizer$optimize(instance)
  } else {
    message("Redis server is not available.\nPlease set up Redis prior to running the example.")
  }
}
# }

Run the code above in your browser using DataLab