Learn R Programming

mlr3tuning

Package website: release | dev

mlr3tuning is the hyperparameter optimization package of the mlr3 ecosystem. It features highly configurable search spaces via the paradox package and finds optimal hyperparameter configurations for any mlr3 learner. mlr3tuning works with several optimization algorithms e.g. Random Search, Iterated Racing, Bayesian Optimization (in mlr3mbo) and Hyperband (in mlr3hyperband). Moreover, it can automatically optimize learners and estimate the performance of optimized models with nested resampling. The package is built on the optimization framework bbotk.

Extension packages

mlr3tuning is extended by the following packages.

  • mlr3tuningspaces is a collection of search spaces from scientific articles for commonly used learners.
  • mlr3hyperband adds the Hyperband and Successive Halving algorithm.
  • mlr3mbo adds Bayesian Optimization methods.

Resources

There are several sections about hyperparameter optimization in the mlr3book.

The gallery features a collection of case studies and demos about optimization.

The cheatsheet summarizes the most important functions of mlr3tuning.

Installation

Install the last release from CRAN:

install.packages("mlr3tuning")

Install the development version from GitHub:

# install.packages("pak")
pak::pak("mlr-org/mlr3tuning")

Examples

We optimize the cost and gamma hyperparameters of a support vector machine on the Sonar data set.

library("mlr3learners")
library("mlr3tuning")

learner = lrn("classif.svm",
  cost  = to_tune(1e-5, 1e5, logscale = TRUE),
  gamma = to_tune(1e-5, 1e5, logscale = TRUE),
  kernel = "radial",
  type = "C-classification"
)

We construct a tuning instance with the ti() function. The tuning instance describes the tuning problem.

instance = ti(
  task = tsk("sonar"),
  learner = learner,
  resampling = rsmp("cv", folds = 3),
  measures = msr("classif.ce"),
  terminator = trm("none")
)
instance
## 
## ── <TuningInstanceBatchSingleCrit> ─────────────────────────────────────────────────────────────────
## • State: Not optimized
## • Objective: <ObjectiveTuningBatch>
## • Search Space:
##       id    class     lower    upper nlevels
## 1:  cost ParamDbl -11.51293 11.51293     Inf
## 2: gamma ParamDbl -11.51293 11.51293     Inf
## • Terminator: <TerminatorNone>

We select a simple grid search as the optimization algorithm.

tuner = tnr("grid_search", resolution = 5)
tuner
## 
## ── <TunerBatchGridSearch>: Grid Search ─────────────────────────────────────────────────────────────
## • Parameters: batch_size=1, resolution=5
## • Parameter classes: <ParamLgl>, <ParamInt>, <ParamDbl>, and <ParamFct>
## • Properties: dependencies, single-crit, and multi-crit
## • Packages: mlr3tuning and bbotk

To start the tuning, we simply pass the tuning instance to the tuner.

tuner$optimize(instance)
##        cost     gamma learner_param_vals  x_domain classif.ce
## 1: 5.756463 -5.756463          <list[4]> <list[2]>  0.1828847

The tuner returns the best hyperparameter configuration and the corresponding measured performance.

The archive contains all evaluated hyperparameter configurations.

as.data.table(instance$archive)[, .(cost, gamma, classif.ce, batch_nr, resample_result)]
##           cost      gamma classif.ce batch_nr  resample_result
##  1:  -5.756463   5.756463  0.4663216        1 <ResampleResult>
##  2:   5.756463  -5.756463  0.1828847        2 <ResampleResult>
##  3:  11.512925   5.756463  0.4663216        3 <ResampleResult>
##  4:   5.756463  11.512925  0.4663216        4 <ResampleResult>
##  5: -11.512925 -11.512925  0.4663216        5 <ResampleResult>
## ---                                                           
## 21:  -5.756463  -5.756463  0.4663216       21 <ResampleResult>
## 22:  11.512925  11.512925  0.4663216       22 <ResampleResult>
## 23: -11.512925  11.512925  0.4663216       23 <ResampleResult>
## 24:  11.512925  -5.756463  0.1828847       24 <ResampleResult>
## 25:   0.000000  -5.756463  0.2402346       25 <ResampleResult>

The mlr3viz package visualizes tuning results.

library(mlr3viz)

autoplot(instance, type = "surface")

We fit a final model with optimized hyperparameters to make predictions on new data.

learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("sonar"))

Copy Link

Version

Install

install.packages('mlr3tuning')

Monthly Downloads

11,082

Version

1.5.1

License

LGPL-3

Issues

Pull Requests

Stars

Forks

Maintainer

Marc Becker

Last Published

December 14th, 2025

Functions in mlr3tuning (1.5.1)

CallbackBatchTuning

Create Batch Tuning Callback
CallbackAsyncTuning

Asynchronous Tuning Callback
ArchiveAsyncTuning

Rush Data Storage
ArchiveBatchTuning

Class for Logging Evaluated Hyperparameter Configurations
ArchiveAsyncTuningFrozen

Frozen Rush Data Storage
AutoTuner

Class for Automatic Tuning
TuningInstanceBatchMultiCrit

Class for Multi Criteria Tuning
TuningInstanceBatchSingleCrit

Class for Single Criterion Tuning
ObjectiveTuningBatch

Class for Tuning Objective
TuningInstanceAsyncMultiCrit

Multi-Criteria Tuning with Rush
as_search_space

Convert to a Search Space
Tuner

Tuner
callback_batch_tuning

Create Batch Tuning Callback
extract_inner_tuning_archives

Extract Inner Tuning Archives
assert_async_tuning_callback

Assertions for Callbacks
assert_batch_tuning_callback

Assertions for Callbacks
TunerBatchFromOptimizerBatch

TunerBatchFromOptimizerBatch
TunerBatch

Class for Batch Tuning Algorithms
mlr3tuning.backup

Backup Benchmark Result Callback
mlr3tuning.measures

Measure Callback
mlr3tuning.one_se_rule

One Standard Error Rule Callback
mlr3tuning_assertions

Assertion for mlr3tuning objects
callback_async_tuning

Create Asynchronous Tuning Callback
auto_tuner

Function for Automatic Tuning
TuningInstanceAsyncSingleCrit

Single Criterion Tuning with Rush
extract_inner_tuning_results

Extract Inner Tuning Results
mlr3tuning-package

mlr3tuning: Hyperparameter Optimization for 'mlr3'
TunerAsync

Class for Asynchronous Tuning Algorithms
TuningInstanceSingleCrit

Single Criterion Tuning Instance for Batch Tuning
TuningInstanceMultiCrit

Multi Criteria Tuning Instance for Batch Tuning
TunerAsyncFromOptimizerAsync

TunerAsyncFromOptimizerAsync
mlr3tuning.async_save_logs

Save Logs Callback
mlr_tuners_async_random_search

Hyperparameter Tuning with Asynchronous Random Search
mlr_tuners_async_grid_search

Hyperparameter Tuning with Asynchronous Grid Search
mlr3tuning.async_freeze_archive

Freeze Archive Callback
mlr_tuners

Dictionary of Tuners
mlr_tuners_async_design_points

Hyperparameter Tuning with Asynchronous Design Points
mlr_tuners_irace

Hyperparameter Tuning with Iterated Racing.
mlr_tuners_internal

Hyperparameter Tuning with Internal Tuning
mlr_tuners_nloptr

Hyperparameter Tuning with Non-linear Optimization
mlr3tuning.async_default_configuration

Default Configuration Callback
mlr_tuners_random_search

Hyperparameter Tuning with Random Search
mlr3tuning.asnyc_mlflow

MLflow Connector Callback
as_tuner

Convert to a Tuner
mlr_tuners_gensa

Hyperparameter Tuning with Generalized Simulated Annealing
mlr_tuners_grid_search

Hyperparameter Tuning with Grid Search
ti

Syntactic Sugar for Tuning Instance Construction
ti_async

Syntactic Sugar for Asynchronous Tuning Instance Construction
tnr

Syntactic Sugar for Tuning Objects Construction
tune_nested

Function for Nested Resampling
mlr_tuners_cmaes

Hyperparameter Tuning with Covariance Matrix Adaptation Evolution Strategy
tune

Function for Tuning a Learner
mlr_tuners_design_points

Hyperparameter Tuning with Design Points
set_validate.AutoTuner

Configure Validation for AutoTuner
reexports

Objects exported from other packages
ObjectiveTuning

Class for Tuning Objective
ObjectiveTuningAsync

Class for Tuning Objective
ContextAsyncTuning

Asynchronous Tuning Context
ContextBatchTuning

Batch Tuning Context