⚠️There's a newer version (0.20.0) of this package. Take me there.

mlr3tuning

Package website: release | dev

This package provides hyperparameter tuning for mlr3. It offers various tuning methods e.g. grid search, random search and generalized simulated annealing and different termination criteria can be set and combined. 'AutoTuner' provides a convenient way to perform nested resampling in combination with 'mlr3'. The package is build on bbotk which provides a common framework for optimization.

Installation

CRAN version

install.packages("mlr3tuning")

Development version

remotes::install_github("mlr-org/mlr3tuning")

Example

library("mlr3")
library("mlr3tuning")
library("paradox")

task = tsk("pima")
learner = lrn("classif.rpart")
resampling = rsmp("holdout")
measure = msr("classif.ce")

# Create the search space with lower and upper bounds
search_space = ParamSet$new(list(
  ParamDbl$new("cp", lower = 0.001, upper = 0.1),
  ParamInt$new("minsplit", lower = 1, upper = 10)
))

# Define termination criterion
terminator = trm("evals", n_evals = 20)

# Create tuning instance
instance = TuningInstanceSingleCrit$new(task = task,
  learner = learner,
  resampling = resampling,
  measure = measure,
  terminator = terminator,
  search_space = search_space)

# Load tuner
tuner = tnr("grid_search", resolution = 5)

# Trigger optimization
tuner$optimize(instance)

# View results
instance$result

Resources

Further documentation can be found in the mlr3book and the mlr3tuning cheatsheet. Tutorials are available in the mlr3gallery.

Copy Link

Version

Down Chevron

Install

install.packages('mlr3tuning')

Monthly Downloads

5,762

Version

0.5.0

License

LGPL-3

Issues

Pull Requests

Stars

Forks

Maintainer

Last Published

December 7th, 2020

Functions in mlr3tuning (0.5.0)