12.2 Tuners

The exemplary tuner function in this tutorial is called blackBoxFun.

A new tuner consists of an objective function and settings. The first one is the heart of the tuner. It must fulfill the following requirements:

12.2.1 Objective function

A possible implementation could look as follows:

blackBoxFun = function (x, pe) {
  x = mlr3misc::set_names(x, nm = pe$param_set$ids())
  pe$eval(x)
  performance = unlist(pe$bmr$data[.N]$performance)[[1]]
  if (! pe$task$measures[[1]]$minimize)
    return (-performance)
  return (performance)
}

With pe being the mlr3tuning::PerformanceEvaluator object, blackBoxFun() should be able to do the following (similar to mlr3tuning::TunerRandom or mlr3tuning::TunerGenSA)

blackBoxFun(c(cp = 0.05), pe)
pe$bmr$aggregated()

12.2.2 Tuner class

Now to actually call the optimizer using a dedicated R6 Tuner class, we add blackBoxFun() as a private method. This can either be done for an existing class or a new R6 Tuner class can be created.

In this example we replace the private .$tune_step() method from mlr3tuning::TunerGenSA with out new objective function that we defined above.

TunerGenSA = R6Class("TunerGenSA",
  inherit = Tuner,
  public = list(
    GenSA_res = NULL,
    initialize = function(pe, evals, ...) {
      if (any(param_set$storage_type != "numeric")) {
        stop("Parameter types needs to be numeric")
      }
      checkmate::assert_integerish(evals, lower = 1L)
      super$initialize(id = "GenSA", pe = pe, terminator = TerminatorEvaluations$new(evals),
        settings = list(max.call = evals, ...))
    }
  ),
  private = list(
    tune_step = function() {
      blackBoxFun = function (x, pe) {
        x = mlr3misc::set_names(x, nm = pe$param_set$ids())
        pe$eval(x)
        performance = unlist(pe$bmr$data[.N]$performance)[[1]]
        if (! pe$task$measures[[1]]$minimize)
          return (-performance)
        return (performance)
      }
      self$GenSA_res = GenSA(fn = blackBoxFun, lower = self$pe$param_set$lower, upper = self$pe$param_set$upper,
        control = self$settings, pe = self$pe)
    }
  )
)

Note that the private method needs always be called .$tune_step() as it will be called from the .$tune() method of the Tuner class.

12.2.3 Example

Now that the “new” mlr3tuning::TunerGenSA tuner has been defined, we can test it in a small use case:

# does not work currently
task = mlr3::mlr_tasks$get("spam")
learner = mlr3::mlr_learners$get("classif.rpart")
learner$predict_type = "prob"
resampling = mlr3::mlr_resamplings$get("holdout")
measures = mlr3::mlr_measures$mget(c("classif.auc", "classif.ce"))
param_set = paradox::ParamSet$new(
  params = list(
    paradox::ParamDbl$new("cp", lower = 0.001, upper = 0.1)
  )
)

pe = PerformanceEvaluator$new(task, learner, resampling, param_set)
tuner = TunerGenSA$new(pe, 60L)
tuner$tune()
## Error in pe$eval(x): Assertion on 'dt' failed: Must be a data.table, not double.

tuner$pe$bmr$aggregated()
## Error in eval(expr, envir, enclos): attempt to apply non-function
tuner$tune_result()
## Error in tuner$tune_result(): attempt to apply non-function
str(tuner$GenSA_res)
##  NULL