## 6.2 Error Handling

To demonstrate how to properly deal with misbehaving learners, mlr3 ships with the learner classif.debug:

task = mlr_tasks$get("spam") learner = mlr_learners$get("classif.debug")
print(learner)
#> <LearnerClassifDebug:classif.debug>
#> Model: -
#> Parameters: list()
#> Packages: -
#> Predict Type: response
#> Feature types: logical, integer, numeric, character, factor, ordered
#> Properties: missings, multiclass, twoclass

This learner comes with special hyperparameters that let us control

1. What conditions should be signaled (message, warning, error), and
2. during which stage the conditions should be signaled (train or predict).
learner$param_set #> ParamSet: #> id class lower upper levels default value #> 1: message_train ParamLgl NA NA TRUE,FALSE <NoDefault> #> 2: message_predict ParamLgl NA NA TRUE,FALSE <NoDefault> #> 3: warning_train ParamLgl NA NA TRUE,FALSE <NoDefault> #> 4: warning_predict ParamLgl NA NA TRUE,FALSE <NoDefault> #> 5: error_train ParamLgl NA NA TRUE,FALSE <NoDefault> #> 6: error_predict ParamLgl NA NA TRUE,FALSE <NoDefault> #> 7: segfault_train ParamLgl NA NA TRUE,FALSE <NoDefault> #> 8: segfault_predict ParamLgl NA NA TRUE,FALSE <NoDefault> #> 9: predict_missing ParamDbl 0 1 0 #> 10: save_tasks ParamLgl NA NA TRUE,FALSE <NoDefault> #> 11: x ParamDbl 0 1 <NoDefault> Alternatively, we can tell the Learner to provoke a segfault which tears down the complete R session. With its default settings, it will do nothing special: it learns a random label which is used to create constant predictions. ### 6.2.1 Encapsulation By default,mlr3 does not catch conditions such as warnings or errors. Thus, the exception raised by the debug learner stops the execution allowing us to traceback() the error: task = mlr_tasks$get("spam")
learner = mlr_learners$get("classif.debug") learner$param_set$values = list(error_train = TRUE) learner$train(task)
#> Error in learner$train_internal(task = task): Error from classif.debug->train() The learner execution can be encapsulated though. With encapsulation, exceptions do not stop the program flow and any output is logged to the learner instead of just printed to the console. One way to encapsulate the execution is provided by the package evaluate. The encapsulation can be enabled via mlr_control(): task = mlr_tasks$get("spam")
learner = mlr_learners$get("classif.debug") learner$param_set$values = list(warning_train = TRUE, error_train = TRUE) ctrl = mlr_control(encapsulate_train = "evaluate") learner$train(task, ctrl = ctrl)
learner$log #> stage class msg #> 1: train warning Warning from classif.debug->train() #> 2: train error Error from classif.debug->train() learner$errors
#> [1] "Error from classif.debug->train()"

You can also enable the encapsulation for the predict step of a learner by setting encapsulate_predict in mlr_control().

Another possibility to encapsulate is by running everything in a callr session. callr spawns a new R process, and thus even guards the session from segfaults. On the downside, starting new processes comes with a computational overhead.

ctrl = mlr_control(encapsulate_train = "callr")
task = mlr_tasks$get("spam") learner = mlr_learners$get("classif.debug")
learner$param_set$values = list(segfault_train = TRUE)
learner$train(task = task, ctrl = ctrl) learner$errors
#> [1] "callr exited with status -11"

Without a model, it is not possible to predict:

learner\$predict(task)
#> Error: No model available, call train() first

### 6.2.2 Fallback learners

Fallback learners have the purpose to continue with the computation in cases where a Learner or a Measure are misbehaving in some sense. Some typical examples include:

• The learner fails to fit a model during training. This can happen if some convergence criterion is not met or the learner ran out of memory.
• The learner fails to predict for some or all observations. A typical case could be new factor levels in the test data which the model cannot handle.

The fallback learner from the package mlr3pipelines can be used for these scenarios. This is still work in progress.