Learn R Programming

mlr3 (version 1.0.0)

score_roc_measures: Calculate ROC Measures

Description

Calculate a set of roc performance measures based on the confusion matrix.

  • tpr True positive rate (Sensitivity, Recall)

  • fpr False positive rate (Fall-out)

  • fnr False negative rate (Miss rate)

  • tnr True negative rate (Specificity)

  • ppv Positive predictive value (Precision)

  • fomr False omission rate

  • lrp Positive likelihood ratio (LR+)

  • fdr False discovery rate

  • npv Negative predictive value

  • acc Accuracy

  • lrm Negative likelihood ratio (LR-)

  • dor Diagnostic odds ratio

Usage

score_roc_measures(pred)

Value

list()

A list containing two elements confusion_matrix which is the 2 times 2 confusion matrix of absolute frequencies and measures, a list of the above mentioned measures.

Arguments

pred

(PredictionClassif)
The prediction object.

Examples

Run this code
learner = lrn("classif.rpart", predict_type = "prob")
splits = partition(task = tsk("pima"), ratio = 0.7)
task = tsk("pima")
learner$train(task)
pred = learner$predict(task)
score_roc_measures(pred)

Run the code above in your browser using DataLab