Learn R Programming

E2E (version 0.1.2)

calculate_metrics_at_threshold_dia: Calculate Classification Metrics at a Specific Threshold

Description

Calculates various classification performance metrics (Accuracy, Precision, Recall, F1-score, Specificity, True Positives, etc.) for binary classification at a given probability threshold.

Usage

calculate_metrics_at_threshold_dia(
  prob_positive,
  y_true,
  threshold,
  pos_class,
  neg_class
)

Value

A list containing:

  • Threshold: The threshold used.

  • Accuracy: Overall prediction accuracy.

  • Precision: Precision for the positive class.

  • Recall: Recall (Sensitivity) for the positive class.

  • F1: F1-score for the positive class.

  • Specificity: Specificity for the negative class.

  • TP, TN, FP, FN, N: Counts of True Positives, True Negatives, False Positives, False Negatives, and total samples.

Arguments

prob_positive

A numeric vector of predicted probabilities for the positive class.

y_true

A factor vector of true class labels.

threshold

A numeric value between 0 and 1, the probability threshold above which a prediction is considered positive.

pos_class

A character string, the label for the positive class.

neg_class

A character string, the label for the negative class.

Examples

Run this code
y_true_ex <- factor(c("Negative", "Positive", "Positive", "Negative", "Positive"),
                    levels = c("Negative", "Positive"))
prob_ex <- c(0.1, 0.8, 0.6, 0.3, 0.9)
metrics <- calculate_metrics_at_threshold_dia(
  prob_positive = prob_ex,
  y_true = y_true_ex,
  threshold = 0.5,
  pos_class = "Positive",
  neg_class = "Negative"
)
print(metrics)

Run the code above in your browser using DataLab