Compute confusion matrices from multi-model annotations
compute_confusion_matrices(
annotations,
gold = NULL,
pairwise = TRUE,
label_levels = NULL,
sample_col = "sample_id",
model_col = "model_id",
label_col = "label",
truth_col = "truth"
)A list with elements `vs_gold` (named list of matrices, one per model) and `pairwise` (list of pairwise confusion tables).
Output from [explore()] or a compatible data frame with at least `sample_id`, `model_id`, and `label` columns.
Optional vector of gold labels. Overrides the `truth` column when supplied.
When `TRUE`, cross-model confusion tables are returned even if no gold labels exist.
Optional factor levels to enforce a consistent ordering in the resulting tables.
Column names to use when `annotations` is a custom data frame.