Create a confusion matrix
Calculates a cross-tabulation of observed and predicted classes with associated statistics.
## S3 method for class 'default': confusionMatrix(data, reference, positive = NULL, dnn = c("Prediction", "Reference"), ...)
- a factor of predicted classes
- a factor of classes to be used as the true results
- an optional character string for the factor level that corresponds to a "positive" result (if that makes sense for your data). If there are only two factor levels, the first level will be used as the "positive" result.
- a character vector of dimnames for the table
- options to be passed to
table. NOTE: do not include
The functions requires that the factors have exactly the same levels.
For two class problems, the sensitivity, specificity, positive
predictive value and negative predictive value is calculated using the
positive argument. For more than two classes, these results are
calculated comparing each factor level to the remaining levels
(i.e. a "one versus all" approach). In each case, the overall accuracy and Kappa statistic are calculated.
The overall accuracy rate is computed along with a 95 percent confidence interval for this rate (using
binom.test) and a one-sided test to see if the accuracy is better than the "no information rate," which is taken to be the largest class percentage in the data.
- a list with elements
table the results of
positive the positive result level overall a numeric vector with overall accuracy and Kappa statistic values byClass the sensitivity, specificity, positive predictive value and negative predictive value for each class. For two class systems, this is calculated once using the
numLlvs <- 4 confusionMatrix( factor(sample(rep(letters[1:numLlvs], 200), 50)), factor(sample(rep(letters[1:numLlvs], 200), 50))) numLlvs <- 2 confusionMatrix( factor(sample(rep(letters[1:numLlvs], 200), 50)), factor(sample(rep(letters[1:numLlvs], 200), 50)))