Learn R Programming

rfUtilities (version 2.0-0)

accuracy: Accuracy

Description

Classification accuracy measures for pcc, kappa, users accuracy, producers accuracy

Usage

accuracy(x, y)

Arguments

x
vector of predicted data or table/matrix contengency table
y
vector of observed data, if x is not table/matrix contengency table

Value

A list class object with the following components:
  • PCC percent correctly classified (accuracy)
  • users.accuracy The users accuracy
  • producers.accuracy The producers accuracy
  • kappa Cohen's Kappa (chance corrected accuracy)
  • sensitivity Sensitivity
  • specificity Specificity
  • plr Positive Likelihood Ratio
  • nlr Negative Likelihood Ratio
  • typeI.error Type I error
  • typeII.error Type II error
  • gain Information gain
  • f.score F-score
  • auc Area Under the ROC Curve
  • confusion A confusion matrix

References

Cohen, J. (1960) A coefficient of agreement for nominal scales. Educational and Psychological Measurement 20 (1):37-46 Cohen, J. (1968) Weighted kappa: Nominal scale agreement with provision for scaled disagreement or partial credit. Psychological Bulletin 70 (4):213-220 Powers, D.M.W., (2011). Evaluation: From Precision, Recall and F-Measure to ROC, Informedness, Markedness & Correlation. Journal of Machine Learning Technologies 2(1):37-63.

Examples

Run this code
 # Two classes (vector)
 observed <- sample(c(rep("Pres",50),rep("Abs",50)), 100, replace=TRUE )
 accuracy(observed[sample(1:length(observed))], observed)

 # Two classes (contingency table)
accuracy(cbind(c(15,11), c(2,123)))

 # Multiple classes
 accuracy(iris[sample(1:150),]$Species, iris$Species)

Run the code above in your browser using DataLab