calcStats calculates the performance of a deployed model.
calcStats(object, aucSkip = FALSE, plotSkip = FALSE, verbose = TRUE)# S4 method for ExprsPredict
calcStats(object, aucSkip = FALSE,
plotSkip = FALSE, verbose = TRUE)
# S4 method for RegrsPredict
calcStats(object, aucSkip = FALSE,
plotSkip = FALSE, verbose = TRUE)
An ExprsPredict or RegrsPredict object.
A logical scalar. Toggles whether to calculate area under the receiver operating characteristic curve. See Details.
A logical scalar. Toggles whether to plot the receiver operating characteristic curve. See Details.
A logical scalar. Toggles whether to print the results of model performance to console.
Returns a data.frame of performance metrics.
ExprsPredict: Method to calculate performance for classification models.
RegrsPredict: Method to calculate performance for continuous outcome models.
For classification, if the argument aucSkip = FALSE AND the ExprsArray
object was an ExprsBinary object with at least one case and one control AND
ExprsPredict contains a coherent @probability slot, calcStats
will calculate classifier performance using the area under the receiver operating
characteristic (ROC) curve via the ROCR package. Otherwise, calcStats
will calculate classifier performance traditionally using a confusion matrix.
Note that accuracies calculated using ROCR may differ from those calculated
using a confusion matrix because ROCR adjusts the discrimination threshold to
optimize sensitivity and specificity. This threshold is automatically chosen as the
point along the ROC which minimizes the Euclidean distance from (0, 1).
For regression, accuracy is defined the R-squared of the fitted regression. This
ranges from 0 to 1 for use with pl and pipe. Note that
the aucSkip and plotSkip arguments are ignored for regression.