This function plots test ROC curves of each model found in the cross validation process. It will also aggregate the models into a single prediction performance, plotting the resulting ROC curve (models coherence). Furthermore, it will plot the mean sensitivity for a given set of specificities.
plotModels.ROC(modelPredictions,
number.of.models=0,
specificities=c(0.975,0.95,0.90,0.80,0.70,0.60,0.50,0.40,0.30,0.20,0.10,0.05),
theCVfolds=1,
predictor="Prediction",
cex=1.0,
...)
A data frame returned by the crossValidationFeatureSelection_Bin
function, either the Models.testPrediction
, the FullBSWiMS.testPrediction
, the Models.CVtestPredictions
, the TestRetrained.blindPredictions
, the KNN.testPrediction
, or the LASSO.testPredictions
value
The maximum number of models to plot
Vector containing the specificities at which the ROC sensitivities will be calculated
The number of folds performed in a Cross-validation experiment
The name of the column to be plotted
Controlling the font size of the text inside the plots
Additional parameters for the roc
function (pROC
package)
A vector with the AUC of each ROC
A vector with the mean sensitivity at the specificities given by specificities
A matrix where each row represents the sensitivity at the specificity given by specificities
for a different ROC
The specificities used to calculate the sensitivities
The AUC of the ROC curve that resulted from using mean.sensitivities
The confusion matrix between the outcome and the ensemble prediction
The ensemble (median prediction) of the repeated predictions