AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)
AutoScore_testing(
test_set,
final_variables,
cut_vec,
scoring_table,
threshold = "best",
with_label = TRUE,
metrics_ci = TRUE
)A data frame with predicted score and the outcome for downstream visualization.
A processed data.frame that contains data for testing purpose. This data.frame should have same format as
train_set (same variable names and outcomes)
A vector containing the list of selected variables, selected from Step(ii) AutoScore_parsimony. Run vignette("Guide_book", package = "AutoScore") to see the guidebook or vignette.
Generated from STEP(iii) AutoScore_weighting.Please follow the guidebook
The final scoring table after fine-tuning, generated from STEP(iv) AutoScore_fine_tuning.Please follow the guidebook
Score threshold for the ROC analysis to generate sensitivity, specificity, etc. If set to "best", the optimal threshold will be calculated (Default:"best").
Set to TRUE if there are labels in the test_set and performance will be evaluated accordingly (Default:TRUE). Set it to "FALSE" if there are not "label" in the "test_set" and the final predicted scores will be the output without performance evaluation.
whether to calculate confidence interval for the metrics of sensitivity, specificity, etc.
Xie F, Chakraborty B, Ong MEH, Goldstein BA, Liu N. AutoScore: A Machine Learning-Based Automatic Clinical Score Generator and Its Application to Mortality Prediction Using Electronic Health Records. JMIR Medical Informatics 2020;8(10):e21798
AutoScore_rank, AutoScore_parsimony, AutoScore_weighting, AutoScore_fine_tuning, print_roc_performance, Run vignette("Guide_book", package = "AutoScore") to see the guidebook or vignette.
## Please see the guidebook or vignettes
Run the code above in your browser using DataLab