This function lets the user get a confusion matrix and accuracy, and for for binary classification models: AUC, Precision, Sensitivity, and Specificity.
model_metrics(tag, score, multis = NA, abc = TRUE, thresh = 10,
thresh_cm = 0.5, plots = TRUE, subtitle = NA)
Vector. Real known label
Vector. Predicted value or model's result
Data.frame. Containing columns with each category score (only used when more than 2 categories coexist)
Boolean. Arrange columns and rows alphabetically when categorical values?
Integer. Threshold for selecting binary or regression models: this number is the threshold of unique values we should have in 'tag' (more than: regression; less than: classification)
Numeric. Value to splits the results for the confusion matrix. Range of values: (0-1)
Boolean. Include plots?
Character. Subtitle for plots
Other Machine Learning: ROC
,
clusterKmeans
, conf_mat
,
export_results
, gain_lift
,
h2o_automl
, h2o_predict_API
,
h2o_predict_MOJO
,
h2o_predict_binary
,
h2o_predict_model
,
h2o_selectmodel
, impute
,
iter_seeds
, mplot_conf
,
mplot_cuts_error
, mplot_cuts
,
mplot_density
, mplot_full
,
mplot_gain
, mplot_importance
,
mplot_lineal
, mplot_metrics
,
mplot_response
, mplot_roc
,
mplot_splits
, msplit
Other Calculus: ROC
, conf_mat
,
corr
, deg2num
,
dist2d
, errors
,
loglossBinary
, mae
,
mape
, mse
,
quants
, rmse
,
rsqa
, rsq