Bayes factor for Model Comparison
If all models were fit from the same data, compare_performance()
  returns an additional column named BF, which shows the Bayes factor
  (see bayesfactor_models) for each model against
  the denominator model. The first model is used as denominator model,
  and its Bayes factor is set to NA to indicate the reference model.
  
Ranking Models
When rank = TRUE, a new column Performance_Score is returned. This
  score ranges from 0% to 100%, higher values indicating better model performance.
  Calculation is based on normalizing all indices (i.e. rescaling them to a
  range from 0 to 1), and taking the mean value of all indices for each model.
  This is a rather quick heuristic, but might be helpful as exploratory index.
In particular when models are of different types (e.g. mixed models, classical
  linear models, logistic regression, ...), not all indices will be computed
  for each model. In case where an index can't be calculated for a specific
  model type, this model gets an NA value. All indices that have any
  NAs are excluded from calculating the performance score.
There is a plot()-method for compare_performance(),
  which creates a "spiderweb" plot, where the different indices are
  normalized and larger values indicate better model performance.
  Hence, points closer to the center indicate worse fit indices
  (see online-documentation
  for more details).