SpecsVerification (version 0.5-3)

ScoreDiff: Calculate average score difference and assess uncertainty

Description

Calculate the difference (mean score of the reference forecast) minus (mean score of the forecast). Uncertainty is assessed by the Diebold-Mariano test for equality of predictive accuracy.

Usage

ScoreDiff(
  scores,
  scores.ref,
  N.eff = NA,
  conf.level = 0.95,
  handle.na = "na.fail"
)

Arguments

scores

vector of verification scores

scores.ref

vector of verification scores of the reference forecast, must be of the same length as `scores`

N.eff

user-defined effective sample size to be used in hypothesis test and for confidence bounds; if NA, the length of `scores` is used; default: NA

conf.level

confidence level for the confidence interval; default = 0.95

handle.na

how should missing values in scores vectors be handled; possible values are 'na.fail' and 'use.pairwise.complete'; default: 'na.fail'

Value

vector with mean score difference, estimated standard error of the mean, one-sided p-value of the Diebold-Mariano test, and the user-specified confidence interval

References

Diebold, Mariano (1995): Comparing Predictive Accuracy. Journal of Business & Economic Statistics. https://www.jstor.org/stable/1392185

See Also

SkillScore

Examples

Run this code
# NOT RUN {
data(eurotempforecast)
ScoreDiff(EnsCrps(ens, obs), EnsCrps(ens[, 1:2], obs))
# }

Run the code above in your browser using DataCamp Workspace