ForeCA (version 0.2.6)

Omega: Estimate forecastability of a time series

Description

An estimator for the forecastability \(\Omega(x_t)\) of a univariate time series \(x_t\). Currently it uses a discrete plug-in estimator given the empirical spectrum (periodogram).

Usage

Omega(
  series = NULL,
  spectrum.control = list(),
  entropy.control = list(),
  mvspectrum.output = NULL
)

Arguments

series

a univariate time series; if it is multivariate, then Omega works component-wise (i.e., same as apply(series, 2, Omega)).

spectrum.control

list; control settings for spectrum estimation. See complete_spectrum_control for details.

entropy.control

list; control settings for entropy estimation. See complete_entropy_control for details.

mvspectrum.output

an object of class "mvspectrum" representing the multivariate spectrum of \(\mathbf{X}_t\) (not necessarily normalized).

Value

A real-value between \(0\) and \(100\) (%). \(0\) means not forecastable (white noise); \(100\) means perfectly forecastable (a sinusoid).

Details

The forecastability of a stationary process \(x_t\) is defined as (see References)

$$ \Omega(x_t) = 1 - \frac{ - \int_{-\pi}^{\pi} f_x(\lambda) \log f_x(\lambda) d \lambda }{\log 2 \pi} \in [0, 1] $$ where \(f_x(\lambda)\) is the normalized spectral density of \(x_t\). In particular \( \int_{-\pi}^{\pi} f_x(\lambda) d\lambda = 1\).

For white noise \(\varepsilon_t\) forecastability \(\Omega(\varepsilon_t) = 0\); for a sum of sinusoids it equals \(100\) %. However, empirically it reaches \(100\%\) only if the estimated spectrum has exactly one peak at some \(\omega_j\) and \(\widehat{f}(\omega_k) = 0\) for all \(k\neq j\).

In practice, a time series of length T has \(T\) Fourier frequencies which represent a discrete probability distribution. Hence entropy of \(f_x(\lambda)\) must be normalized by \(\log T\), not by \(\log 2 \pi\).

Also we can use several smoothing techniques to obtain a less variance estimate of \(f_x(\lambda)\).

References

Goerg, G. M. (2013). “Forecastable Component Analysis”. Journal of Machine Learning Research (JMLR) W&CP 28 (2): 64-72, 2013. Available at http://jmlr.org/proceedings/papers/v28/goerg13.html.

See Also

spectral_entropy, discrete_entropy, continuous_entropy

Examples

Run this code
# NOT RUN {
nn <- 100
eps <- rnorm(nn)  # white noise has Omega() = 0 in theory
Omega(eps, spectrum.control = list(method = "direct"))
# smoothing makes it closer to 0
Omega(eps, spectrum.control = list(method = "wosa"))

xx <- sin(seq_len(nn) * pi / 10)
Omega(xx, spectrum.control = list(method = "direct"))
Omega(xx, entropy.control = list(threshold = 1/40))
Omega(xx, spectrum.control = list(method = "wosa"),
      entropy.control = list(threshold = 1/20))

# an AR(1) with phi = 0.5
yy <- arima.sim(n = nn, model = list(ar = 0.5))
Omega(yy, spectrum.control = list(method = "wosa"))

# an AR(1) with phi = 0.9 is more forecastable
yy <- arima.sim(n = nn, model = list(ar = 0.9))
Omega(yy, spectrum.control = list(method = "wosa"))

# }

Run the code above in your browser using DataLab