sfa
performs Slow Feature Analysis (SFA) on a
Important: This implementation of SFA is just the most basic
version; it is merely included here for convenience in
initialize_weightvector
. If you want to actually use full functionality of SFA in R
use the rSFA package, which has a much more advanced and efficient implementations.
sfa()
here corresponds to sfa1
.
sfa(series, ...)
a T
observations from the
matrix
, data.frame
,
or a multivariate ts
object.
additional arguments
An object of class sfa
which inherits methods from princomp
.
Signals are ordered from slowest to fastest.
Slow Feature Analysis (SFA) finds slow signals (see References below). The problem has an
analytic solution and thus can be computed quickly using generalized eigen-value solvers.
For ForeCA it is important to know that SFA is equivalent to
finding a linear combination signal with largest lag
The disadvantage of SFA for forecasting is that, e.g., white noise (WN)
is ranked higher than an AR(1) with negative autocorrelation coefficient
Note though that maximizing (or minimizing) the lag Omega
), but it is a good start.
Laurenz Wiskott and Terrence J. Sejnowski (2002). “Slow Feature Analysis: Unsupervised Learning of Invariances”, Neural Computation 14:4, 715-770.
# NOT RUN {
XX <- diff(log(EuStockMarkets[-c(1:100),])) * 100
plot(ts(XX))
ss <- sfa(XX[,1:4])
summary(ss)
plot(ss)
plot(ts(ss$scores))
apply(ss$scores, 2, function(x) acf(x, plot = FALSE)$acf[2])
biplot(ss)
# }
Run the code above in your browser using DataLab