mgcv's gam and its siblings to fit models of the general form
$Y_i(t) = \mu(t) + \int X_i(s)\beta(s,t)ds + f(z_{1i}, t) + f(z_{2i}) + z_{3i} \beta_3(t) + \dots + E_i(t))$
with a functional (but not necessarily continuous) response $Y(t)$,
(optional) smooth intercept $\mu(t)$, (multiple) functional covariates $X(t)$ and scalar covariates
$z_1$, $z_2$, etc. The residual functions $E_i(t) \sim GP(0, K(t,t'))$ are assumed to be i.i.d.
realizations of a Gaussian process. An estimate of the covariance operator $K(t,t')$ evaluated on yind
has to be supplied in the hatSigma-argument.pffrGLS(formula, yind, hatSigma, algorithm = NA, method = "REML",
tensortype = c("te", "t2"), bs.yindex = list(bs = "ps", k = 5, m = c(2,
1)), bs.int = list(bs = "ps", k = 20, m = c(2, 1)), cond.cutoff = 500,
...)pffryind. See Details.pffrpffrpffrpffrhatSigma is greater than this, hatSigma is
made ``more'' positive-definite via nearPD to ensure a condition number equal to cond.cutoff. Defaults tpffr-object, see pffr.hatSigma has to be positive definite. If hatSigma is close to positive semi-definite or badly conditioned,
estimated standard errors become unstable (typically much too small). pffrGLS will try to diagnose this and issue a warning.
The danger is especially big if the number of functional observations is smaller than the number of gridpoints
(i.e, length(yind)), since the raw covariance estimate will not have full rank.
Please see pffr for details on model specification and
implementation.
THIS IS AN EXPERIMENTAL VERSION AND NOT WELL TESTED YET -- USE AT YOUR OWN RISK.pffr, fpca.sc