This function computes factor scores for observations.
Using factor scores,
we can represent the original data point \(y_j\) in a
q-dimensional reduced space. This is only meaningful
in the case of mcfa
or mctfa
models,
as the factor cores for mfa
and mtfa
are
white noise.
The (estimated conditional expectation of) unobservable factors
\(U_{ij}\) given \(y_j\) and the component membership
can be expressed by,
$$
\hat{u}_{ij} = E_{\hat{\Psi}}\{U_{ij} \mid y_j, z_{ij} = 1\}.
$$
The estimated mean \(U_{ij}\) (over the
component membership of \(y_j\))
is give as
$$
\hat{u}_{j} = \sum_{i=1}^g \tau_i(y_j; \hat{\Psi}) \hat{u}_{ij},
$$
where \(\tau_i(y_j; \hat{\Psi})\)
estimated posterior probability of \(y_j\)
belonging to the \(i\)th component.
An alternative estimate of \(u_j\), the posterior expectation
of the factor corresponding to the jth observation \(y_j\), is
defined by replacing \(\tau_i(y_j;\,\hat{\Psi})\) by \(\hat{z}_{ij}\),
where
\(\hat{z}_{ij} = 1\), if \(\hat{\tau}_i(y_j; \hat{\Psi})\)
>= \(\hat{\tau_h}(y_j; \hat{\Psi})
(h=1,\,\dots,\,g; h \neq i)\), else
\(\hat{z}_{ij} = 0\).
$$
\hat{u}_{j}^C = \sum_{i=1}^g \hat{z}_{ij} \hat{u}_{ij}.
$$
For MFA, we have
$$
\hat{u}_{ij} = \hat{\beta}_i^T (y_j - \hat{\mu}_i),
$$
and
$$
\hat{u}_{j} = \sum_{i=1}^g \tau_i(y_j; \hat{\Psi}) \hat{\beta}_i^T
(y_j - \hat{\mu}_i)
$$
for \(j = 1, \dots, n\) where
\(\hat{\beta}_i = (B_iB_i^T + D_i)^{-1} B_i\).
For MCFA,
$$
\hat{u}_{ij} = \hat{\xi}_i + \hat{\gamma}_i^T (y_j -\hat{A}\hat{\xi}_i),
$$
$$
\hat{u}_{j} = \sum_{i=1}^g\tau_i(y_j; \hat{\Psi})
\{\hat{\xi}_i + \hat{\gamma}_i^T(y_j -\hat{A}\hat{\xi}_i)\},
$$
where \(\gamma_i = (A \Omega_i A + D)^{-1} A \Omega_i\).
With MtFA and MCtFA, the distribution of
\(\hat{u}_{ij}\) and of \(\hat{u}_{j}\)
have the same form as those of MFA and MCFA, respectively.