Assume that a \(p\)-variate \({\bf Y}\) with \(T\) observations is whitened, i.e. \({\bf Y}={\bf S}^{-1/2}({\bf X}_t - \frac{1}{T}\sum_{t=1}^T {\bf X}_{t})\), for \(t = 1, \ldots, T,\)
where \({\bf S}\) is the sample covariance matrix of \({\bf X}\).
The values of \({\bf Y}\) are then split into \(K\) disjoint intervals \(T_i\). Algorithm first calculates matrix
$${\bf M} = \frac{1}{T}\sum_{i = 1}^K \left({\bf m}_{T_i} {\bf m}_{T_i}^T + \frac{1}{2} {\bf S}_{T_i} {\bf S}_{T_i}^T\right) - \frac{1}{2} {\bf I},$$
where \(i = 1, \ldots, K\), \(K\) is the number of breakpoints, \({\bf I}\) is an identity matrix, and \({\bf m}_{T_i}\) is the average of values of \({\bf Y}\) and \({\bf S}_{T_i}\) is the sample variance of values of \({\bf Y}\) which belong to a disjoint interval \(T_i\).
The algorithm finds an orthogonal matrix \({\bf U}\) via eigendecomposition
$${\bf M} = {\bf UDU}^T.$$
The final unmixing matrix is then \({\bf W} = {\bf U S}^{-1/2}\). The first \(k\) rows of \({\bf U}\) are the eigenvectors corresponding to the non-zero eigenvalues and the rest correspond to the zero eigenvalues. In the same way, the first \(k\) rows of \({\bf W}\) project the observed time series to the subspace of non-stationary components, and the last \(p-k\) rows to the subspace of stationary components.