Common examples of such features comprise the first-passage-time of random walkers on Markovian graphs, thermodynamic properties of lattice models in statistical physics (e.g. Ising model), and the epidemic threshold of epidemic network models (SIR and SIS models).
SCG differs from traditional clustering schemes by producing a coarse-grained graph (not just a partition of the vertices), representative of the original one. As shown in [1], Principal Component Analysis can be viewed as a particular SCG, called exact SCG, where the matrix to be coarse-grained is the covariance matrix of some data set.
SCG should be of interest to practitioners of various fields dealing with problems where matrix eigenpairs play an important role, as for instance is the case of dynamical processes on networks.
Given $M\in\mathbf{C}^{n\times n}$ and $(\lambda,v)$ a (right) eigenpair of $M$ to be preserved by the coarse graining, the problem is to find a projector $\widehat{P}$ solving $$\min_{P\in \Omega} \Vert e_{P}(v)\Vert,$$ where $\Omega$ is a set of projectors in $\mathbf{C}^{n\times n}$ described by some ad hoc constraints $c_{1},\dots c_{r}$ (e.g. $c_{1}:P\in\mathbf{R}^{n\times n}, c_{2}:P=P^{t}, c_{3}:P_{ij}\ge 0$, etc).
Choosing pertinent constraints to solve the SCG problem is of great importance in applications. For instance, in the absence of constraints the SCG problem is solved trivially by $\widehat{P}=vv^*$ ($v$ is assumed normalized). We have designed a particular constraint, called homogeneous mixing, which ensures that vertices belonging to the same group are merged consistently from a physical point of view (see [1] for details). Under this constraint the SCG problem reduces to finding the partition of ${1,\dots,n}$ (labeling the original vertices) minimizing $$\Vert e_P(v)\Vert^{2} = \sum_{\alpha=1}^{\tilde{n}}\sum_{i\in\alpha}[v(i)-(Pv)(i)]^{2},$$ where $\alpha$ denotes a group (i.e. a block) in a partition of ${1,\dots,n}$, and $|\alpha|$ is the number of elements in $\alpha$.
If $M$ is symmetric or stochastic, for instance, then it may be desirable (or mandatory) to choose $L$, $R$ so that $\widetilde M$ is symmetric or stochastic as well. This structural constraint has led to the construction of particular semi-projectors for symmetric [1], stochastic [3] and Laplacian [2] matrices, that are made available.
In short, the coarse graining of matrices and graphs involves:
scg
. This function handles all the steps involved in the
Spectral Coarse Graining (SCG) of some particular matrices and graphs
as described above and in reference [1]. In more details,
scg
computes some prescribed eigenpairs of a matrix or a
graph matrix (for now adjacency, Laplacian and stochastic matrices are
available), works out an optimal partition to preserve the eigenpairs,
and finally outputs a coarse-grained matrix or graph along with other
useful information.
These steps can also be carried out independently: (1) Use
get.adjacency
, graph.laplacian
or
get.stochastic
to compute a matrix $M$. (2) Work
out some prescribed eigenpairs of $M$ e.g. by
means of eigen
or arpack
. (3) Invoke one the four
algorithms of the function scgGrouping
to get a
partition that will preserve the eigenpairs in the coarse-grained
matrix. (4) Compute the semi-projectors $L$ and $R$ using
scgSemiProjectors
and from there the coarse-grained
matrix $\widetilde{M}=LMR^*$. If necessary, construct a
coarse-grained graph from $\widetilde{M}$ (e.g. as in [1]). D. Gfeller, and P. De Los Rios, Spectral Coarse Graining of Complex
Networks, Physical Review Letters, 99(3), 2007.