The Kullback-Leibler divergence measures the similarity between two APFA models. If the two models are identical then it is zero.
Usage
KL(A,B)
Arguments
A
APFA igraph object
B
APFA igraph object
Value
Returns the KL-divergence.
Details
A and B must be commensurate, i.e., defined on the same variable set. Note that the KL-divergence is not a true distance measure, since it is not
not symmetric in A and B. For large APFA the computation of the KL-divergence may be prohibitive in both time and memory.
References
Thollard, F.; Dupont, P. & de la Higuera, C. Probabilistic DFA Inference using Kullback-Leibler Divergence and Minimality 17th International Conference on Machine Learning., 2000, 975-982