Entropy and the Kullback-Leibler divergence for Bayesian networks: Computational complexity and efficient implementation.
An object of class bn.fit
. Refer to the documentation of bnlearn
for details.
A conditional linear Gaussian Bayesian network to illustrate the algorithms developed in the associated paper (Figure 3, top). The probabilities were available from a repository. The vertices are:
(a, b);
(c, d);
(e, f);
Scutari, M. (2024). Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation. Algorithms, 17(1), 24.