CI
KL.CI
returns the Kullback-Leibler (KL) divergence between an object of class CI
and its update after a model-preserving parameter variation.
# S3 method for CI
KL(x, type, entry, delta, ...)
A dataframe including in the first column the variations performed, and in the following columns the corresponding KL divergences for the chosen model-preserving co-variations.
object of class CI
.
character string. Type of model-preserving co-variation: either "total"
, "partial"
, row
,column
or all
. If all
the KL divergence is computed for every type of co-variation matrix.
a vector of length 2 indicating the entry of the covariance matrix to vary.
numeric vector with positive elements, including the variation parameters that act multiplicatively.
additional arguments for compatibility.
Computation of the KL divergence between a Bayesian network and its updated version after a model-preserving variation.
C. Görgen & M. Leonelli (2020), Model-preserving sensitivity analysis for families of Gaussian distributions. Journal of Machine Learning Research, 21: 1-32.
KL.GBN
, Fro.CI
, Fro.GBN
, Jeffreys.GBN
, Jeffreys.CI
KL(synthetic_ci, "total", c(1,1), seq(0.9,1.1,0.01))
KL(synthetic_ci, "partial", c(1,4), seq(0.9,1.1,0.01))
KL(synthetic_ci, "column", c(1,2), seq(0.9,1.1,0.01))
KL(synthetic_ci, "row", c(3,2), seq(0.9,1.1,0.01))
KL(synthetic_ci, "all", c(3,2), seq(0.9,1.1,0.01))
Run the code above in your browser using DataLab