Learn R Programming

bnmonitor (version 0.2.2)

KL_bounds: Bounds for the KL-divergence

Description

Computation of the bounds of the KL-divergence for variations of each parameter of a CI object.

Usage

KL_bounds(ci, delta)

Value

A dataframe including the KL-divergence bound for each co-variation scheme (model-preserving and standard) and every entry of the covariance matrix. For variations leading to non-positive semidefinite matrix, the dataframe includes a NA.

Arguments

ci

object of class CI.

delta

multiplicative variation coefficient for the entry of the covariance matrix given in entry.

Details

Let \(\Sigma\) be the covariance matrix of a Gaussian Bayesian network with \(n\) vertices. Let \(D\) and \(\Delta\) be variation matrices acting additively on \(\Sigma\). Let also \(\tilde\Delta\) be a model-preserving co-variation matrix. Denote with \(Y\) and \(\tilde{Y}\) the original and the perturbed random vectors. Then for a standard sensitivity analysis $$KL(\tilde{Y}||Y)\leq 0.5n\max\left\{f(\lambda_{\max}(D\Sigma^{-1})),f(\lambda_{\min}(D\Sigma^{-1}))\right\}$$ whilst for a model-preserving one $$KL(\tilde{Y}||Y)\leq 0.5n\max\left\{f(\lambda_{\max}(\tilde\Delta\circ\Delta)),f(\lambda_{\min}(\tilde\Delta\circ\Delta))\right\}$$ where \(\lambda_{\max}\) and \(\lambda_{\min}\) are the largest and the smallest eigenvalues, respectively, \(f(x)=\ln(1+x)-x/(1+x)\) and \(\circ\) denotes the Schur or element-wise product.

References

C. Görgen & M. Leonelli (2020), Model-preserving sensitivity analysis for families of Gaussian distributions. Journal of Machine Learning Research, 21: 1-32.

See Also

KL.CI, KL.CI

Examples

Run this code
KL_bounds(synthetic_ci,1.05)


Run the code above in your browser using DataLab