Learn R Programming

flexmix (version 1.1-2)

KLdiv: Kullback-Leibler Divergence

Description

Estimate the Kullback-Leibler divergence of several distributions.

Usage

KLdiv(object, ...)
## S3 method for class 'matrix':
KLdiv(object, eps=1e-4, ...)

Arguments

object
see Methods section below
eps
probabilities below this treshold are discarded for numerical stability
...
Passed to the matrix method.

Value

  • A matrix of of KL divergences where the rows correspond to using the respective distribution as $f()$ in the formula above.

Details

Estimates $$\int f(x) (\log f(x) - \log g(x)) dx$$ for distributions with densities $f()$ and $g()$.

References

S. Kullback and R. A. Leibler. On information and sufficiency. The Annals of Mathematical Statistics 22(1), pages 79-86, 1951. Friedrich Leisch. Exploring the structure of mixture model components. In Jaromir Antoch, editor, Compstat 2004 - Proceedings in Computational Statistics, pages 1405-1412. Physika Verlag, Heidelberg, Germany, 2004. ISBN 3-7908-1554-3.

Examples

Run this code
x = (1:100)/100
## Gaussian and Student t are much closer to each other than
## to the uniform:
KLdiv(cbind(u=dunif(x), n=dnorm(x), t=dt(x, df=10)))

Run the code above in your browser using DataLab