flexmix (version 2.3-9)

KLdiv: Kullback-Leibler Divergence

Description

Estimate the Kullback-Leibler divergence of several distributions.

Usage

## S3 method for class 'matrix':
KLdiv(object, eps=10^-4, overlap=TRUE,...)
## S3 method for class 'flexmix':
KLdiv(object, method = c("continuous", "discrete"), ...)

Arguments

object
See Methods section below.
method
The method to be used; "continuous" determines the Kullback-Leibler divergence between the unweighted theoretical component distributions and the unweighted posterior probabilities at the observed points are used by "discrete".
eps
Probabilities below this threshold are replaced by this threshold for numerical stability.
overlap
Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps.
...
Passed to the matrix method.

Value

  • A matrix of KL divergences where the rows correspond to using the respective distribution as $f()$ in the formula above.

Details

Estimates $$\int f(x) (\log f(x) - \log g(x)) dx$$ for distributions with densities $f()$ and $g()$.

References

S. Kullback and R. A. Leibler. On information and sufficiency.The Annals of Mathematical Statistics, 22(1), 79--86, 1951. Friedrich Leisch. Exploring the structure of mixture model components. In Jaromir Antoch, editor, Compstat 2004 - Proceedings in Computational Statistics, 1405--1412. Physika Verlag, Heidelberg, Germany, 2004. ISBN 3-7908-1554-3.

Examples

Run this code
## Gaussian and Student t are much closer to each other than
## to the uniform:

x <- seq(-3, 3, length=200)
y <- cbind(u=dunif(x), n=dnorm(x), t=dt(x, df=10))

matplot(x, y, type="l")
KLdiv(y)

if (require("mlbench")) {
set.seed(2606)
x <-  mlbench.smiley()$x
model1 <- flexmix(x~1, k=9, model=FLXmclust(diag=FALSE),
                  control = list(minprior=0))
plotEll(model1, x)
KLdiv(model1)
}

Run the code above in your browser using DataLab