Learn R Programming

monomvn (version 1.2)

kl.norm: KL Divergence Between Two Multivariate Normal Distributions

Description

Returns the Kullback-Leibler (KL) divergence (a.k.a. distance) between two multivariate normal (MVN) distributions described by their mean vector and covariance matrix

Usage

kl.norm(mu1, S1, mu2, S2, quiet=FALSE, symm=FALSE)

Arguments

mu1
mean vector of first MVN
S1
covariance matrix of first MVN
mu2
mean vector of second MVN
S2
covariance matrix of second MVN
quiet
when FALSE (default), gives a warning if posdef.approx finds a non-positive definite covariance matrix
symm
when TRUE a symmetrized version of the K-L divergence is used. See the note below

Value

  • Returns a positive real number giving the KL divergence between the two normal distributions.

Details

The KL divergence is given by the formula: $$D_{\mbox{\tiny KL}}(N_1 \| N_2) = \frac{1}{2} \left( \log \left( \frac{|\Sigma_2|}{|\Sigma_1|} \right) + \mbox{tr} \left( \Sigma_2^{-1} \Sigma_1 \right) + \left( \mu_2 - \mu_1\right)^\top \Sigma_2^{-1} ( \mu_2 - \mu_1 ) - N \right)$$ where $N$ is length(mu1).

As a preprocessing step S1 and S2 are checked to be positive definite via the posdef.approx. If not, and if the accuracy package is installed, then they can be coerced to the nearest positive definite matrix.

References

http://www.statslab.cam.ac.uk/~bobby/monomvn.html

See Also

posdef.approx

Examples

Run this code
mu1 <- rnorm(5)
s1 <- matrix(rnorm(100), ncol=5)
S1 <- t(s1) %*% s1

mu2 <- rnorm(5)
s2 <- matrix(rnorm(100), ncol=5)
S2 <- t(s2) %*% s2

## not symmetric
kl.norm(mu1,S1,mu2,S2)
kl.norm(mu2,S2,mu1,S1)

## symmetric
0.5 *(kl.norm(mu1,S1,mu2,S2) + kl.norm(mu2,S2,mu1,S1))

Run the code above in your browser using DataLab