Learn R Programming

spatialEco (version 0.1-5)

kl.divergence: Kullback-Leibler divergence (relative entropy)

Description

Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().

Usage

kl.divergence(object, eps = 10^-4, overlap = TRUE)

Arguments

object
Matrix or dataframe object with >=2 columns
eps
Probabilities below this threshold are replaced by this threshold for numerical stability.
overlap
Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps.

Value

pairwise Kullback-Leibler divergence index (matrix)

References

Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86

Examples

Run this code
x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
  matplot(x, y, type='l')
    kl.divergence(y)
   
# extract value for last column
  kl.divergence(y[,1:2])[3:3]

Run the code above in your browser using DataLab