kl.divergence
From spatialEco v1.3-2
by Jeffrey S Evans
Kullback-Leibler divergence (relative entropy)
Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().
Usage
kl.divergence(object, eps = 10^-4, overlap = TRUE)
Arguments
- object
Matrix or dataframe object with >=2 columns
- eps
Probabilities below this threshold are replaced by this threshold for numerical stability.
- overlap
Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps.
Value
pairwise Kullback-Leibler divergence index (matrix)
References
Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86
Examples
# NOT RUN {
x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
matplot(x, y, type='l')
kl.divergence(y)
# extract value for last column
kl.divergence(y[,1:2])[3:3]
# }
Community examples
Looks like there are no examples yet.