# KL.dist: Kullback-Leibler Divergence

## Description

Compute Kullback-Leibler symmetric distance.

## Usage

KL.dist(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
KLx.dist(X, Y, k = 10, algorithm="kd_tree")

## Arguments

k

The maximum number of nearest neighbors to search. The default value
is set to 10.

algorithm

nearest neighbor search algorithm.

## Value

Return the Kullback-Leibler distance between `X`

and `Y`

.

## Details

Kullback-Leibler distance is the sum of divergence `q(x)`

from `p(x)`

and `p(x)`

from `q(x)`

.

`KL.*`

versions return distances from `C`

code to `R`

but `KLx.*`

do not.

## References

S. Boltz, E. Debreuve and M. Barlaud (2007).
“kNN-based high-dimensional Kullback-Leibler distance for tracking”.
*Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on*.

S. Boltz, E. Debreuve and M. Barlaud (2009).
“High-dimensional statistical measure for region-of-interest tracking”.
*Trans. Img. Proc.*, **18**:6, 1266--1283.

## Examples

# NOT RUN {
set.seed(1000)
X<- rexp(10000, rate=0.2)
Y<- rexp(10000, rate=0.4)
KL.dist(X, Y, k=5)
KLx.dist(X, Y, k=5)
#thoretical distance = (0.2-0.4)^2/(0.2*0.4) = 0.5
# }