FNN (version 1.1.4)

KL.dist: Kullback-Leibler Divergence

Description

Compute Kullback-Leibler symmetric distance.

Usage

KL.dist(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.dist(X, Y, k = 10, algorithm="kd_tree")

Value

Return the Kullback-Leibler distance between X and Y.

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Author

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com

Details

Kullback-Leibler distance is the sum of divergence q(x) from p(x) and p(x) from q(x) .

KL.* versions return distances from C code to R but KLx.* do not.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266--1283.

See Also

KL.divergence.

Examples

Run this code
    set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)
    
    KL.dist(X, Y, k=5)                 
    KLx.dist(X, Y, k=5) 
    #thoretical distance = (0.2-0.4)^2/(0.2*0.4) = 0.5
    

Run the code above in your browser using DataLab