powered by
Returns the Z estimator of Kullback-Leibler Divergence, which has exponentially decaying bias. See Zhang and Grabchak (2014b) for details.
KL.z(x, y)
Vector of counts from the first distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.
Vector of counts from the second distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.
Lijuan Cao and Michael Grabchak
Z. Zhang and M. Grabchak (2014b). Nonparametric Estimation of Kullback-Leibler Divergence. Neural Computation, 26(11): 2570-2593.
x = c(1,3,7,4,8) y = c(2,5,1,3,6) KL.z(x,y) KL.z(y,x)
Run the code above in your browser using DataLab