powered by
Computes the Kullback-Leibler divergence based on kernel density estimates of two samples.
KL_div(x, y, from_a, to_b)
a numeric value that is the kl divergence
numeric, the values from a sample p
numeric, the values from a sample q
numeric, the lower limit of the integration
numeric, the upper limit of the integration
The Kullback-Leibler divergence is defined as $$D_{KL}(P||Q) = \int_{-\infty}^{\infty} p(x) \log \frac{p(x)}{q(x)} dx$$
set.seed(123) p <- rnorm(100) q <- rnorm(100) KL_div(p, q, -Inf, Inf) # 0.07579204 q <- rnorm(100, 10, 4) KL_div(p, q, -Inf, Inf) # 7.769912
Run the code above in your browser using DataLab