Learn R Programming

YEAB (version 1.0.6)

KL_div: Computes the Kullback-Leibler divergence based on kernel density estimates

Description

Computes the Kullback-Leibler divergence based on kernel density estimates of two samples.

Usage

KL_div(x, y, from_a, to_b)

Value

a numeric value that is the kl divergence

Arguments

x

numeric, the values from a sample p

y

numeric, the values from a sample q

from_a

numeric, the lower limit of the integration

to_b

numeric, the upper limit of the integration

Details

The Kullback-Leibler divergence is defined as $$D_{KL}(P||Q) = \int_{-\infty}^{\infty} p(x) \log \frac{p(x)}{q(x)} dx$$

Examples

Run this code
set.seed(123)
p <- rnorm(100)
q <- rnorm(100)
KL_div(p, q, -Inf, Inf) # 0.07579204
q <- rnorm(100, 10, 4)
KL_div(p, q, -Inf, Inf) # 7.769912

Run the code above in your browser using DataLab