Learn R Programming

kldest (version 1.0.0)

kld_est_kde1: 1-D kernel density-based estimation of Kullback-Leibler divergence

Description

This estimation method approximates the densities of the unknown distributions \(P\) and \(Q\) by a kernel density estimate using function 'density' from package 'stats'. Only the two-sample, not the one-sample problem is implemented.

Usage

kld_est_kde1(X, Y, MC = FALSE, ...)

Value

A scalar, the estimated Kullback-Leibler divergence \(\hat D_{KL}(P||Q)\).

Arguments

X, Y

Numeric vectors or single-column matrices, representing samples from the true distribution \(P\) and the approximate distribution \(Q\), respectively.

MC

A boolean: use a Monte Carlo approximation instead of numerical integration via the trapezoidal rule (default: FALSE)?

...

Further parameters to passed on to stats::density (e.g., argument bw)

Examples

Run this code
# KL-D between two samples from 1D Gaussians:
set.seed(0)
X <- rnorm(100)
Y <- rnorm(100, mean = 1, sd = 2)
kld_gaussian(mu1 = 0, sigma1 = 1, mu2 = 1, sigma2 = 2^2)
kld_est_kde1(X,Y)
kld_est_kde1(X,Y, MC = TRUE)

Run the code above in your browser using DataLab