Learn R Programming

ShapePattern (version 3.1.0)

KLPQ: Computes and returns the Kullback-Leibler divergence between two probability distributions

Description

Given two probability distributions (vectors) of the same length, this function computes the Kullback-Leibler divergence (relative entropy) between them. If any entries in the distribution are 0, then the argument JITTER can be used to add a tiny offset.

Usage

KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)

Value

The function returns a numeric value that is the Kullback-Leibler divergence. If this value is 0 (zero), then the two distributions are identical.

Arguments

P

A first probability distribution vector.

Q

A second probability distribution vector.

JITTER

An optional tiny value to be added to the probabilities to avoid non-zero entries (e.g., 0.000000001).

Author

Tarmo K. Remmel

References

None currently.

See Also

See Also patternbits

Examples

Run this code
KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)

Run the code above in your browser using DataLab