Learn R Programming

mlf (version 1.2.1)

kld: Kullback-Leibler Divergence

Description

Provides estimated difference between individual entropy and cross-entropy of two probability distributions.

Usage

kld(x, y, bins)

Arguments

x, y

numeric or discrete data vectors

bins

specify number of bins

Examples

Run this code
# NOT RUN {
# Sample numeric vector
a <- rnorm(25, 80, 35)
b <- rnorm(25, 90, 35)
mlf::kld(a, b, bins = 2)

# Sample discrete vector
a <- as.factor(c(1,1,2,2))
b <- as.factor(c(1,1,1,2))
mlf::kld(a, b)
# }

Run the code above in your browser using DataLab