50% off | Unlimited Data & AI Learning

Last chance! 50% off unlimited learning

Sale ends in


philentropy (version 0.2.0)

JE: Shannon's Joint-Entropy H(X,Y)

Description

This funciton computes Shannon's Joint-Entropy H(X,Y)=P(X,Y)log2(P(X,Y)) based on a given joint-probability vector P(X,Y).

Usage

JE(x, unit = "log2")

Arguments

x

a numeric joint-probability vector P(X,Y) for which Shannon's Joint-Entropy H(X,Y) shall be computed.

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Value

a numeric value representing Shannon's Joint-Entropy in bit.

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

H, CE, KL, JSD, gJSD, distance

Examples

Run this code
# NOT RUN {
JE(1:100/sum(1:100))

# }

Run the code above in your browser using DataLab