Compute Shannon's Conditional-Entropy based on the chain rule \(H(X | Y)
= H(X,Y) - H(Y)\) based on a given joint-probability vector \(P(X,Y)\) and
probability vector \(P(Y)\).
Usage
CE(xy, y, unit = "log2")
Value
Shannon's Conditional-Entropy in bit.
Arguments
xy
a numeric joint-probability vector \(P(X,Y)\)
for which Shannon's Joint-Entropy \(H(X,Y)\) shall be computed.
y
a numeric probability vector \(P(Y)\) for which
Shannon's Entropy \(H(Y)\) (as part of the chain rule) shall be computed.
It is important to note that this probability vector must be the probability
distribution of random variable Y ( P(Y) for which H(Y) is computed).
unit
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.
Author
Hajk-Georg Drost
Details
This function might be useful to fastly compute Shannon's
Conditional-Entropy for any given joint-probability vector and probability
vector.
References
Shannon, Claude E. 1948. "A Mathematical Theory of
Communication". Bell System Technical Journal27 (3): 379-423.