
Last chance! 50% off unlimited learning
Sale ends in
RelativeEntropy
is used to compute the relative entropy between two probability distributions.RelativeEntropy(p, q, group.index = NULL)
+Inf
if group.index
is not given. A numeric vector if group.index
is given.i
where q[i] == 0
but p[i] > 0
, then the relative entropy is Inf
. Mathematically, this happens when p
is not absolutely continuous with respect to q
.
If group.index
is provided the relative entropy will be decompoesd using the chain rule stated in Lemma 3.1(i) of Pal and Wong (2013), see equation (23) there. In this case the output has 1 + 1 + m
components, where m
is the number of groups defined by group.index
. The first component is the left-hand-side of (23). The second component is the first term on the right-hand-side of (23). The other m
components are the terms in the sum on the right-hand-side of (23).ShannonEntropy
p <- c(0.3, 0.3, 0.4)
q <- c(0.5, 0.3, 0.2)
RelativeEntropy(p, q)
RelativeEntropy(q, p) # relative entropy is not symmetric
Run the code above in your browser using DataLab