tfd_kl_divergence: Computes the Kullback--Leibler divergence.
Description
Denote this distribution by p and the other distribution by q.
Assuming p, q are absolutely continuous with respect to reference measure r,
the KL divergence is defined as:
KL[p, q] = E_p[log(p(X)/q(X))] = -int_F p(x) log q(x) dr(x) + int_F p(x) log p(x) dr(x) = H[p, q] - H[p]
where F denotes the support of the random variable X ~ p, H[., .]
denotes (Shannon) cross entropy, and H[.] denotes (Shannon) entropy.
Usage
tfd_kl_divergence(distribution, other, name = "kl_divergence")
Value
self$dtype Tensor with shape [B1, ..., Bn] representing n different calculations
of the Kullback-Leibler divergence.
Arguments
distribution
The distribution being used.
other
tfp$distributions$Distribution instance.
name
String prepended to names of ops created by this function.