Calculates the Kullback-Leibler divergence (relative entropy) between
unweighted theoretical component distributions. Divergence is calculated
as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities
f() and g().
# NOT RUN {x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
matplot(x, y, type='l')
kl.divergence(y)
# extract value for last column kl.divergence(y[,1:2])[3:3]
# }