Computes the information entropy H=sum(p*log_b(p)), also known as Shannon entropy, of a probability vector p.
entropy(p, b = exp(1), normalize = TRUE)
vector of probabilities; typically normalized, such that sum(p)=1.
base of the logarithm (default is e)
logical flag. If TRUE (default), the vector p is automatically normalized.
Returns the information entropy in units that depend on b. If b=2, the units are bits; if b=exp(1), the units are nats; if b=10, the units are dits.