Learn R Programming

entropy (version 1.1.3)

mi.plugin: Plug-In Mutual Information Estimator

Description

mi.plugin computes the mutual information of of two discrete random variables from the specified bin frequencies.

Usage

mi.plugin(freqs2d, unit=c("log", "log2", "log10"))

Arguments

freqs2d
matrix of bin frequencies.
unit
the unit in which entropy is measured.

Value

  • mi.plugin returns the mutual information.

Details

The mutual information of two discrete random variables $X$ and $Y$ is defined as $MI = H(X) + H(Y) - H(X, Y)$.

See Also

entropy.plugin.

Examples

Run this code
# load entropy library 
library("entropy")

# joint distribution of two discrete variables
freqs2d = rbind( c(0.2, 0.1, 0.15), c(0, 0.3, 0.25) )  

# and corresponding mutual information
mi.plugin(freqs2d)

Run the code above in your browser using DataLab