Local Nearest Neighbor (LNN) mutual information estimator by Gao et al. 2017. This estimator uses the LNN entropy (lnn_entropy) estimator into the mutual information identity.
Usage
lnn_mi(data, splits, k = 5, tr = 30)
Arguments
data
Matrix of sample observations, each row is an observation.
splits
A vector that describes which sets of columns in data to compute the mutual information between. For example, to compute mutual information between two variables use splits = c(1,1). To compute redundancy among multiple random variables use splits = rep(1,ncol(data)). To compute the mutual information between two random vector list the dimensions of each vector.
k
Order of the local kNN bandwidth selection.
tr
Order of truncation (number of neighbors to include in the local density estimation).
References
Gao, W., Oh, S., & Viswanath, P. (2017). Density functional estimators with k-nearest neighbor bandwidths. IEEE International Symposium on Information Theory - Proceedings, 1, 1351<U+2013>1355.