Learn R Programming

copent (version 0.5)

entknn: Estimating entropy from data with kNN method

Description

Estimating entropy from data with kNN method.

Usage

entknn(x,k=3,dt=2)

Value

The function returns the estimated entropy value of data x.

Arguments

x

the data with each row as a sample

k

kth nearest neighbour, default = 3

dt

the type of distance between samples, = 1 for Eclidean distance; other for Maximum distance

Details

This program involves estimating entropy from data by kNN method. It was proposed in Kraskov et al (2004). The algorithm is the second step of estimating copula entropy copent.

The argument x is for the data with each row as a sample from random variables. The argument k and dt is used in the kNN method for estimating entropy. k is for the kth nearest neighbour (default = 3) and dt is for the type of distance between samples which has currently two value options (1 for Eclidean distance, and 2(default) for Maximum distance).

References

Kraskov, A., St\"ogbauer, H., & Grassberger, P. (2004). Estimating Mutual Information. Physical Review E, 69(6), 66138.

Examples

Run this code

library(mnormt)
rho <- 0.5
sigma <- matrix(c(1,rho,rho,1),2,2)
x <- rmnorm(500,c(0,0),sigma)
xent1 <- entknn(x)

Run the code above in your browser using DataLab