entropy.estimate: Vasicek estimate of differential Shannon Entropy
Description
Computes Vasicek estimate of differential Shannon entropy from a numeric sample.
Usage
entropy.estimate(x,window)
Arguments
x
(numeric, vector) the numeric sample.
window
(numeric, single value) an integer between 1 and half on the sample size specifying the window size for computing Vasicek estimate. See Details for additional information.
Value
A single numeric value representing the Vasicek estimate of entropy of the sample
Details
Vasicek estimator of Shannon entropy is defined, for a random sample \(X_1, \dots, X_n\), by
$$\frac{1}{n}\sum_{i=1}^{n} \log (\frac{n}{2m}[X_{(i+m)}-X_{(i-m)}]),$$
where \(X_{(i)}\) is the order statistic, \(m<(n/2)\) is the window size, and \(X_{(i)}=X_{(1)}\) for \(i<1\) and \(X_{(i)}=X_{(n)}\) for \(i>n\).
References
Vasicek, O., A test for normality based on sample entropy, Journal of the Royal Statistical Society,38(1), 54-59 (1976).
See Also
vs.test which performs Vasicek-Song goodness-of-fit tests to the specified maximum entropy distribution family.
# NOT RUN {set.seed(2)
samp <- rnorm(100, mean = 0, s = 1)
entropy.estimate(x = samp, window = 8)
log(2*pi*exp(1))/2#true value of entropy of normal distribution# }