SpatEntropy (version 0.1.0)

shannonZ_sq: Shannon's entropy of \(Z\) with a squared information function.

Description

This function computes Shannon's entropy of \(Z\) with the square of the information function.

Usage

shannonZ_sq(shannZ)

Arguments

shannZ

Output of shannonZ()

Value

Estimated probabilities for all \(Z\) categories (data pairs), and Shannon's entropy of \(Z\) with a squared information function.

Details

This computes a version of Shannon's entropy of \(Z\) (see shannonZ()) where the information function \(\log(1/p(z_r))\) is squared: $$H(Z)_2=\sum p(z_r)\log(1/p(z_r))^2$$ It is useful for estimating the variance of the maximum likelihood estimator of Shannon's entropy given by shannonZ().

Examples

Run this code
# NOT RUN {
#NON SPATIAL DATA
shZ=shannonZ(sample(1:5, 50, replace=TRUE))
shannonZ_sq(shZ)

#POINT DATA
data.pp=runifpoint(100, win=square(10))
marks(data.pp)=sample(c("a","b","c"), 100, replace=TRUE)
shZ=shannonZ(marks(data.pp))
shannonZ_sq(shZ)

#LATTICE DATA
data.lat=matrix(sample(c("a","b","c"), 100, replace=TRUE), nrow=10)
shZ=shannonZ(data.lat)
shannonZ_sq(shZ)

# }

Run the code above in your browser using DataCamp Workspace