SpatEntropy (version 0.1.0)

shannonX_sq: Shannon's entropy with a squared information function.

Description

This function computes Shannon's entropy of \(X\) with the square of the information function.

Usage

shannonX_sq(data)

Arguments

data

A data matrix or vector, can be numeric, factor, character, ... If the dataset is a point pattern, data is the mark vector.

Value

Estimated probabilities for all data categories, and Shannon's entropy of \(X\) with a squared information function.

Details

This computes a version of Shannon's entropy (see shannonX()) where the information function \(\log(1/p(x_i))\) is squared: $$H(X)_2=\sum p(x_i) \log(1/p(x_i))^2$$ It is useful for estimating the variance of the maximum likelihood estimator of Shannon's entropy given by shannonX().

Examples

Run this code
# NOT RUN {
#NON SPATIAL DATA
shannonX_sq(sample(1:5, 50, replace=TRUE))

#POINT DATA
data.pp=runifpoint(100, win=square(10))
marks(data.pp)=sample(c("a","b","c"), 100, replace=TRUE)
shannonX_sq(marks(data.pp))

#LATTICE DATA
data.lat=matrix(sample(c("a","b","c"), 100, replace=TRUE), nrow=10)
shannonX_sq(data.lat)

# }

Run the code above in your browser using DataLab