Learn R Programming

entropart (version 1.1.3)

Hqz: Similarity-based entropy of a community

Description

Calculates the entropy of order $q$ of a probability vector according to a similarity matrix.

Usage

Hqz(Ps, q, Z, CheckArguments = TRUE)

Arguments

Ps
A probability vector, summing to 1.
q
A number.
Z
A relatedness matrix, i.e. a square matrix whose terms are all positive, strictly positive on the diagonal.
CheckArguments
Logical; if TRUE, the function arguments are verified. Should be set to FALSE to save time when the arguments have been checked elsewhere.

Value

  • A number equal to the calculated entropy.

Details

Entropy is calculated following Leinster and Cobbold (2012) after Ricotta and Szeidl (2006): it is the entropy of order q of the community, using species ordinariness as the information function. A similarity matrix is used (as for Dqz), not a distance matrix as in Ricotta and Szeidl (2006). See the example.

References

Leinster, T. and Cobbold, C. (2012). Measuring diversity: the importance of species similarity. Ecology 93(3): 477-489. Ricotta, C. and Szeidl, L. (2006). Towards a unifying approach to diversity measures: Bridging the gap between the Shannon entropy and Rao's quadratic index. Theoretical Population Biology 70(3): 237-243.

See Also

Dqz, PhyloEntropy

Examples

Run this code
# Load Paracou data (number of trees per species in two 1-ha plot of a tropical forest)
  data(Paracou618)
  # Prepare the similarity matrix
  DistanceMatrix <- as.matrix(EightSpTree$Wdist^2/2)
  # Similarity can be 1 minus normalized distances between species
  Z <- 1 - DistanceMatrix/max(DistanceMatrix)
  # Calculate diversity of order 2
  Ps <- EightSpAbundance/sum(EightSpAbundance)
  Hqz(Ps, 2, Z)
  # Equal to normalized Rao quadratic entropy when q=2
  Rao(Ps, EightSpTree)/max(DistanceMatrix)
  # But different from PhyloEntropy for all other q, e.g. 1
  Hqz(Ps, 1, Z)
  summary(PhyloEntropy(Ps, 1, EightSpTree))

Run the code above in your browser using DataLab