Learn R Programming

entropart (version 1.2.0)

Hqz: Similarity-based entropy of a community

Description

Calculates the entropy of order $q$ of a probability vector according to a similarity matrix.

Usage

Hqz(Ps, q = 1, Z = diag(length(Ps)), CheckArguments = TRUE)
bcHqz(Ns, q =1, Z = diag(length(Ns)), Correction = "Best", CheckArguments = TRUE)

Arguments

Ps
A probability vector, summing to 1.
Ns
A numeric vector containing species abundances.
q
A number: the order of entropy. Default is 1.
Z
A relatedness matrix, i.e. a square matrix whose terms are all positive, strictly positive on the diagonal. Generally, the matrix is a similarity matrix, i.e. the diagonal terms equal 1 and other terms are between 0 and 1. Default is the
Correction
A string containing one of the possible corrections: "None" (no correction), "ChaoShen", "MarconZhang" or "Best", the default value. The "MarconZhang" correction assumes a similarity matrix.
CheckArguments
Logical; if TRUE, the function arguments are verified. Should be set to FALSE to save time when the arguments have been checked elsewhere.

Value

  • A number equal to the calculated entropy.

Details

Entropy is calculated following Leinster and Cobbold (2012) after Ricotta and Szeidl (2006): it is the entropy of order q of the community, using species ordinariness as the information function. A similarity matrix is used (as for Dqz), not a distance matrix as in Ricotta and Szeidl (2006). See the example. Bias correction requires the number of individuals. Use bcHqz and choose the Correction. Correction techniques are from Marcon and Zhang (2014). Currently, the "Best" correction is the max value of "ChaoShen" and "MarconZhang".

References

Leinster, T. and Cobbold, C. (2012). Measuring diversity: the importance of species similarity. Ecology 93(3): 477-489. Marcon, E. and Zhang, Z. (2014). The decomposition of similarity-based diversity and its bias correction. HAL hal-00989454(version 1). Ricotta, C. and Szeidl, L. (2006). Towards a unifying approach to diversity measures: Bridging the gap between the Shannon entropy and Rao's quadratic index. Theoretical Population Biology 70(3): 237-243.

See Also

Dqz, PhyloEntropy

Examples

Run this code
# Load Paracou data (number of trees per species in two 1-ha plot of a tropical forest)
  data(Paracou618)
  # Prepare the similarity matrix
  DistanceMatrix <- as.matrix(EightSpTree$Wdist^2/2)
  # Similarity can be 1 minus normalized distances between species
  Z <- 1 - DistanceMatrix/max(DistanceMatrix)
  # Calculate diversity of order 2
  Ps <- EightSpAbundance/sum(EightSpAbundance)
  Hqz(Ps, 2, Z)
  # Equal to normalized Rao quadratic entropy when q=2
  Rao(Ps, EightSpTree)/max(DistanceMatrix)
  # But different from PhyloEntropy for all other q, e.g. 1
  Hqz(Ps, 1, Z)
  summary(PhyloEntropy(Ps, 1, EightSpTree))

Run the code above in your browser using DataLab