TreeDist (version 1.1.1)

SplitEntropy: Entropy of two splits

Description

Calculate the entropy, joint entropy, entropy distance and information content of two splits, treating each split as a division of n leaves into two groups. Further details are available in a vignette, MacKay (2003) and Meilă (2007).

Usage

SplitEntropy(split1, split2 = split1)

Arguments

split1, split2

Logical vectors listing leaves in a consistent order, identifying each leaf as a member of the ingroup (TRUE) or outgroup (FALSE) of the split in question.

Value

A numeric vector listing, in bits:

  • H1 The entropy of split 1;

  • H2 The entropy of split 2;

  • H12 The joint entropy of both splits;

  • I The mutual information of the splits;

  • Hd The entropy distance (variation of information) of the splits.

References

Mackay2003TreeDist

Meila2007TreeDist

See Also

Other information functions: ClusteringEntropy(), SplitSharedInformation(), SplitwiseInfo()

Examples

Run this code
# NOT RUN {
A <- TRUE
B <- FALSE
SplitEntropy(c(A, A, A, B, B, B), c(A, A, B, B, B, B))
# }

Run the code above in your browser using DataLab