Calculate the entropy, joint entropy, entropy distance and information content of two splits, treating each split as a division of n leaves into two groups. Further details are available in a vignette, MacKay (2003) and Meila (2007) .
Value
A numeric vector listing, in bits:
H1The entropy of split 1;H2The entropy of split 2;H12The joint entropy of both splits;IThe mutual information of the splits;HdThe entropy distance (variation of information) of the splits.
References
MacKay DJC (2003).
Information Theory, Inference, and Learning Algorithms.
Cambridge University Press, Cambridge.
https://www.inference.org.uk/itprnn/book.pdf.
 Meila M (2007).
“Comparing clusterings—an information based distance.”
Journal of Multivariate Analysis, 98(5), 873–895.
doi:10.1016/j.jmva.2006.11.013
.
See also
Other information functions:
SplitSharedInformation(),
TreeInfo