Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible [Formula presented]-point correlation is measured by a decrease in entropy for the joint distribution of [Formula presented] variables relative to the maximum entropy allowed by all the observed [Formula presented] variable distributions. We calculate the “connected information” terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.
All Science Journal Classification (ASJC) codes
- Physics and Astronomy(all)