Dual total correlation
In information theory, dual total correlation,[1] information rate,[2] excess entropy,[3][4] or binding information[5] is one of several known non-negative generalizations of mutual information. While total correlation is bounded by the sum entropies of the n elements, the dual total correlation is bounded by the joint-entropy of the n elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation.[3]
Definition
For a set of n random variables [math]\displaystyle{ \{X_1,\ldots,X_n\} }[/math], the dual total correlation [math]\displaystyle{ D(X_1,\ldots,X_n) }[/math] is given by
- [math]\displaystyle{ D(X_1,\ldots,X_n) = H\left( X_1, \ldots, X_n \right) - \sum_{i=1}^n H\left( X_i \mid X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n \right) , }[/math]
where [math]\displaystyle{ H(X_{1},\ldots,X_{n}) }[/math] is the joint entropy of the variable set [math]\displaystyle{ \{X_{1},\ldots,X_{n}\} }[/math] and [math]\displaystyle{ H(X_i \mid \cdots ) }[/math] is the conditional entropy of variable [math]\displaystyle{ X_{i} }[/math], given the rest.
Normalized
The dual total correlation normalized between [0,1] is simply the dual total correlation divided by its maximum value [math]\displaystyle{ H(X_{1}, \ldots, X_{n}) }[/math],
- [math]\displaystyle{ ND(X_1,\ldots,X_n) = \frac{D(X_1,\ldots,X_n)}{H(X_1,\ldots,X_n)} . }[/math]
Relationship with Total Correlation
Dual total correlation is non-negative and bounded above by the joint entropy [math]\displaystyle{ H(X_1, \ldots, X_n) }[/math].
- [math]\displaystyle{ 0 \leq D(X_1, \ldots, X_n) \leq H(X_1, \ldots, X_n) . }[/math]
Secondly, Dual total correlation has a close relationship with total correlation, [math]\displaystyle{ C(X_1, \ldots, X_n) }[/math], and can be written in terms of differences between the total correlation of the whole, and all subsets of size [math]\displaystyle{ N-1 }[/math]:[6]
- [math]\displaystyle{ D(\textbf{X}) = (N-1)C(\textbf{X}) - \sum_{i=1}^{N} C(\textbf{X}^{-i}) }[/math]
where [math]\displaystyle{ \textbf{X} = \{X_1,\ldots,X_n\} }[/math] and [math]\displaystyle{ \textbf{X}^{-i} = \{X_1,\ldots,X_{i-1},X_{i+1},\ldots,X_n\} }[/math]
Furthermore, the total correlation and dual total correlation are related by the following bounds:
- [math]\displaystyle{ \frac{C(X_1, \ldots, X_n)}{n-1} \leq D(X_1, \ldots, X_n) \leq (n-1) \; C(X_1, \ldots, X_n) . }[/math]
Finally, the difference between the total correlation and the dual total correlation defines a novel measure of higher-order information-sharing: the O-information:[7]
- [math]\displaystyle{ \Omega(\textbf{X}) = C(\textbf{X}) - D(\textbf{X}) }[/math].
The O-information (first introduced as the "enigmatic information" by James and Crutchfield[8] is a signed measure that quantifies the extent to which the information in a multivariate random variable is dominated by synergistic interactions (in which case [math]\displaystyle{ \Omega(\textbf{X})\lt 0 }[/math]) or redundant interactions (in which case [math]\displaystyle{ \Omega(\textbf{X}) \gt 0 }[/math].
History
Han (1978) originally defined the dual total correlation as,
- [math]\displaystyle{ \begin{align} & D(X_1,\ldots,X_n) \\[10pt] \equiv {} & \left[ \sum_{i=1}^n H(X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n ) \right] - (n-1) \; H(X_1, \ldots, X_n) \; . \end{align} }[/math]
However Abdallah and Plumbley (2010) showed its equivalence to the easier-to-understand form of the joint entropy minus the sum of conditional entropies via the following:
- [math]\displaystyle{ \begin{align} & D(X_1,\ldots,X_n) \\[10pt] \equiv {} & \left[ \sum_{i=1}^n H(X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n ) \right] - (n-1) \; H(X_1, \ldots, X_n) \\ = {} & \left[ \sum_{i=1}^n H(X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n ) \right] + (1-n) \; H(X_1, \ldots, X_n) \\ = {} & H(X_1, \ldots, X_n) + \left[ \sum_{i=1}^n H(X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n ) - H(X_1, \ldots, X_n) \right] \\ = {} & H\left( X_1, \ldots, X_n \right) - \sum_{i=1}^n H\left( X_i \mid X_1, \ldots, X_{i-1}, X_{i+1}, \ldots, X_n \right)\; . \end{align} }[/math]
See also
Bibliography
Footnotes
- ↑ Han, Te Sun (1978). "Nonnegative entropy measures of multivariate symmetric correlations". Information and Control 36 (2): 133–156. doi:10.1016/S0019-9958(78)90275-9.
- ↑ Dubnov, Shlomo (2006). "Spectral Anticipations". Computer Music Journal 30 (2): 63–83. doi:10.1162/comj.2006.30.2.63.
- ↑ 3.0 3.1 Nihat Ay, E. Olbrich, N. Bertschinger (2001). A unifying framework for complexity measures of finite systems. European Conference on Complex Systems. pdf.
- ↑ Olbrich, E.; Bertschinger, N.; Ay, N.; Jost, J. (2008). "How should complexity scale with system size?". The European Physical Journal B 63 (3): 407–415. doi:10.1140/epjb/e2008-00134-9. Bibcode: 2008EPJB...63..407O.
- ↑ Abdallah, Samer A.; Plumbley, Mark D. (2010). "A measure of statistical complexity based on predictive information". arXiv:1012.1890v1 [math.ST].
- ↑ Varley, Thomas F.; Pope, Maria; Faskowitz, Joshua; Sporns, Olaf (24 April 2023). "Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex". Communications Biology 6 (1). doi:10.1038/s42003-023-04843-w.
- ↑ Rosas, Fernando E.; Mediano, Pedro A. M.; Gastpar, Michael; Jensen, Henrik J. (13 September 2019). "Quantifying high-order interdependencies via multivariate extensions of the mutual information". Physical Review E 100 (3). doi:10.1103/PhysRevE.100.032305.
- ↑ James, Ryan G.; Ellison, Christopher J.; Crutchfield, James P. (1 September 2011). "Anatomy of a bit: Information in a time series observation". Chaos: An Interdisciplinary Journal of Nonlinear Science 21 (3). doi:10.1063/1.3637494.
References
- Fujishige, Satoru (1978). "Polymatroidal dependence structure of a set of random variables". Information and Control 39: 55–72. doi:10.1016/S0019-9958(78)91063-X.
- Varley, Thomas; Pope, Maria; Faskowitz, Joshua; Sporns, Olaf (2023). "Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex". Nature Communications Biology. doi:10.1038/s42003-023-04843-w.
Original source: https://en.wikipedia.org/wiki/Dual total correlation.
Read more |