Partial information decomposition
Partial Information Decomposition is an extension of information theory, that aims to generalize the pairwise relations described by information theory to the interaction of multiple variables.[1]
Motivation
Information theory can quantify the amount of information a single source variable has about a target variable via the mutual information . If we now consider a second source variable , classical information theory can only describe the mutual information of the joint variable with , given by . In general however, it would be interesting to know how exactly the individual variables and and their interactions relate to .
Consider that we are given two source variables and a target variable . In this case the total mutual information , while the individual mutual information . That is, there is synergistic information arising from the interaction of about , which cannot be easily captured with classical information theoretic quantities.
As mutual information is able to capture non-linear, non-monotone statistical relationships between variables, the PID framework would be able to quantify mulitvariate statistical dependencies in arbitrary complex systems in a much more general way than e.g. the correlation coefficient and is especially able to distinguish between different kinds of interaction between variables.
Definition
Partial information decomposition further decomposes the mutual information between the source variables with the target variable as
Here the individual information atoms are defined as
- is the unique information that has about , which is not in
- is the synergistic information that is in the interaction of and about
- is the redundant information that is in both or about
There is, thus far, no universal agreement on how these terms should be defined, with different approaches that decompose information into redundant, unique, and synergistic components appearing in the literature.[1][2][3][4]
However, once an appropriate definition of redundant information has been chosen, the decomposition reduces to a Möbius inversion[5] and can be calculated using the Fast Möbius Transform[6].
Applications
Despite the lack of universal agreement, partial information decomposition has been applied to diverse fields, including climatology,[7] neuroscience[8][9][10] sociology,[11] and machine learning[12] Partial information decomposition has also been proposed as a possible foundation on which to build a mathematically robust definition of emergence in complex systems[13] and may be relevant to formal theories of consciousness.[14]
See also
References
- ↑ 1.0 1.1 Williams PL, Beer RD (2010-04-14). "Nonnegative Decomposition of Multivariate Information". arXiv:1004.2515 [cs.IT].
- ↑ "Quantifying Synergistic Information Using Intermediate Stochastic Variables" (in en). Entropy 19 (2): 85. February 2017. doi:10.3390/e19020085. ISSN 1099-4300.
- ↑ "An operational information decomposition via synergistic disclosure". Journal of Physics A: Mathematical and Theoretical 53 (48): 485001. 2020-12-04. doi:10.1088/1751-8121/abb723. ISSN 1751-8113. Bibcode: 2020JPhA...53V5001R.
- ↑ "A Novel Approach to the Partial Information Decomposition". Entropy 24 (3): 403. March 2022. doi:10.3390/e24030403. PMID 35327914. Bibcode: 2022Entrp..24..403K.
- ↑ "Mereological approach to higher-order structure in complex systems: From macro to micro with Möbius". Physical Review Research (American Physical Society) 7 (2). 2025. doi:10.1103/PhysRevResearch.7.023016. Bibcode: 2025PhRvR...7b3016J.
- ↑ "Fast Möbius transform: An algebraic approach to information decomposition". Physical Review Research (American Physical Society) 7 (3). 2025. doi:10.1103/PhysRevResearch.7.033049.
- ↑ "Debates—Does Information Theory Provide a New Paradigm for Earth Science? Causality, Interaction, and Feedback" (in en). Water Resources Research 56 (2). February 2020. doi:10.1029/2019WR024940. ISSN 0043-1397. Bibcode: 2020WRR....5624940G.
- ↑ "Revealing the Dynamics of Neural Information Processing with Multivariate Information Decomposition". Entropy 24 (7): 930. July 2022. doi:10.3390/e24070930. PMID 35885153. Bibcode: 2022Entrp..24..930N.
- ↑ "A synergistic core for human brain evolution and cognition". Nature Neuroscience 25 (6): 771–782. June 2022. doi:10.1038/s41593-022-01070-0. PMID 35618951.
- ↑ "Partial information decomposition as a unified approach to the specification of neural goal functions". Brain and Cognition. Perspectives on Human Probabilistic Inferences and the 'Bayesian Brain' 112: 25–38. March 2017. doi:10.1016/j.bandc.2015.09.004. PMID 26475739.
- ↑ "Untangling Synergistic Effects of Intersecting Social Identities with Partial Information Decomposition" (in en). Entropy 24 (10): 1387. October 2022. doi:10.3390/e24101387. ISSN 1099-4300. PMID 37420406. Bibcode: 2022Entrp..24.1387V.
- ↑ "The Partial Information Decomposition of Generative Neural Network Models" (in en). Entropy 19 (9): 474. September 2017. doi:10.3390/e19090474. ISSN 1099-4300. Bibcode: 2017Entrp..19..474T.
- ↑ "Greater than the parts: a review of the information decomposition approach to causal emergence". Philosophical Transactions. Series A, Mathematical, Physical, and Engineering Sciences 380 (2227). July 2022. doi:10.1098/rsta.2021.0246. PMID 35599558.
- ↑ "What it is like to be a bit: an integrated information decomposition account of emergent mental phenomena". Neuroscience of Consciousness 2021 (2). 2021. doi:10.1093/nc/niab027. PMID 34804593.
