Entropy of network ensembles
A set of networks that satisfies given structural characteristics can be treated as a network ensemble.[1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble.[2]
The entropy is the logarithm of the number of graphs.[3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network.[4]
Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints.[5]
Gibbs and Shannon entropy
By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as: [math]\displaystyle{ Z = \sum_{\mathbf{a}} \delta \left[\vec{F}(\mathbf{a})-\vec{C}\right] \exp\left(\sum_{ij}h_{ij}\Theta(a_{ij}) + r_{ij}a_{ij}\right) }[/math]
where [math]\displaystyle{ \vec{F}(\mathbf{a})=\vec{C} }[/math] is the constraint, and [math]\displaystyle{ a_{ij} }[/math] ([math]\displaystyle{ a_{ij} \geq {0} }[/math]) are the elements in the adjacency matrix, [math]\displaystyle{ a_{ij} \gt 0 }[/math] if and only if there is a link between node i and node j. [math]\displaystyle{ \Theta(a_{ij}) }[/math] is a step function with [math]\displaystyle{ \Theta(a_{ij}) = 1 }[/math] if [math]\displaystyle{ x \gt 0 }[/math], and [math]\displaystyle{ \Theta(a_{ij}) = 0 }[/math] if [math]\displaystyle{ x = 0 }[/math]. The auxiliary fields [math]\displaystyle{ h_{ij} }[/math] and [math]\displaystyle{ r_{ij} }[/math] have been introduced as analogy to the bath in classical mechanics.
For simple undirected networks, the partition function can be simplified as[6]
[math]\displaystyle{ Z = \sum_{\{a_{ij}\}} \prod_{k}\delta(\textrm{constraint}_{k}(\{a_{ij}\})) \exp\left(\sum_{i\lt j}\sum_{\alpha}h_{ij}(\alpha)\delta_{a_{ij},\alpha}\right) }[/math]
where [math]\displaystyle{ a_{ij}\in\alpha }[/math], [math]\displaystyle{ \alpha }[/math] is the index of the weight, and for a simple network [math]\displaystyle{ \alpha=\{0,1\} }[/math].
Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.
For a microcanonical ensemble, the Gibbs entropy [math]\displaystyle{ \Sigma }[/math] is defined by:
[math]\displaystyle{ \begin{align} \Sigma &= \frac{1}{N} \log\mathcal{N} \\ &= \frac{1}{N} \log Z|_{h_{ij}(\alpha)=0\forall(i,j,\alpha)} \end{align} }[/math]
where [math]\displaystyle{ \mathcal{N} }[/math] indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.
The probability of having a link between nodes i and j, with weight [math]\displaystyle{ \alpha }[/math] is given by:
[math]\displaystyle{ \pi_{ij}(\alpha) = \frac{\partial \log Z}{\partial{h_{ij}}(\alpha)} }[/math]
For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:
[math]\displaystyle{ {S}=-\sum_{i\lt j}\sum_{\alpha} \pi_{ij}(\alpha) \log \pi_{ij}(\alpha) }[/math]
Relation between Gibbs and Shannon entropy
Network ensemble [math]\displaystyle{ G(N,L) }[/math] with given number of nodes [math]\displaystyle{ N }[/math] and links [math]\displaystyle{ L }[/math], and its conjugate-canonical ensemble [math]\displaystyle{ G(N,p) }[/math] are characterized as microcanonical and canonical ensembles and they have Gibbs entropy [math]\displaystyle{ \Sigma }[/math] and the Shannon entropy S, respectively. The Gibbs entropy in the [math]\displaystyle{ G(N,p) }[/math] ensemble is given by:[7]
[math]\displaystyle{ {N}\Sigma = \log\left(\begin{matrix}\cfrac{N(N-1)}{2}\\L\end{matrix}\right) }[/math]
For [math]\displaystyle{ G(N,p) }[/math] ensemble,
[math]\displaystyle{ {p}_{ij} = p = \cfrac{2L}{N(N-1)} }[/math]
Inserting [math]\displaystyle{ p_{ij} }[/math] into the Shannon entropy:[6]
[math]\displaystyle{ \Sigma = S/N+\cfrac{1}{2N}\left[\log\left( \cfrac{N(N-1)}{2L} \right) - \log\left(\cfrac{N(N-1)}{2}-L\right)\right] }[/math]
The relation indicates that the Gibbs entropy [math]\displaystyle{ \Sigma }[/math] and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit [math]\displaystyle{ N\to\infty }[/math].
Von Neumann entropy
Von Neumann entropy is the extension of the classical Gibbs entropy in a quantum context. This entropy is constructed from a density matrix [math]\displaystyle{ \rho }[/math]: historically, the first proposed candidate for such a density matrix has been an expression of the Laplacian matrix L associated with the network. The average von Neumann entropy of an ensemble is calculated as:[8]
[math]\displaystyle{ {S}_{VN} = -\langle\mathrm{Tr}\rho\log(\rho)\rangle }[/math]
For random network ensemble [math]\displaystyle{ G(N,p) }[/math], the relation between [math]\displaystyle{ S_{VN} }[/math] and [math]\displaystyle{ S }[/math] is nonmonotonic when the average connectivity [math]\displaystyle{ p(N-1) }[/math] is varied.
For canonical power-law network ensembles, the two entropies are linearly related.[6]
[math]\displaystyle{ {S}_{VN} = \eta {S/N} + \beta }[/math]
Networks with given expected degree sequences suggest that, heterogeneity in the expected degree distribution implies an equivalence between a quantum and a classical description of networks, which respectively corresponds to the von Neumann and the Shannon entropy.[9]
This definition of the Von Neumann entropy can also be extended to multilayer networks with tensorial approach[10] and has been used successfully to reduce their dimensionality from a structural point of perspective.[11]
However, it has been shown that this definition of entropy does not satisfy the property of sub-additivity (see Von Neumann entropy's subadditivity), expected to hold theoretically. A more grounded definition, satisfying this fundamental property, has been introduced by De Domenico and Biamonte[12] as a quantum-like Gibbs state
[math]\displaystyle{ \rho(\beta)=\frac{e^{-\beta L}}{Z(\beta)} }[/math]
where [math]\displaystyle{ Z(\beta)=Tr[e^{-\beta L}] }[/math] is a normalizing factor which plays the role of the partition function, and [math]\displaystyle{ \beta }[/math] is a tunable parameter which allows multi-resolution analysis. If [math]\displaystyle{ \beta }[/math] is interpreted as a temporal parameter, this density matrix is formally proportional to the propagator of a diffusive process on the top of the network.
This feature has been used to build a statistical field theory of complex information dynamics, where the density matrix can be interpreted in terms of the super-position of streams operators whose action is to activate information flows among nodes.[13] The framework has been successfully applied to analyze the protein-protein interaction networks of virus-human interactomes, including the SARS-CoV-2, to unravel the systemic features of infection of the latter at microscopic, mesoscopic and macroscopic scales,[14] as well as to assess the importance of nodes for integrating information flows within the network and the role they play in network robustness.[15]
This approach has been generalized to deal with other types of dynamics, such as random walks, on the top of multilayer networks, providing an effective way to reduce the dimensionality of such systems without altering their structure.[16] Using both classical and maximum-entropy random walks, the corresponding density matrices have been used to encode the network states of the human brain and to assess, at multiple scales, connectome’s information capacity at different stages of dementia.[17]
See also
References
- ↑ Levin, E.; Tishby, N.; Solla, S.A. (October 1990). "A statistical approach to learning and generalization in layered neural networks". Proceedings of the IEEE 78 (10): 1568–1574. doi:10.1109/5.58339. ISSN 1558-2256.
- ↑ Bianconi, Ginestra (2008). "The entropy of randomized network ensembles" (in en). EPL (Europhysics Letters) 81 (2): 28005. doi:10.1209/0295-5075/81/28005. ISSN 0295-5075. Bibcode: 2008EL.....8128005B.
- ↑ Menichetti, Giulia; Remondini, Daniel (2014). "Entropy of a network ensemble: definitions and applications to genomic data". Theoretical Biology Forum 107 (1–2): 77–87. ISSN 0035-6050. PMID 25936214.
- ↑ Krawitz, Peter; Shmulevich, Ilya (27 September 2007). "Entropy of complex relevant components of Boolean networks". Physical Review E 76 (3): 036115. doi:10.1103/PhysRevE.76.036115. PMID 17930314. Bibcode: 2007PhRvE..76c6115K.
- ↑ Bianconi, Ginestra (27 March 2009). "Entropy of network ensembles". Physical Review E 79 (3): 036114. doi:10.1103/PhysRevE.79.036114. PMID 19392025. Bibcode: 2009PhRvE..79c6114B.
- ↑ 6.0 6.1 6.2 Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E 80 (4): 045102. doi:10.1103/PhysRevE.80.045102. PMID 19905379. Bibcode: 2009PhRvE..80d5102A.
- ↑ Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej (1 July 2006). "Homogeneous complex networks" (in en). Physica A: Statistical Mechanics and Its Applications 366: 587–607. doi:10.1016/j.physa.2005.10.024. ISSN 0378-4371. Bibcode: 2006PhyA..366..587B. https://www.sciencedirect.com/science/article/pii/S0378437105011180.
- ↑ Du, Wenxue; Li, Xueliang; Li, Yiyang; Severini, Simone (30 December 2010). "A note on the von Neumann entropy of random graphs" (in en). Linear Algebra and Its Applications 433 (11): 1722–1725. doi:10.1016/j.laa.2010.06.040. ISSN 0024-3795.
- ↑ Anand, Kartik; Bianconi, Ginestra; Severini, Simone (18 March 2011). "Shannon and von Neumann entropy of random networks with heterogeneous expected degree". Physical Review E 83 (3): 036109. doi:10.1103/PhysRevE.83.036109. PMID 21517560. Bibcode: 2011PhRvE..83c6109A.
- ↑ De Domenico, Manlio; Solé-Ribalta, Albert; Cozzo, Emanuele; Kivelä, Mikko; Moreno, Yamir; Porter, Mason A.; Gómez, Sergio; Arenas, Alex (4 December 2013). "Mathematical Formulation of Multilayer Networks". Physical Review X 3 (4): 041022. doi:10.1103/PhysRevX.3.041022. Bibcode: 2013PhRvX...3d1022D.
- ↑ De Domenico, Manlio; Nicosia, Vincenzo; Arenas, Alex; Latora, Vito (23 April 2015). "Structural reducibility of multilayer networks". Nature Communications 6: 6864. doi:10.1038/ncomms7864. PMID 25904309. Bibcode: 2015NatCo...6.6864D. http://deim.urv.cat/%7Ealephsys/papers/reducibility.pdf.
- ↑ De Domenico, Manlio; Biamonte, Jacob (21 December 2016). "Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison". Physical Review X 6 (4): 041062. doi:10.1103/PhysRevX.6.041062. Bibcode: 2016PhRvX...6d1062D.
- ↑ Ghavasieh, Arsham; Nicolini, Carlo; De Domenico, Manlio (10 November 2020). "Statistical physics of complex information dynamics". Physical Review E 102 (5): 052304. doi:10.1103/PhysRevE.102.052304. PMID 33327131. Bibcode: 2020PhRvE.102e2304G.
- ↑ Ghavasieh, Arsham; Bontorin, Sebastiano; Artime, Oriol; Verstraete, Nina; De Domenico, Manlio (23 April 2021). "Multiscale statistical physics of the pan-viral interactome unravels the systemic nature of SARS-CoV-2 infections". Communications Physics 4 (1): 83. doi:10.1038/s42005-021-00582-8. Bibcode: 2021CmPhy...4...83G.
- ↑ Ghavasieh, Arsham; Stella, Massimo; Biamonte, Jacob; De Domenico, Manlio (10 June 2021). "Unraveling the effects of multiscale network entanglement on empirical systems". Communications Physics 4 (1): 129. doi:10.1038/s42005-021-00633-0. Bibcode: 2021CmPhy...4..129G.
- ↑ Ghavasieh, Arsham; De Domenico, Manlio (13 February 2020). "Enhancing transport properties in interconnected systems without altering their structure". Physical Review Research 2 (1): 13–15. doi:10.1103/PhysRevResearch.2.013155. Bibcode: 2020PhRvR...2a3155G.
- ↑ Benigni, Barbara; Ghavasieh, Arsham; Corso, Alessandra; D'Andrea, Valeria; De Domenico, Manlio (22 June 2021). "Persistence of information flow: a multiscale characterization of human brain". Network Neuroscience 5 (3): 831–850. doi:10.1162/netn_a_00203. PMID 34746629.
Original source: https://en.wikipedia.org/wiki/Entropy of network ensembles.
Read more |