Network entropy

From HandWiki

In network science, the network entropy is a disorder measure derived from information theory to describe the level of randomness and the amount of information encoded in a graph.[1] It is a relevant metric to quantitatively characterize real complex networks and can also be used to quantify network complexity[1][2]

Formulations

According to a publication from Entropy, a journal published by MDPI, there are several formulations in which to measure the network entropy and, as a rule, they all require a particular property of the graph to be focused, such as the adjacency matrix, degree sequence, degree distribution or number of bifurcations, what might lead to values of entropy that aren't invariant to the chosen network description.[3]

Degree Distribution Shannon Entropy

The Shannon entropy can be measured for the network degree probability distribution as an average measurement of the heterogeneity of the network.

[math]\displaystyle{ \mathcal{H} = - \sum_{k=1}^{N - 1} P(k) \ln{P(k)} }[/math]

This formulation has limited use with regards to complexity, information content, causation and temporal information. Be that as it may, algorithmic complexity has the ability to characterize any general or universal property of a graph or network and it is proven that graphs with low entropy have low algorithmic complexity because the statistical regularities found in a graph are useful for computer programs to recreate it. The same cannot be said for high entropy networks though, as these might have any value for algorithmic complexity.[3]

Random Walker Shannon Entropy

Due to the limits of the previous formulation, it is possible to take a different approach while keeping the usage of the original Shannon Entropy equation.

Consider a random walker that travels around the graph, going from a node [math]\displaystyle{ i }[/math] to any node [math]\displaystyle{ j }[/math] adjacent to [math]\displaystyle{ i }[/math] with equal probability. The probability distribution [math]\displaystyle{ p_{ij} }[/math] that describes the behavior of this random walker would thus be

[math]\displaystyle{ p_{ij} = \begin{cases} \frac{1}{k_i}, & \text{if } A_{ij} = 1 \\ 0, & \text{if } A_{ij} = 0 \\ \end{cases} }[/math],

where [math]\displaystyle{ (A_{ij}) }[/math] is the graph adjacency matrix and [math]\displaystyle{ k_i }[/math] is the node [math]\displaystyle{ i }[/math] degree.

From that, the Shannon entropy from each node [math]\displaystyle{ \mathcal{S}_i }[/math] can be defined as

[math]\displaystyle{ \mathcal{S}_i = - \sum_{j = 1}^{N - 1} p_{ij} \ln{p_{ij}} = \ln{k_i} }[/math]

and, since [math]\displaystyle{ max(k_i) = N - 1 }[/math], the normalized node entropy [math]\displaystyle{ \mathcal{H}_i }[/math] is calculated

[math]\displaystyle{ \mathcal{H}_i = \frac{\mathcal{S}_i}{max(\mathcal{S}_i)} = \frac{\ln{k_i}}{\ln(max(k_i))} = \frac{\ln{k_i}}{\ln(N - 1)} }[/math]

This leads to a normalized network entropy [math]\displaystyle{ \mathcal{H} }[/math], calculated by averaging the normalized node entropy over the whole network[4]

[math]\displaystyle{ \mathcal{H} = \frac{1}{N} \sum_{i = 1}^N \mathcal{H}_i = \frac{1}{N \ln(N - 1)} \sum_{i = 1}^N \ln{k_i} }[/math]

The normalized network entropy is maximal [math]\displaystyle{ \mathcal{H} = 1 }[/math] when the network is fully connected and decreases the sparser the network becomes [math]\displaystyle{ \mathcal{H} = 0 }[/math]. Notice that isolated nodes [math]\displaystyle{ k_i = 0 }[/math] do not have its probability [math]\displaystyle{ p_{ij} }[/math] defined and, therefore, are not considered when measuring the network entropy. This formulation of network entropy has low sensitivity to hubs due to the logarithmic factor and is more meaningful for weighted networks.,[4] what ultimately makes it hard to differentiate scale-free networks using this measure alone.[2]

Random Walker Kolmogorov–Sinai Entropy

The limitations of the random walker Shannon entropy can be overcome by adapting it to use a Kolmogorov–Sinai entropy. In this context, network entropy is the entropy of a stochastic matrix associated with the graph adjacency matrix [math]\displaystyle{ (A_{ij}) }[/math] and the random walker Shannon entropy is called the dynamic entropy of the network. From that, let [math]\displaystyle{ \lambda }[/math] be the dominant eigenvalue of [math]\displaystyle{ (A_{ij}) }[/math]. It is proven that [math]\displaystyle{ \ln \lambda }[/math] satisfies a variational principal [5] that is equivalent to the dynamic entropy for unweighted networks, i.e., the adjacency matrix consists exclusively of boolean values. Therefore, the topological entropy is defined as

[math]\displaystyle{ \mathcal{H} = \ln \lambda }[/math]

This formulation is important to the study of network robustness, i.e., the capacity of the network to withstand random structural changes. Robustness is actually difficult to be measured numerically whereas the entropy can be easily calculated for any network, which is especially important in the context of non-stationary networks. The entropic fluctuation theorem shows that this entropy is positively correlated to robustness and hence a greater insensitivity of an observable to dynamic or structural perturbations of the network. Moreover, the eigenvalues are inherently related to the multiplicity of internal pathways, leading to a negative correlation between the topological entropy and the shortest average path length. [6]

Other than that, the Kolmogorov entropy is related to the Ricci curvature of the network,[7] a metric that has been used to differentiate stages of cancer from gene co-expression networks,[8] as well as to give hallmarks of financial crashes from stock correlation networks[9]

Von Neumann entropy

Von Neumann entropy is the extension of the classical Gibbs entropy in a quantum context. This entropy is constructed from a density matrix [math]\displaystyle{ \rho }[/math]: historically, the first proposed candidate for such a density matrix has been an expression of the Laplacian matrix L associated with the network. The average von Neumann entropy of an ensemble is calculated as:[10]

[math]\displaystyle{ {S}_{VN} = -\langle\mathrm{Tr}\rho\log(\rho)\rangle }[/math]

For random network ensemble [math]\displaystyle{ G(N,p) }[/math], the relation between [math]\displaystyle{ S_{VN} }[/math] and [math]\displaystyle{ S }[/math] is nonmonotonic when the average connectivity [math]\displaystyle{ p(N-1) }[/math] is varied.

For canonical power-law network ensembles, the two entropies are linearly related.[11]

[math]\displaystyle{ {S}_{VN} = \eta {S/N} + \beta }[/math]

Networks with given expected degree sequences suggest that, heterogeneity in the expected degree distribution implies an equivalence between a quantum and a classical description of networks, which respectively corresponds to the von Neumann and the Shannon entropy.[12]

This definition of the Von Neumann entropy can also be extended to multilayer networks with tensorial approach[13] and has been used successfully to reduce their dimensionality from a structural point of perspective.[14]

However, it has been shown that this definition of entropy does not satisfy the property of sub-additivity (see Von Neumann entropy's subadditivity), expected to hold theoretically. A more grounded definition, satisfying this fundamental property, has been introduced by Manlio De Domenico and Biamonte[15] as a quantum-like Gibbs state

[math]\displaystyle{ \rho(\beta)=\frac{e^{-\beta L}}{Z(\beta)} }[/math]

where [math]\displaystyle{ Z(\beta)=Tr[e^{-\beta L}] }[/math] is a normalizing factor which plays the role of the partition function, and [math]\displaystyle{ \beta }[/math] is a tunable parameter which allows multi-resolution analysis. If [math]\displaystyle{ \beta }[/math] is interpreted as a temporal parameter, this density matrix is formally proportional to the propagator of a diffusive process on the top of the network.

This feature has been used to build a statistical field theory of complex information dynamics, where the density matrix can be interpreted in terms of the super-position of streams operators whose action is to activate information flows among nodes.[16] The framework has been successfully applied to analyze the protein-protein interaction networks of virus-human interactomes, including the SARS-CoV-2, to unravel the systemic features of infection of the latter at microscopic, mesoscopic and macroscopic scales,[17] as well as to assess the importance of nodes for integrating information flows within the network and the role they play in network robustness.[18]

This approach has been generalized to deal with other types of dynamics, such as random walks, on the top of multilayer networks, providing an effective way to reduce the dimensionality of such systems without altering their structure.[19] Using both classical and maximum-entropy random walks, the corresponding density matrices have been used to encode the network states of the human brain and to assess, at multiple scales, connectome’s information capacity at different stages of dementia.[20]

Maximum Entropy Principle

The maximum entropy principle is a variational principal stating that the probability distribution best representing the current state of a system is the one which maximizes the Shannon entropy.[21] This concept can be used to generate an ensemble of random graphs with given structural properties derived from the maximum entropy approach which, in its turn, describes the most probable network configuration: the maximum entropy principle allows for maximally unbiased information when lacking complete knowledge (microscopic configuration is not accessible, e.g.: we don't know the adjacency matrix). On the other hand, this ensemble serves as a null model when the actual microscopic configuration of the network is known, allowing to assess the significance of empirical patterns found in the network[22]

Complex Network Ensembles

It is possible to extend the network entropy formulations to instead measure the ensemble entropy. Let [math]\displaystyle{ \Omega }[/math] be an ensemble of all graphs [math]\displaystyle{ G }[/math] with the same number [math]\displaystyle{ N }[/math] of nodes and type of links, [math]\displaystyle{ P(G) }[/math] is defined as the occurrence probability of a graph [math]\displaystyle{ G \in \Omega }[/math]. From the maximum entropy principle, it is desired to achieve [math]\displaystyle{ P(G) }[/math] such that it maximizes the Shannon entropy

[math]\displaystyle{ \mathcal{H} = - \sum_{G \in \Omega} P(G) \ln P(G) }[/math]

Moreover, constraints shall be imposed in order to extract conclusions. Soft constraints (constraints set over the whole ensemble [math]\displaystyle{ \Omega }[/math]) will lead to the canonical ensemble whereas hard constraints (constraints set over each graph [math]\displaystyle{ G }[/math]) are going to define the microcanonical ensemble. In the end, these ensembles lead to models that can validate a range of network patterns and even reconstruct graphs from limited knowledge, showing that maximum entropy models based on local constraints can quantify the relevance of a set of observed features and extracting meaningful information from huge big data streams in real-time and high-dimensional noisy data.[22]

See also Entropy of network ensembles.

References

  1. 1.0 1.1 Anand, Kartik; Krioukov, Dmitri; Bianconi, Ginestra (2014). "Entropy distribution and condensation in random networks with a given degree distribution". Physical Review E 89 (6): 062807. doi:10.1103/PhysRevE.89.062807. PMID 25019833. Bibcode2014PhRvE..89f2807A. https://journals.aps.org/pre/abstract/10.1103/PhysRevE.89.062807. 
  2. 2.0 2.1 Freitas, Cristopher GS; Aquino, Andre LL; Ramos, Heitor S; Frery, Alejandro C; Rosso, Osvaldo A (2019). "A detailed characterization of complex networks using Information Theory". Scientific Reports 9 (1): 16689. doi:10.1038/s41598-019-53167-5. PMID 31723172. Bibcode2019NatSR...916689F. 
  3. 3.0 3.1 Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper (2018). "A review of graph and network complexity from an algorithmic information perspective". Entropy 20 (8): 551. doi:10.3390/e20080551. PMID 33265640. Bibcode2018Entrp..20..551Z. 
  4. 4.0 4.1 Small, Michael (2013). "Complex networks from time series: Capturing dynamics". 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013). pp. 2509–2512. doi:10.1109/ISCAS.2013.6572389. ISBN 978-1-4673-5762-3. https://ieeexplore.ieee.org/document/6572389. 
  5. Arnold, Ludwig; Gundlach, Volker Matthias; Demetrius, Lloyd (1994). "Evolutionary formalism for products of positive random matrices". The Annals of Applied Probability 4 (3): 859–901. doi:10.1214/aoap/1177004975. https://www.jstor.org/stable/2245067. 
  6. Demetrius, Lloyd; Manke, Thomas (2005). "Robustness and network evolution—an entropic principle". Physica A: Statistical Mechanics and Its Applications 346 (3): 682–696. doi:10.1016/j.physa.2004.07.011. Bibcode2005PhyA..346..682D. https://www.sciencedirect.com/science/article/abs/pii/S0378437104009975. 
  7. Lott, J.; Villani, C. (2009). "Ricci curvature for metric-measure spaces via optimal transport". Annals of Mathematics 169 (3): 903–991. doi:10.4007/annals.2009.169.903. 
  8. Sandhu, R.; Georgiou, T.; Reznik, E.; Zhu, L.; Kolesov, I.; Senbabaoglu, Y.; Tannenbaum, A. (2015). "Graph curvature for differentiating cancer networks". Scientific Reports 5: 12323. doi:10.1038/srep12323. PMID 26169480. Bibcode2015NatSR...512323S. 
  9. Sandhu, Romeil S; Georgiou, Tryphon T; Tannenbaum, Allen R (2016). "Ricci curvature: An economic indicator for market fragility and systemic risk". Science Advances 2 (5): e1501495. doi:10.1126/sciadv.1501495. PMID 27386522. Bibcode2016SciA....2E1495S. 
  10. Du, Wenxue; Li, Xueliang; Li, Yiyang; Severini, Simone (30 December 2010). "A note on the von Neumann entropy of random graphs" (in en). Linear Algebra and Its Applications 433 (11): 1722–1725. doi:10.1016/j.laa.2010.06.040. ISSN 0024-3795. 
  11. Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E 80 (4): 045102. doi:10.1103/PhysRevE.80.045102. PMID 19905379. Bibcode2009PhRvE..80d5102A. 
  12. Anand, Kartik; Bianconi, Ginestra; Severini, Simone (18 March 2011). "Shannon and von Neumann entropy of random networks with heterogeneous expected degree". Physical Review E 83 (3): 036109. doi:10.1103/PhysRevE.83.036109. PMID 21517560. Bibcode2011PhRvE..83c6109A. 
  13. De Domenico, Manlio; Solé-Ribalta, Albert; Cozzo, Emanuele; Kivelä, Mikko; Moreno, Yamir; Porter, Mason A.; Gómez, Sergio; Arenas, Alex (4 December 2013). "Mathematical Formulation of Multilayer Networks". Physical Review X 3 (4): 041022. doi:10.1103/PhysRevX.3.041022. Bibcode2013PhRvX...3d1022D. 
  14. De Domenico, Manlio; Nicosia, Vincenzo; Arenas, Alex; Latora, Vito (23 April 2015). "Structural reducibility of multilayer networks". Nature Communications 6: 6864. doi:10.1038/ncomms7864. PMID 25904309. Bibcode2015NatCo...6.6864D. http://deim.urv.cat/%7Ealephsys/papers/reducibility.pdf. 
  15. De Domenico, Manlio; Biamonte, Jacob (21 December 2016). "Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison". Physical Review X 6 (4): 041062. doi:10.1103/PhysRevX.6.041062. Bibcode2016PhRvX...6d1062D. 
  16. Ghavasieh, Arsham; Nicolini, Carlo; De Domenico, Manlio (10 November 2020). "Statistical physics of complex information dynamics". Physical Review E 102 (5): 052304. doi:10.1103/PhysRevE.102.052304. PMID 33327131. Bibcode2020PhRvE.102e2304G. 
  17. Ghavasieh, Arsham; Bontorin, Sebastiano; Artime, Oriol; Verstraete, Nina; De Domenico, Manlio (23 April 2021). "Multiscale statistical physics of the pan-viral interactome unravels the systemic nature of SARS-CoV-2 infections". Communications Physics 4 (1): 83. doi:10.1038/s42005-021-00582-8. Bibcode2021CmPhy...4...83G. 
  18. Ghavasieh, Arsham; Stella, Massimo; Biamonte, Jacob; De Domenico, Manlio (10 June 2021). "Unraveling the effects of multiscale network entanglement on empirical systems". Communications Physics 4 (1): 129. doi:10.1038/s42005-021-00633-0. Bibcode2021CmPhy...4..129G. 
  19. Ghavasieh, Arsham; De Domenico, Manlio (13 February 2020). "Enhancing transport properties in interconnected systems without altering their structure". Physical Review Research 2 (1): 13–15. doi:10.1103/PhysRevResearch.2.013155. Bibcode2020PhRvR...2a3155G. 
  20. Benigni, Barbara; Ghavasieh, Arsham; Corso, Alessandra; D'Andrea, Valeria; De Domenico, Manlio (22 June 2021). "Persistence of information flow: a multiscale characterization of human brain". Network Neuroscience 5 (3): 831–850. doi:10.1162/netn_a_00203. PMID 34746629. 
  21. "Information Theory and Statistical Mechanics". Physical Review. Series II 106 (4): 620–630. 1957. doi:10.1103/PhysRev.106.620. Bibcode1957PhRv..106..620J. http://bayes.wustl.edu/etj/articles/theory.1.pdf. 
  22. 22.0 22.1 Cimini, Giulio; Squartini, Tiziano; Saracco, Fabio; Garlaschelli, Diego; Gabrielli, Andrea; Caldarelli, Guido (2019). "The statistical physics of real-world networks". Nature Reviews Physics 1 (1): 58–71. doi:10.1038/s42254-018-0002-6. Bibcode2019NatRP...1...58C.