Infomax

From HandWiki

Infomax is an optimization principle for artificial neural networks and other information processing systems. It prescribes that a function that maps a set of input values I to a set of output values O should be chosen or learned so as to maximize the average Shannon mutual information between I and O, subject to a set of specified constraints and/or noise processes. Infomax algorithms are learning algorithms that perform this optimization process. The principle was described by Linsker in 1988.[1] Infomax, in its zero-noise limit, is related to the principle of redundancy reduction proposed for biological sensory processing by Horace Barlow in 1961,[2] and applied quantitatively to retinal processing by Atick and Redlich.[3]

One of the applications of infomax has been to an independent component analysis algorithm that finds independent signals by maximizing entropy. Infomax-based ICA was described by Bell and Sejnowski, and Nadal and Parga in 1995.[4] [5]

See also

References

  1. Linsker R (1988). "Self-organization in a perceptual network". IEEE Computer 21 (3): 105–17. doi:10.1109/2.36. 
  2. Barlow, H. (1961). "Possible principles underlying the transformations of sensory messages". in Rosenblith, W.. Sensory Communication. Cambridge MA: MIT Press. pp. 217–234. 
  3. Atick JJ, Redlich AN (1992). "What does the retina know about natural scenes?". Neural Computation 4 (2): 196–210. doi:10.1162/neco.1992.4.2.196. 
  4. Bell AJ, Sejnowski TJ (November 1995). "An information-maximization approach to blind separation and blind deconvolution". Neural Comput 7 (6): 1129–59. doi:10.1162/neco.1995.7.6.1129. PMID 7584893. 
  5. Nadal J.P., Parga N. (1999). "Sensory coding: information maximization and redundancy reduction". in Burdet, G.; Combe, P.; Parodi, O.. Neural Information Processing. World Scientific Series in Mathematical Biology and Medicine. 7. Singapore: World Scientific. pp. 164–171. http://www.lps.ens.fr/~nadal/documents/proceedings/carg97/carg97.html.