A Mathematical Theory of Communication
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.[1][2][3][4] It was renamed The Mathematical Theory of Communication in the 1949 book of the same name,[5] a small but significant title change after realizing the generality of this work. It has tens of thousands of citations which is rare for a scientific article and gave rise to the field of information theory. Scientific American referred to the paper as the "Magna Carta of the Information Age".[6]
Publication
The article was the founding work of the field of information theory. It was later published in 1949 as a book titled The Mathematical Theory of Communication (ISBN:0-252-72546-8), which was published as a paperback in 1963 (ISBN:0-252-72548-4). The book contains an additional article by Warren Weaver, providing an overview of the theory for a more general audience.
Contents
![](/wiki/images/thumb/f/f3/Shannon_communication_system.svg/343px-Shannon_communication_system.svg.png)
This work is known for introducing the concepts of channel capacity as well as the noisy channel coding theorem.
Shannon's article laid out the basic elements of communication:
- An information source that produces a message
- A transmitter that operates on the message to create a signal which can be sent through a channel
- A channel, which is the medium over which the signal, carrying the information that composes the message, is sent
- A receiver, which transforms the signal back into the message intended for delivery
- A destination, which can be a person or a machine, for whom or which the message is intended
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information. It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.
References
- ↑ "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379–423. July 1948. doi:10.1002/j.1538-7305.1948.tb01338.x. http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf. "The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey.".
- ↑ "A Mathematical Theory of Communication". Bell System Technical Journal 27 (4): 623–656. October 1948. doi:10.1002/j.1538-7305.1948.tb00917.x.
- ↑ Ash, Robert B. (1966). Information Theory: Tracts in Pure & Applied Mathematics. New York: John Wiley & Sons Inc. ISBN 0-470-03445-9.
- ↑ Yeung, Raymond W. (2008). "The Science of Information". Information Theory and Network Coding. Springer. pp. 1–4. doi:10.1007/978-0-387-79234-7_1. ISBN 978-0-387-79233-0. https://archive.org/details/informationtheor00yeun.
- ↑ The Mathematical Theory of Communication. University of Illinois Press. 1949. ISBN 0-252-72548-4. http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf.
- ↑ Goodman, Rob; Soni, Jimmy (2018). "Genius in Training" (in en-US). https://alumni.umich.edu/michigan-alum/genius-in-training/.
External links
- (PDF) "A Mathematical Theory of Communication" by C. E. Shannon (reprint with corrections) hosted by the Harvard Mathematics Department, at Harvard University
- Original publications: (in en) The Bell System Technical Journal 1948-07: Vol 27 Iss 3. Internet Archive. AT & T Bell Laboratories. 1948-07-01. pp. 379-423. https://archive.org/details/sim_att-technical-journal_1948-07_27_3/page/n2/., (in en) The Bell System Technical Journal 1948-10: Vol 27 Iss 4. Internet Archive. AT & T Bell Laboratories. 1948-10-01. pp. 623-656. https://archive.org/details/sim_att-technical-journal_1948-10_27_4/page/623/.
- Khan Academy video about "A Mathematical Theory of Communication"
![]() | Original source: https://en.wikipedia.org/wiki/A Mathematical Theory of Communication.
Read more |