Limits of computation
The limits of computation are governed by a number of different factors. In particular, there are several physical and practical limits to the amount of computation or data storage that can be performed with a given amount of mass, volume, or energy.
Hardware limits or physical limits
Processing and memory density
- The Bekenstein bound limits the amount of information that can be stored within a spherical volume to the entropy of a black hole with the same surface area.
- Thermodynamics limit the data storage of a system based on its energy, number of particles and particle modes. In practice, it is a stronger bound than the Bekenstein bound.[1]
Processing speed
- Bremermann's limit is the maximum computational speed of a self-contained system in the material universe, and is based on mass–energy versus quantum uncertainty constraints.
Communication delays
- The Margolus–Levitin theorem sets a bound on the maximum computational speed per unit of energy: 6 × 1033 operations per second per joule. This bound, however, can be avoided if there is access to quantum memory. Computational algorithms can then be designed that require arbitrarily small amounts of energy/time per one elementary computation step.[2][3]
Energy supply
- Landauer's principle defines a lower theoretical limit for energy consumption: kT ln 2 consumed per irreversible state change, where k is the Boltzmann constant and T is the operating temperature of the computer.[4] Reversible computing is not subject to this lower bound. T cannot, even in theory, be made lower than 3 kelvins, the approximate temperature of the cosmic microwave background radiation, without spending more energy on cooling than is saved in computation. However, on a timescale of 109 – 1010 years, the cosmic microwave background radiation will be decreasing exponentially, which has been argued to eventually enable 1030 as much computations per unit of energy.[5] Important parts[clarification needed] of this argument have been disputed.[6]
Building devices that approach physical limits
Several methods have been proposed for producing computing devices or data storage devices that approach physical and practical limits:
- A cold degenerate star could conceivably be used as a giant data storage device, by carefully perturbing it to various excited states, in the same manner as an atom or quantum well used for these purposes. Such a star would have to be artificially constructed, as no natural degenerate stars will cool to this temperature for an extremely long time. It is also possible that nucleons on the surface of neutron stars could form complex "molecules",[7] which some have suggested might be used for computing purposes,[8] creating a type of computronium based on femtotechnology, which would be faster and denser than computronium based on nanotechnology.
- It may be possible to use a black hole as a data storage or computing device, if a practical mechanism for extraction of contained information can be found. Such extraction may in principle be possible (Stephen Hawking's proposed resolution to the black hole information paradox). This would achieve storage density exactly equal to the Bekenstein bound. Seth Lloyd calculated[9] the computational abilities of an "ultimate laptop" formed by compressing a kilogram of matter into a black hole of radius 1.485 × 10−27 meters, concluding that it would only last about 10−19 seconds before evaporating due to Hawking radiation, but that during this brief time it could compute at a rate of about 5 × 1050 operations per second, ultimately performing about 1032 operations on 1016 bits (~1 PB). Lloyd notes that "Interestingly, although this hypothetical computation is performed at ultra-high densities and speeds, the total number of bits available to be processed is not far from the number available to current computers operating in more familiar surroundings."[10]
- In The Singularity Is Near, Ray Kurzweil cites the calculations of Seth Lloyd that a universal-scale computer is capable of 1090 operations per second. The mass of the universe can be estimated at 3 × 1052 kilograms. If all matter in the universe was turned into a black hole, it would have a lifetime of 2.8 × 10139 seconds before evaporating due to Hawking radiation. During that lifetime such a universal-scale black hole computer would perform 2.8 × 10229 operations.[11]
Abstract limits in computer science
In the field of theoretical computer science the computability and complexity of computational problems are often sought-after. Computability theory describes the degree to which problems are computable, whereas complexity theory describes the asymptotic degree of resource consumption. Computational problems are therefore confined into complexity classes. The arithmetical hierarchy and polynomial hierarchy classify the degree to which problems are respectively computable and computable in polynomial time. For instance, the level [math]\displaystyle{ \Sigma^0_0=\Pi^0_0=\Delta^0_0 }[/math] of the arithmetical hierarchy classifies computable, partial functions. Moreover, this hierarchy is strict such that at any other class in the arithmetic hierarchy classifies strictly uncomputable functions.
Loose and tight limits
Many limits derived in terms of physical constants and abstract models of computation in computer science are loose.[12] Very few known limits directly obstruct leading-edge technologies, but many engineering obstacles currently cannot be explained by closed-form limits.
See also
- Digital physics
- Hypercomputation
- Matrioshka brain
- Physics of computation
- Programmable matter
- Quantum computing
- Supertask
- Transcomputational problem
References
- ↑ Sandberg, Anders (22 December 1999). "The Physics of Information Processing Superobjects: Daily Life Among the Jupiter Brains". Journal of Evolution and Technology. http://transhumanist.com/volume5/Brains2.pdf. Retrieved 30 May 2014.
- ↑ Jordan, Stephen P. (2017). "Fast quantum computation at arbitrarily low energy". Phys. Rev. A 95 (3): 032305. doi:10.1103/physreva.95.032305. Bibcode: 2017PhRvA..95c2305J.
- ↑ Sinitsyn, Nikolai A. (2018). "Is there a quantum limit on speed of computation?". Physics Letters A 382 (7): 477–481. doi:10.1016/j.physleta.2017.12.042. Bibcode: 2018PhLA..382..477S.
- ↑ Vitelli, M.B.; Plenio, V. (2001). "The physics of forgetting: Landauer's erasure principle and information theory". Contemporary Physics 42 (1): 25–60. doi:10.1080/00107510010018916. ISSN 0010-7514. Bibcode: 2001ConPh..42...25P. http://www3.imperial.ac.uk/pls/portallive/docs/1/55905.PDF.
- ↑ Sandberg, Anders; Armstrong, Stuart; Cirkovic, Milan M. (2017-04-27). "That is not dead which can eternal lie: the aestivation hypothesis for resolving Fermi's paradox". arXiv:1705.03394 [physics.pop-ph].
- ↑ Bennett, Charles H.; Hanson, Robin; Riedel, C. Jess (1 August 2019). "Comment on 'The Aestivation Hypothesis for Resolving Fermi's Paradox'". Foundations of Physics 49 (8): 820–829. doi:10.1007/s10701-019-00289-5. ISSN 1572-9516. Bibcode: 2019FoPh...49..820B. https://doi.org/10.1007/s10701-019-00289-5.
- ↑ "Life on neutron stars". The Internet Encyclopedia of Science. http://www.daviddarling.info/encyclopedia/N/neutronstarlife.html.
- ↑ "Femtotech? (Sub)Nuclear Scale Engineering and Computation". http://www.cs.usu.edu/~degaris/essays/femtotech.html.
- ↑ Lloyd, Seth (2000). "Ultimate physical limits to computation". Nature 406 (6799): 1047–1054. doi:10.1038/35023282. PMID 10984064. Bibcode: 2000Natur.406.1047L.
- ↑ Lloyd, Seth (2000). "Ultimate physical limits to computation". Nature 406 (6799): 1047–1054. doi:10.1038/35023282. PMID 10984064. Bibcode: 2000Natur.406.1047L. http://puhep1.princeton.edu/~mcdonald/examples/QM/lloyd_nature_406_1047_00.pdf.
- ↑ Kurzweil, Ray (2005). The Singularity is Near. New York: Viking. pp. 911.
- ↑ Markov, Igor (2014). "Limits on Fundamental Limits to Computation". Nature 512 (7513): 147–154. doi:10.1038/nature13570. PMID 25119233. Bibcode: 2014Natur.512..147M.
Original source: https://en.wikipedia.org/wiki/Limits of computation.
Read more |