Biography:Peter Gacs

From HandWiki
Revision as of 05:02, 7 February 2024 by Jslovo (talk | contribs) (url)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Hungarian-American mathematician and computer scientist
Peter Gacs
Born(1947-05-09)May 9, 1947
Budapest, Hungary
NationalityHungarian
CitizenshipHungary, United States
Alma materEötvös Loránd University
Goethe University Frankfurt
Stanford University
Known forreliable cellular automata, GKL rule, Kolmogorov complexity
Scientific career
Fieldsprobability, complexity theory, information theory, cellular automata
InstitutionsHungarian Academy of Sciences
University of Rochester
Boston University
Doctoral advisorClaus-Peter Schnorr

Péter Gács (Hungarian pronunciation: ['pe:ter 'ga:tʃ]; born May 9, 1947), professionally also known as Peter Gacs, is a Hungarian-American mathematician and computer scientist, professor, and an external member of the Hungarian Academy of Sciences. He is well known for his work in reliable computation, randomness in computing, algorithmic complexity, algorithmic probability, and information theory.

Career

Peter Gacs attended high school in his hometown, then obtained a diploma (M.S.) at Loránd Eötvös University in Budapest in 1970. Gacs started his career as a researcher at the Applied Mathematics Institute of the Hungarian Academy of Science.[1] He obtained his doctoral degree from the Goethe University Frankfurt in 1978. Throughout his studies he had the opportunity to visit Moscow State University and work with Andrey Kolmogorov and his student Leonid A Levin. Through 1979 he was a visiting research associate at Stanford University. He was an assistant professor at University of Rochester from 1980 until 1984 when he moved to Boston University where he received tenure in 1985. He has been full professor since 1992.[2]

Work

Gacs has made contributions in many fields of computer science. It was Gács and László Lovász who first brought ellipsoid method to the attention of the international community in August 1979 by publishing the proofs and some improvements of it.[3][4][5] Gacs also gave contribution in the Sipser–Lautemann theorem.[6] His main contribution and research focus were centered on cellular automata and Kolmogorov complexity.

Work on cellular automata

His most important contribution in the domain of cellular automata besides the GKL rule (Gacs–Kurdyumov–Levin rule) is the construction of a reliable one-dimensional cellular automaton presenting thus a counterexample to the positive rates conjecture.[7] The construction that he offered is multi-scale and complex.[8] Later, the same technique was used for the construction of aperiodic tiling sets.[9]

Work on algorithmic information theory and Kolmogorov complexity

Gacs authored several important papers in the field of algorithmic information theory and on Kolmogorov complexity. Together with Leonid A. Levin, he established basic properties of prefix complexity including the formula for the complexity of pairs[10] and for randomness deficiencies including the result rediscovered later and now known as ample excess lemma.[11][12] He showed that the correspondence between complexity and a priori probability that holds for the prefix complexity is no more true for monotone complexity and continuous a priori probability.[13][14] In the related theory of algorithmic randomness he proved that every sequence is Turing-reducible to a random one (the result now known as Gacs–Kucera theorem, since it was independently proven by Antonin Kucera).[14] Later he (with coauthors) introduced the notion of algorithmic distance and proved its connection with conditional complexity.[15][14]

He was one a pioneer of algorithmic statistics,[16] introduced one of the quantum versions for algorithmic complexity,[17] studied the properties of algorithmic randomness for general spaces[18][19] and general classes of measures.[20] Some of these results are covered in his surveys of algorithmic information theory.[21][22] He also proved results on the boundary between classical and algorithmic information theory: the seminal example that shows the difference between common and mutual information (with János Körner).[23] Together with Rudolf Ahlswede and Körner, he proved the blowing-up lemma.[24]

References

  1. "The list of people that worked at the Renyi Institute". https://old.renyi.hu/former_new.html. 
  2. "Bio". https://www.bu.edu/cs/profiles/peter-gacs/. 
  3. Kolata, Gina Bari (November 2, 1979). "Mathematicians Amazed by Russian's Discovery". Science 206 (4418): 545–546. doi:10.1126/science.206.4418.545. PMID 17759415. Bibcode1979Sci...206..545B. 
  4. Gács, Peter; Lovász, Laszlo (1981), König, H.; Korte, B.; Ritter, K., eds., "Khachiyan's algorithm for linear programming", Mathematical Programming at Oberwolfach (Berlin, Heidelberg: Springer Berlin Heidelberg) 14: pp. 61–68, doi:10.1007/bfb0120921, ISBN 978-3-642-00805-4, http://link.springer.com/10.1007/BFb0120921, retrieved 2020-11-27 
  5. Shenitzer, Abe, and John Stillwell, eds. Mathematical evolutions. MAA, 2002.
  6. Lautemann, Clemens (1983-11-08). "BPP and the polynomial hierarchy" (in en). Information Processing Letters 17 (4): 215–217. doi:10.1016/0020-0190(83)90044-3. ISSN 0020-0190. https://dx.doi.org/10.1016%2F0020-0190%2883%2990044-3. 
  7. Gács, Peter (2001-04-01). "Reliable Cellular Automata with Self-Organization" (in en). Journal of Statistical Physics 103 (1): 45–267. doi:10.1023/A:1004823720305. ISSN 1572-9613. Bibcode2001JSP...103...45G. https://doi.org/10.1023/A:1004823720305. 
  8. Gray, Lawrence F. (2001-04-01). "A Reader's Guide to Gacs's "Positive Rates" Paper" (in en). Journal of Statistical Physics 103 (1): 1–44. doi:10.1023/A:1004824203467. ISSN 1572-9613. Bibcode2001JSP...103....1G. https://doi.org/10.1023/A:1004824203467. 
  9. Durand, Bruno; Romashchenko, Andrei; Shen, Alexander (2012-05-01). "Fixed-point tile sets and their applications" (in en). Journal of Computer and System Sciences. In Commemoration of Amir Pnueli 78 (3): 731–764. doi:10.1016/j.jcss.2011.11.001. ISSN 0022-0000. 
  10. Peter Gacs. On the symmetry of algorithmic information. Doklady Akademii Nauk SSSR, 218(6):1265–1267, 1974. In Russian.
  11. Peter Gacs. Exact expressions for some randomness tests. Z. Math. Log. Grdl. M., 26:385–394, 1980.
  12. Downey, Rodney G., and Denis R. Hirschfeldt. Algorithmic randomness and complexity. Springer, 2010
  13. Peter Gacs. On the relation between descriptional complexity and algorithmic probability. Theoretical Computer Science, 22:71–93, 1983. Short version: Proc. 22nd IEEE FOCS (1981) 296-303
  14. 14.0 14.1 14.2 Li, Ming, and Paul Vitányi. An introduction to Kolmogorov complexity and its applications. Vol. 3. New York: Springer, 2008.
  15. Charles H. Bennett, Peter Gacs, Ming Li, Paul M. B. Vitanyi, and Woiciech Zurek. Information distance. IEEE Transactions on Information Theory, 44(4):1407–1423, 1998. (Preliminary version appeared in STOC’97, arXiv:1006.3520.) According to Google Scholar, this paper has been cited 687 times by April, 2021 [1]
  16. Peter Gacs, John Tromp, and Paul M. B. Vitanyi. Algorithmic statistics. IEEE Transactions on Information Theory, 47:2443–2463, 2001. arXiv:math/0006233[math.PR]. Short version with similar title in Algorithmic Learning Theory, LNCS 1968/2000.
  17. Aditi Dhagat, Peter Gacs, and Peter Winkler. On playing "twenty questions" with a liar. In Proceedings of the third annual ACM-SIAM symposium on Discrete algorithms, SODA’92, pages 16–22, Philadelphia, PA, USA, 1992. Society for Industrial and Applied Mathematics.
  18. Peter Gacs. Uniform test of algorithmic randomness over a general space. Theoretical Computer Science, 341(1-3):91–137, 2005.
  19. Peter Gacs, Mathieu Hoyrup, and Cristobal Rojas. Randomness on computable probability spaces - a dynamical point of view. Theory of Computing Systems, 48:465–485, 2011. 10.1007/s00224-010-9263-x, arXiv:0902.1939. Appeared also in STACS 2009.
  20. Laurent Bienvenu, Peter Gacs, Mathieu Hoyrup, Cristobal Rojas, and Alexander Shen. Randomness with respect to a class of measures. Proceedings of the Steklov Institute of Mathematics, 274(1):34–89, 2011. In English and Russian, also in arXiv:1103.1529.
  21. Peter Gacs. Lecture notes on descriptional complexity and randomness. Technical report, Boston University, Computer Science Dept., Boston, MA 02215, 2009. www.cs.bu.edu/faculty/gacs/papers/ait-notes.pdf.
  22. Thomas M. Cover, Peter Gacs, and Robert M. Gray. Kolmogorov’s contributions to information theory and algorithmic complexity. The Annals of Probability, 17(3):840–865, 1989.
  23. Peter Gacs and Janos Korner. Common information is far less than mutual information. Problems of Control and Inf. Th., 2:149–162, 1973.
  24. Ahlswede, Gacs, Körner Bounds on conditional probabilities with applications in multiuser communication, Z. Wahrsch. und Verw. Gebiete 34, 1976, 157–177