Biography:Yurii Nesterov

From HandWiki
Short description: Russian mathematician
Yurii Nesterov
Nesterov yurii.jpg
2005 in Oberwolfach
Born (1956-01-25) January 25, 1956 (age 68)
Moscow, USSR
CitizenshipBelgium
Alma materMoscow State University (1977)
Awards
  • Dantzig Prize, 2000
  • John von Neumann Theory Prize, 2009
  • EURO Gold Medal, 2016

The WLA Prize in Computer Science or Mathematics, 2023[1]

Scientific career
Fields
Institutions
Doctoral advisorBoris Polyak

Yurii Nesterov is a Russian mathematician, an internationally recognized expert in convex optimization, especially in the development of efficient algorithms and numerical optimization analysis. He is currently a professor at the University of Louvain (UCLouvain).

Biography

In 1977, Yurii Nesterov graduated in applied mathematics at Moscow State University. From 1977 to 1992 he was a researcher at the Central Economic Mathematical Institute of the Russian Academy of Sciences. Since 1993, he has been working at UCLouvain, specifically in the Department of Mathematical Engineering from the Louvain School of Engineering, Center for Operations Research and Econometrics.

In 2000, Nesterov received the Dantzig Prize.[2]

In 2009, Nesterov won the John von Neumann Theory Prize.[3]

In 2016, Nesterov received the EURO Gold Medal.[4]

In 2023, Yurii Nesterov and Arkadi Nemirovski received the WLA Prize in Computer Science or Mathematics, "for their seminal work in convex optimization theory".[5]

Academic work

Nesterov is most famous for his work in convex optimization, including his 2004 book, considered a canonical reference on the subject.[6] His main novel contribution is an accelerated version of gradient descent that converges considerably faster than ordinary gradient descent (commonly referred as Nesterov momentum, Nesterov Acceleration or Nesterov accelerated gradient, in short — NAG).[7][8][9][10][11] This method, sometimes called "FISTA", was further developed by Beck & Teboulle in their 2009 paper "A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems".[12]

His work with Arkadi Nemirovski in their 1994 book[13] is the first to point out that the interior point method can solve convex optimization problems, and the first to make a systematic study of semidefinite programming (SDP). Also in this book, they introduced the self-concordant functions which are useful in the analysis of Newton's method.[14]

References

  1. "2023 WLA Prize Laureates". 2023. https://www.thewlaprize.org/Laureates/2023/Yurii_Nesterov/. 
  2. "The George B. Dantzig Prize". 2000. http://www.siam.org/prizes/sponsored/dantzig.php. 
  3. "John Von Neumann Theory Prize". 2009. https://www.informs.org/Recognize-Excellence/Award-Recipients/Yurii-Nesterov. 
  4. "EURO Gold Medal". 2016. https://www.euro-online.org/web/pages/608/last-activities-list. 
  5. "Laureates of the 2023 WLA Prize Announced". 2023. https://www.thewlaprize.org/PressRoom/News/2023/09/14/179.html. 
  6. Nesterov, Yurii (2004). Introductory lectures on convex optimization : A basic course. Kluwer Academic Publishers. ISBN 978-1402075537. 
  7. Nesterov, Y (1983). "A method for unconstrained convex minimization problem with the rate of convergence [math]\displaystyle{ O(1/k^2) }[/math]". Doklady AN USSR 269: 543–547. 
  8. Walkington, Noel J. (2023). "Nesterov's Method for Convex Optimization" (in en). SIAM Review 65 (2): 539–562. doi:10.1137/21M1390037. ISSN 0036-1445. https://epubs.siam.org/doi/10.1137/21M1390037. 
  9. Bubeck, Sebastien (April 1, 2013). "ORF523: Nesterov's Accelerated Gradient Descent". http://blogs.princeton.edu/imabandit/2013/04/01/acceleratedgradientdescent/. 
  10. Bubeck, Sebastien (March 6, 2014). "Nesterov's Accelerated Gradient Descent for Smooth and Strongly Convex Optimization". https://blogs.princeton.edu/imabandit/2014/03/06/nesterovs-accelerated-gradient-descent-for-smooth-and-strongly-convex-optimization/. 
  11. "The zen of gradient descent". http://blog.mrtz.org/2013/09/07/the-zen-of-gradient-descent.html. 
  12. Beck, Amir; Teboulle, Marc (2009-01-01). "A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems". SIAM Journal on Imaging Sciences 2 (1): 183–202. doi:10.1137/080716542. https://epubs.siam.org/doi/abs/10.1137/080716542. 
  13. Nesterov, Yurii; Arkadii, Nemirovskii (1995). Interior-Point Polynomial Algorithms in Convex Programming. Society for Industrial and Applied Mathematics. ISBN 978-0898715156. 
  14. Boyd, Stephen P.; Vandenberghe, Lieven (2004). Convex Optimization. Cambridge University Press. ISBN 978-0-521-83378-3. https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf. Retrieved October 15, 2011. 

External links