In 2023, Yurii Nesterov and Arkadi Nemirovski received the WLA Prize in Computer Science or Mathematics, "for their seminal work in convex optimization theory".[5]
Academic work
Nesterov is most famous for his work in convex optimization, including his 2004 book, considered a canonical reference on the subject.[6] His main novel contribution is an accelerated version of gradient descent that converges considerably faster than ordinary gradient descent (commonly referred as Nesterov momentum, Nesterov Acceleration or Nesterov accelerated gradient, in short — NAG).[7][8][9][10][11] This method, sometimes called "FISTA", was further developed by Beck & Teboulle in their 2009 paper "A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems".[12]
^Nesterov, Yurii; Arkadii, Nemirovskii (1995). Interior-Point Polynomial Algorithms in Convex Programming. Society for Industrial and Applied Mathematics. ISBN978-0898715156.