Position: Doctoral Student

Current Institution: UC Berkeley

Abstract:
Accelerated gradient methods play a central role in optimization, achieving optimal rates in
many settings. While many generalizations and extensions of Nesterov’s original acceleration
method have been proposed, it is not yet clear what is the natural scope of the acceleration concept. In this work, we study accelerated methods from a continuous-time perspective. We show that there is a Lagrangian functional that we call the Bregman Lagrangian which generates a large class of accelerated methods in continuous time, including (but not limited to) accelerated gradient descent, its non-Euclidean extension, and accelerated higher-order gradient methods. We show that the continuous-time limit of all of these methods correspond to traveling the same curve in spacetime at different speeds. From this perspective, Nesterov’s technique and many of its generalizations can be viewed as a systematic way to go from the continuous-time curves generated by the Bregman Lagrangian to a family of discrete-time accelerated algorithms.

Bio:
I am a fourth year doctoral student at UC Berkeley working with Michael Jordan and Benjamin Recht. I am broadly interested in applied math, dynamical systems, and optimization, and I am currently a member of the Statistical AI Lab and the AMPLab at Berkeley. Before starting graduate school, I graduated from Harvard University in 2011 where I got my bachelors in applied mathematics and philosophy. During my year off, I worked with Professor Cynthia Rudin at MIT in the Predictions Analysis Lab.