skip to main content
Caltech

Electrical Engineering Systems Seminar

Thursday, February 25, 2021
12:00pm to 1:00pm
Add to Cal
Online Event
Accelerated gradient methods on Riemannian manifolds - Recording available upon request
Suvrit Sra, Associate Professor, EECS, LIDS, IDSS, Massachusetts Institue of Technology,

This talks lies at the interface of geometry and optimization. I'll talk about geodesically convex optimization problems, a rich class of non-convex optimization problems that admit tractable global optimization. I'll provide some background on this class and some motivating examples. Beyond a general introduction to the topic area, I will dive deeper into a recent discovery of a long-sought result: an accelerated gradient method for Riemannian manifolds. Towards developing this method, we will revisit Nesterov's (Euclidean) estimate sequence technique and present a conceptually simple alternative. We will then generalize this simpler alternative to the Riemannian setting. Combined with a new geometric inequality, we will then obtain the first (global) accelerated Riemannian-gradient method. I'll also comment on some very recent updates on this topic.

For more information, please contact Caroline Murphy by email at [email protected].