Caltech Young Investigators Lecture
Semidefinite programs (SDPs) are a fundamental class of optimization problems with important recent applications in approximation algorithms, quantum complexity, robust learning, algorithmic rounding, and adversarial deep learning. We present a faster interior point method to solve generic SDPs with variable size n × n and m constraints in time
where ω is the exponent of matrix multiplication and δ is the relative accuracy. In the predominant case of m ≥ n, our runtime outperforms that of the previous fastest SDP solver, which is based on the cutting plane method. This is joint work with H. Jiang, T. Kathuria, Y. T. Lee, and Z. Song and was published in FOCS 2020.
Shifting gears from convex programs to nonconvexity, we present a result with a nonasymptotic (finite-time) guarantee for first-order optimization for the class of nonsmooth nonconvex problems. While this problem has been extensively studied for decades for the class of smooth or convex functions, it has seen relatively less progress for nonsmooth nonconvex functions, which have recently emerged to be of great importance in deep learning. Our work makes progress towards this goal. This is joint work with D. Davis, D. Drusvyatskiy, Y. T. Lee, and G. Ye, and was published in NeurIPS 2022.
Swati is a final-year Ph.D. student at the University of Washington, Seattle, where she is advised by Prof. Yin Tat Lee. Her research focuses on developing provably efficient algorithms for structured classes of optimization problems including semidefinite programs, nonsmooth problems (convex and nonconvex), problems in applied linear algebra, and online resource-allocation. Prior to pursuing her Ph.D., she studied Electronics Engineering and worked as a signal processing engineer in the hardware electronics industry.
This lecture is part of the Young Investigators Lecture Series sponsored by the Caltech Division of Engineering & Applied Science.