skip to main content
Caltech

CMX Student/Postdoc Seminar

Friday, February 26, 2021
1:00pm to 2:00pm
Add to Cal
Online Event
(1st talk) Neural operator and dynamic systems (2nd talk) Low-rank matrix manifold:Geometry and asymptotic behavior of optimization
(1st speaker)Zongyi Li (2nd speaker)Ziyun Zhang, Graduate Students, Applied and Computational Mathematics / Computing & Mathematical Sciences, Caltech,

(1st talk) The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and the Navier-Stokes equation (including the turbulent regime). Our Fourier neural operator shows state-of-the-art performance compared to existing neural network methodologies and it is up to three orders of magnitude faster compared to traditional PDE solvers.

(2nd talk) The low-rank matrix manifold is the Riemannian manifold of fixed rank matrices whose rank is much smaller than the dimension. It is popular in modern data science applications involving low-rank recovery because of the efficiency of manifold optimization algorithms along with nearly optimal theoretical guarantee. This talk is motivated by some recent findings about using Riemannian gradient descent to minimize the least-squares loss function on the low-rank matrix manifold. Our focus is to address the non-convexity and non-closedness of this manifold. I will first introduce the general theory of the asymptotic escape of strict saddle sets on Riemannian manifolds. Then I will discuss the so-called spurious critical points that are special to the low-rank matrix manifold, and new analytical techniques tailored to the spurious critical points. Together they pave the way for a thorough understanding of the global asymptotic behavior of Riemannian gradient descent on the low-rank matrix manifold.

For more information, please contact Jolene Brink by email at [email protected] or visit CMX Website.