skip to main content
Caltech

CMX Student/Postdoc Seminar

Friday, November 20, 2020
1:00pm to 2:00pm
Add to Cal
Online Event
Multiscale Computation and Parameter Learning for Kernels from PDEs: Two Provable Examples
Yifan Chen, Graduate Student, Applied and Computational Mathematics, Caltech,

This talk is concerned with the computation and learning of kernel operators from a PDE background. The standard mathematical model is Lu=f, where L is the inverse of some kernel operator; u and f are functions that may or may not be directly available to us, depending on the problem set-up.

In the first part, we consider the computation problem: given L and f, compute u. Here L can be heterogeneous Laplacians, or Helmholtz's operators in the high-frequency regime. For this problem, we develop a multiscale framework that achieves nearly exponential convergence of accuracy regarding the computational degrees of freedom. The main innovation is an effective coarse-fine scale decomposition of the solution space that exploits local structures of both L and f.

In the second part, we consider the learning problem: given u at some scattered points only, the task is to recover the full u and learn the operator L that encodes the underlying physics. We approach this problem via Empirical Bayes and Kernel Flow methods. Analysis of their consistency in the large data limit, as well as explicit identification of their implicit bias in parameter learning, are established for a Matern-like model on the torus, both theoretically and empirically.

For more information, please contact Jolene Brink by phone at 6263952813 or by email at [email protected] or visit CMX Website.