skip to main content
Caltech

CMX Student/Postdoc Seminar

Friday, October 22, 2021
2:00pm to 3:00pm
Add to Cal
Online Event
Operator regression for forward and inverse problems
Nicholas Nelsen, Graduate Student, Department of Mechanical Engineering, Caltech,

Operator learning has emerged as a key enabler for accelerating the computation of existing scientific models and for discovering new models from data when no model exists. In the first part of this talk, I will describe a fully data-driven methodology based on random features to regress nonlinear operators between infinite-dimensional spaces of functions. Generalizing traditional random feature methods operating in Euclidean spaces, this approach may be viewed as a random parametric (operator-valued) kernel method that enjoys several computational advantages over its nonparametric counterparts. The algorithm is deployed in practice to regress solution operators of parametric partial differential equations (PDEs), and I will also use the learned surrogate model to rapidly solve a PDE-based Bayesian inverse problem. The second part of my talk concerns recent theoretical results on the learnability of compact, bounded, and unbounded linear operators that define forward and inverse problems. Bayesian and learning-theoretic estimators for an unknown linear operator on an infinite-dimensional Hilbert space are derived given noisy input-output data, and under some imposed assumptions, convergence rates of the operator estimators are established in the infinite data limit. I will conclude with numerical results on learning differential (unbounded), identity (bounded), and inverse differential (compact) operators that exhibit excellent agreement with the theory and beyond.

For more information, please contact Jolene Brink by email at [email protected] or visit CMX Website.