skip to main content
Caltech

CMX Lunch Seminar

Wednesday, February 17, 2021
12:00pm to 1:00pm
Add to Cal
Online Event
Approximation Theory and Metric Entropy of Neural Networks
Jonathan Siegel, Postdoctoral Scholar, Department of Mathematics, Penn State,

We consider the problem of approximating high dimensional functions using shallow neural networks. We begin by introducing natural spaces of functions which can be efficiently approximated by such networks. Then, we derive the metric entropy of the unit balls in these spaces. Drawing upon recent work connecting stable approximation rates to metric entropy, this leads to the optimal approximation rates for the given spaces. Next, we show that higher approximation rates can be obtained by further restricting the function class. In particular, for a restrictive but natural space of functions, shallow networks with ReLU$^k$ activation function achieve an approximation rate of $O(n^{-(k+1)})$ in every dimension. Finally, we discuss the connections between this surprising result and the finite element method.

For more information, please contact Jolene Brink by phone at 6263952813 or by email at [email protected] or visit CMX Website.