skip to main content
Caltech

CMI Seminar: Gitta Kutyniok

Tuesday, March 5, 2019
4:00pm to 5:00pm
Add to Cal
Gates-Thomas 135
Approximation Theory meets Deep Learning
Gitta Kutyniok, TU Berlin,

PLEASE NOTE THE NEW LOCATION: Hall Auditorium, 135 Gates Thomas

Despite the outstanding success of deep neural networks in real-world applications, most of the related research is empirically driven and a mathematical foundation is almost completely missing. One central task of a neural network is to approximate a function, which for instance encodes a classification task. In this talk, we will be concerned with the question, how well a function can be approximated by a neural network with sparse connectivity. Using methods from approximation theory and applied harmonic analysis, we will derive a fundamental lower bound on the sparsity of a neural network. By explicitly constructing neural networks based on certain representation systems, so-called $\alpha$-shearlets, we will then demonstrate that this lower bound can in fact be attained. Finally, we present numerical experiments, which surprisingly show that already the standard backpropagation algorithm generates deep neural networks obeying those optimal approximation rates.

March 12, 2019

For more information, please contact Linda Taddeo by phone at 626-395-6704 or by email at [email protected] or visit Mathematics of Information Seminar - Upcoming Events.