Tuesday, October 23, 2018
4:00 pm

CMI Seminar

Complexity, Noise, and Emergent Properties of Learning Deep Representations
Alessandro Achille, UCLA

I will show that Information Theoretic quantities control and describe a large part of the training process of Deep Neural Networks, and can be used to explain how properties, such as invariance to nuisance variability and disentanglement of semantic factors, emerge naturally in the learned representation. The resulting theory has connections with several fields ranging from algorithmic complexity to variational inference. This framework not only predicts the asymptotic behavior of deep networks, but also shows that the initial learning transient has a large irreversible effect on the outcome of the training, which gives rise to critical learning periods akin to those observed in biological systems. This urges us to study the complex, and so far neglected, initial phase of learning.

Contact Linda Taddeo ltaddeo@caltech.edu at 626-395-6704
Add this event to my calendar