skip to main content

TCS+ Talk

Wednesday, March 11, 2020
10:00am to 11:00am
Add to Cal
Annenberg 322
Reasoning About Generalization via Conditional Mutual Information
Thomas Steinke, Researcher, IBM Almaden,

Abstract: We provide an information-theoretic framework for studying the generalization properties of machine learning algorithms. Our framework ties together existing approaches, including uniform convergence bounds and recent methods for adaptive data analysis. Specifically, we use Conditional Mutual Information (CMI) to quantify how well the input (i.e., the training data) can be recognized given the output (i.e., the trained model) of the learning algorithm. We show that bounds on CMI can be obtained from VC dimension, compression schemes, differential privacy, and other methods. We then show that bounded CMI implies various forms of generalization.

Based on joint work with Lydia Zakynthinou.

For more information, please contact Bonnie Leung by email at [email protected].