skip to main content
Caltech

TCS+ Talk

Wednesday, November 28, 2018
10:00am to 11:00am
Add to Cal
Annenberg 322
The Adaptive Complexity of Maximizing a Submodular Function
Eric Balkanski, Harvard University,

Abstract:  We introduce the study of submodular optimization in the adaptive complexity model to quantify the information theoretic complexity of black-box optimization in a parallel computation model. Informally, the adaptivity of an algorithm is the number of sequential rounds it makes when each round can execute polynomially-many function evaluations in parallel. Since submodular optimization is regularly applied on large datasets we seek algorithms with low adaptivity to enable speedups via parallelization. For the canonical problem of maximizing a monotone submodular function under a cardinality constraint, all that was known until recently is that the adaptivity needed to obtain a constant approximation is between 1 and Ω(n). Our main result is a tight characterization showing that the adaptivity needed is, up to lower order terms, θ(log n):
- We describe an algorithm which requires O(log n) sequential rounds and achieves an approximation that is arbitrarily close to 1/3;
- We show that no algorithm can achieve an approximation better than O(1 / log n) with fewer than O(log n / log log n) rounds.
Thus, when allowing for parallelization, our algorithm achieves a constant factor approximation exponentially faster than any previous algorithm for submodular maximization. I will conclude the talk by surveying very recent results on submodular optimization in the adaptive complexity model.

Joint work with Yaron Singer.

 

For more information, please contact Bonnie Leung by email at [email protected].