skip to main content

CMX Student/Postdoc Seminar

Friday, October 16, 2020
1:00pm to 2:00pm
Add to Cal
Online Event
Competitive Optimization and Factorization by KL-Minimization
Florian Schaefer, Graduate Student, Computing and Mathematical Science, Caltech,

The first part of this talk is concerned with competitive optimization, where multiple agents try to minimize their respective objectives that each depend on all agents' actions. We propose competitive gradient descent (CGD) and show that it is a natural generalization of gradient descent to competitive optimization, with good practical performance. We then show how ideas from information geometry can be used to extend CGD to competitive mirror descent (CMD) that can incorporate a wide range of convex constraints.

The second part of the talk is concerned with the approximate factorization of dense kernel matrices arising as Green's matrices of elliptic PDE or covariance matrices of smooth Gaussian processes. For a given sparsity pattern, we show that the optimal (in KL-divergence) sparse inverse-Cholesky factor of the kernel matrix can be computed in closed form, embarrassingly parallel. By exploiting the conditional independence properties of finitely smooth Gaussian processes, we show that these factors can be computed in near-linear complexity, improving the state of the art for fast solvers of Green's matrices of elliptic PDE.

For more information, please contact Jolene Brink by phone at 6263952813 or by email at [email protected] or visit CMX Website.