Keller Colloquium in Computing and Mathematical Sciences
Eigenvalues of random matrices play a central role in many areas of applied mathematics and computer science. Asymptotic random matrix theory has been immensely successful at precisely explaining the limiting spectra of large random matrices with independent entries (or other symmetries). For more general models in finite dimensions, the picture is less crystalline but tools such as the "Matrix Chernoff Bound" give useful coarse bounds on the extreme eigenvalues. I will describe an object which shares features of both these regimes --- the expected characteristic polynomials of finite random matrices --- and which can be used to show that some of the sharp bounds from the former setting hold with non-negligible probability in the latter. The technique is based on certain interlacing relations between polynomials with all real roots, and is elementary and should be accessible to a general audience. Based on joint work with Adam Marcus and Daniel Spielman.