CMX Student/Postdoc Seminar
High-dimensional Bayesian inference problems cast a long-standing challenge in generating samples, especially when the posterior has multiple modes. For a wide class of Bayesian inference problems whose forward model equipped with the multiscale structure that coarse-scale low-dimensional surrogate can approximate the original fine-scale high-dimensional problem well, we propose to train a Multiscale Invertible Generative Network (MsIGN) for sample generation. We approximate the fine-scale posterior distribution by a fine-scale surrogate that can be decoupled into the coarse-scale posterior and a prior conditional distribution. A novel prior conditioning layer is then designed to model this prior conditional distribution and bridge different scales, enabling coarse-to-fine multi-stage training. The fine-scale surrogate is further modified by the invertible generative network, and to avoid mode missing, we adopt the Jeffreys divergence as the training objective. On two high-dimensional Bayesian inverse problems, MsIGN approximates the posterior accurately and clearly captures multiple modes, showing superior performance compared with previous deep generative network approaches. On the natural image synthesis task, MsIGN achieves superior performance in bits-per-dimension among our baselines and yields great interpret-ability of its neurons in intermediate layers.