Special ACM Seminar
Variational Bayes (VB) is a popular alternative to Monte Carlo (MC) simulations and its derivatives for estimating the posterior density functions of the model parameters. Its popularity stems from the fact that the posterior density functions are determined by a fast gradient descent algorithm, as opposed to the inefficient sampling methods in MC methods. In the treatment of Bayesian neural networks, however, VB approaches to posterior density estimation lead to intractable integrals that are inevitably approximated via MC integration. In this talk, I will present an approach to approximating the posterior density functions of the parameters of a simple Bayesian neural network with one hidden layer by fast approximate numerical integration techniques that do not require MC sampling.