Abstract: There are two main strategies for Bayesian inference. Markov chain Monte Carlo (MCMC) uses random walks, while variational inference (VI) finds the best approximation of the posterior in some tractable family. Many view these as having a tradeoff in which MCMC is the gold standard, while VI is faster but less accurate. This talk will describe some recent methods that integrate Monte Carlo methods into VI, including importance sampling, sequential Monte Carlo, annealed importance sampling, and Hamiltonian Monte Carlo. The fundamental idea is to leverage randomness to either enlarge the variational family and/or to make it closer to the posterior.
Justin Domke is a professor at University of Massachusetts, Amherst
Time and place: Room T2 at CS-building Aalto at 14:15-15:00, and also zoom https://aalto.zoom.us/j/61438240521