Markov chain Monte Carlo (MCMC) provides the dominant methodology for inference over statistical models with non-conjugate priors. Despite a wealth of theoretical characterisation of mixing times, geometric ergodicity, and asymptotic step-sizes, the design and implementation of MCMC methods remains something of an engineering art-form. An attempt to address this issue in a systematic manner leads one to consider the geometry of probability distributions, as has been the case previously in the study of e.g. higher-order efficiency in statistical estimators. By considering the natural Riemannian geometry of probability distributions MCMC proposal mechanisms based on Langevin diffusions that are characterised by the metric tensor and associated manifold connections are proposed and studied. Furthermore, optimal proposals that follow the geodesic paths related to the metric are defined via the Hamilton-Jacobi approach and these are empirically evaluated on some challenging modern-day inference tasks.
Keywords: MCMC; Riemann manifold; Bayesian inference; Computational statistics
Biography: Mark Girolami holds a Chair of Statistics in the Department of Statistical Science, UCL and is Director of the Centre for Computational Statistics and Machine Learning, he is an Adjunct Professor in the Department of Computer Science, UCL. Prior to joining UCL Mark held a Chair in Computing and Inferential Science at the University of Glasgow.