Mark Girolami sends along this article, “Riemann Manifold Langevin and Hamiltonian Monte Carlo,” by Ben Calderhead, Siu Chin, and himself:
This paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis Adjusted Langevin Algorithms. This allows for highly efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The proposed methodology exploits the Riemannian geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold providing highly efficient convergence and exploration of the target density. The performance of these Riemannian Manifold Monte Carlo methods is rigorously assessed by performing inference on logistic regression models, log-Gaussian Cox point processes, stochastic volatility models, and Bayesian estimation of dynamical systems described by nonlinear differential equations. Substantial improvements in the time normalised Effective Sample Size are reported when compared to alternative sampling approaches.
Cool! And they have Matlab code so you can go try it out yourself. If anybody out there knows more about this (I’m looking at you, Radford and Christian), please let us know. I care a lot about this right now because we’re starting a big project on Bayesian computation for hierarchical regression models with deep interactions.
P.S. The tables are ugly but I forgive the authors because their graphs are so pretty; for example: