You can get a taste of Hamiltonian Monte Carlo (HMC) by reading the very gentle introduction in David MacKay’s general text on information theory:

- MacKay, D. 2003.
*Information Theory, Inference, and Learning Algorithms*. Cambridge University Press. [see Chapter 31, which is relatively standalone and can be downloaded separately.]

Follow this up with Radford Neal’s much more thorough introduction to HMC:

- Neal, R. 2011. MCMC Using Hamiltonian Dynamics. In Brooks, Gelman, Jones and Meng, eds.,
*Handbook of Markov Chain Monte Carlo*. Chapman and Hall/CRC Press.

To understand why HMC works and set yourself on the path to understanding generalizations like Riemann manifold HMC, you’ll need to know a bit about differential geometry. I really liked the combination of these two books:

- Magnus, J. R. and H. Neudecker. 2007.
*Matrix Differential Calculus with Application in Statistics and Econometrics*. 3rd Edition. Wiley?

and

- Leimkuhler, B. and S. Reich. 2005.
*Simulating Hamiltonian Dynamics*. Cambridge University Press.

As a bonus, Magnus and Neudecker also provide an excellent introduction to matrix algebra and real analysis before mashing them up. The question mark after “Wiley” is due to the fact that the preface says that the third-edition is self-published and copyright the authors and and available from the first author’s home page. It’s no longer available on Magnus’s home page, nor is it available for sale by Wiley. It can be found in PDF form on the web, though; try Googling [matrix differential calculus magnus].

That’s chapter 30 in Mackay, not 31 – at least in my hardcopy, don’t know if the downloadable version is different.

Thanks for pointing that out. The link on MacKay’s site is “chapter 31”, but it points to chapter 30.