Ted Dunning points me to this article by J. Andres Christen and Colin Fox:
We [Christen and Fox] develop a new general purpose MCMC sampler for arbitrary continuous distributions that requires no tuning. . . .The t-walk maintains two independent points in the sample space, and all moves are based on proposals that are then accepted with a standard Metropolis-Hastings acceptance probability on the product space. . . . In a series of test problems across dimensions we find that the t-walk is only a small factor less efficient than optimally tuned algorithms, but significantly outperforms general random-walk M-H samplers that are not tuned for specific problems. . . . Several examples are presented showing good mixing and convergence characteristics, varying in dimensions from 1 to 200 and with radically different scale and correlation structure, using exactly the same sampler. The t-walk is available for R, Python, MatLab and C++ at http://www.cimat.mx/~jac/twalk/.
This looks pretty cool to me! I asked Christian Robert what he thought, and he referred me to this blog entry where he presents a bit of a skeptical, wait-and-see attitude:
The proposal of the authors is certainly interesting and widely applicable but to cover arbitrary distributions in arbitrary dimensions with no tuning and great performances sounds too much like marketing . . . there is no particular result in the paper showing an improvement in convergence time over more traditional samplers. . . . Since the authors developed a complete set of computer packages, including one in R, I figure people will start to test the method to check for possible improvement over the existing solutions. If the t-walk is indeed superior sui generis, we should hear more about it in the near future…
I have a lot of big models to fit, so I expect we’ll be trying out many of these different methods during the next year or so.
P.S. Update here.