Follow the Amazon link and check to see if it’s still on sale.
P.S. I don’t make any money through this link. We do get some royalties from the book, but only a very small amount. I’m pushing the Amazon link right now because (a) I think the book is great, and I want as many people as possible to have it, and (b) 40% off is a pretty good deal and I don’t know how long this will last.
P.P.S. Just so this post has some statistical content, here’s one of my favorite papers, Bayesian model-building by pure thought: some principles and examples. It’s from 1996, and here’s the abstract:
A kindle edition would be nice. Thanks.
It’s a really good article, but I didn’t get this quote though:
“In this article, we discuss other ways in which probability models can be evaluated, using a variety of invariance principles, before seeing any data. (We do not go as far as some maximum entropy theorists (e.g. Skilling (1988)) who seek not just to restrict a model class, but actually to specify a model based on theoretical principles only.)”
Invariance principles imply a group of transformations. Whether they restrict possibilities to a class of models or a specific model depends on properties of the group and isn’t up to the Statistician.
Also it’s interesting to note that in all my physics education, I never encountered a single probability distribution that was derived from data. They were all derived from theoretical considerations. That’s probabily the biggest reason physicists can get by without taking statistics courses, which most never do.
http://en.wikipedia.org/wiki/Titius%E2%80%93Bode_law
Astronomy isn’t physics, but it’s a lot closer than say, economics. The Titus-Bode Law (inaccurate thought it proved to be for n>7) still has no obvious theoretical source that I’m aware of.
Also, weren’t there a fair number of 17th-19th Century physics laws that started as pure observation and then got a theoretical model much later? Just because we teach the theoretical model first doesn’t mean that that was the way the discovery worked.
It’s not really a “probability distribution” though is it? It’s more of a formula predicting how far out the nth planet from a star should be relative to some relevant scale.
Sure. But the errors of the Titius-Bode “model” can be given a data-based distribution.
I realize that, for example, the normal distribution underlying, say, Brownian motion comes from first principles. But I took Andrew’s point to be that whenever the error distribution can’t come from first principles, it has to come from the data.
In economics, we specify logit models on the assumption that latent underlying utility follows an extreme value distribution. That’s completely unmotivated (and wrong, but useful) of course, but at least the Titius-Bode error distribution could be derived from observations.
Joseph:
Here’s what I meant. In my paper I used theoretical principles (self-consistency under rescaling, etc) to restrict certain model classes (for example, to restrict the weighting functions in a spatial smoother, so that a smoother that is a simple average of a point and its eight nearest neighbors is not allowed). But I did not attempt to use theoretical principles to choose the parametric form of the model class, just to make some restrictions within a given parametric form. In contrast, the max-ent people, following Jaynes, attempt to choose the parametric form itself based on their invariance principles.
The offer is still available…”Estimated delivery November 20-22″
It is still there!! And bought! :D
Unfortunately no deal is offered on the European sites of Amazon. Should we expect one closer to the release date?
So I’ve been meaning to ask: with the third edition of BDA coming out very soon, does that mean we can look forward to a new edition of multilevel modeling?
I think Andrew may write another regression book before revising Gelman and Hill. We are rewriting the Gelman and Hill regression book’s models in Stan. They’re in progress on GitHub at: https://github.com/stan-dev/rstanarm
I can’t wait until I’m surprised by my own gift!