David MacKay

I learned from this comment that David MacKay has passed away. Here’s an obituary, which has a lot of information, really much more than I could give because I only met MacKay a couple of times. The first time was when I was in Cambridge, England, for a conference, and I got there a day early and was walking in the park and came across some people playing frisbee, so I joined in. David was one of the leaders of the group and was spending a lot of time helping some of the younger and inexperienced players. The second time was at the conference itself, of course; it was only there that I realized the David from the frisbee game was this important scientist.

David had certain traits that I associate with physicists who go into statistics; some of this came up in our discussion on Occam’s razor. His book in information theory and inference was fun to read in part because he was rediscovering and deriving things from scratch, so we get to see the joy of discovery all over again, in ideas that we had seen in other contexts. As the obituary notes, MacKay made many contributions in computer science and in recent years had moved into climate change education and policy. And here’s the most British thing about him.

I had heard that David was seriously ill so his death did not come as a surprise but it still saddens me greatly. The world would’ve benefited from another 48 years of him.

6 thoughts on “David MacKay

  1. Yes, truly sad.
    He’d sent me a copy of his book (because he liked some blog comments), which was very good.
    Then, he gave a lecture at Stanford and I got to talk to him for a while.

    I’d heard he was (terminally) ill about 6 months ago, but also that he was quite open and blogging about it, right to the end.

  2. Recently I’ve been reading a lot of his 90’s work on the evidence framework, ARD, and Bayesian neural networks. He’s had an incredible influence in the community and has impacted much of my work as well. I find his most recent work on sustainable energy admirable for bringing scientific thought to policy makers and also, as a student, motivating that he could make such big contributions to a separate field. Hope the best to his family.

  3. I never had a chance to meet David MacKay, but I am very sad at his passing. David’s info-theory and inference textbook is a marvel. It’s not only a great resource for learning math behind inference and machine learning, but it’s so readable that I accidentally ended up learning about coding and compression too. It’s free as PDF, and it’s highly recommended.

  4. Andrew, I’m deeply saddened by this news, but grateful that you posted it; I may not have heard otherwise for some time.

    David and I met as new PhDs at MaxEnt meetings ca. 1990. We were learning Bayes at the same time, from the writings of Ed Jaynes, the Cambridge Bayesians (Gull & Skilling), and Larry Bretthorst (I’d added Berger, Zellner, Lindley, Good, etc., to my personal mix at the time). We were both equally full of excitement about it, and eagerly read each other’s early work. I still have the bound copy of his thesis that he sent me as a token of thanks for comments on drafts of some chapters.

    We kept in touch through the 90s, but only occasionally in the last 10-15 years. Even despite a lapse in correspondence, he thoughtfully sent me a copy of his “hot air” book (triggering a reconnection). It was a few years ago that we last corresponded, and I was totally unaware of his cancer struggle until learning of his death here. It’s heartbreaking news.

    David was brilliant, energetic, inventive, and enthusiastic, and he has left behind a strong legacy, both in his published work and in the fine grad students that came out of his group. I’ll very much miss his influence on the field, and our always fun and educational, if lately rare, interactions (including at least one shared ultimate frisbee game).

    You described David as among “physicists who go into statistics.” In that community he was a bit of an outlier, in that his undergrad training was in physics, but his PhD was from the Caltech Computation and Neural Systems degree program. So he had some formal training in the information sciences, albeit not strictly in statistics.

    With your mention of David’s Occam’s razor ideas, I can’t resist briefly indulging in some Bayesian discussion in David’s honor.

    I once asked David how he came to his understanding of the Bayesian Occam’s razor, and he told me he was just generalizing from my 1989 MaxEnt paper (a review of Bayesian ideas for astronomers). That’s not a boast; there, all I was doing was passing along ideas I learned from Ed Jaynes and Steve Gull. In particular, in my opinion Ed was the first to clearly explain the so-called Bayesian Occam’s razor. I think his early contribution to this line of thought isn’t appreciated, because it appears in a 1979 book review:

    Reviewed Work: Inference, Method, and Decision: Towards a Bayesian Philosophy of Science by Roger D. Rosenkrantz
    Review by: Edwin T. Jaynes
    Journal of the American Statistical Association
    Vol. 74, No. 367 (Sep., 1979), pp. 740-741
    http://www.jstor.org.proxy.library.cornell.edu/stable/2287026

    I say “the so-called Bayesian Occam’s razor” (even though I use the “Occam’s razor” expression myself) because already in 1979 Ed appreciated that marginal likelihoods don’t fully accord with a simple interpretation of William of Ockham’s rule. He starts,

    “Let us, however, examine only the most obvious example: that a hypothesis with fewer parameters seems intuitively simpler. It is interesting to see the mechanism by which Bayes’s theorem usually justifies—but in some cases modifies—this intuition.”

    After describing a setting where Bayesian model comparison performs in an intuitively Occam-eque manner, he writes,

    “But, having seen this mechanism, it is easy to invent cases (e.g., if introduction of the new parameter is accompanied by a drastic redistribution of marginal prior probability on the subspace of the old model) in which Bayes’s theorem will contradict Ockham because it is taking into account further circumstances undreamt of in Ockham’s philosophy…. It would be interesting to see how Ockham’s principle could be justified in orthodox statistical theory, which does not admit a prior probability for a hypothesis. Even in Bayesian theory, the question is subtle enough to have caused trouble.”

    Subtle enough that we’re still talking about it 700 years after Ockham, nearly a century after Jeffreys (whom Ed acknowledges in the review), decades after Jaynes, and a British dozen or so years after the insightful, quirky, and provocative Information theory, inference, and learning algorithms.

Leave a Reply

Your email address will not be published. Required fields are marked *