Skip to content

My wikipedia edit

The other day someone mentioned my complaint about the Wikipedia article on “Bayesian inference” (see footnote 1 of this article) and he said I should fix the Wikipedia entry myself.

And so I did. I didn’t have the energy to rewrite the whole article–in particular, all of its examples involve discrete parameters, whereas the Bayesian problems I work on generally have continuous parameters, and its “mathematical foundations” section focuses on “independent identically distributed observations x” rather than data y which can have different distributions. It’s just a wacky, unbalanced article. But I altered the first few paragraphs to get rid of the stuff about the posterior probability that a model is true.

I much prefer the Scholarpedia article on Bayesian statistics by David Spiegelhalter and Kenneth Rice, but I couldn’t bring myself to simply delete the Wikipedia article and replace it with the Scholarpedia content.

Just to be clear: I’m not at all trying to disparage the efforts of the Wikipedians. It’s only through putting stuff out there that it can be edited and improved.


  1. Adam Hyland says:

    If you convince Spiegelhalter and Rice to release that page under a GDFL compatible license you literally can replace chunks of the WP article with the Scholarpedia article. I actually just gave a presentation to the Chicago R Users group about improving WP content on statistics. The stats content right now is pretty spotty. I’m not surprised that a technically challenging but well known (among academics and anyone tangentially involved in stats) subject like Bayesian Inference would be in trouble.

  2. idiot says:

    Yeah, (also known as Andrew Gelman), I agree with you. According to this article ( ), it’s people like you, the non-core members of Wikipdia, that produces lots of the content of Wikipedia that the “core” editors improve upon. By editing and making your point clear, you are helping (not insulting) Wikipedia.

  3. jovo says:

    it seems like you wrote: “”Bayesian inference”’ is a method of [[statistical inference]] in which [[evidence]] is used to estimate parameters…” I’m a bit surprised by this, given that some of my hard-line “bayesian” friends claim, “bayesian’s don’t estimate parameters, they compute posteriors.” obviously, this is silly, because except in the most simple situations, at a minimum, bayesians *estimate* posteriors (using MCMC or variational inference, for example).

    • Andrew says:


      To me, a set of posterior simulations is a form of parameter estimate. It’s a distributional estimate, not a point estimate. But I don’t restrict the term “estimate” to point estimation.

  4. Jonathan says:

    Andrew, I couldn’t agree more. I’ve had my eye on this article for a while and have meant to find time to sit down and go over it. (Although, I should add, I’m far from a statistics expert.)

    However, I don’t think it should immediately be replaced with the article on Scholarpedia. Firstly, it’s good to have two different versions of the same information online. Secondly, the content of the various articles dealing with Bayesian statistics are currently being balanced for content, and some of the statistics articles generally are picking up a more consistent format. Finally, I think the Wikipedia article requires a slightly different approach with respect to potential audience.

    I actually just spent some time this evening mainly revising the article structure. Still work to be done, of course.

    • K? O'Rourke says:

      I would agree and stress that there there can not be one good article for all backgrounds.

      Actually I posted on this here before arguing – that many who would not get the “simple” continuous example would get the discrete examples in wikipedia (as well as complained about the ambiguity of epistemic/aleatory for random effects).

      Someone responded directly with an argument that continuity was needed to write about Bayesian statistics rather than Bayes theorem which I fail to see why.

      Especially given what Stigler (JRSSA 2010 paper) has reported on Galton’s attempt with just a two stage quincunx.
      ( And possibly talking about at Duke today )


  5. greg says:

    Why not use ‘improve a wikipedia article’ as a class assignment? I think that is a nice and actually very useful assignment… I am sure wikipedia would benefit from the many smart students making changes, which might also get reviewed by the professor or TA.

  6. Av says:

    I don’t mean to defend the Wikipedia content (which does seem unbalanced), but the Scholarpedia article seems to be written for a more educated audience. For instance, its introductory sentence talks of “epistemological uncertainty”, and the third formula includes a gamma function without any explanation.

    Adam: Wikipedia only accepts CC-BY-SA licensed contributions (or more liberal licenses, e.g. CC-BY, or a dual CC-BY-SA & GFDL license). GFDL is not free enough on its own, and nor is CC-BY-NC-ND (which is the license Spiegelhalter and Rice have chosen for their article).

    Greg: it’s a nice idea, perhaps not that easy to grade though. There’s some relevant information at

    • K? O'Rourke says:

      As mentioned above, the argument seems to be that continuity is needed to write about Bayesian statistics sensibly so a certain facility/grasp of calculus and comfort with functions/distributions that happen to be conjugate is required.

      Those without this background are then thought to not be capable of grasping Baysesian statistics and hence are not an audience to address.

      Given Stigler has recently pointed out how far Galton got with informative Bayesian analyses using only a discrete machine – offers some hope that the above argument will not continue to stand – for much longer.

      (Also check out Charles Geyer’s web site on discrete approaches that are fully rigoruos)


      • Andrew says:


        I have no problem with discrete examples. I just don’t think all the examples should be discrete. The focus on discrete binary parameter spaces gives an unbalanced view of what Bayesian inference is like. Also, you can do a continuous example without calculus by just giving the weighted average formula. Finally, I think it’s appropriate for a wikipedia article to reach multiple audiences.

        • K? O'Rourke says:

          Andrew: Agree that not all examples should be discrete even if they could be and that there should be different material/approaches for different audiences.

          As for giving weighted average formulas, its from what I think that has done in meta-analysis thats mostly motivating me to find other more general but transparent ways to show how statistics works.

  7. [...] Posted by Andrew on 25 November 2011, 2:22 pmI checked and somebody went in and screwed up my fixes to the wikipedia page on Bayesian inference. I give up. Filed under Bayesian Statistics [...]

  8. [...] Bayesian Statistics – Scholarpedia Entry (Recommended by Prof. Andrew Gelman) [...]