Bayes related

Dave Decker writes:

I’ve seen some Bayes related things recently that might make for interesting fodder on your blog.

There are two books, teaching Bayesian analysis from a programming perspective.

And also a “web application for data analysis using powerful Bayesian statistical methods.”


I took a look. The first book is Think Bayes: Bayesian Statistics Made Simple, by Allen B. Downey. It’s super readable and, amazingly, has approximately zero overlap with Bayesian Data Analysis. Downey discusses lots of little problems in a conversational way. In some ways it’s like an old-style math stat textbook (although with a programming rather than mathematical flavor) in that the examples are designed for simplicity rather than realism. I like it! Our book already exists; it’s good to have something else for people to read, coming from an entirely different perspective.

The second book is Probabilistic Programming and Bayesian Methods for Hackers, by Cameron Davidson-Pilon with contributions from may others. This book is a bit more polished and less conversational (which is good in some ways and bad in others; overall it’s a plus for book #2 to differ in some ways from book #1, so you can get something out of reading each of them), with a more of a focus on business-analytics sorts of problems. I didn’t read this book in detail either, but from what I saw, I like it too!

My main criticism with both books is that they talk a lot about inference but not so much about model building or model checking (recall the three steps of Bayesian data analysis). I think it’s ok for an introductory book to focus on inference, which of course is central to the data-analytic process—but I’d like them to at least mention that Bayesian ideas arise in model building and model checking as well.

Finally I looked at BayesHive, which describes itself as “a web application for data analysis using powerful Bayesian statistical methods. We help you build statistical models for your entire dataset such that final decisions and measurements are based on all the available information.” This sounds great, but unfortunately the website has zero detail on what they are doing.

I guess that’s the for-profit, closed-source world for you. For Stan, everything is open and we tell you what we are trying to do and what algorithms we’re using. This thing is all a big mystery. Their documentation has things like, “BayesHive is what you get when you ask functional programmers to re-invent statistics and data analysis software.” Here are some details: “Using our Angular.UIRouter module, we can define a hierarchical set of user interface “states”, each of which has associated Hamlet files defining the user interface appearance and Julius files defining its behaviour. . . . The layout and behaviour definitions for all the states are collected together using Template Haskell code that traverses the directory tree of state definitions and constructs the state dispatch structure needed to initialise the Angular ui-router system. . . .” As Bob would say, for all we know, they’re running Stan under the hood. Once Stan becomes more of a standard, this should be a good thing, as developers of this sort of software will be able to focus on specific applications rather than having to write their own code for statistical modeling and inference.

8 thoughts on “Bayes related

  1. For me the biggest distinguishing factor in favor of those two books was that they are freely available online. Very convenient. :)

  2. BayesHive is the interface and Baysig is the probabilistic programming language. The reference manual’s still a work in progress, so I couldn’t tell how they are doing inference.

    From their FAQ ( http://bayeshive.com/helppage/Frequently%20Asked%20Questions ):

    Q: What did you use to build BayesHive?

    A: BayesHive is built using Yesod, Stan, Haskell, node.js, AngularJS and numerous other packages.

    Q: How is Baysig Different from WinBugs/JAGS/Stan?

    Unlike WinBUGS, JAGS, and Stan, Baysig is a general-purpose programming language. This means that:

    * complex models can be built from components

    * new elementary probability distributions can be defined within the model

    * Post-inference calculations such as decisions can be expressed

    * There is support for plotting the model estimation results and/or the raw data

    * In addition, Baysig supports first-class ordinary and stochastic differential equations, which means you get inference in continuous dynamical systems.

    Given its basis in Haskell, it’s likely to be more like Church in flavor than like Stan. I’m very curious what they’re doing for ODEs.

    • By the way, Stan’s aiming in this direction.

      Stan’s a general purpose language in the sense that it’s Turing complete

      * We plan to add subroutines to Stan to allow more modular development.

      * Stan allows new probability distributions to be defined in the model.

      * Stan supports posterior inference in the generated quantities block, with functions of parameters and random quantities

      * Stan itself doesn’t support plotting, but RStan does (and PyStan and MATStan will, too).

      * We are already working on the design for adding ODE solvers to Stan.

    • Thanks for your comments, Bob. Baysig should indeed feel like some intermediate between Church and WinBUGS/Stan. You should be able to state models in a language that is very close to Stan, and then keep on using that language for manipulating probability distributions.

      We do support ODEs but for Bayesian inference we find that stochastic differential equations often work best. We simply use the approximate likelihood based on Euler-Maruyama.

      The best place to read more about BayesHive and Baysig is probably the Help pages: http://bayeshive.com/help

  3. This is perhaps picking nits, but I just re-read Jaynes and can’t help myself. On the first page of “Think Bayes”:

    A probability is a number between 0 and 1 (including both) that represents
    a degree of belief in a fact or prediction.
    (…)
    The value 0.5, often written as 50%, means that a predicted outcome is as likely to happen as not

    It seems this is a classic instance of the mind-projection fallacy, for instance I might have a belief of 0.5 that my wife will give birth to a boy when she is due next month, but the outcomes are certainly not equally likely to happen!

    Also related to the mind-projection fallacy there is the superfluous use of the word “random” through the first chapter. On the first page:

    The U.S. population is about 311 million, so the probability that a randomly chosen American will have a heart attack in the next year is roughly 0.3%

    I think this could (and should) simply say:

    The U.S. population is about 311 million, so the probability that any single American will have a heart attack in the next year is roughly 0.3%

    which does not require one to speculate in which mechanism choose the author “randomly”.

    • 1) You seem to be interpreting “likely to happen” in a frequentist way – which is itself an example of the mind projection fallacy (likely from whose perspective?). In the Bayesian framework, if your belief is 0.5 then _from your perspective_ it is 50% likely to happen.

      2) I’ve found that statistics texts read much more sensibly if you consistently remind yourself that “random” means “unpredictable” rather than “non-deterministic”. In the context you quoted, “randomly chosen” means “arbitrarily chosen” (which, if you think about it, is the same as “unpredictably chosen”).

    • Thanks for these comments on Think Bayes. The manuscript is in production now, so I have time to make some edits. About your first point, the definition of probability is a topic of ongoing debate; I tried to provide a minimal definition that sidestep the thorniness as much as possible, but it’s not easy (and I didn’t want to get bogged down).

      About my overuse of the word “random”, I did a search and you are right — there are a lot! I will see if I can edit a few out.

  4. Pingback: Check in with Erik Demaine And Bayes Related « Pink Iguana

Comments are closed.