Smoothing analysis of variance and extending the definition of degrees of freedom

There’s some cool and (possibly) important stuff in Yue Cui’s dissertation summary (under the supervision of Jim Hodges and Brad Carlin at University of Minnesota biostat). The short story is that, for reasons of substantive modeling as well as prediction, we’re pushing to fit more and more complicated models to data. (See here and here for my thoughts on “Occam’s Razor.” I’m with Radford on this one.)

Anyway, having fit these models, we need to figure out how to understand them. Cui’s dissertation is pretty technical so I won’t try to summarize it here, but it has some interesting stuff for those of you out there who like counting degrees of freedom. Jim writes,

Highlights: Chapter 2 summarizes our Technometrics paper on smoothed ANOVA, due to appear very soon. Chapter 3 redefines degrees of freedom in a way that’s consistent with the Hodges & Sargent definition but which allows a tidy decomposition of df by effects for arbitrary linear hierarchical models with normal errors (we think — no counterexamples yet). This gives new options for putting priors on smoothing parameters, among other things. Chapter 4 extends the version of SANOVA in our Technometrics to arbitrary designs (again, we can’t see how there *could* be a counterexample, but my imagination has failed me before). In particular, it allows more than one error term and doesn’t require balance.

‘ll also put in a plug for my paper with Pardoe on R-squared and pooling factors for multilevel models. In all these papers, the game is to come up with something that looks reasonable and gives the right answer in key special cases.

1 thought on “Smoothing analysis of variance and extending the definition of degrees of freedom

  1. I've done nonlinear least-squares fits with 32 adjustable parameters, and linear least-squares fits with just under 500. And I'm a believer in Occam's Razor.

    The number of adjustable parameters is not the number of "entities" in Occam's Razor. In both problems mentioned above, the theory was well-developed, and the adjustable parameters were those required by the theory. Adopting the theory was really one "entity," regardless of how many parameters the theory required.

    After all, Occam's Razor is "Entities should not be multiplied beyond necessity." Up to necessity is permitted. One of the many great things a good theory does, is tell you what's necessary.

Comments are closed.