Skip to content
 

“Surely our first response to the disproof of a shocking-but-surprising claim should be to be un-shocked and un-surprised, not to try to explain away the refutation”

I came across the above quote the other day in an old post of mine, when searching for a Schrodinger’s cat image.

The quote came up in the context of a statistical claim made by a political activist which was widely promoted and discussed but which turned out to be false. As I wrote at the time, I was disappointed that the activist’s response to the disproof of his claim was not to recalibrate his understanding but rather to try to explain away the refutation and to attack the people who went to the trouble of figuring out where he’d gone wrong. Later on in the comments I continued along the same lines:

If you think being extremely numerate is protection against making a statistical mistake, you are naive about the process of scientific discovery. Extremely numerate people make mistakes all the time. Everybody makes mistakes all the time. Being open to learning from your mistakes, that’s how to move forward. Denying your mistakes and fighting, that’s not a way to move forward in your understanding.

Also this:

As they say in AA (or someplace like that), it’s only after you admit you’re a sinner that you can be redeemed. I know that I’m a sinner. I make statistical mistakes all the time. It’s unavoidable.

As you can see, it’s my general position that if something’s worth saying, it’s worth saying over and over and over.

The issue of accepting error in a shocking-but-surprising claim has connections to two statistical issues I’ve been thinking about recently, as I’ll discuss.

The paradoxical nature of anecdotal evidence (and of evidence more generally)

Thomas Basbøll and I recently published a couple of articles on the role of stories in social science (see here and here). Our key point is that stories should be anomalous and immutable: anomalous because the role of a story is to change our view of the world, to represent a solid piece of information that contradicts, in some way, our current understanding; and immutable because the value of this contradiction comes from the story having sharp edges that do not fit into conventional structures.

To the extent that a story becomes pliable, so that its details can be altered to fit a point of view, it loses its ability to inform us, as social scientists (or as humans, acting in the role of amateur scientists in our goal of learning about the social world).

That’s (one reason) why it’s important, when your surprising story is shot down, to accept that you might be wrong. Your story is surprising—that is, it contains information—but this surprise is conditional on the information being true. When it turns out the information is false, it’s a horrible mistake to hold on to the surprise and discard the truth. Then you’re in the position of this guy:

cliff

Your belief has no foundation, and you’re supporting yourself on nothing but a cloud of ignorance.

Time to turn around before you end up here:

cliff2

“Psychological Science”-style papers

The other thing the above quote reminds me of, is all the controversy about noise-mining research articles that have appeared in journals such as Psychological Science. My fullest discussion of such issues appears in this recent paper, but, for here, let me reiterate Jeremy Freese’s point that research about the unknown is, well, it’s full of unknowns, and there should be no shame in accepting that a once-promising idea didn’t work out.

Surprising, newsworthy, statistically significant, and wrong: it happens all the time.

7 Comments

  1. “When it turns out the information is false, it’s a horrible mistake to hold on to the surprise and discard the truth.”

    I’d buy a t-shirt that said that!

  2. jonathan says:

    Sorry but maybe you should tag this as “naive”.

  3. Fernando says:

    Somewhere (perhaps in this blog) I read that if you are an engineer and you dont know what you are doing you are in trouble, and if you are a scientist and know what you are doing you are also in trouble. Or something like that.

    The implication is engineers might get defensive about their work but not scientists.

    PS liked the illustrations!

Leave a Reply