I came across the above quote the other day in an old post of mine, when searching for a Schrodinger’s cat image.
The quote came up in the context of a statistical claim made by a political activist which was widely promoted and discussed but which turned out to be false. As I wrote at the time, I was disappointed that the activist’s response to the disproof of his claim was not to recalibrate his understanding but rather to try to explain away the refutation and to attack the people who went to the trouble of figuring out where he’d gone wrong. Later on in the comments I continued along the same lines:
If you think being extremely numerate is protection against making a statistical mistake, you are naive about the process of scientific discovery. Extremely numerate people make mistakes all the time. Everybody makes mistakes all the time. Being open to learning from your mistakes, that’s how to move forward. Denying your mistakes and fighting, that’s not a way to move forward in your understanding.
Also this:
As they say in AA (or someplace like that), it’s only after you admit you’re a sinner that you can be redeemed. I know that I’m a sinner. I make statistical mistakes all the time. It’s unavoidable.
As you can see, it’s my general position that if something’s worth saying, it’s worth saying over and over and over.
The issue of accepting error in a shocking-but-surprising claim has connections to two statistical issues I’ve been thinking about recently, as I’ll discuss.
The paradoxical nature of anecdotal evidence (and of evidence more generally)
Thomas Basbøll and I recently published a couple of articles on the role of stories in social science (see here and here). Our key point is that stories should be anomalous and immutable: anomalous because the role of a story is to change our view of the world, to represent a solid piece of information that contradicts, in some way, our current understanding; and immutable because the value of this contradiction comes from the story having sharp edges that do not fit into conventional structures.
To the extent that a story becomes pliable, so that its details can be altered to fit a point of view, it loses its ability to inform us, as social scientists (or as humans, acting in the role of amateur scientists in our goal of learning about the social world).
That’s (one reason) why it’s important, when your surprising story is shot down, to accept that you might be wrong. Your story is surprising—that is, it contains information—but this surprise is conditional on the information being true. When it turns out the information is false, it’s a horrible mistake to hold on to the surprise and discard the truth. Then you’re in the position of Wiley E. Coyote.
Your belief has no foundation, and you’re supporting yourself on nothing but a cloud of ignorance.
Time to turn around before you end up falling.
“Psychological Science”-style papers
The other thing the above quote reminds me of, is all the controversy about noise-mining research articles that have appeared in journals such as Psychological Science. My fullest discussion of such issues appears in this recent paper, but, for here, let me reiterate Jeremy Freese’s point that research about the unknown is, well, it’s full of unknowns, and there should be no shame in accepting that a once-promising idea didn’t work out.
Surprising, newsworthy, statistically significant, and wrong: it happens all the time.
“When it turns out the information is false, it’s a horrible mistake to hold on to the surprise and discard the truth.”
I’d buy a t-shirt that said that!
Sorry but maybe you should tag this as “naive”.
Somewhere (perhaps in this blog) I read that if you are an engineer and you dont know what you are doing you are in trouble, and if you are a scientist and know what you are doing you are also in trouble. Or something like that.
The implication is engineers might get defensive about their work but not scientists.
PS liked the illustrations!
Maybe this one http://statmodeling.stat.columbia.edu/?s=bridges+fall ?
Reminds me of a story from an engineer friend: His father was an engineer in Poland. The custom there (at least at that time) was that when a bridge was built, the engineer and construction manager would stand under the bridge while it was fully loaded. Definitely an incentive to do it right!
(Also like the minesweepers who walk arm in arm across a field after they have swept it for mines.)
In a statistical context – From http://www.stat.harvard.edu/Faculty_Content/meng/COPSS_50.pdf
“it is a useful exercise to imagine ourselves
in a situation where our statistical analysis would actually be used to decide
the best treatment for a serious disease for a loved one or even for ourselves.
Such a “personalized situation” emphasizes that it is my interest/life at stake,”
I’m not sure if this is where you got it but I first came across that saying in Hamming’s fantastic “Art of Doing Science and Engineering”. I don’t think he was the original source of the saying though.