Some things are just really hard to believe: more on choosing your facts.

Republicans are much more likely than Democrats to think that Barack Obama is a Muslim and was born in Kenya. But why? People choose to be Republicans or Democrats because they prefer the policy or ideology of one party or another, and it’s not obvious that there should be any connection whatsoever between those factors and their judgment of a factual matter such as Obama’s religion or country of birth.

In fact, people on opposite sides of many issues, such as gay marriage, immigration policy, global warming, and continued U.S. presence in Iraq, tend to disagree, often by a huge amount, on factual matters such as whether the children of gay couples have more psychological problems than the children of straight couples, what are the economic impacts of illegal immigration, what is the effect of doubling carbon dioxide in the atmosphere, and so on.

Of course, it makes sense that people with different judgment of the facts would have different views on policies: if you think carbon dioxide doesn’t cause substantial global warming, you’ll be on the opposite side of the global warming debate from someone who thinks it does. But often the causality runs the other way: instead of choosing a policy that matches the facts, people choose to believe the facts that back up their values-driven policies. The issue about Obama’s birth country is an extreme example: it’s clear that people did not first decide whether Obama was born in the U.S., and then decide whether to vote Republican or Democratic. They are choosing their fact based on their values, not the other way around. Perhaps it is helpful to think of people as having an inappropriate prior distribution that makes them more likely to believe things that are aligned with their desires.

The interaction between a person’s values and their judgment about factual matters has long been noted. For instance, Upton Sinclair said “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!” To give an example: North Atlantic cod fishermen did not understand that they were overfishing their stocks, right up until the cod population collapsed.

I want to be very clear that I’m talking about how people judge facts, not values. Some people might want to restrict fishing because they like preserving a more natural ecosystem that includes fish and orcas and sea lions, while others might want less restricted fishing because they want to make more money or because they want cheaper fish. These groups might disagree about fisheries policy because they have different goals. That’s very different from disagreeing about a fact, like “how many North Atlantic cod will there be next year, if we catch N of them this year”?

People often seem to reason “backwards,” making their judgment about facts based on the implications of those facts, rather than the other way around: “If we are overfishing, then we will not be able to catch as many fish next year. That will be an economic disaster for me. Therefore we are not overfishing.”

I used to think that when people’s judgment about facts seemed very wrong, in a direction that obviously matched their personal ideology or desires, they were lying. Cigarette companies know cigarettes are addictive, fishermen know they are overfishing, Senator Inhofe and Richard Lindzen know carbon dioxide causes global warming, and so on. But I was wrong about that. There is a very strong tendency for people to believe what they want to believe, when their lifestyle is at risk, as Sinclair noted, but also when their culture or ideology is threatened.

I don’t know what to with the knowledge that people, including (I presume) me, are biased in our judgments about facts. I think that to some extent forewarned is forearmed — I think recognizing ones’ own biases can help to overcome them.

I also think that recognizing the interaction between desires and factual judgments can help figure out how to influence or persuade other people. If you think people who espouse wacky beliefs are lying, you treat them very differently compared to realizing that they are fooling themselves.

====
Related reading on this blog includes the following posts, most of them focused on beliefs about climate change and some of which have interesting or entertaining comments:
Who’s your favorite expert, by Andrew Gelman

No problem, we’ll adjust the data to fit the model, by Phil Price

How do I form my attitudes about scientific questions?, by Andrew Gelman

ClimateGate: How do YOU choose what to believe?, by Phil Price

17 thoughts on “Some things are just really hard to believe: more on choosing your facts.

  1. I think you should look at this paper:
    http://www.springerlink.com/content/064786861r21m

    The title of the paper:
    When Corrections Fail: The Persistence of Political Misperceptions

    Here the abstract:
    An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a “backfire effect” in which corrections actually increase misperceptions among the group in question.

  2. I think we have to wholeheartedly abandon a very appealing lie. People are not rational. Every single belief every person has is a rationalisation of their emotional state – in particular it's usually hiding from fear. Even the smartest people with the correctest beliefs tell themselves lies that are qualititively identical to those of the craziest loon. Even you, even me. We are wild animals with lies on top. Nothing more, nothing less.

    And that's fine.

  3. Nyhan's work (linked by Manoel) is very relevant. He cites some of the dissonance research; I've also discussed the connection to Festinger here if you're interested.
    I'd add that there's a lot of research in the decision-making literature that suggests that often people make quick, "gut" judgments and only afterwards search for rational justifications. Jonathan Haidt's social-intuitionist theory addresses this in the domain of moral reasoning. Haidt's argument is that the primary function of reasoning is not purely cognitive but rather social: good-sounding reasons allow us to defend a position and perhaps persuade others.
    It's also important to note that although the example at the start of the post is one where more conservatives seem to have the fact wrong, this phenomenon doesn't seem to be limited by ideology.

  4. "In fact, people on opposite sides of many issues,… immigration policy, global warming, … tend to disagree, often by a huge amount, on factual matters such as … what are the economic impacts of illegal immigration, what is the effect of doubling carbon dioxide in the atmosphere, and so on."

    What I'm fascinated by is how, at the prestige end of the intellectual scale, nobody notices obvious connections, such as how mass immigration to the U.S. increases global carbon emissions:

    The Pew Research Center reported in 2008:

    “If current trends continue, the population of the United States will rise to 438 million in 2050, from 296 million in 2005, and 82% of the increase will be due to immigrants arriving from 2005 to 2050 and their U.S.-born descendants, according to new projections developed by the Pew Research Center.”[Immigration to Play Lead Role In Future U.S. Growth, by Jeffrey Passel and D'Vera Cohn, February 11, 2008]

    According to UN data, the average American emits about four times as much carbon as the average Mexican and ten times as much as the average Central American. Either immigrants will assimilate to American economic levels or they won’t. In the first case, mass immigration is a global carbon emissions disaster, in the second, it’s a social and economic disaster for the U.S.

    One or the other.

  5. I'm also fascinated by how the New York Times declares, as a matter of uncontested fact, that Barack Obama is a Christian. I've read Obama's 1995 memoir several times and I saw very little evidence for that assertion in Obama's long description of how he came to join Rev. Wright's church.

    The distinguished British man of letters Jonathan Raban has read Obama's book, too, and visited Wright's church, as described in his essay for The Guardian. He came to the conclusion that Obama is an agnostic and that Wright's church is largely racialist rather than religious:

    http://www.thestranger.com/seattle/Content?oid=47

  6. The way people interpret facts is more sophisticated than "people judge (or choose) facts based on their personal values."

    Most evidence of claimed facts, for instance, is not observed firsthand but offered to us by people we can reasonably expect to have an interest in influencing our beliefs. What we are really judging, then, is not merely our degree of belief in some proposed fact F (in light of our prior knowledge) but F, given that person P claimed F. Thus our prior beliefs about F and P come into play. When P is some person strongly distrusted, the evidence "P claimed F" may rationally be interpreted as evidence against F.

    E. T. Jaynes offers this more-sophisticated model in <a href="http://books.google.com/books?id=tTN4HuUNXjgC&pg=PA128&lpg=PA128&dq=jaynes+divergence+polarizing&source=bl&ots=H3QspwMsY4&sig=yzKc81pijEhJAhVgOjD6FbDfn0Q&hl=en&ei=VhBvTKChJMGC8gbKqKmmDQ&sa=X&oi=book_result&ct=result&resnum=3&ved=0CCMQ6AEwAg#v=onepage&q&f=false&quot; rel="nofollow">Probability theory: the logic of science and shows how it predicts commonly observed political behavior, polarization in particular.

    One interesting take-away from Jaynes's model is that political polarization is not necessarily irrational. Given sufficiently different prior beliefs about F or P, rational updates will result in divergence.

    Cheers,
    Tom

  7. seems like this is a version of a classic cognitive bias from the cognitive psychology literature: the confirmation bias. psychologists have shown that people misremember facts (and evaluate them differently) in order for their memories to be more in line with their theories of how the world works. so, if you are a republican and are thinking about how gay marriage impacts psychological well-being of children, you might "conveniently" forget about the study showing that children of lesbian parents being psychologically better off than those raised by straight parents.

    =joe

  8. I think it is not so much an issue of cognitive bias as it is the tendency to use heuristics to determine an opinion. Rather than consider the facts that underlie the opinion, people generally consider the sources and whether they consider them reasonable and sympathetic. Given the results of the evaluation, which is generally emotive and simpler to complete than a rigorous analysis of the facts, a person would then decide to accepts the facts as stated or not.

  9. Hm.

    I don't disagree that cognitive dissonance exists. But I think you're seeing "facts" where there aren't any. The economic effects of illegal immigration, and the effects of extending unemployment benefits. Two cases where each political side in the US is in conflict with the published literature. I think if there was a deeper dialogue, it's not a flat out denial of facts. There's a spectrum of "what are the chances this is true" as well as "what is the importance of this effect" and "are there other fundamental issues at play".

    Since you tend liberal, Phil, lets take the unemployment example so you can make a better defense of the "fact"-denying: Extending unemployment benefits would increase, or delay the decrease of, the unemployment rate according to the literature. Doesn't that mean Democrats are idiots? No, because there's a chance that the literature doesn't hold in this case. There's a chance that the literature is flat out wrong. There's also the issue that punishing those who choose to remain on unemployment is not as important as helping the people who truly cannot find work.

    But the short questionnaires won't show the range of objections, it will simply show that Democrats are fact-denying nimrods. The talking points in articles and on television will sound more like fact-denying than honest evaluation.

    I'd recommend you find examples where you have "denied facts" yourself rather than playing anthropoligist with political opponents.

  10. The examples you chose imply that only Republicans are selective about their facts. Talk to a typical Democrat about racial differences in IQ scores. He'll probably begin by denying it, then claim that any difference is due to bias in the tests or income differences. When shown evidence that the gap still exists when adjusted for all of these factors, he'll try to shift the terms of the argument and claim that IQ scores are a poor measure of intelligence anyway. As a last resort he'll pull out a couple of anecdotes about low-IQ individuals who were very successful in life.

    The parallels to the way Republicans approach conversations about global warming are uncanny. Right down to the anecdotes. Say, wasn't that a cold winter we had last year?

  11. I don't know what to with the knowledge that people, including (I presume) me, are biased in our judgments about facts. I think that to some extent forewarned is forearmed — I think recognizing ones' own biases can help to overcome them.

    I think you answered your own question. To be clear, you are exploring this phenomenon:

    There is a very strong tendency for people to believe what they want to believe, when their lifestyle is at risk, as Sinclair noted, but also when their culture or ideology is threatened.

    Which, as one commenter pointed out, is strongly related to the concept "confirmation bias".

    A useful way to deal with this problem (both as we discuss it here, or all the way down to implementing solutions in society) is to recognize this as the problem:

    Some people know that the phenomenon of confirmation bias exists, and some people do not.

    Two solutions to this problem that I can think of off the top of my head:

    Teach about the phenomenon of confirmation bias in grade school or secondary school — how to recognize it in oneself, and how to deal with it.

    Teach journalists in j-school to always consider the possible confirmation bias dimmension to any story, and to put effort into researching and illuminating the possible biases of the "two sides".

    Both of these are tricky, largely because the right way to implement them will be influenced by a lot of confirmation bias. An eternal biased braid…

  12. Thanks for the comments, everyone. Here are some comments on the comments:

    Galdino links to a paper by Nyhan and Reifler that I had never read before, but I swear I’ve seen another write-up of this research or something very much like it: they look at what happens when people are given factual information that contradicts a previously held belief (specifically, the beliefs considered were: that WMD were found in Iraq, that tax cuts increase government revenue, and that George W Bush eliminated all stem cell research). Basically, when given factual information that contradicted these beliefs, some people’s beliefs were strengthened, not weakened as they should have been. The paper also discusses other research that found essentially the same thing. (Depressing, no?) Miles links to a very readable newspaper article that says essentially the same thing.

    Moertel suggests reading a quite interesting treatise that shows, among other things, how the effect of information on a person’s beliefs should (via Bayes’ Theorem) depend strongly on the perceived credibility of the person providing the information. In the present context, if you give Rush Limbaugh a lot of credit, then of course you’re likely to believe Obama is a Muslim born in Kenya. To me, this just pushes the puzzle back a level, to why so many Republicans find Rush Limbaugh credible. But it is an interesting article that suggests a rational explanation for widely differing interpretation of information. bbis also suggests that the perceived credibility of the information is a key factor.

    Fraac takes the opposite view: there’s no point looking for a rational explanation because people aren’t rational “and that’s fine.” Fraac, everyone except a few neoclassical economists knows people aren’t rational, it’s not a revelation! But it’s not fine at all, not when it leads to colossal blunders that cause suffering and death, which it does. And just because people are irrational in general, they (we) are not completely irrational all the time; it can be useful to figure out what is and isn’t rational, and whether anything predictive can be said about the irrationality. But basically, I agree with you that we are seeing some irrationality here; I still think it’s worth thinking about.

    Webster and Srivastava both mention the “cognitive dissonance” work of Festinger, which I remember learning something about in a psychology class a long time ago but all I’d be able to give is a one-paragraph summary. I remember that when people hold inconsistent beliefs, or beliefs inconsistent with the facts, it makes them feel uncomfortable and sometimes do some remarkable rationalization, but I think that is a slightly different phenomenon from what I’m talking about here. I have to plead ignorance, though, about the details of Festinger’s work.

    Srivastava links to his own quite interesting blog entry, which in turn links to some other relevant work. I won’t try to summarize it here, but I encourage you to read it.

    Bachir agrees with my point that it’s good to know that effects like “confirmation bias” exist — the tendency to believe evidence that backs up what you already believe, while disbelieving contradictory evidence — but Corey points to an amusing story that says that when people know about this, all it does is convince them that their ideological opponents (but not they themselves) are subject to confirmation bias! I liked the post Corey pointed to, right up to where I got to a (non-tongue-in-cheek) mention of “g-factor.” I think “g-factor” is a ludicrous attempt to put a gloss of scientific precision on a useful but necessarily imprecise concept, like defining an “athleticism factor” based on a bunch of measurements of athletic performance, or quantifying the “smut factor” of a film by counting the duration of various sexual acts that are depicted. If you want to say some people are smarter or more athletic than others, and that some films are smuttier than others, I think only an idiot would disagree; but if you think you can boil intelligence, athleticism, or smuttiness down to a single “scientific” number I think you are kidding yourself. But that’s a post for another time, perhaps; at any rate, Corey’s link is worth a look.

    Martin says my examples (of people believing things that are obviously wrong) only use Republicans, but that’s not quite true. I used (1) the Obama “birther” example, which I think is very clear and is right-wing; (2) global warming, where the facts about warming are less clear but where I think there is very clear bias in the right-wing position; (3) fisheries policy, where the facts are very clear (they really did overfish the hell out of North Atlantic cod) but I don’t think that’s a partisan issue; and (4) I mentioned cigarette addiction in passing, which I think is also not partisan. So I’m going to claim this is 2-2-0 (right-wing – nonpartisan – left-wing) as far as whose ox is being gored. But, sure, I could have given some left-wing examples. A good one would have been the 9/11 conspiracy theorists — people who think the U.S. government was behind 9/11, blew up the towers with explosives, yada yada. I know some people who believe that! So, now let’s call it 2-2-1. But it’s definitely true that the examples that leap to my mind are ones where the right wing has the wacky beliefs, even though there are plenty on both (or all) sides. The Nyhan and Reifler paper mentioned by Galdino says “It would also be helpful to test additional corrections of liberal misperceptions. Currently, all of our backfire results come from conservatives—a finding that may provide support for the hypothesis that conservatives are especially dogmatic (Greenberg and Jonas 2003; Jost et al. 2003a, b). However, there is a great deal of evidence that liberals (e.g. the stem cell experiment above) and Democrats (e.g., Bartels 2002, pp. 133–137; Bullock 2007; Gerber and Huber 2010) also interpret factual information in ways that are consistent with their political predispositions. Without conducting more studies, it is impossible to determine if liberals and conservatives react to corrections differently.”

    Sean suggests that liberals (like me, he assumes) have an inappropriate bias towards the Keynesian notion that providing unemployment benefits in a recession will decrease unemployment benefits. But he allows that just because the bias exists doesn’t necessarily mean the liberals are wrong. He suggests that there is a lot more going on than people wanting to believe something and thus tending to believe it; rather, there’s an interplay between evaluating facts, credibility of facts, credibility of theories, and so on. Sean, I’m delighted that you brought it up, because it’s something I had originally discussed in my own post but cut for length and to avoid diluting my main point. I agree, it’s actually rare that to have black-and-white situations like “Birther” beliefs, 9/11 conspiracy theorists, and so on — cases in which people believe something that is obviously not true, seemingly purely because of ideological bias. As the “facts” become less clear, there is indeed an interplay between facts and worldview as people try to generate a sort of holistic understanding of the world. Or at least, that’s one way to think of it. The unemployment example is a good one in this regard: to, say, Krugman (a liberal economist, quite a rare bird), people who disagree with the effectiveness of a stimulus like unemployment benefits are the ones with the inappropriate bias! I don’t really know what to say here, except that I agree with most of your points.

    Sailer says “nobody notices obvious connections, such as how mass immigration to the U.S. increases global carbon emissions”; I’m not exactly sure how this is related to this particular blog post. The fact that people consume more (and thus have more environmental impact) as they become wealthier is widely recognized, and projections of both population by country (including immigration) and GDP by country are routinely used for IPCC predictions of greenhouse gas emissions and for many other purposes. Perhaps my perception that this linkage is widely acknowledged is colored by the fact that I’ve done some (a very small amount) of work on making such projections, so I’ve come into contact with lots of other people, and publications, that discuss this issue. But also, Steve, I think you wildly exaggerate when you say that immigration leads to one catastrophe or the other. The roughly 1.5% of the world population that is projected to be a U.S. immigrant or the child of an immigrant in 2050 will generate something like 6% of the world environmental impact. But even if none of those people were in the U.S., they would be somewhere else, so the effect of immigration to the U.S. is perhaps more like a 4% net increase in worldwide environmental impact, compared to no immigrants coming to the U.S. Of course this assumes that immigration doesn’t depress the U.S. GDP per capita; if, instead, immigration to the U.S. is economically good for the immigrants (and their children) but economically bad for people who are already here, then the immigration effect is even smaller. So I just don’t see where you get a “global carbon emissions disaster,” even with full immigration. As for the “either one or the other” claim, that is a good example of another subject that was just discussed on this blog, the either/or fallacy.

    I’d like to thank all of the commenters — I think I mentioned everyone; if not, I apologize — for an unusually informative set of comments, I learned a lot from them and from the links people included.

Comments are closed.