 WHAT DO WE KNOW? Fernbach and Sloman. |
About one month before the mid-term elections last fall, 98 people from around the country — Democrats, Republicans, and independents — sat down at their computers to participate in a psychological experiment put together by three Brown University researchers.
Some read the following statement:
Mid-term elections for the House of Representatives are coming up in November. Recently, Ryan Frazier, a Republican candidate in Colorado's hotly contested 7th district House race won the endorsement of the Denver Post, Colorado's largest newspaper.
Others read a statement that included just the first sentence, with no mention of Ryan Frazier or the newspaper endorsement. Both groups were then asked to judge the likelihood that the "Republicans will win control of the House of Representatives" on a 0-100 scale.
Finally, all participants were asked whether they wanted to gamble on the GOP winning control of the House, with two options:
1. You get 10 dollars, no matter what happens.
2. You get 30 dollars if the Republicans win control of the House of Representatives in mid-term elections.
The result: those given an extra nugget of information — the Post's endorsement of Frazier — were actually less likely to bet on the GOP takeover.
This, along with four other experiments conducted as part of the same study, point to what researchers call the "weak evidence effect," the subject of a paper published in advance last month in the online version of the journal Cognition.
The central finding: presenting weak, but supportive evidence makes people less confident in predicting a particular outcome than presenting no evidence at all.
Two of the three authors — lead author Phil Fernbach, a post-doctoral research associate in the department of cognitive, linguistic, and psychological sciences at Brown and Steven A. Sloman, a professor in the same department — sat down in their stripped-down lab on Waterman Street on a recent afternoon to discuss the findings.
Fernbach said the findings point to a larger flaw in the way we think: we're myopic. We focus too much on the evidence immediately at hand, often to the exclusion of stuff we already know. And if that evidence is weak, it seems, we're less likely to hop on board with what may be a perfectly reasonable prediction.
Start talking about the United Nations imposing a "no-fly zone" in Libya and the public might actually get more pessimistic about the odds of Muammar Qaddafi losing his grip on power. A no-fly zone? C'mon, that won't do anything. The guy is never going down.
Sloman said the research also buttresses an idea about human psychology that goes back to Aristotle: the best way to convince someone is to let him convince himself.
Give him a fact or, in this case, a prediction — the Republicans will seize the House of Representatives — and let him construct his own explanation for how it might come to pass.
Fernbach and Sloman say their research could help to explain some of our modern mysteries: why the public was skeptical of President Obama's sweeping health care overhaul, for instance, even as it supported individual elements.
And it could have implications for those in the business of persuasion: lawyers, marketers, and politicians who might consider highlighting weak evidence on the other side.
Frazier, the Colorado Republican, could use all the advice he can get: in the midst of the GOP tide in November, he lost handily.