THE truth is complicated, as British journalist and economist Tim Harford reminds us in his new book, “Data Detective.” But for true believers — ideologues, partisans, religious fundamentalists, etc. — the truth is self-evident; like science which is “settled” and beyond argument.
There is, however, “nothing more anti-scientific than the very idea that science is settled, static, impervious to challenge.” That’s from the late great Charles Krauthammer, global warming believer, Harvard Medicine School graduate, chief resident in psychiatry at Massachusetts General Hospital, author of a scientific paper published in the Archives of General Psychiatry, and co-author of “a path-finding study on the epidemiology of mania.”
Just imagine if Einstein believed that Newton “had written the final word on gravity” — that it was already “settled science.” Today, astrophysicists are relentlessly poking holes in Einstein’s theory of gravity. They are, after all, scientists, not zealots.
Science is all about data which include statistics. Most of the times, however, what we see with our own eyes contradicts what statistics tells us. To quote Chico Marx (impersonating Groucho) in the movie “Duck Soup”: Who ya gonna believe, me or your own eyes?
Sometimes, says Tim Harford, the statistics give us a vastly better way to understand the world, but sometimes they mislead us. “We need to be wise enough to figure out when the statistics are in conflict with everyday experience — and in those cases, which to believe.”
He tells us that sometimes, when personal experience tells us one thing, and the statistics tell us something quite different…both may be true…but that’s not always the case, he adds.
(President Truman supposedly once said about economists, “Whenever I ask their opinion, they say on the one hand, so-and-so; but on the other hand, so-and-so, On the one hand — but on the other hand. I would like to meet an economist with one hand!”)
Back to Tim Harford. Everyone knows or should know that heavy cigarette smoking increases the risk of lung cancer (by a factor of 16). But what if “your chain-smoking nonagenarian grandma is as fit as a fiddle [and] the only person you know who died from lung cancer is your next-door neighbor’s uncle [who] never smoked a cigarette in his life”?
Whenever personal experience and statistics seem to be in conflict, Harford says, “a closer look at the situation may reveal particular reasons why personal experience is likely to be an unreliable guide.”
According to Harford, “Psychologists have a name for our tendency to confuse our own perspective with something more universal: it’s called ‘naïve realism,’ the sense that we are seeing reality as it truly is, without filters or errors. Naive realism can lead us badly astray when we confuse our personal perspective on the world with some universal truth. We are surprised when an election goes against us: Everyone in our social circle agreed with us, so why did the nation vote otherwise? Opinion polls don’t always get it right, but I can assure you they have a better track record of predicting elections than simply talking to your friends.”
But naïve realism is a powerful illusion, Harford adds. Maybe because we get our information from the media which usually highlight terrible, gruesome, outrageous, tragic news. “None of these stories reflect everyday life; all of them are viscerally memorable and seem to take place in our living rooms. We form our impressions accordingly.”
Now add social media to the mix: rumors, gossips, misinformation, slanders, etc. are now presented as “news” or “commentaries,” and are considered as such by many educated people.
Harford cites another great book, “Thinking, Fast and Slow,” by psychologist Daniel Kahneman who noted that:
“When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” Rather than asking, “Are terrorists likely to kill me?” we ask ourselves, “Have I recently seen a news report about terrorism?” Instead of saying, “Out of all the teenage girls I know, how many are already mothers?” we say, “Can I think of a recent example of a news story about teenage pregnancy?”
Harford says news reports are data, in a way, but they’re usually not representative data. Worse, they can influence our views of the world. “To adapt Kahneman’s terminology, they’re ‘fast statistics’ — immediate, intuitive, visceral, and powerful. ‘Slow statistics,’ those based on a thoughtful gathering of unbiased information, aren’t the ones that tend to leap into our minds.”
However, Harford says, there are certain things that we cannot learn from a spreadsheet brimming with statistics and other data. Most of times these need a close-up view — they can also be “misused.”
Harford says “if a group of doctors collect and analyze data on clinical outcomes, they are likely to learn something together that helps them to do their jobs. But if the doctors’ bosses then decide to tie bonuses or professional advancement to improving these numbers, unintended consequences will predictably occur. For example, several studies have found evidence of cardiac surgeons refusing to operate on the sickest patients for fear of lowering their reported success rates.”
In Britain, he says, the government collected data on how many days people had to wait for an appointment when they called their doctor. This is, of course, a useful thing to know, especially in a country with a single-payer (which is the government) healthcare system. “But then the government set a target to reduce the average waiting time. Doctors logically responded by refusing to take any advance bookings at all; patients had to phone up every morning and hope they happened to be among the first to get through. Waiting times became, by definition, always less than a day.”
Citing social scientists, Harford reminds us that “statistical metrics are at their most pernicious when they are being used to control the world, rather than try to understand it.” A statistical metric (e.g., higher hourly wage rate) may be a pretty decent proxy for something that really matters, Harford says, but it is almost always a proxy than the real thing (hourly rates were raised, but because the economy was down, some or many employees ended up working for fewer hours or losing their jobs).
“Once you start using that proxy as a target to be improved, or a metric to control others at a distance, it will be distorted, faked, or undermined. The value of the measure will evaporate.”
My favorite “statistical metric” story was told many years ago by President Reagan:
In the good ol’ Socialist Paradise that was the USSR, a party commissar visited a collective farm to see how the potato harvest was doing, and if it met the goals set by the government’s rigorously scientific economic plan. The commissar asked a farmer who said, “Oh comrade commissar! If we took all the potatoes, they would reach the foot of God.” The commissar frowned. “Comrade farmer,” he said, “this is the Soviet Union. There is no God.”
“That’s okay,” the farmer said, “there are no potatoes.”
Send feedback to firstname.lastname@example.org