The last time Republicans won a presidential election without a Nixon or a Bush on the ticket was 85 years ago, in 1928. With nearly a full century of electoral data available, the answer to the Republicans’ presidential ambitions is obvious, and it isn’t Christie or Ryan or Paul or Cruz.
Charles Wheelan’s “Naked Statistics: Stripping the Dread From the Data” will help you think about that assertion like a statistician and will illustrate why you don’t have to be Chris Christie, Paul D. Ryan, Rand Paul or Ted Cruz to sense an error in the conclusion. Bushes and Nixons may interpret the electoral data differently, of course.
The difference in interpretation is one of Wheelan’s central points. “Statistics alone cannot prove anything; instead, we use statistical inference to accept or reject explanations on their basis of relative likelihood,” he writes.
The Bushes and Nixons find statistical evidence for their case in 85 years of presidential elections; everybody else sees a small sample and remembers from statistics kindergarten that correlation isn’t causation.
Chances are that the press release’s smart people around Washington churn out by the thousands use some form of statistical analysis to support their case, while other smart people churn out their own press releases using statistics to refute the first bunch.
Does high government debt cause slow economic growth? Can more guns reduce deaths and injuries caused by guns? Do tests reliably measure student achievement and teacher performance? Do tax cuts increase tax revenue? Does early preventive care mean life-long improvements to health? Are women the victims of discrimination? Is a DNA match sufficient to justify a conviction in a criminal case?
Partisans in big debates have more and more statistical data at their disposal. There must be a statistics equivalent to John Maynard Keynes’ comment that practical men, believing themselves exempt from any intellectual influence, are usually the slaves of some defunct economist. Few arguments today aren’t slaves to some statistician.
“Our ability to analyze data has grown far more sophisticated than our thinking about what we ought to do with the results,” Wheelan writes. “The use of statistics to describe complex phenomena is not exact. That leaves plenty of room for shading the truth.”
If even honest people can disagree about how to interpret the data, what does one make of an argument when one side, or both sides, isn’t even trying to be honest?
Wheelan’s book is a primer for those who think a good grasp of this material is essential, a computer-age necessity to go along with familiarity with the separation of powers or the concept of inflation. He starts slow, with the difference between the mean and the median, and builds gradually.
Follow him through random sampling, correlation and coefficients, probability and margins of error, bias, dependent and explanatory variables, standard deviation, standard error, the central limit theorem, false negatives and false positives, p-values, regression analysis, least squares and more.
The terminology may sound intimidating, but Wheelan handles it well and is a patient teacher. If you’re the kind of reader whose flagging interest can be revived by cracks about the Kardashians or the author’s faux self-deprecation, you’ll enjoy Wheelan’s style.
Following the speeches from elected officials, the crowd stands at long tables as they dig into BBQ, brunswick stew, cadillac rice at the Law Enforcement Cookout at Wayne Dasher's pond house in Glennville, Ga., on Thursday, April 17, 2014.