In the weeks before Election Day, we were besieged by polling data, breathlessly conveyed as breaking news by unimaginative journalists. This might seem rather benign, a mild diversion for political obsessives, but I’m not sure that polls are quite so innocent. We either need to train a more critical eye on opinion polls and become informed consumers of their data, or start ignoring them altogether.
The problem is that journalists, pundits, politicians, interest groups, and citizens usually take polls at face value, sometimes because it’s in their interests to do so. If a poll says 55% of the population supports Jones, according to popular reason, Jones must have 55% of the vote.
However, polling really is not so simple. For one thing, it is usually a bad idea to draw firm conclusions from a single poll as they can go grievously wrong. A Rasmussen poll predicted a 13-point victory for Sen. Daniel Inouye (D-HI) a few weeks before the election: Inouye won on Tuesday by over 50 points. Then there’s the September poll conducted for PJTV, a right-wing Internet TV channel, which found that a third of African Americans likely to vote would support a Tea Party candidate. A single poll mixed with wishful thinking and self-interest can outweigh for some people the well-known realities of American politics.
The reason polls have this power is that they have a scientific ring to them. But what makes the best, most accurate polls scientific often does not apply to run-of-the-mill political polls. I am not referring only to the baldly partisan polling that the two parties churn out in order to drive their preferred narratives about certain races. (Rasmussen, for its part, is a technically nonpartisan but Republican-friendly pollster. Nate Silver, of the FiveThirtyEight blog, estimates that their polls had a three or four point Republican tilt this year.) There are factors besides a poll’s provenance that should make us suspicious of seemingly straightforward results.
One major problem in the average political survey is forced choice: Pollsters will only offer options such as agree or disagree, support or oppose, but not “I don’t know” or “I haven’t thought much about it.” Another recent PJTV poll asked likely voters whether they supported or opposed the Tea Party; there were no other options. The not-so-surprising result was that more than half the country supports the Tea Party. But this finding starts to look a little shaky when you consider that a recent Newsweek poll found that more than a quarter of registered voters have not read, heard, or seen anything about the Tea Party. Apparently a lot of the people PJTV tallied either as supporters or opponents were just hearing about this “Tea Party” thing for the first time.
Americans just are not as opinionated as opinion polls assume and require them to be. When polls explicitly offer an option such as “I don’t know” or “I haven’t thought much about it,” people often take it. In 2002, the National Election Study found that about a third of Americans admitted not having thought much about the Bush tax cuts, the central domestic policy initiative of the past year. Or consider a CBS News poll from late August, during the “Ground Zero Mosque” nonsense, which asked about Americans’ impressions of Islam. Thirty-seven percent reported not having heard enough about the religion to say.
It is easy for those who are afflicted with the political bug to forget that sometimes their fellow citizens just don’t pay that much attention to politics and don’t have strong opinions on every issue. But opinion polling has often implied that Americans have well-formed views on everything under the sun.
Then there’s the issue of question wording, which has a tremendous ability to introduce bias into poll results. Compare a couple of polls taken in June, during the Gulf oil spill, measuring attitudes towards offshore drilling: CBS News asked whether respondents favored increased drilling off the coast, or thought that “the costs and risks are too great.” Just 40% favored drilling, and 51% said the costs are too great. But Ipsos presented two options: either offshore drilling is “necessary so that America can produce its own energy,” or it is a bad idea “because of the risks to the environment.” 62% said drilling is necessary; just 32% said it is a bad idea.
CBS News, of course, reported that a “majority now opposes more offshore drilling.” Ipsos, meanwhile, concluded that support for offshore drilling was not budging “despite increased coverage and environmental fallout from the spill.” Which conclusion you believe depends on which question wording you prefer — or, more realistically, which conclusion you want to sell. The inevitable variation between polls with different question wordings enables almost all interested parties to claim the public’s support for their own positions.
Now, while I have focused on issue polling, it is worth noting that polling averages are generally pretty good at predicting electoral outcomes. However, a few major caveats are in order. First, the media often does not report polling averages. They report lone poll results, like Gallup’s outlandish prediction that Republicans would best Democrats by 15 points in the overall congressional ballot (it looks to be closer to seven points).
Second, even polling averages can fail systematically. They predicted a three-point victory for Sharron Angle over Sen. Harry Reid (D-NV), but Reid pulled another rabbit out of his hat, and won by five points. One possible explanation, says Silver, is that pollsters did not pick up a lot of unenthusiastic Reid voters (is there any other kind?) because such people were unlikely to complete the polls. The flipside of this problem can be seen in Colorado, where polls thought radical-right independent candidate Tom Tancredo would be closer to victory than he was, possibly because his voters were very enthusiastic and likely to respond to pollsters.
I should clarify, in closing, that this is not an argument against polling per se. Nonpartisan, transparent, methodologically sound polling is absolutely useful. It can tell politicians what people think, and tell citizens what their neighbors think. It’s fun for the political class to follow this stuff, and, yes, I confess that it’s fun for me too, but the current cacophony of opinion polls is distracting, and the indiscriminate way in which the media reports on them is misleading. So, in conclusion, we must either learn how to read polls, or how to ignore them.
Sam Barr ’11 (sbarr17@fas) would have known that Truman was going to beat Dewey and would have notified the Chicago Daily Tribune accordingly.