Tuesday, December 27, 2016

The Polls Weren't the Problem—We Were

This article originally appeared on VICE UK.

"This article is so fucking idiotic and irresponsible." That's how polling guru Nate Silver began 14 tweets of math-based rage directed at Huffington Post journalist Ryan Grim in early November, on the weekend before That Election.

Hillary Clinton was leading by about 3 to 4 percentage points in the polls, and Grim had written a piece attacking Silver for his claim that Donald Trump still had a shot at victory. "It's not easy to sit here and tell you that Clinton has a 98 percent chance of winning," Grim wrote. "Everything inside us screams out that life is too full of uncertainty, that being so sure is just a fantasy. But that's what the numbers say."

Oh, Grim.

Of course Hillary Clinton didn't win the election, and while Silver didn't call the election for Trump he came closer than Grim and most others. More important, though, are the questions raised by by this argument. Why did two people see the same numbers so differently? Were the figures wrong? Are polls just useless now?

The only way to know for sure how people will vote in an election is to hold an election. Short of that, you can go out and ask people how they intend to vote. If you ask enough people, and if those people are representative of society—the right proportions of men and women, old people and young people and so on—then you should get an answer that's within 2 to 3 percentage points of the final result.

Here's where Nate Silver and Ryan Grim disagreed. During the election campaign, hundreds of polls were carried out. In Grim's mind, any small errors in them would be pretty random, and would tend to cancel out. "For the polls to be wrong," he wrote, "there wouldn't need to be one single three-point error. All of the polls—all of them—would have to be off by three points in the same direction."

Nate Silver's argument was that the errors might not be random. He could imagine all kinds of situations in which all of the polls might be skewed in one direction. After all, most of the big polling companies use pretty similar techniques. They have similar problems to deal with. Certain groups of voters can be harder to reach than others, for example, especially if they're the kind of anti-establishment Trump supporters or Brexit voters who distrust institutions and experts.

More to the point, it's exactly what happened last time. Obama did about 2 to 3 points better than the polls predicted in 2012. Grim even admitted that in his piece, which made his "98 percent certainty" theory even more ridiculous. Combined with the fact that Trump didn't need to win the popular vote to grab the Electoral College in his little fist, it shouldn't have been that surprising that he won.


WATCH: Leftists, Angered by Brexit, March in London


Immediately after Trump's victory, people started talking about "another" failure of the polls. It's total bullshit. Hillary Clinton won the popular vote by 2 points. The polls predicted that she'd win by 3 or 4 points. That's a 1- or 2-point margin of error, which is about what you'd expect from a decent poll. The problem wasn't that the polls were wrong; it's that pundits—myself included—deluded ourselves about Trump's chances and read far too much into them.

The same goes for most polling in the UK. It's fashionable now to claim polls are meaningless—especially if you're, say, a Corbyn supporter staring at a 10-point Tory lead. The reality is that pollsters correctly called the 1997, 2001, 2005 and 2010 elections, and coped reasonably well with tricky one-off referendums on Scottish independence and voting reform. YouGov were able to predict both of Jeremy Corbyn's leadership election wins and Cameron's win in the last Tory leadership election.

Even the big misses haven't been as big as people claim. The polls were still only out by a few points in the EU Referendum, but those points were critical in such a close race. Polls were off by about seven points when they failed to predict the Tory win in the 2015 General Election, but they correctly predicted the collapse of the Lib Dems, the rise of the SNP in Scotland and the rise of UKIP—three completely unique events in modern history.

Polls are seductive. We don't want to wait for results, especially when things like Brexit or the future of American democracy are at stake. But our demand to know, combined with pundits eager to please an audience, leads to people making claims about polls that just don't stand up. Tiny random movements in individual polls are reported as if they show real changes in the public mood.

The best thing to do? Just stop reading them. Stop feeding the monster. Ignore the day-to-day reporting, and if you really want to know how things are looking, check in on sites that will show you lots of polls over time – @BritainElects on Twitter, the UK Polling Report website for British polling, or Nate Silver's FiveThirtyEight website for American polls.

Question if things are really certain, or whether you just want them to be. And in a close race be prepared to accept the most terrifying answer of them all, the one no self-regarding pundit wants to give you: "We really don't know."

Lead image: Gage Skidmore via Flickr

Follow Martin Robbins on Twitter.



from vice http://ift.tt/2hJMZIt
via cheap web hosting

No comments:

Post a Comment