Why the polls got it so wrong in the British election
- Written by The Conversation
If the opinion polls had proved accurate, we would have been woken up on the morning of May 8 to a House of Commons in which the Labour Party had a chance to form government. By the end of the day, the country would have had a new prime minister called Ed Miliband.
This didn’t happen. Instead the Conservative Party was returned with almost 100 more seats than Labour and a narrow majority. So what went wrong? Why were the polls so far off? And why has the British Polling Council announced an inquiry?
We have been here before. The polls were woefully inaccurate in the 1992 election, predicting a Labour victory, only for John Major’s Conservatives to win by a clear seven percentage points. While the polls had performed a bit better since, history repeated itself this year.
Facing realities
A big issue at hand is the methodology used. On the whole, pollsters simply do not make any effort to duplicate the real polling experience. Even as election day approaches, they very rarely identify who the candidates are in their survey questions, instead simply prompting party labels. This tends to miss a lot of late tactical vote switching. The filter they use to determine who will actually vote as opposed to say they will vote is clearly faulty, which can be seen if we compare the actual voter turnout figures with those projected in most of the polling numbers.
Almost invariably, they over-estimate how many of those who say they will vote do actually vote. Finally, the raw polls do not make allowance for what we can learn from past experience as to what happens when people actually make the cross on the ballot paper compared to their stated intention.
We know that there tends to be a late swing to the incumbents in the privacy of the polling booth. For this reason, it is wise to adjust raw polls for this late swing.
Wisconsin Department of Natural Resources, CC BY-ND
Of all these factors, which was the main cause of the polling meltdown? For the answer, I think we need just look to the exit poll, which was conducted at polling stations with people who had actually voted.
Exit, pursued by a pollster
This exit poll, as in 2010, was pretty accurate, while similar exit-style polls conducted during polling day over the telephone or online with those who declared they had voted or were going to vote failed pretty much as spectacularly as the other final polls. The explanation for this difference can, I believe, be traced to the significant difference in the number of those who declare they have voted or that they will vote and those who actually do vote.
If this difference works particularly to the detriment of one party compared to another, then that party will under-perform in the actual vote tally relative to the voting intentions declared on the telephone or online.
In this case, it seems a very reasonable hypothesis that rather more of those who declared they were voting Labour failed to actually turn up at the polling station than was the case with declared Conservatives. Add to that late tactical switching and the well-established late swing in the polling booth to incumbents and we have, I believe, a large part of the answer.
Skin in the game
Interestingly, those who invested their own money in forecasting the outcome performed a lot better in predicting what would happen than did the pollsters. The betting markets had the Conservatives well ahead in the number of seats they would win right through the campaign and were unmoved in this belief throughout. Polls went up, polls went down, but the betting markets had made their mind up. The Tories, they were convinced, were going to win significantly more seats than Labour.
I have interrogated huge data sets of polls and betting markets over many, many elections stretching back years and this is part of a well-established pattern. Basically, when the polls tell you one thing, and the betting markets tell you another, follow the money. Even if the markets do not get it spot on every time, they will usually get it a lot closer than the polls.
So what can we learn going forward? If we want to predict the outcome of the next election, the first thing we need to do is to accept the weaknesses in the current methodologies of the pollsters, and seek to correct them, even if it proves a bit more costly. With a limited budget, it is better to produce fewer polls of higher quality than a plethora of polls of lower quality. Then adjust for known biases. Or else, just look at what the betting is saying. It’s been getting it right since 1868, before polls were even invented, and continues to do a pretty good job.
Leighton Vaughan Williams does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
Authors: The Conversation
Read more http://theconversation.com/why-the-polls-got-it-so-wrong-in-the-british-election-41530