The Extra Word on … the UK election polling & results

Updated: May 11, 2015

Welcome to this regular TIW column, highlighting some of the stories and results that have caught the punting eye of our editor-in-chief Sean Callander.

This business of predictions is tricky stuff. A football tipster has a lousy week, and no-one really cares. A financial advisor has a lousy week and you’re postponing retirement for a few years. The weather bureau has a lousy week, and you either find yourself without an umbrella in a rain shower, or walking through knee-deep floodwaters on your way home from work. Which brings us to the UK, where the entire business and future of election polling has been called into question after the surprise victory of the Conservatives in last week’s election for the House of Commons.



With votes counted in all 650 constituencies, the Conservatives had won an overall majority of 331 seats in the House of Commons with Labour doomed to languish on the opposition benches with 232 seats. The result came as a complete shock following months of polls that showed the two big parties running neck-and-neck with neither close to winning an overall majority. Echoing the phrase that suddenly didn’t sound so clichéd, Prime Minister David Cameron (pictured with wife Samantha) said, “there’s only one opinion poll that counts and that’s the one on election day, and I’m not sure that’s ever been truer than it is today.”

The polls had converged to suggest the Conservatives and Labour were tied or within a point or two of each other on about 32 or 33 percent of the vote share apiece. In fact, the Conservatives won about 37 per cent to around 31 for Labour. Such was the disbelief when the exit poll of people who had actually voted came out on Thursday night that Paddy Ashdown, a former leader of the Liberal Democrats, vowed to “publicly eat my hat” on live television if it turned out to be right. “Paddy Ashdown’s hat” has since acquired its own Twitter account and hashtag. John Curtice, a prominent elections expert and president of the British Polling Council, said it would launch an inquiry into what had gone wrong, led by an independent statistician.



Curtice said there were two broad possible explanations: a strong last-minute swing to the Conservatives, or polling problems. Some pollsters admitted something had gone badly wrong, and they did not yet understand what. “Election results raise serious issues for all pollsters,” Populus, one of the main polling firms, said on Twitter. Others, such as Survation, ComRes and Ipsos MORI, defended themselves, saying they had been right about the Scottish National Party’s surge in popularity, the collapse of the Liberal Democrats and a sharp increase in vote share for anti-EU party UKIP.

“Polls, in the UK and in other places around the world, appear to be getting worse as it becomes more challenging to contact a representative sample of voters” – Nate Silver

Rob Ford, a political scientist at the University of Manchester, drew comparisons with the polling disaster of 1992, when Labour were expected to win but the Conservatives beat their polling average by 9 points to romp home. “Pollsters always mutter the words 1992 in a kind of horrified tone. They can now add 2015 to the recollections of horror,” he told the Reuters new agency. The problem in 1992 was identified as “shy Tories” who were reluctant to own up that they would vote Conservative at a time when the party was perceived to be unpopular.

But Ford said while there may have been some shy Tories this time, that was at best a partial explanation as pollsters had all been adjusting their data to reflect that known phenomenon. Another possibility was that pollsters had not captured representative samples of the electorate, and only experimentation with methods would address that in future. “Polls, in the UK and in other places around the world, appear to be getting worse as it becomes more challenging to contact a representative sample of voters,” said noted prediction markets expert Nate Silver on his FiveThirtyEight website.



Silver (pictured above) fared terribly in the UK election – in his pre-election forecast, he gave 278 seats to Conservatives and 267 to Labour. Shortly after midnight, he was forecasting 272 seats for Conservatives and 271 for Labour. But when the sun rose in London on Friday, Conservatives had an expected 329 seats, against Labour’s 233. He also overestimated the Liberal Democrats’ result by roughly 20 seats). But the problem went beyond the UK. Silver went on to cite four examples where the polls had failed to provide an accurate forecast of the election outcome: the Scottish independence referendum, the 2014 U.S. midterms, the Israeli legislative elections, and even the 2012 U.S. presidential election, where “Obama beat the final polling averages by about three points nationwide”.

“It’s a very, very big miss and at this stage we just don’t know why … we’re shocked” – Damian Lyons Lowe

The former New York Times statistician gained national fame for correctly anticipating the outcome of the 2008, 2010 and 2012 U.S. elections. He did this largely by understanding how to read the polls, and by knowing which polls were worth reading. He wasn’t the only one. In 2012, the Daily Kos blogger Markos Moulitsas was more accurate than Silver in predicting the outcome of the 2012 electoral college. Needless to say, Moulitsas was not offered a high-paying job at ESPN. But the UK result highlighted a flaw in the foundations of the thinking upon which Silver has built a predictive empire. His ability to forecast elections is largely dependent on the accuracy of polling. What if the polling is flawed?

It was also possible that voters who were undecided until the last minute had overwhelmingly voted Conservative, making their minds up too late for the polls, said Ford, who like Curtice was also involved in the exit poll. Questions were also being raised about why different polls had converged towards similar findings that were all wrong. “These were different pollsters with different methodologies but everyone missed this,” Ford said. “It’s a very, very big miss and at this stage we just don’t know why. We’re shocked.” Damian Lyons Lowe, chief executive of Survation, said he had gone as far as to “chicken out” of publishing an eve of election poll that gave results much closer to what came to pass because it seemed such an outlier, suggesting that some data had perhaps been ignored.



Further complicating the picture, under Britain’s first-past-the-post electoral system, what matters is coming first in individual seats rather than national share of the vote. That meant Labour lost more than 20 seats despite increasing its share of the vote by about two percentage points from 2010, while the Conservatives gained more than 20 seats despite increasing their vote share by less than one percentage point. The main problem for Labour was that it was wiped out in Scotland, losing 40 seats there to the Scottish National Party (leader Nicola Sturgeon is pictured above). Although its support increased in regions such as Yorkshire that merely increased the majority in seats it already held.

After the 1992 disaster some pollsters tried to deny there was any problem at all, and it was all down to external forces and no fault of the polling companies. Despite some pollsters being in denial, others realised there was a problem and set about finding out how to fix it. The industry engaged in a very comprehensive investigation of the various methodologies they used in 1992 and many of the current ingredients of political polling were introduced as a result of that serious study. This general election performance follows the performance of the polls in the 2014 Scottish referendum where the choice was simply binary and yet all the polls were out by between 4-6 per cent. This industry has a problem.

Leave a Reply

Your email address will not be published. Required fields are marked *