Brexit Reflections - How the polls got it wrong again

John Curtice looks at the challenge for opinion polls when estimating the likely outcome of #EURef.
Estimating the likely outcome of the referendum on the UK’s membership of the EU was always going to be a challenge for the opinion polls. In a general election they have years of experience as to what does and does not work on which to draw when estimating the level of support for the various political parties. They still make mistakes, as was evident last year, but at least they can learn from them. In a one-off referendum they have no previous experience on which to draw — and there is certainly no guarantee that what has worked in a general election will prove effective in what is a very different kind of contest.
Meanwhile the subject matter of this referendum raised a particular challenge. General elections in the UK are primarily about left and right. The question is whether the government should be doing a little more or doing a little. The social division underlying this debate tends to be between the middle class and the working class.
But this referendum was about something different. With immigrationfeaturing as one of the central issues, it was a division between “social liberals” and “social conservatives”. The former tend to be comfortable with the diversity that comes with immigration, while the latter prefer a society in which people share the same customs and culture. Social liberals were inclined to vote in favour of remaining in the EU, while social conservatives were more inclined to vote to leave.
The principal social division behind this debate is not social class but education. Graduates tend to be social liberals, while those with few, if any, educational qualifications are inclined to be social conservatives. Age also matters too, with younger people tending to be more socially liberal.
Pollsters in the UK have less experience of measuring this dimension of politics. They do not, for example, necessarily collect information on the educational background of their respondents as a matter of routine. Yet any poll that contained too many or two few graduates was certainly at risk of over or underestimating the level of support for staying in the EU.
Meanwhile, we do not know whether there is any reason to anticipate availability bias — that is whether Remain or Leave supporters are easier for pollsters to find than those of the opposite view. Equally, the pollsters have less idea what those who say they don’t know how they are going to vote will eventually do.
Online or on the phone?
The pollsters’ difficulties in estimating referendum vote intentions were all too obvious during the referendum campaign. In particular, polls conducted by phone systematically diverged from those done via the internet in their estimate of the relative strength of the two sides. For much of the campaign, phone polls reckoned that Remain was on 55% and Leave on 45%. The internet polls were scoring the contest at 50% each — a fact that often seemed to be ignored by those who were confident that the Remain side would win. This divergence alone was clear evidence of the potential difficulty of estimating referendum vote intention correctly.
In the event, that difficulty was all too evident when the ballot boxes were eventually opened. Eight polling companies published “final” estimates of referendum voting intention based on interviewing that concluded no more than four days before polling day.
Although two companies did anticipate that Leave would win, and one reckoned the outcome would be a draw, the remaining five companies all put Remain ahead. No company even managed to estimate Leave’s share exactly, let alone underestimate it. In short, the polls (and especially those conducted by phone) collectively underestimated the strength of Leave support.
There is little doubt that the companies are disappointed with this outcome. Some have already issued statements that they will be investigating what went wrong. The British Polling Council has indicated that it will be asking its members to undertake such investigations and may have the findings externally reviewed. It will inevitably take a while before we get to the bottom of what went wrong. However, it is already clear that there is one issue that will be worthy of investigation.
As the pollsters worked out their final estimates of the eventual outcome, many of them made different decisions from those that they had made previously about how to deal with the possible impact of turnout and the eventual choice made by the “don’t knows”. In the event those decisions did not improve their polls’ accuracy.
On average the eight polls between them anticipated that Remain would win with 52%, and Leave would end up with 48%. If all the pollsters had stuck to what they had been doing earlier in the campaign (and Populus did not adjust its figures in the way that it did), the average score of the polls would have been Remain 50%, Leave 50%. In short, at least half of the error in the polls may be a consequence of the decisions that the pollsters made about to how to adjust their final figures. Polling a referendum truly is a tough business.
John Curtice is a Senior Research Fellow at NatCen Social Research, Professor of Politics at the University of Strathclyde and Research Consultant to the Scottish Centre for Social Research. He is particularly interested in electoral behaviour, electoral systems, and political and social attitudes.


Comments policy

All comments posted on the site via Disqus are automatically published. Additionally comments are sent to moderators for checking and removal if necessary. We encourage open debate and real time commenting on the website. The Centre on Constitutional Change cannot be held responsible for any content posted by users. Any complaints about comments on the site should be sent to

John Curtice's picture
post by John Curtice
University of Strathclyde
28th June 2016
Filed under:

Latest blogs

  • 16th August 2018

    A week after the state of intergovernmental relations (IGR) in the UK was highlighted by the UK government’s law officers standing in opposition to their devolved counterparts in the UK Supreme Court, the Public Administration and Constitutional Affairs Committee published a report on improving IGR after Brexit. Jack Sheldon discusses the methods by which England could gain distinct representation — something it currently lacks — in a new IGR system.

  • 10th August 2018

    Brexit is re-making the UK’s constitution under our noses. The territorial constitution is particularly fragile. Pursuing Brexit, Theresa May’s government has stumbled into deep questions about devolution.

  • 8th August 2018

    The UK in a Changing Europe has formed a new Brexit Policy Panel (BPP). The BPP is a cross-disciplinary group of over 100 leading social scientists created to provide ongoing analysis of where we have got to in the Brexit process, and to forecast where we are headed. Members of the UK in a Changing Europe Brexit Policy Panel complete a monthly survey addressing three key areas of uncertainty around Brexit: if —and when—the UK will leave the EU; how Brexit will affect British politics; and what our relationship with the EU is likely to look like in the future. The CCC participates on the Panel.

  • 2nd August 2018

    The House of Commons Public Administration and Constitutional Affairs Committee issued its report ‘Devolution and Exiting the EU: reconciling differences and building strong relationships’. Discussing its contents, Professor Nicola McEwen suggests that the report includes some practical recommendations, some of which were informed by CCC research. It also shines a light on some of the more difficult challenges ahead.

  • 31st July 2018

    The politicisation of Brexit, combined with deteriorating relations between London and Dublin, has created a toxic atmosphere in Northern Ireland, says Mary Murphy, which will require imagination and possibly new institutions to resolve.

Read More Posts