An investigation into the differences between phone and online polling for the UK’s EU membership referendum

Download the full Polls apart report

Cube teal


It is difficult to conceive of a tougher test for political pollsters, after a year they would rather forget, than the referendum on the UK’s membership of the European Union: a binary decision that cuts across traditional party lines with no recent equivalent vote to turn to for reference. To make matters worse, there is currently a huge gulf between what polls conducted online and most of those administered over the phone are saying. At this rate, the political polling industry risks emerging from one post-mortem only to be plunged into the need for another.

That is why Populus commissioned Matt Singh from Number Cruncher Politics, one of a very few correctly to predict the outcome of the last General Election, to analyse the differences between the online and phone results of EU referendum polling and to proffer an explanation. He and Populus data analyst, James Kanagasooriam, have examined a year’s worth of EU polling data along with individual level data from the large-scale British Election Study face-to-face and online surveys. Their task has been to try and explain why samples that look similar demographically and by voting history can give such different overall answers about people’s preferred referendum outcome depending whether they are asked to give their views online or over the phone.

Together, Singh and Kanagasooriam, work their way through the impact on polls – both phone and online – of offering “Don’t know” as a prompted rather than an unprompted alternative to “Remain” or “Leave”. They then look for markers which correlate strongly with EU referendum voting intention but which are not readily apparent by looking at people’s demographic or political background. In the process they find a third dimension; people’s social attitudes to things like equality and national identity – which are at least as significant as their voting history or personal backgrounds in determining how they approach the issue of the UK’s EU membership. Using these two insights they begin to reconcile the discrepancy between what the polls conducted online and over the phone are now saying about what people will do on June 23rd.

This work is designed to prompt further debate about the way political pollsters go about drawing their samples and weighting them. It is part of Populus’s continuing commitment to get to the heart of where we went wrong with our political polling at the 2015 General Election, before we re-enter the fray of publishing political polls again. It should, though, be read not in the spirit of re-fighting the last war but of trying to win the next one.

Rick Nye

Managing Director, Populus

March 2016

Executive Summary

As the polling industry prepares to publish the final report of its inquiry into what went wrong at the 2015 General Election, new research from Populus & Number Cruncher Politics finds that pollsters risk further humiliation, this time at the hands of the EU referendum result. Why? Because their changes in methods are concentrating on trying to get the last election right rather than preparing for the very different kind of challenge posed by the EU referendum.

A feature of opinion polling for Britain’s forthcoming referendum on EU membership has been the clear, consistent and systematic difference between the results obtained by computer-assisted telephone interviewing (CATI) and those conducted on the internet. Various theories have been advanced to explain the discrepancy, but no one has yet published a detailed analysis using individual-level polling data. This paper sets out to do that.

We conclude that there are two principal causes for the discrepancy between phone polls and online polls of the question of the UK’s membership of the European Union.

The first is how the question is presented.  Online polls almost always provide an explicit “don’t know” option when asking about the referendum, allowing undecided voters to complete the survey without giving a voting intention, whereas telephone polls generally do not, recording only verbatim “don’t know” responses. As a result, the latter method has tended to produce fewer “don’t knows” with undecided voters compelled to choose, most often to the benefit of the status quo option.  This explains about one third of the gap between the phone and online polls.

The second, more significant cause is that the question of the UK’s EU membership has highlighted the differences in the make-up of the samples of the general public accessed by internet and telephone polls respectively. Each has lately been found to be less-than-perfect for political purposes and sometimes these separate deficiencies end up affecting online and phone polls similarly (as they did at the 2015 General Election). At other times these differences skew the outcome of online and phone polls in very different ways even after controlling for the differences in demographic and partisan make-up between the two samples. Simply put, people who seem otherwise alike in terms of who they are, where they live or how they’ve voted previously, can turn out to be different on the question of the EU referendum depending on whether they are part of a phone or an online sample.

In uncovering this we believe we have discovered a third, significant dimension to the challenge of drawing-up properly balanced political samples to poll, namely people’s broader social attitudes. These are not simply functions of someone’s demographic and partisan characteristics. Social attitudes need to be addressed in their own right because they affect political behaviour differently even among those people who seem similar in many other ways.  It is this that helps to explain why phone and online answers to the EU referendum question are so difficult to reconcile.

The most important question for pollsters, of course, is whether the phone or online polls lie nearer to the current truth on this issue. Taking all the evidence into account, our view is that the true state of public opinion on the UK’s continued membership of the European Union is likely to be rather closer to what telephone polls are showing – a clear Remain lead – than the narrower contest suggested by online polling. What follows is an explanation as to why.

Download the full Polls apart report

Download the data tables for the 26th-28th February telephone poll

Download the data tables for the 26th-28th February online poll

Download the data tables for the 4th-6th March telephone poll

Download the data tables for the 2nd-10th March online poll

The Polls apart study interviewed 1,002 GB adults (phone) and 2,071 GB adults (online) between 26th and 28th February, 1,004 GB adults between 4th and 6th March (phone) and 4,047 GB adults between 2nd and 10th March (online). Data were weighted by 2015 general election vote, age, gender, region, socioeconomic grade, housing tenure, employment status, car ownership and whether the respondent had taken a foreign holiday in the last three years. Where indicated, data were additionally weighted by attitudes to gender equality, racial equality and (for respondents in England) national identity. Populus is a member of the British Polling Council and abides by its rules.