The 2015 general election brought a similar reputational challenge to political pollsters and our clients and prospective clients are entitled to know how we have responded to it.
Polling how many people will vote and who they’ll vote for is a uniquely difficult research challenge. In 2015 methods that had been very accurate for the previous four general elections abruptly failed.
All the pre-election voting polls were wrong, not just Populus polls; indeed, many were much further from the result than Populus. But we take no comfort at all from that.
This is how we responded to our own trust crisis, when it became clear that the pre-election polls had got the 2015 election wrong:
We led the call to set up a full independent inquiry into why the polls were wrong. This was established the day after the election. It is public, exhaustive and comprised entirely of independent experts and world-leading academics – and is due to report its initial conclusions in January.
We launched our own review of our voting polls, re-examining all the data, the underlying samples and the specifics of question wording, question order and data weighting.
We took an immediate decision that we would not publish any voting intention polls until we were confident that we had fixed the problem (and that if we couldn’t fix the problem, we wouldn’t do voting intention polls anymore). We have declined business as a result of this decision and will continue to do so.
Our work to understand where our voting polls went wrong in 2015 is almost completed. This has already yielded some fascinating insights and, in turn, some compelling new ways to understand not just what consumers think, but how and why.