Failure of election polls exposes research industry’s dirty secret

Technological and behavioural change have rendered existing approaches to opinion polling a sham. The people who take part aren’t representative, and there’s no way to check if they are telling the truth.

From the moment the BBC/ITV/Sky News exit poll was published last night, political pundits began to question how the opinion polls had got the likely outcome of the election so wrong.

As the results were announced it became clear that the exit polls were largely accurate, while the pre-election opinion polls – and the miles of column inches given over to them – had been way off the mark.

But anyone who works in market research will not have been surprised. The inaccuracy of political opinion polls is just a particularly high-profile example of the industry’s dirty secret.

It all comes down to the sample and its recruitment – literally how you find people to answer the survey questions.

Online surveys necessarily have a self-selecting sample: those willing to register on a pollster’s website and log-in to take part.

Telephone surveys, meanwhile, use random diallers, but rely on people still having landlines, picking up their phones and being willing to take part.

And whichever methodology is used, there is still inherent bias in the sample, with those most motivated to participate likely to be those more swayed by a little extra pocket money, who have relatively more time and perhaps less excitement in their lives.

You might even say that survey participation has become an industry unto itself. A simple online search will reveal tips for people who want to participate in more surveys so they can earn more money. Frequently they will be advised to ‘pretend’ to be from a demographic that is less well represented in research – generally men and those who are older, better-educated and wealthier.

The pollsters’ challenge is that the differences between those who are and are not willing to take part seem to be becoming increasingly polarised. Weight the findings all you like – but if those willing to complete a poll are fundamentally different from the population at large then they are never going to be accurate barometers of opinion, and you can’t ‘weight’ that problem away.

It is possible, for example, that the polls radically under-estimated Conservative support by relying too heavily on the availability of internet connections – biasing the sample away from rural households towards more urban, technologically-engaged audiences. Alternatively, the small financial incentives may have had disproportionate traction among certain groups of voters more likely to vote Labour.

Add all this to other issues – the limited ways to verify who is completing online surveys, and a total inability to check what they are saying is true – and you have a serious problem.

So what’s the solution? The exit polls hold a clue. Yes, they too will have used sophisticated weighting techniques, but their real strength is simply their ability to engage people who would never ‘voluntarily’ complete a survey.

That’s the real challenge facing pollsters in the future: adapting their methods to access a range of people who are truly representative of the population at large.

To achieve it, they need to do more than just mix up online, telephone and face-to-face approaches. They need to fundamentally re-think how they reach out to and incentivise people to take part in their surveys.

If there is a positive to have come out of pollsters’ collective failure, it is that the industry’s endemic problems of recruitment and sampling have been exposed, and will no longer be tolerated.

Instead, the onus now falls on the research industry to improve its methods or find alternatives – because the flaws of opinion polls are no secret anymore.

Revealing Reality is a multi-award-winning research agency committed to revealing what people really do, as opposed to what they say they do. We pride ourselves on our recruitment and research techniques – working hard to engage the widest possible range of different kinds of people across the UK. Much of our research is face-to-face and we flex our methods to ensure samples aren’t biased towards those keenest to participate.