Nature | News: Q&A

Why the polls got the UK election wrong

Political scientist Michael Bruter says the psychology of voters who make up their minds at the last minute can yield unexpected results.

Article tools

Rights & Permissions

Christian Lionel-Dupont

Michael Bruter, a political scientist at the London School of Economics.

Pollsters are still rubbing their eyes at the unexpected result of the 7 May UK election. The Conservative party romped to an outright majority, meaning that incumbent Prime Minister David Cameron forms a new government. Yet polls had predicted a knife-edge race between opposition party Labour and the Conservatives. Labour's share of the nation's vote, at some 30.4%, was not only below the 35% prediction that UK polling company Ipsos MORI released on 7 May (on the basis of data collected in the two days before); it was also outside the typical 3–4% margin of error. The British Polling Council, an association of polling organizations, announced on 8 May that it was setting up an independent inquiry into the failure.

Why did the polls get British voters’ intentions so wrong? Nature asked Michael Bruter, a political scientist who studies electoral psychology at the London School of Economics.

Are you surprised that the results were so different?

There was an obvious gap between what the polls predicted and the results, but I was not surprised. In our research we find in election after election that up to 30% of voters make up their minds within one week of the election, and up to 15% on the day itself. Some people either don’t know ahead of time or change their mind when they’re in the booth.

Usually what happens is that some of these people cancel each other out. Some who thought they would vote Conservative ended up voting Labour, and vice versa. What seems to have happened yesterday is that more people changed their mind in one direction than in the other.

Why did that happen in this election?

One of the main sources of information for voters is precisely what the pollsters tell them — and I think many pollsters do not take that into consideration.

In this case, the pollsters were predicting that there would be no overall majority: that Labour would be the second party but that it would still be able to form a coalition government.

Britain has had such a ‘hung parliament’ before, but never in British history had pollsters predicted this before the election.

So the question that voters were asking themselves was no longer ‘Which of the two parties do I want to win?’ but ‘Which coalition do I want?’ And that’s not something that the polls were equipped to deal with.

There is another important factor. We have found that when you ask people who they are going to vote for, they very often think about what is best for them. But when you go back to the same people after the election and ask them whom they voted for, we find that they voted much more in terms of what they think is best of the country.

Perhaps in this election, even some people who have been left out by the [Conservative and Liberal Democrat] coalition’s policies in the past five years still voted Conservative because they decided — rightly or wrongly — that that was the best choice for the country.

Don’t polling companies take these factors into account?

Not really, and it would be unfair to blame polling companies for the way things are done. They always state quite openly that they are only taking a snapshot of public opinion at a particular point in time. My research uses very long questionnaires that can take 15–25 minutes to answer. This takes a lot of time and money. Election polls normally take one minute to answer.

Polls have margins of error that are supposed to factor in their limitations, but they didn’t cover Labour’s fall in this case — why?

Margins of error are a complicated thing to measure. The way we calculate them traditionally assumes use of a random sample. But pollsters use quota samples, in which you try to create a representative mini-population based on a number of criteria: gender, age, group, region and social class. In that case it’s problematic to talk about margins of error.

When you do quota samples, you are making assumptions about what actually matters. From the point of view of the psychological behaviour of people, this isn’t correlated very much with gender, religion and so on. So the quota samples could be biased from the point of view of psychological behaviour.

Why don’t polls use a random sample?

Random samples are much more expensive. The other thing that companies use to drive the cost down is the mode of polling. Face-to-face polling would be much more expensive [than calling or doing Internet surveys], but you’d be more likely to have a random sample and to have a real representation of the electorate. Many people on the phone will refuse to answer a survey.

So random, face-to-face polls would be more accurate, but would cost more money and take more time. And it would still not change the fact that many people act differently when they are in the polling station.

Journal name:
Nature
DOI:
doi:10.1038/nature.2015.17511

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments for this thread are now closed.

Comments

3 comments Subscribe to comments

  1. Avatar for Neylan
    Neylan
    No one is ready to see the REAL reasons the polls are wrong (forget theoretical arguments about percentages etc.) All these journalists and experts, but no one sees the real reasons because no one is being practical.. 1. Left-inclined supporters are more likely to be at home to answer the phone, they are more likely to be surfing the internet joining discussions and doing polls online. Conservative voters, if not retired, are out working (a higher % are at least), and in the evening they are not interested in answering the phone or responding to surveys. 2. No one will think badly of someone who votes to the left wing, but no one will admit voting to the right. The right inclined voters are more private and keep opinions to themselves; they are the silent majority. No one is lying, people do not change their mind at the last minute, the Poll companies are always trying to point fingers at the voters, but their methodology is WRONG, millions of dollars and no one questions who gets 'polled'. Exit polls are more accurate because the Polling company has access to people it would not normally have access to. I am not AT ALL surprised by the UK Conservative majority, the same thing happened in Canada a few years ago. I vote conservative but no one has ever asked me how I vote, yet a left-wing relative of mine who is always getting the government to pay for this and that and has more free time than me is always there when asked, so his votes in the poll have always counted, but not mine. Polling companies need to factor all this in and find a way to reach a cross-section of society accurately. Thank you.
  2. Avatar for Roger Hammond
    Roger Hammond
    The article confirms the pre-election polls as manipulative of the voters' intents - how else would a voter make the "good for me/good for the nation" choice without polling information? So, the best objective is to totally discredit the whole poll system, and just wait for the actual election results! This is best achieved by lying to the pollsters, as strategically as possible. I know I do! Just think what advances we could make if the pollsters went off to do something useful...
  3. Avatar for Alfredo Louro
    Alfredo Louro
    So basically the margin of error (3% 19 times out of 20, whatever that means) is rubbish, polls influence the way people vote, but do not forecast how people really vote. Or, to put it another way, election polls are very similar to weather forecasts, which also influence people's behaviour, although they are often wrong.

CRISPR in humans

crispr-human

CRISPR gene-editing tested in a person for the first time

The move by Chinese scientists could spark a biomedical duel between China and the United States.

Newsletter

The best science news from Nature and beyond, direct to your inbox every day.

Open SESAME

sesame

First Middle Eastern X-ray factory readies for action

SESAME project is set to revolutionize science in the region but is strapped for cash.

Warming waters

ocean

How much longer can Antarctica’s hostile ocean delay global warming?

The waters of the Southern Ocean have absorbed much of the excess heat and carbon generated by humanity.

Angst in America

trump-effect

Immigrant and minority scientists shaken by Trump win

Worries include job prospects, discrimination — and safety.

Testing genetics

mutations

The flip side of personal genomics: When a mutation doesn't spell disease

Researchers worry about misinforming people about the risk of disease.

Nature Podcast

new-pod-red

Listen

This week, your brain on cannabis, testing CRISPR in a human, and what it might be like to live on Mars.