Bad polling

US election polls: what went wrong?

Posted in Bad polling, Politics, Polling Matters, U.S. on November 9th, 2016 by Leo – Comments Off on US election polls: what went wrong?

Keiran and I recorded a Polling Matters podcast at 5am on election morning, responding to the results and debating what went wrong with the polls.

You can listen here:

 

The pollsters have to show they take this disaster seriously

Posted in Bad polling, Politics on May 8th, 2015 by Leo – Comments Off on The pollsters have to show they take this disaster seriously

The pollsters have had a shocker. A calamitous, humiliating, sector-threatening humdinger of an epic fail.

An uncanny consensus that Labour and the Tories would be within one point of each other – a closely hung parliament, with Ed Miliband in Number 10 – was proved to be utterly wrong (only one pollster put a wider gap, and that had Labour 2 points ahead).

Ahead of this election, some commentators pointed to the errors pollsters made in predicting the ’92 election, suggesting that the polls could be wrong again. I didn’t take it too seriously for a few reasons: ’92 was caused in part by old census data, which wasn’t a problem now; this time the pollsters had taken into account the ‘shy Tory’ effect that caused the ’92 mistakes; and there are more pollsters around now to check one another’s results.

I was wrong, and so were the pollsters.

It’s important they realise how damaging this might be for the polling industry. As it stands, I don’t see why we should treat future election polls as more than a rough guide.

If that’s the case, why should journalists continue to pay for so many political polls?

Some pollsters seem to recognise this, like Stephan Shakespeare at YouGov:

But others, like Ipsos MORI, don’t appear to do so. In a statement, they’ve focused on what they got right (including their exit poll, which, to be fair, was excellent) as if that will divert us from the fact they called the election completely wrong.

I suggest the following approach from pollsters would be more productive:

  1. Acknowledge they got things completely wrong and that they’re disappointed in their performance.
  2. Set it in the context of how much pollsters usually get right, eg every major UK election after ’92 (broadly right, anyway).
  3. Show what they’re doing to fix it. The British Polling Council has announced an inquiry into the results: this is good news as long as it’s done well and agencies support it.

I’ve seen various possible explanations for the pollshambles, including lower-than-expected Labour turnout (though I don’t see why that couldn’t have been picked up by polls), and a fresh ‘shy Tory’ effect.

The inquiry should also look at the converging of the final polls. If the polls had finished a week earlier, two of them (Ipsos MORI on 28/4 and Ashcroft on 26/4) would have got the Labour-Tory gap pretty much right. Instead, they converged on the same answer. The fact this answer proved to be completely wrong makes me even more suspicious about the process behind this convergence.

Intriguingly, Damian Lyons Lowe at Survation has broken cover to say they suppressed a poll on the eve of the election that had nearly got the result right, as they didn’t want to be an outlier. I wonder whether any other agencies did the same – or tweaked results to fit with the pack.

Unless the pollsters show they’re on top of this, they may struggle to persuade people to take them seriously and commission polls from them in future.

 

Update 1: Andrew Hawkins at ComRes has joined Ipsos MORI in proclaiming how well his agency did. Not a good look, I suggest.

 

Update 2: Andrew Cooper of Populus has written in the FT about pollsters’ failure and the need to understand and explain what went wrong.

 

Update 3: This is, roughly speaking, how some of the pollsters are trying to put it:

flesh-wound

And this is how everyone else sees it:

Update 4: Opinium have joined YouGov and Populus, as have ICM, in apologising for the wrong prediction, while the view is becoming established that polls in general can’t be trusted:

 

and perhaps it will strengthen Lord Foulkes’ efforts to regulate the polling industry:

The strange case of the converging election polls

Posted in Bad polling, Politics on May 7th, 2015 by Leo – 1 Comment

The pollsters have submitted their final judgements of public opinion before the election.

They’ve disagreed for months about how people say they will vote: less than a month ago two polls on the same day put the Tories on 39 and 33 respectively and Ukip on 7 and 15.

But now the final polls are in, the results are strikingly similar.

A quick analysis shows how the variance has collapsed between previous polls and this weeks’. Variance in pollsters’ scores for Ukip fell from 7.6 in mid-April to 3.4 now, while the variance for Labour fell from 4.7 to a tiny 0.8 now (all but one of the final polls put Labour on 33 or 34) (* methodology below).

 

They’re so similar, in fact, that it’s tempting to be sceptical. After months of polls that no-one could test, the polls converge on the day when they’ll be assessed against a real ballot of public opinion.

A pollster that got it completely wrong, when no-one else did, would look very silly. But one who gets it wrong when everyone else does? There’s nothing to single them out. The incentive for following the herd is clear.

I can think of several ways of rigging a poll to get the answers you want, though none seem easy or safe.

You could fiddle with the weights (including based on their 2010 vote), though that could be detected by poll nerds; you could change the criteria of who you select to question, though that would be a fairly crude tool for a single poll; you could even manually change some of the results after fieldwork to give the answers you want, though that’s so obviously fraudulent it would be a disaster for any pollster that got caught (if anyone wants to whistleblow drop me a line!).

There are other possible, legitimate, explanations.

read more »

5 years of this blog: my favourite 5 charts

Posted in Attitudes, Bad polling, Climate Sock, Energy sources on November 23rd, 2014 by Leo – Comments Off on 5 years of this blog: my favourite 5 charts

I’ve been writing this blog for five years.  Most grateful to anyone who’s bothered to read it and to everyone who’s re-posted it or used my findings elsewhere.

In the spirit of these things, here are my five favourite charts that I’ve produced over the years:

5. Most people don’t understand the word ‘progressive’ 

Words are useful when they help people understand things. The word ‘progressive’ has become code among politics people for left-wing, or perhaps centre-left, or perhaps liberal in general.

It seems more common in the US and perhaps there people understand it as meaning ‘left-wing’. They don’t here though.

Here, for most people it has no political meaning at all: it just means “someone I like”:

Read the post

 

4. Wind farms are really popular, even when they’re built nearby

On one level I sort of understand the Tory Party’s opposition to wind farms. I’m sure there are some people that viscerally hate them, maybe even majorities in some communities, and perhaps Tory policy wonks think they’re a bad investment.

But the way some senior Tories talk, it’s as if wind farms are as popular neighbours as paedophile collectives – particularly compared with how they talk about fracking. They seem to assume that wind farms are hated, and everyone knows they’re hated.

Which is odd, because this is what people think about potential local power sources:

Read the post

3. People no longer think the monarchy make Britain better

read more »

This fracking poll finding is one of the least convincing I’ve ever seen

Posted in Bad polling, Energy sources on August 11th, 2014 by Leo – 5 Comments

A new poll has found over 3 times as many people support fracking as oppose it. That’s a reversal of previous polls, in which most people generally opposed fracking. So has there been a change in the public mood?

No.

Instead, Populus and UK Onshore Oil and Gas have published one of the most misleading poll findings I’ve ever seen.

Short of faking results or fiddling the weights or sample (which this poll doesn’t), there are two ways to get a poll to give the answers you want. You can ask a series of leading questions that get respondents thinking the way you want them to, then ask the question you’re really interested in. Or you can word the questions so respondents only see half the argument.

This poll does both.

The opening three questions are statements that form the basis of the argument for fracking. They’re phrased without any costs (free ponies for all), counter-arguments or alternatives:

  • The UK needs to invest more in a whole range of new infrastructure, including housing, roads and railways, airport capacity and new energy sources
  • The UK needs to use a range of energy sources to meet the country’s energy needs
  • Britain needs to be able to produce its own energy so it isn’t reliant on gas from other countries

Then comes the clincher. A question on fracking that’s 146 words long, describes the process with reassuring terms like “tiny fractures” and “approved non-hazardous chemicals”, and tells us that it could meet the UK’s natural gas demand for 50 years. No challenge to the data, no costs or consequences, no alternative energy sources.

This isn’t an attempt to find out what the public think about fracking. It’s message testing.

That’s what political candidates or businesses do before launching a campaign. They fire a load of messages at respondents to see how much support they could gain in a theoretical world where only their view is heard, and which arguments are most effective.

It’s a useful technique for finding out how people might respond to your arguments. But it’s not supposed to represent what people actually think now.

Except not only was this poll press released as if it shows what people currently think, it was reported as such by the BBC, Press Association and the Telegraph.

This is the kind of thing that destroys trust in polling. I can see why UKOOG wanted it, and I get that the journalists wanted a counter-intuitive story (though it’s a shame they didn’t question what they were given). But I’m surprised that a reputable pollster went for it.

TL;DR:

If I coated it in honey, would you be more likely to eat a live cockroach?

Posted in Bad polling on February 6th, 2013 by Leo – Comments Off on If I coated it in honey, would you be more likely to eat a live cockroach?

Think of something you would never do. For the sake of an example, let’s call it eating a live cockroach. Now suppose I tell you I’ve done something to make it slightly less unappealing, perhaps coated it in honey. Would you be more or less likely to eat that cockroach?

This is a question type repeatedly used by pollsters. I’m going to show why they should stop using it, and why its results should generally be ignored.

Here’s an example. Last week, Survation did a poll for the Mirror on illegal drugs.* In that poll was a question on whether or not respondents had ever taken drugs, and another on what they would do if drugs were sold guaranteed to be free of contaminants.

The results demonstrate why questions of the format “what would you do if x happened” shouldn’t be taken at face value:

So 32% of people who’ve never taken drugs say they would be less likely to take drugs that were guaranteed not to be contaminated.

Read that again. It suggests that a third of people who’ve never taken drugs are currently a bit tempted to give them a try by the thought that the drugs they aren’t buying might contain a bit of rat poison.

Obviously this is complete rubbish. Almost all of that 32% are making a different point: they would never take drugs, and nothing the pollster can say would make them change their mind.

Logic might dictate that they should be more likely to take drugs if they were a bit safer. But they so strongly don’t want to take drugs, they will give the most negative answer they can regardless of what inducements they’re offered (this is similar to the Twitter response to Tom Chivers’ suggestion that liberals should be more likely to vote Tory because of the gay marriage legislation).

That was an easy one to spot, but sometimes the silliness of the result isn’t so obvious.

read more »

Is Euroscepticism collapsing, or is it just bad polling?

Posted in Bad polling, Europe on January 20th, 2013 by Leo – 7 Comments

Today’s YouGov poll shows a startling change in attitudes to the EU. The results suggest more people would now vote to stay in the EU than to leave it: 40% staying in against 34% wanting to leave.

That’s a big swing from two months ago, when 49% said they would vote to leave: 17pts ahead of those wanting to stay:

Shifts like these don’t just happen by themselves. But is it real, or is something going on with the polling?

Option 1: a change in opinion

There are grounds for thinking a real shift has happened. The last time ‘vote to stay in’ was this high was December 2011: just after Cameron’s walkout of the EU summit.

At that time, the suggestion that the UK would leave the EU moved from remote to seeming more possible. Perhaps people started responding to the polling question differently: saying “I’d vote to leave the EU” became less of an empty threat.

Maybe that’s what happened this time as well. Over the last couple of weeks, discussions about the UK’s future in the EU have dominated the news again. People have started thinking about their own view, and they’ve responded to YouGov with a more considered opinion, which has taken some people away from the ‘out’ camp.

So we have a plausible explanation – but it’s not the only possible answer.

Option 2: bad polling

Some polling is designed to find out what people would do if they’re exposed to certain information or arguments. If Tesco promised to make its beefburgers with only British ingredients, would you be more likely to shop there? If you’re told that 60% of people affected by the benefit cap are in work, would you be more likely to oppose it?

But other polling is supposed to be a pure measure of what people currently think. Questions like voting intent and the EU referendum should be in this category.

So for the EU referendum question to show accurately what people think, respondents shouldn’t be shown anything that might influence their response. In an ideal world, they’d only be asked about the EU, and then the poll would finish. But that would be expensive, so we have to accept that the EU question will go in a poll with other questions.

In that case, the other questions respondents see need to be consistent between polls. So if respondents are being influenced by the other questions, at least it’s happening in a comparable way.

But that’s not how YouGov have done it.

read more »

Why political polling is dying

Posted in Bad polling on October 17th, 2012 by Leo – Comments Off on Why political polling is dying

Each week, YouGov conduct six political polls for News International. ComRes poll for the Independent about once a fortnight; ICM, MORI and Populus do monthly political polls; Opinium seem to be polling on a weekly basis; and Survation and TNS have irregular but frequent polls.

Compare this with 10 years ago. According to Mark Pack’s list of past political polls, this week in October 2002 had just two polls. Ten years before that, there was one poll in the same week. So political polling now appears in unprecedented health.

But perversely, political polling contains the seeds of its own destruction. Here’s why.

Polls are commissioned for one of two reasons. Either the commissioner wants to know something, or they want someone else to know something.

Most polls are never made public. They’re commissioned by companies that want to know how they’re viewed, or to test ideas, or to see what people think about a question that’s important to them. These are often of little interest to anyone who’s not directly involved, though sometimes they contain some fascinating insights.

However lots of polls are made public. Some of these are by organisations trying to create a story that helps their cause. A recent example was this Populus poll, commissioned by the Tories and credulously reported by the Guardian despite a question sequence carefully designed to give Ed Miliband a bad score.

But most public political polling isn’t done by campaign groups: it’s done by newspapers, and they do it because they want a good story that sells papers. This is where the problem is.

read more »

How DECC is wasting money on its new opinion poll

Posted in Bad polling, Climate Sock on September 9th, 2012 by Leo – 2 Comments

The Department of Energy and Climate Change has started a new tracking poll on public attitudes to a few of their issues. The first wave was out earlier this year (details and results here), and wave 2 should be out soon.

It won’t be much of a surprise that I’m generally in favour of polling. It’s important that people in government (and others with power) should know what the public think about the policies they’re making decisions about, and well-conducted opinion polls are a way of finding this out. They equalise the volume of everyone’s voices so that each opinion counts the same: media mogul or not (of course it doesn’t deal with how those opinions are formed).

DECC’s poll is on an important topic, conducted not to create headlines but so the government can better understand what the public think, so I should be in favour. But I’m increasingly of the view that it’s been badly put together and is costing too much public money.

Confusing questions

The first issue is the quality of the questions. For brevity I’m just going to focus on the two climate change questions, though there are also others I could make the same argument about.

One of the questions asks: “How concerned, if at all, are you about current climate change, sometimes referred to as ‘global warming’?”.

The problem is that word ‘current’. I think it’s intended to distinguish 20th/21st Century climate change from historical climate change: the ice ages and so on.

But when I first read the question, I understood it to mean the climate change we’re experiencing in 2012, as opposed to what we’re going to experience in 20 years’ time. Since I’m only a little concerned about the climate change we’re experiencing in 2012, I would answer the question accordingly.

I don’t know whether others would understand the question as I did, or whether they would think ‘current’ is referring to 20th/21st Century climate change. Given that, I have no idea how to interpret the results of the question, and no-one else can know either.

The other climate change question asks: “Thinking about the causes of climate change, which, if any, of the following best describes your opinion?”, which all seems fine to me.  But then the answer choices are:

  1. Climate change is entirely caused by natural processes
  2. Climate change is mainly caused by natural processes
  3. Climate change is partly caused by natural processes and partly caused by human activity
  4. Climate change is mainly caused by human activity
  5. Climate change is entirely caused by human activity
  6. I don’t think there is such a thing as climate change.
  7. Don’t know
  8. No opinion

What on earth is choice 3 supposed to be doing? If I think that climate change is mainly human but could also be a bit natural, I could pick either choice 3 or choice 4. Someone who thought it was mostly natural but a bit human could pick choice 2 or choice 3. Given these different interpretations, it’s hard to know what the data mean.

read more »

Does the Evening Standard understand its own Boris vs Ken poll?

Posted in Bad polling, London, Media on April 10th, 2012 by Leo – Comments Off on Does the Evening Standard understand its own Boris vs Ken poll?

I try not to write much about polling methodology. I doubt it’s of interest to many people, and besides, Anthony Wells does it much better than I do.

But there’s been some truly awful reporting today of the latest London mayoral poll, and it’s time to look at weighting and so on.

According to today’s Evening Standard, their new ComRes poll shows “a dramatic slide in Mr Livingstone’s support after his argument with his Tory rival over tax in a radio station lift”.

They go on to say that those “interviewed before “liftgate” last Tuesday morning were split 50/50 between the two candidates. But those surveyed afterwards divided 60/40 in favour of Mr Johnson.” ITV also reported it with the same angle.

This all sounds very plausible and interesting, but it’s in fact a bad misrepresentation of the poll.

The issue is, the poll was never designed to show how opinion changed after shoutyBorisgate. Of course it wasn’t: the poll was set up without anyone knowing there would be any event to compare ‘before’ and ‘after’.

If you do know that an event is coming, say a leaders’ debate, you can run two separate polls, with comparable samples (or even, with the same people), and see how the results compare.

But this ComRes poll doesn’t do that. Instead, a little over three quarters of the poll was conducted before the interview, and the remainder after. Nothing looks to have been done to make sure the samples before and after were comparable.

So we’ve got two groups of people. In terms of how they voted in the last general election (nothing to do with Ken and Boris), the first group has 29% Labour voters and 27% Tory voters. The second group has 26% Labour voters and 32% Tory voters. A Labour 2pt lead vs a Tory 5pt lead.

We then ask them how they’d vote in the London election, and are supposed to be surprised when the group with more Tories say they’re more likely to vote for the Tory candidate!

read more »