Bad polling

Are the public turning against Brexit and what do they really think of Communism? Polling Matters

Posted in Bad polling, Politics, Polling Matters on August 2nd, 2018 by Leo – Comments Off on Are the public turning against Brexit and what do they really think of Communism? Polling Matters

This week’s Polling Matters podcast is split into three parts.

In part one: Keiran and I discuss this week’s Sky Data poll and look at the evidence for whether the public really are turning against Brexit and what this might mean for the debate in Westminster.

In part two: we look at some exclusive Opinium polling on different political systems and ideologies. What do the public think about socialism and capitalism? Is communism really being rehabilitated? And what do the public really understand about the different ideological terms that are often bandied about in the press

In part three: I ask about who pays for polling and how much we should know / pollsters should be made to publish about who pays for their work (see my previous articles about this here).

 

ComRes and We, The People back with another loaded poll

Posted in Bad polling on July 16th, 2018 by Leo – 1 Comment

ComRes and We, The People, a new secretly-funded right-wing lobby group, are back with another loaded poll – a few weeks after they got the Mail to cover their pro-Brexit poll.

This poll got the Mail’s front page today, with the claim most people think the police have lost control of the streets.

As with the pro-Brexit poll, the design of this survey is likely to have helped the We, The People get the answers they wanted. Here are a few of the problems with the poll:

  • An early question asks whether respondents have seen a police officer on their street in the last year – a ridiculously high bar (how much time do you actually spend looking at your street? A couple of minutes a day?). Not surprisingly, most people said no, preparing them to think policing is insufficient when they come to the next questions.
  • When the idea of political correctness is introduced, respondents are led to see it as something that limits the police’s effectiveness. Respondents are forced to choose between: “The police feel like they’re on my side with my priorities and interests at heart” and “The police increasingly feel as if they have their own politically correct agenda which does not match my interests”. This assumes political correctness is opposed to respondents’ interests: the choice is between being politically correct and acting in respondents’ interests. While it’s normal for these questions (known as polarities) to force respondents to choose between extreme positions, this question conflates two different debates (are the police politically correct? is that good or bad?) and in doing so leads respondents to think negatively about political correctness in policing.
  • In a series of statements, which respondents can agree or disagree with, six of the seven are phrased where agreement gives the answer that We, The People presumably wanted. This is bad polling practice. If you really want to measure public opinion you ask a question that presents both sides of an argument equally, then allow respondents to choose which they are closer to. Or if you really have to ask agree/disagree questions, the questions should be balanced overall so you’re not pushing a particular argument and you can compare skewed questions against each other.

In fairness, the poll is less bad than the previous one: the wording of each individual question is less skewed this time. But the order and overall balance of the questions still add up to a loaded poll.

A good guide of a fair poll is that you shouldn’t be able to guess the view of the organisation commissioning the poll from the questions. This clearly fails that.

It is also arguable the poll fails the Market Research Society’s Code of Conduct, which says (33d) “Members must take reasonable steps to ensure … that participants are not led towards a particular point of view”.

Amusingly, despite the leading questions, the poll still got some results that the Mail choose to ignore, including:

  • 48% agreed “The police need to act with political correctness as it encourages acceptance and decency in society”, with 32% disagreeing (strangely, this was missed off We, The People’s press release).
  • 59% agreed “Hate crime is a blight on our society and the police are right to try to tackle it” compared with 24% saying “The concept of hate crime is well intentioned but threatens Britain’s heritage of free speech and open expression”.

 

Pro-Brexit survey is a long list of loaded questions

Posted in Bad polling, Europe on May 23rd, 2018 by Leo – 2 Comments

A poll on the House of Lords and Brexit, doing the rounds today, apparently shows the upper house is seen as out of tune with the public, would be wrong to try to stop Brexit and so on.

A glance at ComRes’s data tables is enough to throw up doubts about the results (the tables were published promptly after the Mail ran the story, so credit on that).

The fundamental problem is that the questions were nearly all one-sided agree/disagree questions, with each one loaded against the Lords and Remainers. A couple of examples:

  • It would be wrong for the House of Lords to try and thwart Brexit [“thwart”!]
  • It is wrong that the House of Lords has already voted against the government on Brexit 14 times
  • There are currently 780 members of the Lords compared to 650 MPs in the Commons. This is too many

If you really want to measure public opinion you ask a question that presents both sides of an argument equally, then allow respondents to choose which they are closer to. Or if you really have to ask agree/disagree questions, the collection of the questions should be balanced so you’re not pushing a particular argument and you can compare the skewed questions against each other.

A good guide of a fair poll is that you shouldn’t be able to guess the view of the organisation commissioning the poll from the questions. This clearly fails that.

The poll was done for a new pro-Brexit campaign called “We, the People”. Their website gives few clues about who they are, other than that the Fitzrovia-based outfit is a “grassroots campaigning group” that wants to “remind the liberal metropolitan elite of the ‘other Britain'”.

After the barrage of anti-Lords and pro-Brexit messages, respondents are given the opportunity to describe the Lords in terms like “out of tune with the will of the British people”  and “an outdated throwback”. They do unsurprisingly well.

The poll hasn’t broken any rules, but surveys with such skewed questions hardly help rebuild trust in the industry.

US election polls: what went wrong?

Posted in Bad polling, Politics, Polling Matters, U.S. on November 9th, 2016 by Leo – Comments Off on US election polls: what went wrong?

Keiran and I recorded a Polling Matters podcast at 5am on election morning, responding to the results and debating what went wrong with the polls.

You can listen here:

 

The pollsters have to show they take this disaster seriously

Posted in Bad polling, Politics on May 8th, 2015 by Leo – Comments Off on The pollsters have to show they take this disaster seriously

The pollsters have had a shocker. A calamitous, humiliating, sector-threatening humdinger of an epic fail.

An uncanny consensus that Labour and the Tories would be within one point of each other – a closely hung parliament, with Ed Miliband in Number 10 – was proved to be utterly wrong (only one pollster put a wider gap, and that had Labour 2 points ahead).

Ahead of this election, some commentators pointed to the errors pollsters made in predicting the ’92 election, suggesting that the polls could be wrong again. I didn’t take it too seriously for a few reasons: ’92 was caused in part by old census data, which wasn’t a problem now; this time the pollsters had taken into account the ‘shy Tory’ effect that caused the ’92 mistakes; and there are more pollsters around now to check one another’s results.

I was wrong, and so were the pollsters.

It’s important they realise how damaging this might be for the polling industry. As it stands, I don’t see why we should treat future election polls as more than a rough guide.

If that’s the case, why should journalists continue to pay for so many political polls?

Some pollsters seem to recognise this, like Stephan Shakespeare at YouGov:

But others, like Ipsos MORI, don’t appear to do so. In a statement, they’ve focused on what they got right (including their exit poll, which, to be fair, was excellent) as if that will divert us from the fact they called the election completely wrong.

I suggest the following approach from pollsters would be more productive:

  1. Acknowledge they got things completely wrong and that they’re disappointed in their performance.
  2. Set it in the context of how much pollsters usually get right, eg every major UK election after ’92 (broadly right, anyway).
  3. Show what they’re doing to fix it. The British Polling Council has announced an inquiry into the results: this is good news as long as it’s done well and agencies support it.

I’ve seen various possible explanations for the pollshambles, including lower-than-expected Labour turnout (though I don’t see why that couldn’t have been picked up by polls), and a fresh ‘shy Tory’ effect.

The inquiry should also look at the converging of the final polls. If the polls had finished a week earlier, two of them (Ipsos MORI on 28/4 and Ashcroft on 26/4) would have got the Labour-Tory gap pretty much right. Instead, they converged on the same answer. The fact this answer proved to be completely wrong makes me even more suspicious about the process behind this convergence.

Intriguingly, Damian Lyons Lowe at Survation has broken cover to say they suppressed a poll on the eve of the election that had nearly got the result right, as they didn’t want to be an outlier. I wonder whether any other agencies did the same – or tweaked results to fit with the pack.

Unless the pollsters show they’re on top of this, they may struggle to persuade people to take them seriously and commission polls from them in future.

 

Update 1: Andrew Hawkins at ComRes has joined Ipsos MORI in proclaiming how well his agency did. Not a good look, I suggest.

 

Update 2: Andrew Cooper of Populus has written in the FT about pollsters’ failure and the need to understand and explain what went wrong.

 

Update 3: This is, roughly speaking, how some of the pollsters are trying to put it:

flesh-wound

And this is how everyone else sees it:

Update 4: Opinium have joined YouGov and Populus, as have ICM, in apologising for the wrong prediction, while the view is becoming established that polls in general can’t be trusted:

 

and perhaps it will strengthen Lord Foulkes’ efforts to regulate the polling industry:

The strange case of the converging election polls

Posted in Bad polling, Politics on May 7th, 2015 by Leo – 1 Comment

The pollsters have submitted their final judgements of public opinion before the election.

They’ve disagreed for months about how people say they will vote: less than a month ago two polls on the same day put the Tories on 39 and 33 respectively and Ukip on 7 and 15.

But now the final polls are in, the results are strikingly similar.

A quick analysis shows how the variance has collapsed between previous polls and this weeks’. Variance in pollsters’ scores for Ukip fell from 7.6 in mid-April to 3.4 now, while the variance for Labour fell from 4.7 to a tiny 0.8 now (all but one of the final polls put Labour on 33 or 34) (* methodology below).

 

They’re so similar, in fact, that it’s tempting to be sceptical. After months of polls that no-one could test, the polls converge on the day when they’ll be assessed against a real ballot of public opinion.

A pollster that got it completely wrong, when no-one else did, would look very silly. But one who gets it wrong when everyone else does? There’s nothing to single them out. The incentive for following the herd is clear.

I can think of several ways of rigging a poll to get the answers you want, though none seem easy or safe.

You could fiddle with the weights (including based on their 2010 vote), though that could be detected by poll nerds; you could change the criteria of who you select to question, though that would be a fairly crude tool for a single poll; you could even manually change some of the results after fieldwork to give the answers you want, though that’s so obviously fraudulent it would be a disaster for any pollster that got caught (if anyone wants to whistleblow drop me a line!).

There are other possible, legitimate, explanations.

read more »

5 years of this blog: my favourite 5 charts

Posted in Attitudes, Bad polling, Climate Sock, Energy sources on November 23rd, 2014 by Leo – Comments Off on 5 years of this blog: my favourite 5 charts

I’ve been writing this blog for five years.  Most grateful to anyone who’s bothered to read it and to everyone who’s re-posted it or used my findings elsewhere.

In the spirit of these things, here are my five favourite charts that I’ve produced over the years:

5. Most people don’t understand the word ‘progressive’ 

Words are useful when they help people understand things. The word ‘progressive’ has become code among politics people for left-wing, or perhaps centre-left, or perhaps liberal in general.

It seems more common in the US and perhaps there people understand it as meaning ‘left-wing’. They don’t here though.

Here, for most people it has no political meaning at all: it just means “someone I like”:

Read the post

 

4. Wind farms are really popular, even when they’re built nearby

On one level I sort of understand the Tory Party’s opposition to wind farms. I’m sure there are some people that viscerally hate them, maybe even majorities in some communities, and perhaps Tory policy wonks think they’re a bad investment.

But the way some senior Tories talk, it’s as if wind farms are as popular neighbours as paedophile collectives – particularly compared with how they talk about fracking. They seem to assume that wind farms are hated, and everyone knows they’re hated.

Which is odd, because this is what people think about potential local power sources:

Read the post

3. People no longer think the monarchy make Britain better

read more »

This fracking poll finding is one of the least convincing I’ve ever seen

Posted in Bad polling, Energy sources on August 11th, 2014 by Leo – 5 Comments

A new poll has found over 3 times as many people support fracking as oppose it. That’s a reversal of previous polls, in which most people generally opposed fracking. So has there been a change in the public mood?

No.

Instead, Populus and UK Onshore Oil and Gas have published one of the most misleading poll findings I’ve ever seen.

Short of faking results or fiddling the weights or sample (which this poll doesn’t), there are two ways to get a poll to give the answers you want. You can ask a series of leading questions that get respondents thinking the way you want them to, then ask the question you’re really interested in. Or you can word the questions so respondents only see half the argument.

This poll does both.

The opening three questions are statements that form the basis of the argument for fracking. They’re phrased without any costs (free ponies for all), counter-arguments or alternatives:

  • The UK needs to invest more in a whole range of new infrastructure, including housing, roads and railways, airport capacity and new energy sources
  • The UK needs to use a range of energy sources to meet the country’s energy needs
  • Britain needs to be able to produce its own energy so it isn’t reliant on gas from other countries

Then comes the clincher. A question on fracking that’s 146 words long, describes the process with reassuring terms like “tiny fractures” and “approved non-hazardous chemicals”, and tells us that it could meet the UK’s natural gas demand for 50 years. No challenge to the data, no costs or consequences, no alternative energy sources.

This isn’t an attempt to find out what the public think about fracking. It’s message testing.

That’s what political candidates or businesses do before launching a campaign. They fire a load of messages at respondents to see how much support they could gain in a theoretical world where only their view is heard, and which arguments are most effective.

It’s a useful technique for finding out how people might respond to your arguments. But it’s not supposed to represent what people actually think now.

Except not only was this poll press released as if it shows what people currently think, it was reported as such by the BBC, Press Association and the Telegraph.

This is the kind of thing that destroys trust in polling. I can see why UKOOG wanted it, and I get that the journalists wanted a counter-intuitive story (though it’s a shame they didn’t question what they were given). But I’m surprised that a reputable pollster went for it.

TL;DR:

If I coated it in honey, would you be more likely to eat a live cockroach?

Posted in Bad polling on February 6th, 2013 by Leo – Comments Off on If I coated it in honey, would you be more likely to eat a live cockroach?

Think of something you would never do. For the sake of an example, let’s call it eating a live cockroach. Now suppose I tell you I’ve done something to make it slightly less unappealing, perhaps coated it in honey. Would you be more or less likely to eat that cockroach?

This is a question type repeatedly used by pollsters. I’m going to show why they should stop using it, and why its results should generally be ignored.

Here’s an example. Last week, Survation did a poll for the Mirror on illegal drugs.* In that poll was a question on whether or not respondents had ever taken drugs, and another on what they would do if drugs were sold guaranteed to be free of contaminants.

The results demonstrate why questions of the format “what would you do if x happened” shouldn’t be taken at face value:

So 32% of people who’ve never taken drugs say they would be less likely to take drugs that were guaranteed not to be contaminated.

Read that again. It suggests that a third of people who’ve never taken drugs are currently a bit tempted to give them a try by the thought that the drugs they aren’t buying might contain a bit of rat poison.

Obviously this is complete rubbish. Almost all of that 32% are making a different point: they would never take drugs, and nothing the pollster can say would make them change their mind.

Logic might dictate that they should be more likely to take drugs if they were a bit safer. But they so strongly don’t want to take drugs, they will give the most negative answer they can regardless of what inducements they’re offered (this is similar to the Twitter response to Tom Chivers’ suggestion that liberals should be more likely to vote Tory because of the gay marriage legislation).

That was an easy one to spot, but sometimes the silliness of the result isn’t so obvious.

read more »

Is Euroscepticism collapsing, or is it just bad polling?

Posted in Bad polling, Europe on January 20th, 2013 by Leo – 7 Comments

Today’s YouGov poll shows a startling change in attitudes to the EU. The results suggest more people would now vote to stay in the EU than to leave it: 40% staying in against 34% wanting to leave.

That’s a big swing from two months ago, when 49% said they would vote to leave: 17pts ahead of those wanting to stay:

Shifts like these don’t just happen by themselves. But is it real, or is something going on with the polling?

Option 1: a change in opinion

There are grounds for thinking a real shift has happened. The last time ‘vote to stay in’ was this high was December 2011: just after Cameron’s walkout of the EU summit.

At that time, the suggestion that the UK would leave the EU moved from remote to seeming more possible. Perhaps people started responding to the polling question differently: saying “I’d vote to leave the EU” became less of an empty threat.

Maybe that’s what happened this time as well. Over the last couple of weeks, discussions about the UK’s future in the EU have dominated the news again. People have started thinking about their own view, and they’ve responded to YouGov with a more considered opinion, which has taken some people away from the ‘out’ camp.

So we have a plausible explanation – but it’s not the only possible answer.

Option 2: bad polling

Some polling is designed to find out what people would do if they’re exposed to certain information or arguments. If Tesco promised to make its beefburgers with only British ingredients, would you be more likely to shop there? If you’re told that 60% of people affected by the benefit cap are in work, would you be more likely to oppose it?

But other polling is supposed to be a pure measure of what people currently think. Questions like voting intent and the EU referendum should be in this category.

So for the EU referendum question to show accurately what people think, respondents shouldn’t be shown anything that might influence their response. In an ideal world, they’d only be asked about the EU, and then the poll would finish. But that would be expensive, so we have to accept that the EU question will go in a poll with other questions.

In that case, the other questions respondents see need to be consistent between polls. So if respondents are being influenced by the other questions, at least it’s happening in a comparable way.

But that’s not how YouGov have done it.

read more »