Bad polling

Why political polling is dying

Posted in Bad polling on October 17th, 2012 by Leo – Comments Off on Why political polling is dying

Each week, YouGov conduct six political polls for News International. ComRes poll for the Independent about once a fortnight; ICM, MORI and Populus do monthly political polls; Opinium seem to be polling on a weekly basis; and Survation and TNS have irregular but frequent polls.

Compare this with 10 years ago. According to Mark Pack’s list of past political polls, this week in October 2002 had just two polls. Ten years before that, there was one poll in the same week. So political polling now appears in unprecedented health.

But perversely, political polling contains the seeds of its own destruction. Here’s why.

Polls are commissioned for one of two reasons. Either the commissioner wants to know something, or they want someone else to know something.

Most polls are never made public. They’re commissioned by companies that want to know how they’re viewed, or to test ideas, or to see what people think about a question that’s important to them. These are often of little interest to anyone who’s not directly involved, though sometimes they contain some fascinating insights.

However lots of polls are made public. Some of these are by organisations trying to create a story that helps their cause. A recent example was this Populus poll, commissioned by the Tories and credulously reported by the Guardian despite a question sequence carefully designed to give Ed Miliband a bad score.

But most public political polling isn’t done by campaign groups: it’s done by newspapers, and they do it because they want a good story that sells papers. This is where the problem is.

read more »

How DECC is wasting money on its new opinion poll

Posted in Bad polling, Climate Sock on September 9th, 2012 by Leo – 2 Comments

The Department of Energy and Climate Change has started a new tracking poll on public attitudes to a few of their issues. The first wave was out earlier this year (details and results here), and wave 2 should be out soon.

It won’t be much of a surprise that I’m generally in favour of polling. It’s important that people in government (and others with power) should know what the public think about the policies they’re making decisions about, and well-conducted opinion polls are a way of finding this out. They equalise the volume of everyone’s voices so that each opinion counts the same: media mogul or not (of course it doesn’t deal with how those opinions are formed).

DECC’s poll is on an important topic, conducted not to create headlines but so the government can better understand what the public think, so I should be in favour. But I’m increasingly of the view that it’s been badly put together and is costing too much public money.

Confusing questions

The first issue is the quality of the questions. For brevity I’m just going to focus on the two climate change questions, though there are also others I could make the same argument about.

One of the questions asks: “How concerned, if at all, are you about current climate change, sometimes referred to as ‘global warming’?”.

The problem is that word ‘current’. I think it’s intended to distinguish 20th/21st Century climate change from historical climate change: the ice ages and so on.

But when I first read the question, I understood it to mean the climate change we’re experiencing in 2012, as opposed to what we’re going to experience in 20 years’ time. Since I’m only a little concerned about the climate change we’re experiencing in 2012, I would answer the question accordingly.

I don’t know whether others would understand the question as I did, or whether they would think ‘current’ is referring to 20th/21st Century climate change. Given that, I have no idea how to interpret the results of the question, and no-one else can know either.

The other climate change question asks: “Thinking about the causes of climate change, which, if any, of the following best describes your opinion?”, which all seems fine to me.  But then the answer choices are:

  1. Climate change is entirely caused by natural processes
  2. Climate change is mainly caused by natural processes
  3. Climate change is partly caused by natural processes and partly caused by human activity
  4. Climate change is mainly caused by human activity
  5. Climate change is entirely caused by human activity
  6. I don’t think there is such a thing as climate change.
  7. Don’t know
  8. No opinion

What on earth is choice 3 supposed to be doing? If I think that climate change is mainly human but could also be a bit natural, I could pick either choice 3 or choice 4. Someone who thought it was mostly natural but a bit human could pick choice 2 or choice 3. Given these different interpretations, it’s hard to know what the data mean.

read more »

Does the Evening Standard understand its own Boris vs Ken poll?

Posted in Bad polling, London, Media on April 10th, 2012 by Leo – Comments Off on Does the Evening Standard understand its own Boris vs Ken poll?

I try not to write much about polling methodology. I doubt it’s of interest to many people, and besides, Anthony Wells does it much better than I do.

But there’s been some truly awful reporting today of the latest London mayoral poll, and it’s time to look at weighting and so on.

According to today’s Evening Standard, their new ComRes poll shows “a dramatic slide in Mr Livingstone’s support after his argument with his Tory rival over tax in a radio station lift”.

They go on to say that those “interviewed before “liftgate” last Tuesday morning were split 50/50 between the two candidates. But those surveyed afterwards divided 60/40 in favour of Mr Johnson.” ITV also reported it with the same angle.

This all sounds very plausible and interesting, but it’s in fact a bad misrepresentation of the poll.

The issue is, the poll was never designed to show how opinion changed after shoutyBorisgate. Of course it wasn’t: the poll was set up without anyone knowing there would be any event to compare ‘before’ and ‘after’.

If you do know that an event is coming, say a leaders’ debate, you can run two separate polls, with comparable samples (or even, with the same people), and see how the results compare.

But this ComRes poll doesn’t do that. Instead, a little over three quarters of the poll was conducted before the interview, and the remainder after. Nothing looks to have been done to make sure the samples before and after were comparable.

So we’ve got two groups of people. In terms of how they voted in the last general election (nothing to do with Ken and Boris), the first group has 29% Labour voters and 27% Tory voters. The second group has 26% Labour voters and 32% Tory voters. A Labour 2pt lead vs a Tory 5pt lead.

We then ask them how they’d vote in the London election, and are supposed to be surprised when the group with more Tories say they’re more likely to vote for the Tory candidate!

read more »

Reasons to be wary of media coverage of climate change polls

Posted in Bad polling, Climate Sock, Media on April 4th, 2012 by Leo – Comments Off on Reasons to be wary of media coverage of climate change polls

This post was written for the Green Alliance blog,to coincide with the launch of a paper on public opinion and the environment

Coverage of public opinion on climate change is never just about reporting numbers. Without appreciating the need for journalists to tell a story, we can never really understand why climate change polls are reported as they are.

Over the last decade, two distinct narratives have been told about what the public think of climate change. Each of these narratives has been so dominant for a time that it has been difficult for alternative views of public opinion to get much attention.

The first, which dominated for most of the noughties, was that climate change was increasingly settled in the public’s minds as a great concern. Polls on climate change were rare for much of the decade, but when they did appear in the media, the coverage tended to acknowledge that the public was worried, although perhaps unsure about the risks or about possible solutions, as in this Observer article.

As a result, there was little prominent dissent from the view that climate change was becoming a more important issue for most people, along with a belief that the world needed to take decisive action.

The rise of scepticism

But by the end of 2009, this prevailing narrative about public views on climate change had given way to a very different account.

It happened quite suddenly, around the time of the COP15 in Copenhagen. Now, the dominant frame was that growing numbers of people doubted the existence of serious man-made climate change, and that there was increased resistance to measures to tackle it.

Opinion polls were important to the development of this new account. For about a year, from late 2009, polls were repeatedly used to show the same narrative: that fewer people were now worried about climate change.

The sheer weight of polls, reported across the media, gave the overwhelming impression of an ongoing change in opinion. But this was misleading: in fact, there appears to have been a one-off fall in concern about climate change, which happened between November’09 and January ’10.

The difficulty in understanding opinion lies in the fact that media outlets want to report their own polls, as an exclusive story. They’re much less interested in repeating polls that another newspaper or broadcaster have commissioned.

So over a period of several months, we saw different polls in outlets from the Daily Mail to the BBC and Guardian, which essentially restated the same phenomenon as if it were a new finding. The result was a powerful new narrative, that concern about climate change was experiencing an ongoing decline.

There are two reasons why it’s useful to see this as a new dominant narrative about public opinion, rather than as straight-forward reporting of opinion.

read more »

On rigging and reporting polls

Posted in Bad polling, Climate Sock, Media on April 4th, 2011 by leo – 3 Comments

Consider this plausible scenario. An airline’s new poll finds that most people want airport capacity to be increased. Two weeks later an environmental NGO announces that their own poll has found two-thirds oppose airport expansion.

Both polls are conducted by reputable agencies, and both interviewed representative samples of over 1,000 people.

How can we reconcile these two polls, and how should journalists report them?

It’s not a problem with polling

The problem is not that polling is inherently untrustworthy. Conducting a poll of 1,000 randomly chosen people means speaking to about 0.002% of the UK adult population. Yet the results are so reliable that, 19 times out of 20, the result you get will be within 3 percentage points of the result you would get if you asked every single person in the country. UK Polling Report offer a good explanation for why this is the case.

Alternatively, if wading through probabilities isn’t your thing, just consider YouGov’s five most recent political polls. For each, they interviewed over 2,000 different people; the proportion who said they would vote Labour were, respectively, 44%, 42%, 45%, 42%, 42%.

If polling itself is untrustworthy, the consistency in these results would require either quite a coincidence or a grand conspiracy. And for anyone tempted to call it a fix, just remember the outraged reaction when, after the second leaders’ debate last year, YouGov’s instant poll found that Cameron ‘won’. It would be a twisted conspiracy indeed if YouGov rigged polls for the Tories last year, and are now doing so in favour of Labour.

So the problem is not that polling is inherently untrustworthy. The problem is this:

read more »

More bad poll reporting… even when it’s in the name of the forests

Posted in Bad polling, Climate Sock, Media on January 24th, 2011 by leo – 1 Comment

I like:

  • Trees. Particularly when they’re part of forests.
  • People being able to get into forests with as few restrictions as possible.
  • People’s views being taken into account when government policy is formed.

Because of that, I’m a bit sad about what I’m about to write.

If you’re in the UK, there’s a good chance you’ve seen or heard coverage of 38 Degrees’ poll, which apparently showed that 75% of the public are against the government’s plans to privatise some forests and change the way it manages the rest. It’s had coverage pretty much everywhere, from the bleeding hearts at the Guardian and BBC to those bastions of anti-green activism at the Sun and Telegraph.

So being a nerd, the first thing I did when I heard the news was to look for the data. And this was when I started getting sad.

1. The data weren’t published when the articles were written

To my knowledge, all the coverage was put together on the basis of what 38 Degrees gave to the media (the data were put up on the YouGov site today, Monday, with the coverage posted on Saturday or Sunday).

We’ve seen several times before why this matters. If journalists cover a poll without seeing the data, they’re often reliant entirely on the word of people who are trying to promote their own interest.

In November, we saw an EDF poll that won coverage of apparent strong support for a new nuclear power station, on the basis of a question that came after respondents had been reminded of the jobs a power station could create.

And we’ve seen other polls reported with absolutely no data ever published, like the claim made in an Easyjet press release last year that a YouGov poll showed that 80% of UK consumers wanted a rethink of Air Passenger Duty. Without the data being available, there’s no way of knowing whether it was true.

Now, this isn’t particular to 38 Degrees: everyone does it. After all, when you’ve got a shiny new poll fresh from the pollsters, why not get coverage for it straight away?  And of course if you’re a journalist and you know that competitors have also got the same story, you’ve got to cover it straight away.

But here’s another reason why that’s a bad idea:

read more »

Don’t just believe what you’re told about polls

Posted in Bad polling, Climate Sock, Energy sources, Media on November 14th, 2010 by leo – 11 Comments

From time to time a news story comes out citing a poll that isn’t in the public domain. These articles are written on the basis of a press release – apparently all the information the journalist has about the poll.

Given that journalists are supposed to be a cynical bunch, this always strikes me as surprising. By writing up the data from the press release without checking the poll themselves, they’re taking a leap of faith that they’ve been given a fair representation of the truth. Since these press releases (of course) show results that are helpful to the organisation that commissioned the poll, you would expect due diligence for a journalist to include checking the data.

A recent poll by EDF Energy, carried out by ICM, shows why this matters.

The research was conducted among 1002 adults living near the Hinkley Point Power Station, and asked about their attitudes to nuclear power and the possible construction of a new plant.

On the strength of the poll, EDF put out this press release, in which they said that “Nearly four times as many local people support plans for a new power station at Hinkley Point than oppose it”, and that “63% support the development of Hinkley Point C”. The press release was picked up quite widely by local media, including the BBC. Nice job by their PR people in winning positive local coverage.

Fortunately, ICM is a member of the British Polling Council (BPC) and abides by its rules. These rules are strongly weighted towards transparency, and include the stipulation that where research findings have entered the public domain – as in this poll – the full data and complete wording of the questionnaire must be made available.

As ever, ICM have done this, and we can look at the data here to test out EDF’s claim.

Firstly, there’s no dispute about the figures they’ve issued. As they say, 63% are “strongly in favour” or “slightly in favour” of the potential development of Hinkley Point C, and only 17% are slightly or strongly opposed.

However, being able to see the complete data also allows us to see the wording of the whole questionnaire.  The sequence of questions runs:

read more »

Is Caroline Lucas on course to be elected?

Posted in Bad polling, Climate Sock, Media, Politics on February 21st, 2010 by leo – 1 Comment

Much of the environmental blogosphere is getting het up about a new poll in the Brighton Argus, which claims to show that the Greens’ lead in Brighton Pavilion has been overhauled. According to the poll, Labour now lead, 16 points ahead of Tories, with the Greens in third on 19% – 16 points lower than they were in a December ’09 poll, which had put them in the lead. That’s a massive change for two months, and something that would really need explaining.

As Anthony Wells has argued on UK Polling Report, there are several reasons why we should be pretty wary about taking the new poll too seriously. The question is whether the differences between the two polls reflect a genuine change in attitudes, or are something to do with the methodology.

read more »

Flying and taxes

Posted in Bad polling, Climate Sock, Transport on January 24th, 2010 by leo – Comments Off on Flying and taxes

A few months ago, the pro-aviation campaign group, Flying Matters, released results from their poll of voters in marginal seats, showing strong opposition to the then-forthcoming increase in Airline Passenger Duty.

An industry poll showing that people don’t like taxes imposed on their industry isn’t particularly interesting. It’s not unusual either: aviation is an area where almost all the polling seems to be pretty unconvincing, with questionnaires structured to lead respondents to answer a particular way. In fact, I’m yet to see only one interesting and credible finding in the various reported polls.

read more »