Don’t just believe what you’re told about polls

From time to time a news story comes out citing a poll that isn’t in the public domain. These articles are written on the basis of a press release – apparently all the information the journalist has about the poll.

Given that journalists are supposed to be a cynical bunch, this always strikes me as surprising. By writing up the data from the press release without checking the poll themselves, they’re taking a leap of faith that they’ve been given a fair representation of the truth. Since these press releases (of course) show results that are helpful to the organisation that commissioned the poll, you would expect due diligence for a journalist to include checking the data.

A recent poll by EDF Energy, carried out by ICM, shows why this matters.

The research was conducted among 1002 adults living near the Hinkley Point Power Station, and asked about their attitudes to nuclear power and the possible construction of a new plant.

On the strength of the poll, EDF put out this press release, in which they said that “Nearly four times as many local people support plans for a new power station at Hinkley Point than oppose it”, and that “63% support the development of Hinkley Point C”. The press release was picked up quite widely by local media, including the BBC. Nice job by their PR people in winning positive local coverage.

Fortunately, ICM is a member of the British Polling Council (BPC) and abides by its rules. These rules are strongly weighted towards transparency, and include the stipulation that where research findings have entered the public domain – as in this poll – the full data and complete wording of the questionnaire must be made available.

As ever, ICM have done this, and we can look at the data here to test out EDF’s claim.

Firstly, there’s no dispute about the figures they’ve issued. As they say, 63% are “strongly in favour” or “slightly in favour” of the potential development of Hinkley Point C, and only 17% are slightly or strongly opposed.

However, being able to see the complete data also allows us to see the wording of the whole questionnaire.  The sequence of questions runs:

1. How favourable or unfavourable is your opinion of the nuclear energy industry? Is that very or quite favourable/unfavourable?

[So immediately respondents know that this is an interview about the nuclear industry. But for me that’s ok – we’re not looking at the industry in comparison with others.]

 

2. To what extent do you agree or disagree with the following statement?

Nuclear energy has disadvantages but the country needs nuclear power as part of the energy balance with coal, gas and wind power.

 

[This statement is structured in a way that makes it harder to disagree. It appear reasoned: taking on board the downsides of nuclear before drawing a measured conclusion that it’s a necessary evil to produce a greater good. The result? Only 13% disagree with it, and the whole audience are nudged towards thinking that nuclear power is necessary.]

 

3. As you may be aware the power plant at Hinkley Point B is due to close in 2016. One option for future energy generation would be to build new nuclear capacity at Hinkley Point C. Overall, do you think a new power station will have a positive or a negative impact on the local area?

4a.   Why do you think it will have a positive impact on the local area?

Q5. Which is MOST important?

Q6b. You stated that you think a new power station will have a POSITIVE impact on the local area? However, do you think there will be any NEGATIVE things?

 

4b.   Why do you think it will have a negative impact on the local area?

Q6a. You stated that you think a new power station will have a NEGATIVE impact on the local area? However, do you think there will be any POSITIVE things?

 

Q7. How important, if at all, do you consider a new power station at Hinkley to each of the following?

–       To the creation of local jobs

–       To the future of local businesses

Q8. Why do you say that?

[By this point, respondents have been forced to think about both the negative and the positive aspects of a new nuclear power station.  However, they’ve had one more question about the positive reasons (Q5) than the negative reasons, and the section finishes with two questions about job creation and the future of local businesses.  This is then immediately followed by the question that EDF used in their press release:]

Q9. Overall, thinking about the potential development of Hinkley Point C would you say that you would be?  (strongly in favour / slightly in favour / neither in favour nor opposed / slightly opposed / strongly opposed)

 

If I were writing a poll and wanted an accurate read on people’s attitudes about an issue like the construction of a new power station at Hinkley Point, I would put the key question as early in the poll as possible. I would not put it after several questions that encourage people to think more deeply about the issue than they normally do. And I would certainly not put it after a series of questions that encourages people to think about aspects of the issue that will make them more likely to think about it in a particular way.

Fortunately, thanks to the rules of the BPC (and to EDF and ICM for abiding by these rules) we have been able to scrutinise the results for ourselves. I have my personal opinion on the value of the research, and everyone else can look at the poll and form their own opinions as well. It doesn’t take long to go through a poll, and doesn’t require much specialist knowledge. So in principle we can be reassured that any journalists who write about the story has taken the 15 minutes needed to go to ICM’s website and read through the poll themselves.

This is why it’s so unsatisfying for a poll to get news coverage when it’s based on unpublished data. If no-one can check that the numbers are actually correct and that the question wording isn’t set up to produce certain answers, who’s to say that we should believe what it says in the press release?

Take, for example, the claim made in an Easyjet press release last year that a YouGov poll showed that 80% of UK consumers wanted a rethink of Air Passenger Duty. This claim was mostly carried in various travel publications, but was also picked up by the Observer. Yet, as far as I can tell, the data were never put on either company’s website.

As a journalist, I would want to know more about a poll I was writing about than I can find from any press release I’ve come across. And as a reader, I would want to know that the journalist who wrote the story I’m reading has checked their facts – and that I can double-check them myself if I want.

Share
  1. Rob says:

    Hello

    This isn’t about this particular issue but polls in general.

    I understand your point about not asking the important question after people had been steered in a certain direction but wondered why you wouldn’t want an answer after people had thought about the issue?

    It might not be as accurate a snapshot of the current attitude amongst people but isn’t it better to know how their opinion once they’re better informed. I’m thinking of things like views of immigration given the continual negative coverage it’s given in much of the press.

    Perhaps the same question before and after going into detail would be best?

  2. Janice Haigh says:

    http://www.youtube.com/watch?v=2yhN1IDLQjo&feature=youtube_gdata_player

    Should be required viewing for anyone looking at market research results

  3. Leo says:

    Hi all,

    Thanks. Funnily enough I think the Yes Minister clip is pretty well spot on for the point I’m making.

    Rob/Ian, as you say, there’s nothing at all wrong in finding out what people think after you’ve asked them to think about an issue for a while. In fact, it’s very common practice for pollsters to show people a lot of messages/arguments about a particular issue, and then ask their opinion about that issue after those messages. That’s a perfectly reasonable thing to do – if you compare the results post-messaging with the results pre-messaging, you get a sense of how far people’s opinions could be shifted if a particular argument is made to them.

    However, what’s dubious – and mocked in the Yes Minister clip – is when the post-messaging data is taken out of context, stripped of the information respondents were shown in the poll, and presented as if it’s people’s opinion now. If I wanted to know what people thought about an issue now, I would ask them about it early in the poll, before I’d presented any arguments. The scores that you see after people have been shown an argument are interesting – but they’re not representative of what people actually think now.

    So as you say, Rob, the most interesting would probably be to see the scores both pre- and post-messages.

    Leo

  4. David says:

    Polls are a snapshot, and should never be trusted.

  5. Dez Futak says:

    I agree with Leo – and what’s more, it does seem that ICM manipulated peoples’ thinking to get the result they had already been briefed to provide by the company who employed their services…?

    If that’s true, then maybe all of us need to become students of Goldstein, Martin & Chialdini’s Book “Yes! 50 Secrets from the science of persuasion”, or the more weigthy “Influence: The Psychology of Persuasion”.

    Dez.

  1. […] Climate Sock » Blog Archive » Don’t just believe what you’re told about polls – As a journalist, I would want to know more about a poll I was writing about than I can find from any press release I’ve come across. And as a reader, I would want to know that the journalist who wrote the story I’m reading has checked their facts – and that I can double-check them myself if I want. (tags: statistics polls examples) […]

  2. […] release, and is picked up widely in the industry and sometimes also in generalist media. We saw a couple of months ago that EDF successfully got local BBC coverage with their poll that apparently showed support for the […]

  3. […] have sent me some of the data that they hadn’t previously published (unlike the PR polls that are so irritatingly reported without any published data to back them up, this was a piece of private polling, so GlobeScan […]

  4. […] November, we saw an EDF poll that won coverage of apparent strong support for a new nuclear power station, on the […]

  5. […] other people’s polls is there’s always something to complain about.  The question order is biasing the responses, the weighting’s gone wrong, the answer choices don’t make […]