On rigging and reporting polls

Consider this plausible scenario. An airline’s new poll finds that most people want airport capacity to be increased. Two weeks later an environmental NGO announces that their own poll has found two-thirds oppose airport expansion.

Both polls are conducted by reputable agencies, and both interviewed representative samples of over 1,000 people.

How can we reconcile these two polls, and how should journalists report them?

It’s not a problem with polling

The problem is not that polling is inherently untrustworthy. Conducting a poll of 1,000 randomly chosen people means speaking to about 0.002% of the UK adult population. Yet the results are so reliable that, 19 times out of 20, the result you get will be within 3 percentage points of the result you would get if you asked every single person in the country. UK Polling Report offer a good explanation for why this is the case.

Alternatively, if wading through probabilities isn’t your thing, just consider YouGov’s five most recent political polls. For each, they interviewed over 2,000 different people; the proportion who said they would vote Labour were, respectively, 44%, 42%, 45%, 42%, 42%.

If polling itself is untrustworthy, the consistency in these results would require either quite a coincidence or a grand conspiracy. And for anyone tempted to call it a fix, just remember the outraged reaction when, after the second leaders’ debate last year, YouGov’s instant poll found that Cameron ‘won’. It would be a twisted conspiracy indeed if YouGov rigged polls for the Tories last year, and are now doing so in favour of Labour.

So the problem is not that polling is inherently untrustworthy. The problem is this:

Let’s return to the above example. You work for a reputable polling agency. A PR company calls you and says that their client, a well-known airline, is campaigning for the expansion of the UK’s airport capacity. Would you conduct a poll for them? Of course you would. Most of what polling agencies are known for is political work, but most of their money comes from private companies.

So you write a short poll:

  • When did you last travel abroad?
  • What would make you travel abroad more often?
  • How important is it that the UK is an attractive place for businesses to expand and create jobs?
  • How important is it that the UK has good international transport links?
  • In general, do you think that the UK’s transport infrastructure is better or worse than that in other major European countries, like Germany and the Netherlands?
  • Would you support the expansion of the UK’s airports to keep pace with European competitors?And when the results come in, the PR agency issues a press release: “Majority want airport expansion”. The following day, the story is across the papers, on the BBC website, and the transport minister is talking about expansion on the Today Programme. Nice job.

    How do we make it stop?

    This only happens when journalists go along with it, and the moral high ground is easy to see. We expect journalists to be suspicious of the unverified claims of sources with an obvious agenda. In this case perhaps they should demand the full data before they file a word.

    If they did so, the rules of the game would be on their side. Most of the agencies anyone’s heard of are members of the British Polling Council (the BPC), whose rules state that, for any poll that enters the public domain, the agency has to publish the full data on their website.

    This means our scrupulous and suspicious journalist can have a proper nose around the numbers before they publish anything. This would allow them to see:

    When were the key questions asked?

    A good principle is that the questions you’re most interested in should be as early as possible in the poll.

    If you want to know whether people support or oppose the sale of the forestry estate, why not put that question at the very start? If you ask it after testing a statement that only public ownership would protect the trees, respondents will have been prompted to think a certain way.

    How were the key questions asked?

    Give people a statement, and all else being equal, they’ll be slightly more likely to agree than to disagree. This is particularly true if the interview is face-to-face or on the phone.

    So, our journalist should be more impressed with questions that provide two opposing viewpoints, and ask which I agree with, rather than those questions that just measure how far I agree with a particular statement.

    Equally, it matters what else you’re comparing with. A recent BBC poll found that 49% of British schoolchildren put climate change as in the top three issues facing the world right now, which is very high. But this is less impressive when you realise the list didn’t include the economy, unemployment, or health services.

    What’s changed since last time?

    A lot of the time, numbers themselves aren’t as interesting as how they’ve changed since a previous poll. There’s nothing wrong with comparing results from different polls, provided the questions are asked with the same wording, and are in the same place in the polls. The methodologies should be the same as well. So that means needing to look at both sets of data: double the fun.

    Back in the real world

    Unfortunately there’s a complicating factor in the rules of the British Polling Council. And it is this: the agency doesn’t have to release the full results until two working days after the original press release.

    So in the real world, a press release comes in on a busy day. It’s from a known source, talks about companies you’ve heard of, the numbers sound vaguely plausible, it’s got a big sample size and a small margin of error, it’s already lunchtime and you’re supposed to be filing two other articles today. What’s more, you know that the press release has been sent to all the other major outlets so if you don’t file it now it’s going to be everywhere else within a few hours.

    What else can a journalist be expected to do than go with the story? They won’t even be able to link to the data, unless they come back to it a couple of days later, by which time most readers will have moved on.

    Let’s not pretend it’s easy working under the time pressures of modern media, but here are my suggestions for our hero journalists when faced with their next press-released poll:

    1) Ask to see the data. They will already have it as a data file, even if it’s not online. There’s no reason they shouldn’t be able to send it to you.

    2) If they won’t send it to you immediately, and won’t publish it, be prepared to push for it. Why won’t they send it? They may not even be aware of the BPC’s rule.

    3) If you can’t get the data before you have to file, make a note to go back and check in a couple of days. It won’t take long. If there’s something truly smelly in the data, remember it next time that contact comes to you with a story.

    And for the rest of us non-journalist types:

    Next time you see a poll you want to know more about, find out more. If the polling agency is a member of the BPC, the data should be on their website. If it’s not, ask them for it.

    But ask nicely: they’re very busy too.

Share
  1. Colman Stephenson says:

    XKCD shows another way to do it

    http://xkcd.com/882/

  2. Lazarus says:

    Just subscribed to the site but find the video above is blocked due to copyright issues. Any chance of a summary?

  1. There are no trackbacks for this post yet.