close
MENU
6 mins to read

Political polls' accuracy vs the actual 2011 election result

Gavin White of UMR Research
Sat, 05 Apr 2014

It's election year so we're going to hear a lot about polls.  While it's a matter of time before politicians come out with that hoary old chestnut about the only poll that counts being election day, by and large polls in New Zealand have been pretty good when it comes to picking the election results.  Well, I would say that, wouldn't I?

In this blog I'm going to take a look at the final results from the mainstream polls from the 2011 election, and publicly reveal for the first time the results of UMR's own last poll from that campaign.  I'll also look back at some of the past elections to see how the trends stand up over time.

In New Zealand, there are five major media polls, plus a few others (such as ours) that are done privately.  The five major media polls now are:

  • One News Colmar Brunton
  • NZ Herald Digipoll
  • TV3 Reid Research
  • Fairfax Ipsos
  • Roy Morgan

The first four of those, and UMR (along with one of the private polls), are all members of the New Zealand Association of Market Research Organisations and recently signed up to an agreed set of guidelines in terms of methodologies and reporting. In theory at least, they're all much of a muchness, but there will inevitably be differences in terms of the exact questions asked and how they ensure the survey sample is as representative as possible.  All those surveys have margins of error of between +/- 3.1% and +/- 3.6%.  I'm not privy to what exactly the other companies ensure that their samples are representative and I'm not going to share our exact methods with you – we all jealously safeguard those because they can be points of competitive advantage.

Four of those five polls were around at the 2011 election, the exception being Fairfax (then conducted by Research International). Although some left wing blogs have been critical of the Fairfax poll on the grounds that it was a long way out in 2011, I think that's manifestly unfair as Ipsos wasn't doing it. That's like criticising Cadbury for the taste of a Peanut Slab. The most we can say about the Fairfax Ipsos poll in 2014 is that we don't know how it stacks up historically.

In terms of the polls above, and indeed ours, it's fair to say that they were all reasonably close  Every one of them showed National close to governing alone, Labour in the 20s and the Greens over 10%. While some were clearly closer than others, by and large they produced results that were a reasonable indication of what actually happened.

Let's start by looking at National's vote.  In every case, I've taken the company's final published poll:

  • Actual result: 47.3%
  • UMR: 48.6%
  • One News Colmar Brunton: 50.0%
  • Herald Digipoll: 50.9%
  • Roy Morgan: 49.5%
  • TV3 / Reid Research: 50.8%
  • Fairfax / Research International: 54.0%

Now Labour:

  • Actual result: 27.5%
  • UMR: 28.2%
  • One News Colmar Brunton: 28.0%
  • Herald Digipoll: 28.0%
  • Roy Morgan: 23.5%
  • TV3 / Reid Research: 26.0%
  • Fairfax / Research International: 26.0%

And the Greens:

  • Actual result: 11.1%
  • UMR: 12.4%
  • One News Colmar Brunton: 10.0%
  • Herald Digipoll: 11.8%
  • Roy Morgan: 14.5%
  • TV3 / Reid Research: 13.4%
  • Fairfax / Research International: 12.0%

Lastly, the only other party to pass or come close to the threshold, New Zealand First:

  • Actual result: 6.6%
  • UMR: 6.0%
  • One News Colmar Brunton: 4.2%
  • Herald Digipoll: 5.2%
  • Roy Morgan: 6.5%
  • TV3 / Reid Research: 3.1%
  • Fairfax / Research International: 4.0%

I won't go through the final results for the smaller parliamentary parties but for each of them it's a pretty mixed picture with some polls picking too high and some too low (for example, the range for ACT was 0.7% to 1.8%, versus an actual result of 1.1%).

So what can we learn from all of that? First and foremost, while some polls are closer than others, by and large they provided a reasonable picture of what actually happened. The two big differences for me, however, are:

  • National didn't get enough votes to govern alone, despite all five public polls suggesting that they would (49.5% would almost certainly have been enough for them to govern alone, because of 'wasted' votes cast for parties that didn't get seats in Parliament).
  • Only half the polls picked NZ First getting over the threshold and the three polls that didn't were all out by more than the margin of error. 

It's particularly interesting to note that all six polls listed, including our own, picked National too high. Three of them were out by more than the margin of error.  That's not what we'd expect from probability theory, although we do need to recognise that all these polls closed at least a few days before the election (but less than a week) and votes can shift in the last few days. There are only two explanations for that: either there's a systematic skew towards National or National shed votes in the last few days.

You might think that's just a one-off result but I went back and looked at poll results from every election since 1999. That gives us a total of 19 final polls from 1999 to 2011 conducted by companies that are still polling.  So how did they do:

  • 16 had National too high, while three had them too low.  The most any company had underestimated National's vote by was 2%, while the most a company had overestimated National's vote by was 9%.  One poll has had National's vote above their actual vote by more than the margin of error at three of the last five elections.
  • five had Labour too high, while five had them too low.
  • nine had the Greens too high, while three had them too low. That overstates the case a little, because the most any poll has been out for the Greens is 3.4%.
  • one had NZ First too high, and nine had them too low. The biggest difference was in 2002, when one poll had them 6% too low – mostly the differences are within 2%.

I think it's fair to say from that that there's a tendency for New Zealand polls to overstate the votes for National and to a lesser extent the Greens, and to at least slightly understate the vote for NZ First. When it comes to interpreting current polls, it doesn't really matter whether that's because of inherent biases in the polls or because National and the Greens' vote tends to drop in the last few days of the campaign while NZ First's picks up – the impact on our interpretation should be the same.

One way of looking at this further is to take the average (mean) error for these four parties across the 19 final polls included in this dataset.  That shows us that the average error is:

  • National: 2.7% too high
  • Labour: 0.7% too high
  • Greens: 1.0% too high
  • NZ First: 1.5% too low.

Counting all mainstream media polls since 2005 (excluding UMR but including TV3 and Fairfax / Research International polls in 2008 and 2011) leaves 14 polls and an average error of:

  • National: 2.4% too high
  • Labour: 0.5% too low
  • Greens: 1.5% too high
  • NZ First: 1.1% too low.

These differences didn't really matter at the 2011 election, because the overall result was never really in doubt. I guess you could argue that there would have been more emphasis on National's potential coalition partners had it been known that it probably wasn't going to be able govern alone but John Key spent plenty of time on cups of tea anyway, which suggests that National wasn't counting its chickens on that score.

It surely does matter in 2014, when at least until recently most of the public polls have shown Labour plus Greens within touching distance of National plus its current allies.  I think history suggests that:

  • If the total for Labour plus Greens is within about 2% of the total for National and its allies (whichever of ACT, United Future and the Conservatives makes it into Parliament), then it's actually pretty much a dead heat. 
  • If NZ First gets 4% in most of the mainstream polls, then it will probably pass the 5% threshold on election day.

Gavin White is research drector for UMR Research. This blog was originally posted on sayit.co.nz, a site run by UMR for members of its online research panel

Gavin White of UMR Research
Sat, 05 Apr 2014
© All content copyright NBR. Do not reproduce in any form without permission, even if you have a paid subscription.
Political polls' accuracy vs the actual 2011 election result
36981
false