It's election year so we're going to hear a lot about polls. While it's a matter of time before politicians come out with that hoary old chestnut about the only poll that counts being election day, by and large polls in New Zealand have been pretty good when it comes to picking the election results. Well, I would say that, wouldn't I?
In this blog I'm going to take a look at the final results from the mainstream polls from the 2011 election, and publicly reveal for the first time the results of UMR's own last poll from that campaign. I'll also look back at some of the past elections to see how the trends stand up over time.
In New Zealand, there are five major media polls, plus a few others (such as ours) that are done privately. The five major media polls now are:
The first four of those, and UMR (along with one of the private polls), are all members of the New Zealand Association of Market Research Organisations and recently signed up to an agreed set of guidelines in terms of methodologies and reporting. In theory at least, they're all much of a muchness, but there will inevitably be differences in terms of the exact questions asked and how they ensure the survey sample is as representative as possible. All those surveys have margins of error of between +/- 3.1% and +/- 3.6%. I'm not privy to what exactly the other companies ensure that their samples are representative and I'm not going to share our exact methods with you – we all jealously safeguard those because they can be points of competitive advantage.
Four of those five polls were around at the 2011 election, the exception being Fairfax (then conducted by Research International). Although some left wing blogs have been critical of the Fairfax poll on the grounds that it was a long way out in 2011, I think that's manifestly unfair as Ipsos wasn't doing it. That's like criticising Cadbury for the taste of a Peanut Slab. The most we can say about the Fairfax Ipsos poll in 2014 is that we don't know how it stacks up historically.
In terms of the polls above, and indeed ours, it's fair to say that they were all reasonably close Every one of them showed National close to governing alone, Labour in the 20s and the Greens over 10%. While some were clearly closer than others, by and large they produced results that were a reasonable indication of what actually happened.
Let's start by looking at National's vote. In every case, I've taken the company's final published poll:
Now Labour:
And the Greens:
Lastly, the only other party to pass or come close to the threshold, New Zealand First:
I won't go through the final results for the smaller parliamentary parties but for each of them it's a pretty mixed picture with some polls picking too high and some too low (for example, the range for ACT was 0.7% to 1.8%, versus an actual result of 1.1%).
So what can we learn from all of that? First and foremost, while some polls are closer than others, by and large they provided a reasonable picture of what actually happened. The two big differences for me, however, are:
It's particularly interesting to note that all six polls listed, including our own, picked National too high. Three of them were out by more than the margin of error. That's not what we'd expect from probability theory, although we do need to recognise that all these polls closed at least a few days before the election (but less than a week) and votes can shift in the last few days. There are only two explanations for that: either there's a systematic skew towards National or National shed votes in the last few days.
You might think that's just a one-off result but I went back and looked at poll results from every election since 1999. That gives us a total of 19 final polls from 1999 to 2011 conducted by companies that are still polling. So how did they do:
I think it's fair to say from that that there's a tendency for New Zealand polls to overstate the votes for National and to a lesser extent the Greens, and to at least slightly understate the vote for NZ First. When it comes to interpreting current polls, it doesn't really matter whether that's because of inherent biases in the polls or because National and the Greens' vote tends to drop in the last few days of the campaign while NZ First's picks up – the impact on our interpretation should be the same.
One way of looking at this further is to take the average (mean) error for these four parties across the 19 final polls included in this dataset. That shows us that the average error is:
Counting all mainstream media polls since 2005 (excluding UMR but including TV3 and Fairfax / Research International polls in 2008 and 2011) leaves 14 polls and an average error of:
These differences didn't really matter at the 2011 election, because the overall result was never really in doubt. I guess you could argue that there would have been more emphasis on National's potential coalition partners had it been known that it probably wasn't going to be able govern alone but John Key spent plenty of time on cups of tea anyway, which suggests that National wasn't counting its chickens on that score.
It surely does matter in 2014, when at least until recently most of the public polls have shown Labour plus Greens within touching distance of National plus its current allies. I think history suggests that:
Gavin White is research drector for UMR Research. This blog was originally posted on sayit.co.nz, a site run by UMR for members of its online research panel