Thursday, November 21, 2019

Commissioned uComms Tasmanian State Poll

uComms (commissioned by Australia Institute): Liberal 39.0 Labor 29.4 Green 16.8 Ind 11.7 Other 3.1. 
Tasmanian state polling overstates votes for Greens and this poll is likely to overstate "independent" vote
After adjusting for likely skews, poll would be borderline in majority/minority terms for government in an election "held now" (seats c. 13-9-3 or 12-9-4)
Poll is difficult to interpret because of high "independent" vote, inadequate transparency and lack of uComms track record
Poll taken on October 22


“But look, you found the notice, didn’t you?”
“Yes,” said Arthur, “yes I did. It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard.”
- Douglas Adams

I often feel like Arthur Dent when it comes to finding the most very basic details of Australian commissioned polling.  The Australia Institute Tasmania's uComms poll from nearly a month ago first surfaced in the form of a uselessly skewed result about support for a Tarkine National Park.  Voting intention results were withheld from publication at the time, apparently because releasing them would have diverted media attention away from the supposed Tarkine findings, and it is only this week, 27 days after the poll was conducted, that they have finally been released.  Not prominently though - after seeing more uncritical media reporting of another issues question (regarding the proposed repeal of medevac legislation), I was finally able to find the voting intention results lurking unheralded in a PDF linked off a release of the medevac findings on the TAI website.

TAI claim to be in the business of research, but depriving the audience of the data needed to analyse polls closer to the time they are released suggests they are more interested in using polling to make political points than in allowing their data to be critically examined at the time at which it is current.   For an organisation that claims transparency as one of its interest areas, and makes over 100 references to transparency on its website this is, to put it mildly, hardly a consistent way to operate.

One might ask why look at Tasmanian polling at all in the wake of the national polling failure.  But the national polling failure was in a marketplace with a history of exceptional performance, and the error was one that involved having one major party a few points too high and the other a few too low.  In Tasmania, such errors have always been common, and the nature of Hare-Clark is such that they're not the difference between one side winning outright and the other doing so.

Voting intention poll

The poll is a uComms robopoll with a sample size of 1136 taken on the night of 22 October.  While I would expect the poll to have sampled evenly statewide, there is at this stage no information in the poll reports confirming this.  uComms has in the past used the ReachTEL framework for its polls despite being a distinct pollster; I would expect this is still the case but again I have no explicit confirmation.  uComms is a controversial pollster because of its union links which caused a number of clients to drop it in the leadup to the 2019 election (perhaps unfortunately, because more pollster diversity might have helped).  The poll data are weighted by "gender and age" only, a primitive and cheap approach in light of the national 2019 poll failure at which failure to weight by education level is one of the suspects for why polling went astray.

There is another curious aspect - the breakdowns of the 18-34 year old age group in this poll show, both on voting intention and issue stances, that these voters are conservative compared to the older voters.  This aspect has been seen frequently in ReachTEL-platform polls, whether advertised as by uComms or not, since at least June 2018.  A possible explanation is that the response rate in these age groups is now extremely low and the young voters being captured are not representative.

uComms polls of Bass and Braddon for the Australian Forests Products Association were among the more accurate seat polls taken at the 2019 election, correctly predicting the Liberal Party would win both with relatively minor errors on primary votes (largely caused by omitting a key independent in Braddon).  However, just because a pollster's commissioned polls for one source are accurate does not mean this is necessarily the case for another.  In particular, there is not enough information in the public domain about drop-out rates in this style of polling.  If a poll is laden with questionable political claims in its preambles, are voters who disagree more likely to conclude it is not a genuine poll and therefore hang up?

Anyway, after reassigning voters who are "leaning" to a party (treated as "undecided" in ReachTEL-platform polls but included as party voters by most other polls) the voting intention results on a basis comparable with other polls looks like this:

Liberal 39.0 Labor 29.4 Green 16.8 Independent 11.7 Other 3.1

Long experience of Tasmanian polling is that numbers like this shouldn't be taken at face value.  Not only does the state's most regular pollster, EMRS, habitually overstate the Greens vote compared to actual results at its final polls, but this has also happened in ReachTEL platform statewide polling in 2014 and 2018 (state) and 2013 and 2016 (federal) (there wasn't any in 2019).  The average overestimation of the Greens vote in these cases was 2.6%, so the 16.8% for the Greens could be better treated as something around the low 14s.  Still not bad compared to low 10s at the state election, and a part of the explanation could be recovery of votes lost to Labor at the state election when Labor trespassed on Greens territory with its now-abandoned anti-pokies policy.  A caveat here is that those were ReachTEL polls whereas this is uComms (same platform, but not necessarily same scaling etc) and information about the differences (if any) when it comes to scaling between the two has been hard to come by.

The poll could also be underestimating Labor.  ReachTEL platform polls did this at the 2013 and 2016 federal elections and the 2014 state election, but not at the 2018 state election.

Polling that allows a stand-alone "independent" option tends to lead to inflated results for that option in general, but this is especially an issue in Tasmania.  Half the division of Clark voted for an independent at the last federal election, and nine of the fifteen Legislative Council divisions are represented by independents, so voters might well think that come the state election they could find an independent to vote for.  However - perhaps because the most prominent independents are so likely to run for and win Legislative Council seats - the independents who run at lower house elections are usually obscure and don't get many votes.  In EMRS polling in the previous parliament, combined independent/others votes ran between 12 and 17% for much of the second half of the previous term; the eventual vote for such forces at the 2018 election was only 6.8%.

In this case it may be that the presence of Madeleine Ogilvie as an independent is boosting the independent vote, but I wouldn't take that too seriously just yet.  As the article on the recount that elected Ogilvie shows, party defectors who have stood as independents have normally lost and often lost heavily, the exceptions being two very high profile names in Reg Turnbull and Doug Lowe.  So far my perception is that Ogilvie's presence as an independent has attracted at best a niche response - I'm not seeing much voter adulation a la Wilkie, but nor am I seeing the level of anti-ratting anger that might have been expected.  More likely most of what we're seeing with this "Independent" vote is wishful thinking, perhaps including from voters hoping to find a state-level Wilkie or wishing Sue Hickey would quit the Liberals.

There's a little more insight from the preference figures.  These ReachTEL-platform polls ask the respondent to say which of the ALP and Liberals the voter would preference first, although this is redundant in Tasmania where they need not preference anyone (beyond their first five candidates) at all.  Anyway the claimed preference break is a remarkably strong 78.8% to Labor, and even if Green voters are assumed to be breaking at say 90% to ALP, that still leaves 60% or so breaking to Labor on this meaningless two-party choice.  So a fair chunk of the Ind/Others voters here are probably not just Liberal vote-parkers.

Possible seat distributions

It is not worth getting too carried away with seat modelling for commissioned polls with so many uncertainties attached.  Before even attempting to do so I would adjust the Green vote downwards, but how much to take off "Independent" and where to put it is a much less certain exercise.  However a resurgence in the Green vote to around the 14% level would be expected to recapture Bass at Labor's expense.  In theory, the Greens might also win Lyons from the Liberals, but in practice the lack of an incumbent MP, unfriendly preferences, and perhaps the spreading of the Liberals' vote between three incumbents could get in the way there.  So as with a lot of the polling in the last term, it points to the Liberals being iffy from a majority government standpoint, with seat results like 13-9-3 or 12-9-4 the most likely.  What this particular poll does not point to, on even the most generous reading, is any real resurgence post-election in the Labor vote.  On a non-generous reading, the Opposition has gone backwards, and that was also the finding of the most recent EMRS.

EMRS, too, are not good at reliably releasing data in a timely fashion, and although they were in the field at the end of October I do not know when that might see the light of day.  However if it was released then another poor result for Labor might trigger speculation about the party's direction and/or leadership.

Medevac Question

The medevac question result (62.8% for keeping the existing law to 26.7% for abolishing it) has been greeted with some joy by supporters of the contentious legislation passed by Labor and the crossbench during the previous parliament, but I don't think things are so straightforward there either.  The question design seems harmless in comparison with the skewed Tarkine National Park question, but I'm not sure that's entirely true.  For instance, this section:

"The Medivac law came into force in the last Parliament and the current Parliament is considering
whether to keep or abolish that law" capable of implying a range of leading reasons as to why the law should be kept.  The respondent might think that since it's only a recent law we should keep it for longer and see how it goes.  They might mistakenly assume it was passed by the previous Coalition government (a sensible assumption since the passage of law by an Opposition is extremely rare).  And so on.

Even ignoring this, there's the question of whether even a factual and balanced description of medevac, which I understand the question to generally be, is capable of skewing the result.  The reason for this is that many voters' views about the legislation may be influenced by false beliefs about it.  A poll that presents the respondents' views conditional on them understanding the issue may not represent the range of views on an issue actually out there, and may simulate something (a fully informed electorate) that will actually never exist.  TAI's national polling showed that a lot of voters don't even understand something as simple as the reason for Australia Day, so it's quite likely there would be voters who think medevac applies to non-urgent requests, who would think the transfer was permanent or who would be unaware that the Minister could refuse requests for security reasons.

As a general rule, polling question design should be kept as short and simple as possible.  Something else a national audience should bear in mind is that Tasmania is a pretty refugee-friendly place, so even if this result is accurate it might not be repeatable nationwide.

Added 12 Dec: Forestry Question

A further question on Tasmanian forestry has now surfaced.  The text of the question is:

"The logging industry has announced that, from next year, it wants parts of the 357,000 hectares of forest that were scheduled to be protected under the forest peace deal to be reopened for logging. This includes areas on the Tasman Peninsula and around national parks such as Ben Lomond and Douglas Apsley among others.

Of the following, what do you think the logging industry should do?"

Response options were: "stick to the forest peace deal" (61.1%), "break the forest peace deal" (24.5%) and "unsure/don't know" (14.4%).

This question relates to areas of forest that were initially agreed to be reserved under an agreement between various forest industry and conservation figures in 2012.  I wrote about various aspects of that deal and its reception in the Legislative Council here.  The bill to implement the deal was heavily amended by the Legislative Council before passing it in 2013, and then the new Liberal Government managed to get it repealed, though they could do nothing about those areas that had already acquired World Heritage status.

The question design has a few issues.  It uses the potentially pejorative description "logging industry" (which makes it sounds like the industry only cuts things down without growing anything back), and it highlights areas "around national parks", both factors that may train the respondent to answer a certain way.  But the answer design is much worse, because to describe an industry response as either "sticking to" or "breaking" the peace deal is extremely contestable.  Firstly the deal was heavily dependent on legislative outcomes that have since been repealed, so it's debatable whether there is any agreement left to break.  Secondly the agreement was to support the reservation of particular areas - there was nothing explicit about not logging those areas in the event that the agreement was not implemented by government or repealed.

The TAI state director Leanne Minshull bemoans that forestry related polling is especially prone to accusations of skew and flawed question design.  Political desires to discredit such polls are doubtless part of this, but overwhelmingly the polls discredit themselves.  A very high proportion of forestry-related polls are commissioned by activist groups or industry groups, and for whatever reason neither seems interested in asking straight questions using robust methods.

Minshull's article also refers to very negative results for native forest logging in "the forest industry's own polling" in the form of "research with the University of Canberra in 2018", but actually it wasn't polling (and the data were from 2016).  What it was was a section in a large opt-in survey called Regional Wellbeing.  Regional Wellbeing does employ scaling by occupation, gender, age and location, but its opt-in nature means it is not really the same as a phone poll or panel poll.  Also, one report of the survey reproduces tables from it which suggests that the question about support for native forest logging immediately followed a question about converting "good agricultural land" to forestry, an emotive issue for some farmers especially.  The results are striking nonetheless, but I'd be interested to see a test by a method that isn't primarily opt-in.

No comments:

Post a Comment