Thursday, June 29, 2023

Voice Referendum Polling: The Last Days Of The Yes Ascendancy?

TWO-ANSWER POLLING AVERAGE TREND: 50.1 TO YES (-3.9 in about a month)

(estimate updated 11 July, will be edited if more polls before next article)

(UPDATES ADDED: Notes rejecting the ACM reader survey, the Paterson Tele-Town Hall robopoll and the Australia Institute, plus Essential added 11 July)

----------------------------------------

It's only three weeks since the last one, but this week's Newspoll (among other developments) merits another chapter in the story of the referendum Yes vote's decline.  Sure, maybe I should do a federal roundup sometime, but on the other hand there's still not much to see there.  I can do it in a paragraph: Labor is currently at about 55.8% 2PP as a cross-poll average. While Anthony Albanese himself is being quickly cut down to merely mild popularity, there is no end yet to honeymoon vote shares for his party.  Even this week's 54-46 Newspoll came off primaries that would normally be good for 55.  There are some signs of improvement for Peter Dutton, whose own ratings have gone up just a little and whose deficit on Newspoll Better PM (20 points) is now not much larger than that indicator historically skews by.  So there are a few signs that at least leadership polling has the potential to get more interesting, but for now at least the Voice is where it's at.


The Newspoll with Yes trailing 43-47 with 10% undecided has obviously been the big news so far this week - the first poll of any significance with No unequivocally in front (Resolve was a split result with No ahead on forced choice but Yes ahead with undecided allowed.)  The state-based model taking results from the previous poll into account had more bad news for Yes which was behind in four states (40-54 in Queensland, a horrendous 39-52 in Western Australia, 43-48 in Tasmania and 45-46 in SA) and ahead only in NSW (46-41) and Victoria (48-41).  

The other significant poll to surface this week was a JWS poll taken 2-6 June.  On the surface the 46-43 lead to Yes is identical to the Newspoll published at the same time, but the poll also released the information that the result among voters giving a Yes or No response was 51-49.  On this basis it is a slightly worse poll for Yes (the two-answer preferred vote might be inferred to be, say, 51.3 rather than 51.7).  The JWS poll received a 17% rating on Rotten Tomatoes as a result of some very eccentric state breakdowns, doubtless based on small sample sizes. (Among other things Western Australia was 13 points ahead of the national result whereas every other poll for months bar one has had Western Australia behind.)

More Fish John West Reject

I mention two other polls that I am not including.  A well-known WA pollster Painted Dog released a forced choice result of 57-43 for its home state, the mirror image of the 43-57 that Newspoll recorded for WA on a two-answer basis.  However at this stage Painted Dog have not passed my kennel licence, and I'm not even sure that they can.  There is simply insufficient information about the way the poll works (the previous report largely regurgitated media coverage). There's nothing so far to allay any possible concerns about the size of the source panel, the open recruiting of respondents alongside poll results and the lack of information on weighting and respondent selection.  Painting dogs, it seems, doesn't make them any more transparent.

Furthermore, Painted Dog are conspicuous Yes supporters, writing in an earlier report that "we love to see monumental changes like these and are honoured to play even a small part in making a big difference." and in another that Western Australians "look to be on the right side".  I note that Essential are not neutral either, being described by Peter Lewis as "a proud Yes23 campaign partner", but at least we know a lot more about what Essential actually does and its past form, and they don't trumpet their results alongside calls to sign up to their panel.

I have also seen results from the Finder website added to the very helpful Wikipedia page but this is another one that I am not including because of grossly insufficient methods details.  The results sure look like a poll - perhaps a good one - but there's nothing about respondent selection methods, panel size, or weighting (if any); it's just "a survey" and apparently that's all we need to know.  Sorry Finder, but if you can't find it in you to help me find the way you're finding out your findings then my aggregate can only tell you to get lost.

Anyway, here's my latest colour-coded graph (I am now showing 2023 polls only given the volume of data):


Key to colours: Red - Newspoll, Magenta - Resolve, Yellow - Essential, Dark blue - JWS, Light blue - Freshwater, Black - Morgan.  

What is happening here is that the most recent Essential, a massive outlier (and clearly not entirely randomly, as Essential has had a house effect and a different trajectory compared to others for whatever reasons), is doing a lot of upwards dragging of a trendline that, without it, would have already gone below 50.  But here I'm not excluding outliers even if I think they're very likely to be wrong - I'm saying what the polls, on average, say.  

I currently estimate Yes as of today at 50.6% across the range of polls, noting that Simon Jackman's more statistically advanced model has 50.1.  Whatever it is, it's now very close and No may in reality be in front already (or Yes could have a slightly larger lead).  I think the main cause of the minor difference in our estimates is that Simon weights by sample size, whereas I don't apart from downweighting polls with under 950 respondents; weighting by sample size downweights Essential.  Neither of us weight for the reliability of different polls at elections or at the marriage law postal survey, and if we did then that would upweight Newspoll and put No ahead.  So there's an argument that No's already winning, and even if Yes has a fragile polling lead that's no guarantee (for error margin reasons alone, never mind the history of No overperforming final polling) that Yes would win even in a vote "held right now".  

Absent of any further polls in that time, I will have No ahead this time next week.  

(NB I also ran a version with stable house effects assumed for each poll but at the moment the outcome was not significantly different.  The issue here is that Essential may have a variable divergence from others.)

State of the states

With the Yes vote at best trivially ahead according to polls, the state picture is also important because the referendum needs to win at least four states to pass.  For this component I am using an average of the ten most recent polls that have had state breakdowns, in each case finding the difference between the state two-answer result and the overall two-answer result, and then aggregating those.  But in this case I am using sample size, because some of the pollsters are releasing embarrassingly small state samples that are extremely volatile.  Specifically, I am scaling state samples by the square root of the national sample size.

For a national value of 50.6 to Yes this is what I get (and the result will surprise some observers, but I explain why it is so below).  In each case I give the estimated state vote followed by the estimated state lean compared to the national vote in brackets:

NSW 52.2 (+1.6) 

Vic 53.8 (+3.2)

Qld 44.5 (-6.1)

WA 48.8 (-1.8)

SA 50.7 (+0.1)

Tas 51.3 (+0.7)

Note that the WA result is distorted by the JWS outlier and another from Essential.

My aggregation still has Yes just ahead in four states, which may surprise after all of Newspoll, Resolve and JWS had it not winning enough states.  But Newspoll only had Yes trailing in SA by 1%, and Newspoll has the worst result for Yes nationwide so far, so the Newspoll result is consistent with SA being ahead of the national average.

Concerning Tasmania, Newspoll had Yes trailing by 5%, but on a two-answer basis that's only half a point worse than its national polling.  While data on Tasmania is scarce, both Resolve and Morgan recently had Tasmania well ahead of the national average.  So if Yes is still ahead nationwide as the polling average appears to just have it (for the next few days, barring further polls, anyway ...), the existing polls on average imply Yes should be still ahead in Tasmania.  Virtually no confidence should be attached to this - Morgan breakdowns of voting intention for the state have often been all over the place, and Tasmanian panel pools for online polling will be small and probably difficult to make representative.

I should note though that because the state samples for SA and Tasmania are so unreliable, there's a high probability that Yes isn't really ahead in one or the other, making the Yes "lead" even more fragile.  Indeed, a 538-style probability model would probably find a greater than 50% chance that No is "winning" based on state data already.  

Online Denial Continues

The Newspoll result with the series' first No lead has resulted in not just outrage from the usual Labor-stan spear-carriers on Twitter, but complete insanity.  Many users have again responded by making completely false claims about the record of not just Newspoll but also polling generally.  

Especially I'm seeing many false claims that polls (not only Newspoll) predicted Coalition victories in the most recent federal, NSW and even Victorian elections, when none of this is true, especially not for Newspoll.  (Indeed only in the NSW leadup was there, at some stages, real doubt about who was best placed, and only because of a skewed pendulum, not on two-party preferred.)  Some Twitter users are even calling people racists simply for tweeting impressions of the polling commentary and for debunking viral lies about the track record of pollsters, even when the person they are levelling the claim at has no form for racist commentary on the Voice and may even be an intending Yes voter.  In what universe this (which I myself have copped twice so far) helps their cause I truly have no idea.

(In truth, the media deserve some blame for this for poor general coverage of polling issues, but the major culprits are the tweeps themselves, who swallow the most sensational or absurd polling reports by outlets they would normally distrust, and also mix up polls and publications, and if all else fails, just make stuff up.)

While having noticed early on that the Yes vote was declining, I've been surprised by the sheer rate of what looks a lot like accelerating decline in the last few weeks especially.  Again, projecting a series outside the range of values that have occurred in it and declaring that it is what will happen isn't recommended but there is at least the possibility that Yes could lose in every state and even finish up below 40%.  A heavy loss would have very different consequences than a technical loss based on not winning enough states.

Update 5 July: Late-Breaking PSOs

There has been no further sound polling yet since this article was released but there have been a couple of poll-shaped objects (PSOs) worth mentioning.  The most prominent has been the Australian Community Media reader survey, which has reported a headline finding of 38-55 (No way ahead; this would be 59% to No on a yes-no basis) together with regional, metro and Canberra breakdowns.  ACM publishes the Canberra Times and numerous regional and rural publications.

The data have been reweighted, but only in a primitive fashion, reportedly by age, gender and metro/regional location.  But this is not enough to eliminate any skews caused by the reader base of the publications in question, or by whether or not their circulation within regional areas is politically representative.  (There was also a survey panel used, but it accounted for only about 10% of respondents.)  It's also prone to the same risk as robo-polling: relatively few young voters would be captured and those who were might be unrepresentative of their cohorts in a way reweighting would not repair (especially not if insufficiently fine-scale).  The difference between male and female voters, at about 20%, is also suspiciously high.  The problem is that no amount of weighting will do a very good job of fixing a fundamentally unscientific way of finding respondents.

The other one was a Liberal Senator James Paterson Tele-Town Hall robopoll of Kooyong, supposed to show that a plurality (which the Herald Sun calls a majority) of Kooyong voters would vote no (43.6% No 42.5% Yes).  However no details of question wording, script or weighting methods have been released and this is a pollster with no public form record or Polling Council disclosure requirements.  Tele-Town Hall is best known in Australia for robocall conferences used by the Liberal Party in the 2019 campaign.  This poll would require caution on account of being commissioned by a politician alone, but the lack of public form or remotely adequate disclosure means it can be entirely ignored.  

Updated July 9: And Another One Goes, And Another One's Gone ...

Another poll not up to my standard for inclusion has been issued, this time by the Australia Institute.  TAI and the Institute of Public Affairs are both such frequent sources of skewed or flawed polls (from the left and right respectively) that I am tempted to issue a global 538-style ban on both of them such that I won't use their data even if one of their polls appears to have nothing wrong with it.  

The wording and answer options for the current TAI poll (which had a 52-33 yes-no response) are:

This year, a referendum will be held on the question of whether to enshrine an Aboriginal and Torres Strait Islander Voice in the Constitution. The Voice would be able to make representations to the Parliament and the Executive Government of the Commonwealth on matters relating to Aboriginal and Torres Strait Islander peoples.

In the referendum, will you vote “Yes” or “No” to alter the Constitution to recognise the First Peoples of Australia by establishing an Aboriginal and Torres Strait Islander Voice?

“Yes” to establish a Voice in the Constitution
“No” to not establish a Voice in the Constitution
Don’t know / Not sure

Contrast this with the actual referendum question and directions:

Referendum on proposed Constitution alteration

DIRECTIONS TO VOTER
Write "YES" or "NO" in the space provided opposite the question set out below.

A Proposed Law: To alter the Constitution to recognise the First Peoples of Australia by establishing an Aboriginal and Torres Strait Islander Voice. Do you approve this proposed alteration?

WRITE "YES" OR "NO"

The actual referendum ballot will state three times that the Constitution is to be altered.  The TAI question does include the word "alter" once, but it is downplayed by being in the third sentence, after extra detail about what the Voice does.  Then it is further downplayed by the use of answer options including "to establish a Voice in the Constitution", so it is quite easy for a respondent who is in a hurry to skim over the "alter" bit or to not think about it.  I really cannot see any reason for including "establish a Voice in the Constitution" as an answer option other than to drive up the Yes vote.  A further minor objection here is that some voters may in principle support adding a Voice but not this particular model.  Finally Dynata does not have a public track record of accurate election forecasts. 

For my comments rejecting earlier TAI polls with different wording, see the April edition

July 11 Essential Update

This week's Essential has come in with a 47-43 lead to Yes (=52.2% on a two-answer basis), after Essential switched its methods from forced choice to allowing undecided.  (This is more consistent with Essential's general approach though it is odd that Essential were apparently finding undecideds as likely to vote Yes on forced choice).  

My revised graph has Yes precariously in front 50.1-49.9 ignoring house effects.  The issue here is that applying a global house effect correction to Essential, Resolve and Newspoll especially will put No in front, but because Essential has changed its methods it is possible that its house effect has also changed, and we will need to see more polls with the new methods to determine what impact this might have had.  Anyway its very close.  There was nothing in the Essential state breakdowns to give Yes further cause for concern, if anything an improvement with a strong result in SA but a weak one in NSW (no data for Tasmania).

Essential has published information explaining the methods change, including results from a one-week A/B test that had a forced choice result of 56-44 but an undecideds-allowed response of 46-42-12 (note that these are small samples and so it cannot be assumed the difference caused by different methods was that large).  


July 13: Also a note re SEC Newgate - I have previously excluded this poll for its use of the term "update" among other wording issues.  Another objection to Newgate is that it is still using yeah-nah options like "somewhat support" that will not appear as options on the ballot instructions.  In theory one could manage polls like Newgate and Australia Institute by including them and allowing house effects to do the rest (as Simon Jackman has done with TAI), the problem with this being that if there are more polls with design issues that favour one side than the other then they will distort the average through the sum-to-zero constraint.  Anyway, Newgate's late June result was 43-34 to Yes, down from 52-27, and showing that even polls that are worded favourably to the Yes cause are showing sharp dropoffs in support over time.



No comments:

Post a Comment

The comment system is unreliable. If you cannot submit comments you can email me a comment (via email link in profile) - email must be entitled: Comment for publication, followed by the name of the article you wish to comment on. Comments are accepted in full or not at all. Comments will be published under the name the email is sent from unless an alias is clearly requested and stated. If you submit a comment which is not accepted within a few days you can also email me and I will check if it has been received.