All Episodes

April 24, 2025 • 23 mins

This week we are doing a special podcast about our complicated relationship with political polls. As journalists, we like them because, maybe, they can tell us something about what voters are really thinking. But we are a bit wary of them too. Especially after the federal election in 2019, where the polls were wrong. That caused a massive rethink in how polling is done, and how we in the media rely on it. Jacqueline Maley is joined by chief political correspondent, David Crowe, and special guest Jim Reed, who conducts the resolve political monitor poll for our papers.

Subscribe to The Age & SMH: https://subscribe.smh.com.au/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
S1 (00:00):
From the newsrooms of the Sydney Morning Herald and The Age.
This is inside politics. I'm Jacqueline Maley. It's Friday, April 25th.
This week we're doing a special podcast about our complicated
relationship with political poles. As journalists, we like them because maybe,
just maybe, they can tell us something about what voters
are really thinking. But we are a bit wary of them, too,

(00:24):
especially after the federal election in 2019 where the polls
were wrong. That caused a massive rethink in how polling
is done and how we in the media rely on it.
So let's get into it. How are polls actually conducted?
How are participants recruited? How are the questions formulated? And
are they right this time around? Today, as always, we

(00:46):
are joined by our chief political correspondent David Crowe. And
today our special guest is Jim Read, who conducts the
Resolve Political Monitor poll for our papers. Welcome, both.

S2 (00:57):
Thanks. Great to be back on.

S3 (00:58):
And it's great to be here. And it's also really
good to be face to face in in the studio,
isn't it?

S1 (01:02):
I know it's so old school, I love it. And
we're being filmed, so everybody's on their best behaviour. Jim,
I want to start with you. First. We're going to
do a deep dive into polling and how it works,
what it's looking like for this election. But can you
just talk to us a little bit about how the
sausage is made? What exactly is a poll? How do
you find the people and how do you make sure
that the sample is representative?

S2 (01:23):
Sure, there's the the how the poll is done, but
also the why. I guess the why we do polling
is which dictates the how is that we really want
to inform readers and listeners about what's going on and
hopefully inform the commentary. So it's not just one person's opinion,
you know, there's a sort of safety in numbers, if
you like, by by assessing public opinion. So it's important

(01:44):
that we get it right. And the way you do
that in research is by asking the right questions of
the right people in the right way. And that's the
same with any market research, regardless of whether it's political polling,
you know, so the the first thing we do is
make sure we're asking the right people. And increasingly we're
doing that with online research panels rather than calling people up.
And that gives us a pretty cost effective way of

(02:06):
getting to a representative bunch of people nationally. And we
set minimum quotas by state, by area, by age, by gender.
So we make sure we've got a good bunch of
people and we can, you know, fiddle about with the
back end a little bit with what we call weighting factors.
So if we're slightly down on females, for example, we
may upweight them a little bit in our sample. So

(02:27):
it's representative. And then we asked them the right questions
in the right way. So you know very obviously that's vote.
But also we go a bit deeper into some of
those diagnostic measures like what issues are important. How committed
are you in your vote. How likely are you to change?
What do you think of the leaders? What do you
think of this policy, that policy, etc.? And I think
that's one of the great strengths of our approach is

(02:48):
we we go deeper and talk about policy and issues
and events, and we've got very particular ways, for example,
of asking vote. Um, after the 2019 polling episode, we
came in and we thought, you know, how do we
change this? And one of the major things, I guess,
that we did is to take out an undecided option.
So most polls give an undecided option. We took that

(03:09):
out right. Simply because it doesn't appear on the ballot paper.
So we thought let's you know, let's emulate the real
decision as it is in the in the election box.

S1 (03:18):
That certainly makes sense. David, how do you use polls?
I mean, you report the polls that we publish the
Resolve Political Monitor polls, but how do you use them
more broadly in your work.

S3 (03:27):
Really as a way to find out the mood of
the electorate across a whole range of fronts. And so
we ask the personal questions because we're interested in how
people view the performance of the Prime minister and the
opposition leader. Sometimes we've asked questions about the likeability of
different leaders. We're always conscious of asking a question as
pointed as possible about an issue that's running in the community,

(03:51):
whether it's Donald Trump or whether it's health policy or
education policy and so forth. So we're looking for ways
to really explain to our readers what the mood of
the electorate is, because each of us can can be
in our own bubble in politics in a way, because
sometimes there's a self-reinforcing thing that happens in communities where

(04:13):
you you're more inclined to read the stories that reinforce
your own opinion. So sometimes we well, we we can
use the poll to reflect a better sense of what
the wider community feels. And sometimes this leads to polling
stories that people disagree with. You know, if if Peter
Dutton is making inroads into Labour support by by seizing

(04:36):
on complaints about the cost of living. Our polling reflects that.
And that's an important sort of statement about where the
community is at. And it counters some of the spin
that might be coming out of Canberra. Actually, it's it
serves quite a useful function in that way.

S1 (04:50):
Yeah. It's a corrective to political spin. Jim, how do
you find the people that you poll and do you
pay them?

S2 (04:56):
We do. Usually when they're collected by online polls, they're
usually offered an incentive for their time. We sometimes also
mix a bit of telephone polling in with that, simply
because we find that if you want to, you know,
be as inclusive as possible. There are certain people, you know,
for example, the, you know, people aged over 80, um,

(05:18):
you know, younger people with better things to do or
more affluent groups who simply don't go on online panels,
or at least not in the numbers that you'd like.
So targeting them with mobile phone calls and other things,
you know, helps to to balance that. But certainly with
online panels, it's pretty normal that you provide an incentive for,
for their time. Um, the great thing about online panels

(05:38):
is they're very easy to reach, but they're online panels
and online panels. So the most important thing is that
they're recruited properly, because if they're just a sort of
opt in consumer panel, I think you tend to get
skews and biases in there. And I think that was
part of the 2019 problem with the polls that were
around then. The best polls are the ones where you

(05:59):
recruit them randomly off the back of telephone surveys and
other things like that, and you get a random sample
of people.

S1 (06:05):
And you make sure that they're sort of serious people
who are going to take the exercise seriously. I mean,
how do you do that? And just, you know, you
don't have some young kid, you know, yeah, a donkey
vote or.

S2 (06:13):
You know, it's one of the things as a pollster
sort of keeps you awake at night. I guess so. We,
you know, the best polls. And I'm sure there are
lots of polls, not just mine do this, but you
have very rigorous quality control and security checks. So you
make sure things aren't bots or AI by doing little,
you know, those little checks that we're all used to
and you do internal consistency checks. So you might ask

(06:35):
the same question twice, right? You know, throughout the survey.
And if they answer differently between those two times you
start to get a bit suspicious.

S3 (06:42):
Yeah, right. The voice referendum was an interesting example of,
I think, the use of polling and the rigorous way
in which it was done. Jim's work showed by the
middle of 2023 that the majority were against the indigenous voice.
And this was a very uncomfortable finding for people in
the indigenous community and in labor. About. And I often

(07:04):
think back to that moment and think because the polling
became very consistent all the way through to the actual referendum.
And I think back on that and wonder why people
didn't act on the signal would have been a hard
thing for people to do to say, okay, this is
clearly headed for defeat. So we're going to have to
come up with a different proposal to take to the referendum.

(07:25):
For instance, take out. Putting the voice into the Constitution
and maybe just focusing on recognition instead. But I often
think back to the polling at that time and wonder
about whether a different pathway could have been found, given
what the poll was showing.

S1 (07:41):
I mean, that was one example where the polling was
spot on. Previously, we've had examples where the polling has
been very misleading. So I'm talking about 2019 election, which
obviously showed that Labour under Bill shorten was going to
win government, and they most certainly did not to the
extent where Scott Morrison called it, you know, a miracle.

S4 (07:59):
I have always believed in miracles.

S1 (08:02):
So what happened? Why didn't the polling pick up the miracle?

S2 (08:06):
Well, I think there's really no excuse for it, because
we had polls that were published on the Friday and
even the Saturday. You know, we were saying the opposite result.
The results were a few percent out. You know, it
just it became very important because they called the the
result the outcome incorrectly, I think. But it's not unusual for,
for for polls to be a few percent. The most
important thing is that it's within your margin of error,

(08:28):
which is usually a couple of percent. And therefore we
usually get it right. I think the polling industry did
a lot of navel gazing and a lot of soul
searching after that resolve was brought in, partly because of that,
you know, after that 2019 result and we've, you know,
had a pretty good record since then. We've done, I think,
four elections in a referendum for the age and the SM. So,
you know, which is proven to be accurate within our

(08:50):
margins of error. To be fair to other pollsters, we
didn't have those legacy issues of, you know, using a
certain panel, using a certain question format and having those trends.
So we we were able to start afresh. But I
think the other thing we made a conscious effort to
do was to look afresh at the vote question. And
one of the things, because the polls were saying a

(09:12):
certain thing right up until Election day, one of the
uncertainties was where the 5 or 10% undecided actually went
on the day. So they usually excluded from polls in
their in their voting result. And so we've decided to
get rid of that. And we forced people, much like
when they go into the voting booth and they're forced
to tick an option or, you know, scribble a message

(09:35):
on the vote paper and spoil it, or just do
a donkey vote, we allow them to do that same
thing in our in our vote choice. And regardless of,
you know, us usually doing a poll about a week
out from from Election day, it's still proven to be
pretty accurate. I think really because of that.

S1 (09:52):
Um, well, I mean, we're going to we're going to
find out in nine days, aren't we? Um, Jim, in
the interest of transparency, can you tell us more about
the relationship between your company resolve strategic polling and whoever
commissions the polling, Like our newspaper, newspapers. I mean, what
are the expectations of you from The Age and the
Sydney Morning Herald? And how do you make sure that
the sort of final set of questions doesn't accidentally or

(10:15):
inadvertently reflect somebody's bias along the way?

S2 (10:17):
Well, again, I think there's safety in numbers. We're interviewing
usually 1600 people. Sometimes it's more than that. Our last
poll will be 2000 people. So as long as you
pick that sample and analyse that sample carefully, I don't
think it shouldn't be biased. I think also there's a
how should I put it, an expectation of fidelity. So yes,

(10:39):
we want interesting results. But they've also got to be
balanced and fair and true. And I think one of
the problems that that polling has in terms of, you know,
fidelity and balance, or at least the perception of it,
is that, you know, you might be any polling company
is usually working for other entities as well, you know,
industry bodies, companies, etc.. And there's nothing wrong with that.

(11:03):
There's nothing illegal in that, but I think transparency is
a big thing as well. So if you're a pollster
and you have an interest in working for a company
or an industry, as long as you declare that it
allows people to, you know, decide what they make of
the poll.

S3 (11:19):
Jim's being very polite about the way it's all done,
I think. Let's, um, we're very happy with using resolve.
We did learn from the experience of 2019. Back then,
the consensus, and there was a report done about this
by some independent agency. After the 2019 election. The consensus
across the polls seemed to be labor ahead, 5149. The

(11:41):
actual result was the mirror image of that. For a
lot of the polls, the difference was within the margin
of error. But a lot of the focus on the
two party preferred number boiling poles down to two numbers
only created this impression of a Labour certainty, misleading impression.
And one of the lessons for us out of that
was let's not emphasize only those two numbers. So we've

(12:05):
broadened the polling and we've tried to emphasize things like
the primary vote for for the major parties, which is
so important, especially in this fragmented electorate where about a
third of the voters are not choosing Labour or Liberal,
they're choosing all these other options. And so Labour and
Liberal rely on the preferences. So this was really important.
Jim is not working for any of the political parties.

(12:26):
That's important to us as well. If there's any change
to that, we would declare it so that everybody knows
where they stand. And people have to bear in mind
the margin of error for us, for Jim is 2.4%.
So if it's 51, 49, remember the margin of error.
It's real. But we're very happy with the polling because

(12:46):
every month Jim is polling 1600 people. A lot of
others are doing 1100 people or 1200 people, and their
margin of error is greater because of that. You know,
these are just the technical things where we've got to
be as good as we can, but we acknowledge that
it is not as precise as perfectionists might like, and
it's not a predictor. It's a poll of a particular

(13:08):
view across the electorate at a point in time, like
last weekend, not May 3rd.

S1 (13:13):
Yeah. How do we make sure that the polls serve
a democratic purpose or a journalistic purpose? And they're not
just a horse race, you know, almost for their entertainment value.

S3 (13:22):
It's really hard because I get feedback from people saying,
why aren't you writing about more about policy? But when
I write about polls, there's huge engagement. And just to
be really frank about this, when I write about polls,
I get more feedback from politicians than I do when
I write about policy.

S1 (13:38):
The same politicians who say that they never look at
the polls, they don't read the polls. The only poll
that counts is the one on Election day.

S3 (13:44):
Breaking news.

S1 (13:45):
Yep.

S3 (13:45):
I mean, this is the thing. They're deeply engaged in
this stuff because they're professional politicians and they've they've worked
with numbers all their professional lives. And it's all about
the numbers. So they're deeply invested. When we say something
one side doesn't like, they let us know. They think
the poll's wrong. But over time, you know, we're creating
a or providing a consistent picture of where people are at.

(14:09):
And the most important thing is the trend. We found
the trend against the government when the cost of living
was biting. And certainly this year we've found a trend
toward the government as Election Day gets closer. This is
this is the trend to watch.

S1 (14:24):
Yeah.

S2 (14:25):
And we've I think, you know, just to pick up
on your point there about the two party preferred vote,
we have tried to avoid that that horse race. And
we have a particular way, I think, of looking at
vote where we concentrate on primary vote, because there are
a lot of minor party and independent candidates that, you know,
make up around a third of the primary vote. Now,
so the way we think about vote is that it's

(14:47):
a measure, and it's a measure that doesn't move around
a lot. So we we like to concentrate on policy and,
and other things. But we do report it and we
have to report it. Unless there's a leadership change it
doesn't move around too much. But it is a very
handy measure over time of the health of the parties,
including the Greens and Independents and One Nation and other things.
Two party preferred the binary measure kind of masks that.

(15:10):
So we've we've treated though as an opinion measure for
the vast majority of the term. When we get closer
to the election, we start to talk more about two
party preferred to to say, you know, this is where
the the position is because otherwise you, you have people
working out what seats would change hands from a two

(15:31):
party preferred result, you know, two years out from an election.

S1 (15:34):
Yeah. Too far, which is. Yeah.

S2 (15:35):
Which is, which is silly.

S1 (15:37):
Yeah.

S3 (15:38):
There's a there's a great jazz standard. It don't mean
a thing. It ain't got that swing. And I always
think of that tune when I write in my stories.
There's no such thing as a uniform swing. Um, I
kind of hum it to myself because you cannot apply
the national 52, 48 or 51, 49 to the seat
of Chisholm in Melbourne, or the seat of Bennelong in Sydney.

(16:00):
It does not work that way.

S1 (16:01):
I know, and I think sometimes when we're reporting federal politics,
particularly from Canberra, we we forget about the local politics
is everything. And, you know, seat to seat is is
the only game in town. I just want to ask
you both about how polling can affect the democratic process
itself or can affect opinion. So, you know, it's supposed
to be an objective measure. But sometimes, um, polls, if

(16:23):
they go a certain way, will actually affect the way
that people vote. And sometimes the parties will take pride
in their underdog status or claim their underdog status in
order to get ahead. I mean, what do we sort
of do with that? Is that is that a risk?

S3 (16:37):
Um, isn't there a scientific principle where the where the
process of the experiment can affect the outcome?

S1 (16:42):
Yeah, that's what I'm talking about. Exactly.

S2 (16:43):
The quantum observer.

S3 (16:44):
Yeah. But okay, so I was talking to somebody in
the union movement recently who made the point that labor
is doing better, not just based on what we're finding
with resolve, but also based on the union movement's own polling. Yeah.
Does that then lead some people to think, oh, Labor's
going to be in, I'll vote green. Yeah. Yes. That's

(17:05):
a factor for some people. So there is there is
an impact. But the alternative is to not ask, what
is that? Don't ask, don't tell. You know, like, let's
not ask people because it may influence people's judgment. Well,
influencing people's view of politics is kind of what journalism
is often about.

S1 (17:22):
We can't help it sometimes.

S3 (17:23):
Just reporting.

S1 (17:24):
What happens. Yeah, it is an interesting one, I think.
What do you think on that point, Jim? That sort
of idea of a feedback loop, because it must also
affect the parties psychologically, even though they don't want to
claim the win, if they know that the polls are
going their way and their internal polling is going their way,
they get a pep in their step. And you can
see that happening in the Labour campaign.

S2 (17:43):
Yeah, certainly. Um, Anthony Albanese has, um, you know, seemed
a lot more confident. I think, you know, partly that's,
you know, his position in the polls, I guess, but
but also just running a, you know, a very decent campaign,
a disciplined campaign. So they're on the ascendancy, if you like.
But I think, you know, just reiterating really what David said,

(18:05):
an informed electorate Is an electorate that's ready to vote,
and we're just part of the informing process. Reading through
some of the reader comments of of David's stories is,
you know, which I occasionally do the most wonderful comments
or I didn't know people thought this way, oh, I'll
have to go and research this, etc.. So, you know,
informing people about what's going on is important and, you know,

(18:28):
you can't get around that. There's there's feedback loops everywhere.
So you speak to your family, your friends about what's
going on in the election. That's exactly the same thing
as a poll. So I have no issue with that.
The issue is, if polling were to get it wrong,
there's only one thing worse than having no polling as
a political party. And that's having the wrong polling. That is,

(18:50):
it's getting it wrong. For some reason you're getting a
wrong steer rather than no steer. So I think as
long as we maintain our accuracy and our fairness and
our balance in the way we present things, then it's
it adds to the democratic process, not detracts.

S3 (19:04):
We're not going to put a poll on our front
page on Election Day. You know, I don't see it
as my role to try and predict the outcome on
the day of the election. I'll comment on the outcome
on election night. But in the final days of the
election campaign, I want to report to readers what the
leaders are saying on either side, what the policies are

(19:26):
saying on either side. If there are any developments that
are newsworthy in the actual campaign, rather than do polling
on the Friday or Saturday, that says we expect the
result to be this way or that way. Yeah. Because
I think at that point, voters, you know, might want
us to tell them where things are at in the
competing propositions and they'll make up their own mind.

S1 (19:49):
We do, however, have a new poll coming out next week,
don't we? Which is going to be the last one
before the final day. I just want to ask you
both very quickly what the polls are showing now, like
what our last poll has showed.

S2 (19:59):
It's essentially very close to where we we were in
2022 after lots of ups and downs. You know, Albanese
started with a very, you know, strong honeymoon, which lasted
for about a year. Then the long term trend was
Labour went down, down, down, down, down. Those rate rises
started to bite. The cost of living was biting more generally,

(20:20):
and the feedback we had in focus groups and respondent
comments was that Labour looked, you know, distracted by the
voice and other things and not acting on cost of living.
So they really suffered from that. They lost a great
deal of their vote. And for, you know, the start
of this year, it looked like the coalition were in
with a chance of actually, you know, forcing Labour to

(20:41):
be a first term government. But we've had this huge
bounce back. Labour have gone from a two party preferred
deficit to I think in our last poll it was 53.5%
lead over the coalition, a very slight swing to them,
which would seem to indicate, you know, a majority Labour
government at that snapshot. But, you know, within our margin
of error, as David said, we should always look at

(21:03):
the margin of error. There's also the chance of a
minority Labour government in there. So the polls are basically
we're essentially middle of the pack at the moment, and
the polls are basically telling us that that's the state
of play, majority or minority Labour government at the moment.
But as we've seen in in recent times, a week
is a very long time in politics. So let's see,
we go where we go over the next week and

(21:23):
a half.

S3 (21:24):
I talked to a Labour MP the other day who said,
I'm worried that we're getting complacent and the polls are
telling us we're ahead. And you know, we shouldn't take
that for granted because he was seeing people going into
the early voting booths still grumpy about the cost of living.
And I think that's worth mentioning. We don't just ask
about leaders and, you know, primary vote when we ask
who's better to keep the cost of living low? Labour

(21:46):
were way behind on that 2 or 3 months ago. Now, 30%
say Labour and Albanese are best to keep the cost
of living low and 30%, say Coalition and Peter Dutton.
So it's now neck and neck on that key question.
But MPs are still on the government side, are getting
this blowback on the cost of living still. So I
think we've got to emphasise it's still close.

S1 (22:09):
It's still anybody's game. Yeah. We will have to check
the price of the sausages on. The sausage sizzles on
the day and see what's happening with inflation of the
democracy sausage. Guys, thanks so much for joining us. That
was really interesting. Thank you for coming in, Jim.

S2 (22:24):
Thanks for having.

S1 (22:25):
Me. And good to see you in person.

S3 (22:26):
David, wasn't it good? Thank you. Cheers.

S1 (22:31):
Today's episode was produced by Julia Katzel with technical assistance
from Josh Towers and Taylor Dent. Our executive producer is
Tami Mills, and Tom McKendrick is our head of audio.
To listen to our episodes as soon as they drop,
follow Inside Politics on Apple, Spotify or anywhere you listen
to your podcasts. And to stay up to date with
all our election coverage and exclusives, visit The Age and

(22:53):
The Sydney Morning Herald websites to support our journalism. Subscribe
to us by visiting The Age or SMH. I'm Jacqueline Maley,
thank you for listening.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.