Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Today on Data Nation, Brendan Lind and Professor Andrew Gelman are here
to discuss polling and digital marketing's effects on elections,
and how these patterns could affect the upcoming presidential election of 2024.
I'm Liberty Vittert, Professor at Washington University in St. Louis,
and my co -host is Munther Dahleh,
(00:20):
William A. Coolidge Professor in Electrical Engineering and Computer Science at MIT.
The United States presidential election season this year is a time of tension and
polarization in our country. Besides the election itself, poll results in
the campaign season strike a lot of emotion in the American public. In fact, according
to a study conducted by the National Opinion Research Center at the University of Chicago,
(00:44):
six in ten Democrats felt fearful or angry when contemplating a Trump victory in 2024.
On the other hand, four in 10 Republicans felt the same way about a Harris victory.
With emotions high, pre-election polling can ensure a lot of stress,
ultimately placing pressure on pollsters. But on top of these rising pressures,
(01:05):
it has been even harder for pollsters to gather data.
Nate Cohn told The New York Times that it took two hours of dialing random phone numbers just
to collect one interview, which used to be the traditional method of polling. These changes
have left pollsters still trying to figure out how to develop new methods to collect data.
Academic pollster Andrew Smith stated that pollsters are still researching the best
(01:28):
ways to collect data through web-based surveys. With social media on the rise,
this has also affected polling and data collection.
The US Supreme Court published an article finding that 60 percent of
American adults often get their political information from a smartphone, computer,
or tablet. With a dependency on social media for news, it can be very easy for Americans
(01:50):
to be exposed to fraudulent or inaccurate information. Thus, this country faces a new
challenge during the election season, dodging a sea of inaccurate or biased digital marketing.
Ultimately, with such a controversial and high stakes election, the events leading
up to the big vote will shape the future of our country. Today, we are speaking
(02:10):
with Professor Andrew Gellman, statistician and political scientist, and Brendan Lind,
founder of The Human Agency, a company focused on the digitization of political campaigns,
to better understand polling and digitization's role in the 2024 presidential election.
Munther (02:26):
Okay, maybe I'll kick off and maybe
I'll kick off asking a question to Andrew. For
our audience, at least, let's talk a little bit about polling. What happens with polling and what
are the main ideas behind it? Why is it that many people believe that polls are often very,
very wrong, you know, or at least - they're wrong and they're off
(02:48):
and off by enough to make us distrust them. And that has happened actually,
in both elections seen in 2016 and 2020, that the polls weren't that accurate. So
maybe you can give us an idea of how we should think about this and how we interpret them.
Andrew (03:01):
Well, the polls were off by about two
and a half percentage points in 2020. That's
pretty good, actually, considering the response rate of polls is less than 5 percent. They try
to get a representative sample of voters, then they adjust for known differences
(03:22):
between the sample and the population. So, if your survey has too many women or too many old
people or too many people in some state or another compared to their estimate of the
general population of voters that gets adjusted for and they'll adjust for education. Sometimes
(03:42):
polls will adjust for party identification and who you voted for in the previous election,
which is challenging because there are new voters, but you can still do it.
So after all those adjustments, the polls can still be off because of who responds
and who doesn't. For most purposes, being within 2 percent is pretty good. If you
(04:08):
have a very close election, then that's not super precise. But in any case,
there can be changes in opinion after a poll is taken before people vote. So there'd be
no benefit in having a perfect poll anyway, given that you're measuring a moving target.
Liberty (04:25):
I mean polls, you know,
obviously sort of - election
outcome predictions and poll data can influence voter behavior. You have the bandwagon or
the underdog effect. So when people vote for someone just because they're projected to win,
I call it the bandwagon effect and the opposite, it's the underdog effect. So,
(04:48):
is it better to be the underdog? I get those emails, those political emails that say ‘I'm
going to lose if you don't give me money right now!’ Or is it better to have people hopping
on your bandwagon and think that it's, like, almost inevitability that you're going to win?
Andrew (05:03):
In the general election for president,
I haven't seen evidence for either of those.
In the presidential election, you may already have a strong preference. So, imagine you're
considering voting on some minor issue and there are two candidates, or a local election and you
have to weigh the issues. Do you think that your vote would be very swung by whether you
(05:25):
thought someone was a little ahead or a little behind? I don't think so. These will have effects
in primary elections. So, primary elections are different because there are multiple candidates.
There's a need for strategic voting. You don't want to waste your vote. So, a lot depends on
who you think might be first, second, or third. In a primary, the candidates are typically very
(05:50):
similar in policy, meaning that you have less of a strong reason to support one or another. Also in
primaries, the lineup keeps changing week after week, so you don't have that much time to make
your decision and get used to your decision. There are polling biases. One well-known bias is what
we call differential non-response, which is that if your candidate is doing well, empirically it
(06:13):
seems that you'll be more likely to respond to a poll. And when your candidate isn't doing as well,
you're less likely to respond. So surveys that don't adjust for partisanship of the respondents
will tend to overestimate swings in the polls. Swings will get exaggerated. A small swing
in one direction will be coupled by a swing in response also.We try to adjust for these things,
(06:37):
too. But I'm not so concerned about what you're talking about, this bandwagon underdog thing.
Liberty (06:43):
Brendan, do you see that sort of
strategically when you're planning campaigns?
Brendan (06:47):
I think that anything can fit into
a narrative. So if an underdog narrative is
helpful for you and you think that it helps you tell your story better, maybe that is helpful
in a marketing persuasion way. If making yourself seem like you're just clearly the best choice,
(07:08):
so you're the dominant candidate, that can be helpful from a marketing perspective. If you're
like the only choice for a sane voter, maybe that can be helpful. I think a lot of it comes down to
how you can weave it into messaging and fit within your narrative. In general, outside of that, it's
not inherently better or worse. It all is what's the story you're telling and does that resonate?
Munther (07:29):
So, you know, just maybe a slight
exception to this is what's happening right
now in the swing states and in particular with regard to Arab and Muslim voters,
because the issue there now is sort of putting pressure on the Democratic Party to make a change
in terms of policy with regard to Israel. And, you know, there's a narrative, I think, Brendan,
(07:52):
to your words that we should vote for Jill Stein because that puts pressure on the Democratic Party
to actually make some concessions or some sort of pressure on Israel.
That narrative and the polls that are showing now more and more of the Arab
Muslims are supporting Jill Stein are actually helping people make some decisions about what
(08:13):
they wanted to do because the people are in general not voters for specific issues but
rather they vote broadly on the president. So there is a situation where I see, potentially,
polls shown for a third -party voter can actually sway more opinions about collective behavior.
Brendan (08:32):
I think it's an interesting point
and I think it fits within that narrative
of - if people feel like ‘okay, no one's paying attention to the plight of the Palestinians,
and so I'm going to go pass an anti-vote for Stein’, and they think that Kamala's
going to win anyway, maybe they feel comfortable with that. If they think Kamala potentially is
(08:56):
going to lose - so really they're voting for Trump, essentially, and they see it
like that - great. A lot of times people don't make those rational steps as well,
and so you kind of get into this - to Andrew's point - maybe they weren't really going to vote
anyway, but there are some people who are going to make an anti -vote because of the narrative,
and when that happens - how you frame it. If I was Kamala and I was looking at ‘how am I
(09:19):
going to influence people who are going to be heavily influenced by the situation in Gaza?’
I would be thinking about how a vote for Jill Stein, et cetera, is a vote for Trump.
Andrew (09:31):
So I kind of disagree with your framing
in the sense that I think that people who might
vote for a third party candidate like that, if that's not an option, it's likely they won't vote,
but if they do vote, they might vote for either of the other two. So the idea that someone is voting
for a particular third party rather than the Democrat is already, like, only part of the story,
(09:58):
because you also have people voting for the third party candidate instead of the Republican.
Brendan (10:04):
Completely.
Andrew (10:05):
And certainly the Republican
policies on these issues are also up for
grabs. So I wouldn't want to frame it as only the Democrats have agency somehow in that way.
Munther (10:17):
But I do want to follow
up a little bit because Brendan,
probably part of what you do in digital advertisement and so forth, is also deal
with this massive sort of pressure that is coming in the social media that is framing a particular
narrative. You hear that narrative all the time. It's not about the Republican Party.
It is about putting pressure on the Democratic Party. And so - and I think that that's the
(10:42):
narrative you see with respect to this particular conflict. None of these people potentially would
have voted for Stein. That wasn't - and I think I'm trying to understand whether
this narrative is putting pressure and if this pressure is happening, what is the percentage
of people? Because we're not looking for a large number of people to flip the votes in small swing
(11:04):
states. It’s… maybe it's in the order of 100,000 people, that's not a large number of people.
Liberty (11:09):
Just to make the point, I
mean, you saw it to some degree in
the Republican primaries with Ukraine. There was pressure to pick candidates
that were sort of on the Trump side of ‘don't give any more funding to Ukraine’.
And that the pressure there was that people wanted to vote for either the more Trumpian
candidate that was like anti-Ukraine funding than the less still-Republican,
(11:32):
non-Trump candidate that was for Ukraine funding. So you sort of saw it in the same way. Or do you
think that's the same way, in the Republicans in the Democrats with these two conflicts?
Andrew (11:42):
I don't think they're the same.
I just, I think that these issues affect
both parties. Both parties have complicated sets of views and they involve coalitions
and the ultimate policies that get done will involve both parties also.
Munther (11:57):
And so how responsive do you
see candidates for the presidential
race? How responsive are they to polls?
Andrew (12:06):
I don't know what candidates are
doing. I mean, I think, but let me say that
there's different levels of responses. One is allocation of campaign effort,
and another is position taking. And you don't need any polls ahead of time to know that moderate
positions will be more popular, that they will tend to get more votes than extreme positions,
(12:33):
and you don't need polls to know which will the swing states probably be. So at the presidential
level, I don't know that they're learning that much from polls, although, of course,
they're getting information anyway. When you're talking about congressional races,
then sure, then there are going to be some surprises,
and that could affect their resources, not to mention state legislative races and so forth.
Munther (12:58):
Brendan, what do you think?
Brendan (12:59):
Yeah. I don't think the campaigns pay
attention enough to necessarily the polls or
are necessarily shaping their platform around the polls if their goal is purely to win. But
it becomes kind of a challenge of - are you trying to listen to the polls to become more
like the average voter? Are you trying to be newsworthy? Are you trying to be someone who
(13:22):
feels like you're actually a human being and you have your own identity and you're not just
a product of popular opinion? Those things also shape people's willingness to vote for
a candidate. And you can see that with candidates across the board. Trump is someone who has in no
way really moderated his views to any of these polls. But part of the appeal of Trump is - one,
(13:47):
people think he is his own person. And then this other super important piece,
which is getting airtime, getting exposure. Trump had so much of this in 2016, and he's
generally benefited from this kind of throughout, is that free exposure, people hearing about you,
influences behavior. Advertising is one part, how resonant your message is, and another part,
(14:11):
how often are people seeing it? And so, if you get a massive amount of exposure or you get a
fair amount of exposure with the right amount of resonance, you can create the right combination.
Liberty (14:22):
Just to sort of change the framing
of this, we've all been talking sort of about
what we think, statistically speaking, of what a poll is or how we make prediction outcomes,
but there's other sort of methods of making predictions on election outcomes. So, really to
(14:42):
both of you, for example, we have Alan Lichtman, who's predicted the outcome. I think he's-
Andrew (14:48):
Let's not - let's move on
from that. That's a joke. He has not
accurately predicted - I don't even want to go there as a serious thing.
I will say, economists and political scientists have been looking at this for a long time. Stephen
Rosenstone wrote an excellent book in 1983 on forecasting elections. There's a zillion polls
(15:11):
now, but before the 90’s there weren't that many publicly available polls, and polls jumped around
a lot during the campaign. They've been very stable since the year 2000, but before that,
in earlier campaigns there are huge swings, a candidate could be up by a lot, then down by
a lot, and historically predicting the election based on not using polls at all, but just using
(15:37):
the economy and incumbency, historically predicted better than polls. Now we have so many polls,
like people complain that they can be off by two percentage points. That's not really so much,
but if the election is super close, you can't get a deterministic prediction. If you have somebody
saying, "I have a method that correctly predicts the winner in 1960, 1976, 2000, 2016, and 2020,"
(16:08):
they were all basically tied elections. That would be like someone telling you could predict
the outcome of the coin flip. The very fact that they're making that claim discredits their method.
Liberty (16:19):
So in the past, let's say
fifty years, there have been methods
where people have been using the state of the economy or the state - but isn't that kind of
what polls are doing now? I mean, they're not really using the state of the economy,
they're using what people think is the state of the economy. Is that fair?
Andrew (16:35):
No, I mean, no, it's not fair. (Laugh) I
don't think, I mean, I think that polls are asking
people who they would vote for if the election were held today. We found in recent years,
if you ask people who they've planned to vote for in the election that were held today - we looked
at state polls for president, Senate, governor, they're pretty good. The errors are about twice
(17:00):
the stated margin of error. So roughly speaking, there's a margin of error from sampling, which is
what's reported when the poll is reported with a margin of error, and then there's another margin
of error because there's potential biases. You're not actually getting a completely representative
sample of people. People are changing their opinion. So, really you should kind of double
(17:21):
the margin of error. But there's no economy in there. You're just asking people who they plan to
vote for. If you want to forecast the election based on the economy you could do that too,
and that's pretty good at the aggregate level. You're not forecasting individual people's votes.
You're just saying an aggregate - this is the vote share that a candidate should receive. But yeah,
(17:42):
it is very predictive and in this election, for example, the economy is historic… from
the historic post-war standards, okay but not great. And the incumbent is unpopular,
so there's a kind of negative incumbency effect a little bit. And when you put that together,
(18:07):
it ends up corresponding to a forecast that the election could be very close. And the polls also
are very close. So in this case, the polls and the forecast are similar, and so it's really not that
hard to combine the information. There are times when the polls aren't quite where the forecast is,
(18:29):
and then people have lots of discussion about what's going on and why that is.
Brendan (18:35):
I think one part that I think, one place
that's kind of fun as an alternative spot to go
and look at where people think this election may be going is actually go look at the betting
markets. You've got a bunch of people who look at the data, look at the information, go with
their guts, but are willing to put the money out there. Is it that scientific? I mean, clearly it's
not that scientific, but I think it does kind of bring a new approach to kind of looking at
(19:01):
what likely outcomes are. And you can go today and look at it, and it's pretty much a coin flip right
now in terms of where people think it's going. It rotates a little bit back and forth. And I think
that probably is actually pretty reflective of the polling and the state of the race.
So I think if you're looking for something fun, to get another perspective or to even say, where's
(19:22):
this going? You can try to aggregate the polls, you can go and go to your Nate Silvers or whoever
you want and you can also just go to a little betting market and have a little fun.
Liberty (19:31):
I just wanna jump in, I have
a follow up question here. Who are the
polls then really for? If all it does is state, who would people vote for if the
election was today, which it isn't? It's not really a representation of
the economy. There's no real bandwagon or underdog effect. What's a poll for?
Andrew (19:51):
My impression is that the polls are
loss leaders. Presidential preference polling
is a loss leader for commercial pollsters. So if you're a pollster, you're going to have a
bunch of questions like, what refrigerator do you want to buy? And if you could buy a car online,
would you do it? And all sorts of things like that, and then they also ask, who do you plan to
(20:15):
vote for president? Then it gets on the news, ‘the so-and-so poll reports, blah, blah, blah’. It's
a form of advertising. Now, who's it for is for news consumers. Most people who are going to vote
already know who they're going to vote for - might even describe you. You may already
have decided who you're going to vote for. I'm not going to ask you but you might well already
(20:36):
know. Most consumers of news who care about the election already know who they want to vote for,
so they're not that interested in learning about the candidates position but they're still
interested in the horse race. And I think there's way too much polling. I think it's ridiculous.
You have these extremely online people, even my blog commenters, and they'll write things like,
(20:57):
‘we really need another high-quality poll in Pennsylvania right now’. It's like, why do you
need a high-quality poll in Pennsylvania? You know it's going to be close. Both candidates
are going to be campaigning there. Things can change between now and the election. Like,
what do you want? Like, people want a level of certainty that's just not possible.
Munther (21:14):
Brendan?
Brendan (21:15):
I actually have a question to throw
back to you, Andrew, there, is - you're saying,
Hey, a lot of these pollsters do this as a loss leader to then try and build their name,
end up getting other people to do polls with them. And you're like,
why is this – this poll not really helpful at a national level. Then why is anybody
going to the pollster? Somebody's just basically saying, ‘Oh, well, they did a poll for Kamala,
they did a poll for Trump. I guess I should do a poll.’ So I'm going to go ask these people?
Andrew (21:38):
No, that's not what I mean,
like - if you've ever been surveyed,
they'll ask a lot of questions and most of them aren't about politics.
Brendan (21:46):
Yeah.
Andrew (21:46):
So they ask you a bunch of questions about
what's your favorite movie star and what brand of
gas do you buy and things like that, and they ask who you're going to vote for. The way they make
money is selling poll questions, survey questions to companies who want to do market research. And
so, if you're the XYZ poll and you've been in the news a lot and now you're doing market research,
(22:13):
which poll should I use? Well, I've heard of the XYZ poll. They're pretty good. So,
I think it's a form of advertising. It goes the other way too. Maybe it was 2012. There's one of
these years when Gallup was embarrassingly way off, and I think they stopped doing horse race
polls because they already had a strong reputation and the risk was the other way for them.
Brendan (22:35):
I think to go to why people do polls
or even my take on these polls at different
levels of a campaign is - I somewhat think that these polls end up being something that
makes a campaign's manager's job easier and their life better. They get a poll, it says,
‘Oh, here's four different topics that seemed resonant with people.’ In theory,
(22:57):
they could have known that without doing the poll. They went and paid their buddy a bunch
of money to go do it. So it kind of is this cabal of consultants. They then get these four messages,
they tell the campaign, here's the four things that we think you should spend money on,
and it provides confidence to the campaign and the candidate to think ‘well, this must be the
right thing’. It doesn't mean that there's a lot of substance to it, and I think a lot in politics,
(23:21):
you actually have campaigns just as - they may not listen to the moderating influences
that may make it more likely for them to win,they may not actually be thinking strategically
about ‘how do I spend my money as effectively as possible to produce the best outcomes to
win this campaign?’ That's an issue at the top. It's an issue at the bottom.
And the more local you go, the more you run polls, the more money you're wasting. Often.
Munther (23:44):
So, it seems to me that, you know,
I'm a control theorist and we believe in
information in order to make decisions, right? And somehow this conversation is
slightly depressing to think that this information is actually not useful.
Brendan (23:57):
(Laughs)
Andrew (23:57):
Wait, stop for a second. The
point is not that - they already have
a lot of information. So they are ready from, historically,
they have an idea of where the swing states are and so forth. I think they don't maybe
have tons of information on what will be competitive local races and things like that.
Munther (24:15):
But I think that, you know, it seems
to me that, and maybe this is where I'm making
a big mistake in my thinking. It's in the close races that such information should become more
important than that in other cases, right? So here's a situation where you're learning more
and more about the swing states, you're kind of shredding the population into smaller pieces.
(24:39):
Now it's like a Stackelberg game where they are going to invest resources to actually sway these
subgroups, okay, and then the subgroups will make a decision afterwards, right? And so continuous,
accurate polling gives you an idea where these people are swaying, and the small number of
people that are going to make the difference. At the end, it's almost like a coin toss, still is
(25:04):
like a coin toss, but somebody wins, you know? And I may want to get that coin tossing my way.
Andrew (25:10):
Yeah, I mean, well, you know,
most elections aren't coin tosses,
but certain presidential elections have been recently, the past few decades. It's kind of also
complicated because of the electoral college. So if there were a popular vote, then the Democrats
would have won all these elections. Maybe then the Republicans would have slightly different
(25:30):
policies or different candidates to get that magic two percent. I don't know, I think the strategic
aspect of campaigning is somewhat overrated that sure, the candidates can go in person to certain
states and do rallies, but again, they kind of know ahead of time which states are going to be
close. And the magic people they're trying to reach are the people who support them,
(25:54):
but only have a 50 percent chance of turning out. They're trying to persuade them to go. But I think
the idea of it being a game theoretic thing, like, we're going to Ohio, so now they're going
to Missouri in order to psych them out. I think that's like mostly BS. It's not like a poker game
(26:14):
or these like Operations Research problems from World War Two where you do a feint and you say
you're going to send your resources in one place and then it fakes out the other people. Reporters
like to talk about that, but I think it's more like brute force. You want you want your positions
to be nationally popular, you want to be perceived as moderate and you want your partisans to be
(26:38):
motivated to show up, which usually means you want your partisans to really hate the other side.
Brendan (26:43):
I think you can also kind of separate
out a lot of what we're talking about,
which is this moderate kind of universally appealing platform and say, okay, you can
know that you don't necessarily need a poll for that. And that is probably a strategy around how
you approach the high level issues or the things that you share with everyone. Back in the day,
(27:07):
if you ran ads, you ran TV ads. They were blunt force instruments that everybody
saw. And so there was no targeting. You didn't really have to match the message to the voter.
You just had to make sure this message is likely to move people in my direction and turn people off
of the opposition at large, it's like a club. Now, once you move beyond that blunt force
(27:29):
product and you really want to become effective, you can take targeted messages and actually send
them to this specific voter and it worked to influence that voter. But that's only effective,
not if you've done polling at large, but if you really know who the voters are.
You've got your enriched voter file, your info on who is in this precinct,
who this person is, what motivates them. And if you can get targeting granular enough,
(27:51):
then it's really not about knowing the electorate at large, it's about knowing the person and what
moves them. That is a really complex challenge because most of the time, the data quality's
not that high. So you try to be precise, you try to hit someone with the right message,
you end up sending the right message to the wrong person, essentially. And so you have to
figure this out. If you can do it really well, you can actually sway and influence people.
Munther (28:15):
I heard a little bit of a maybe
contradiction in that conversation,
right? And maybe Brendan, you can elaborate,
because what you were saying, I heard as influencing that particular voter.
Brendan (28:30):
Yep.
Munther (28:31):
Not sensing what that voter is
going to do - you want to go to them and
affect them in a particular way, which to me is not a question of polling, it's a question
of actually advertisement to try to convince you that I have the right set of policies.
Brendan (28:44):
Yeah, it's data collection. Can you
at a granular human level know what's going to
influence that person? Maybe you've captured that through that person's behavior. Maybe you've done
some different modeling based on things you know about them and then some extrapolation.
So there's that. And yeah, and then it's about can you actually reach that person? Polling’s
generally not that useful for that unless it's tied to extrapolation and saying, ‘okay,
(29:09):
based on our polling, we're now tying these specific likely characteristics
to people who we believe have the right makeup for having those views.’
Munther (29:18):
So then do we have data or statistics
on how effective such approaches are? I mean,
do they make a difference at the end?
Brendan (29:28):
Yeah, so I think the best example
of these approaches, in many ways you can
see from the 2016 Trump campaign, right? You have the whole Cambridge Analytica scandal,
and why that became relevant is because people found, ‘Hey,
we actually think that this was effective,’ because the Trump campaign went out.
(29:49):
They got a ton of information on voters, they figured out what could persuade them
and then they sent insane numbers of unique ad variants to different targeted voters.
The best way to build up that file or that kind of database on the voter isn't through polling.
It's actually through running out ads, through running texts, through running e-mails,
seeing what gets people to engage, tying that back to your database, and continually enriching your
(30:14):
understanding of the electorate. And you've seen that both parties have gone and tried
to create essentially shared data cooperatives. There's kind of competing ones sometimes within
the parties, but they'll create shared data cooperatives where they're not just relying on
‘here's a voter file’ or ‘here's something that this campaign got’, but how are we taking the
information that all these campaigns are getting, aggregating it together to really know the voter
(30:37):
as good as possible? And then based on the information people have, it changes then
how they approach targeting and advertising and communicating. If they're running smart campaigns.
Liberty (30:46):
How does this sort of change the future?
I mean, you've done this for a bunch of different
campaigns, Bloomberg or Warren or whoever you've done this for. It feels like campaigns
right now are really grappling with how do you reach voters in this sense? Is it through TV,
which, you know, is a blunt instrument. I don't know if that really works anymore. And
(31:09):
how much money do you spend on TV versus how much money do you spend on Facebook or Hulu
or YouTube or whatever? I mean, where do you see that moving in the future?
Are we there in this point where people are really having to decide one way or the other?
Brendan (31:22):
Yeah. So, you still see that people are
spending way more money on TV, in traditional
forms of reaching voters than they spend on digital, but it's definitely evolving and
changing. And I think that what people are finding is, in 2016, part of what
I think held back the Clinton campaign was your consultants there were all TV people.
(31:42):
Those consultants had spent their entire life, they had their relationships, they made their
money, they had higher margins. It's also easier, less sophistication. So they kind of went that
way. Trump went a little more avant-garde with his campaign team in 2016, changed up. 2020,
you know, Biden started to kind of move to catch up. You know, now with Harris,
it's something where you can go and you can look at online. You can go see, okay, how many ads is
(32:06):
each candidate running and where? And if you go do that, like today, Harris is running 4,200 active
ads on Facebook today. Trump has 360. But if you go and you look at Google, so today on Google,
Trump's running about 30,000 different unique ad variants, while Harris is running maybe like a
tenth of that. And so, you actually, you've moved from 10 years ago, this was the land
(32:31):
that Trump dominated, these paid media ads, and the variant strategy to now they're both going
at variant strategies on different platforms. And then if you go back 20 years back to Obama,
it was kind of pre-running these targeted digital ads. But he was really, you know, if you think of
texts, you think of e-mails, you think of even your website, Obama was going and using the
(32:53):
tools that were kind of more active at the time to reach people in just places no one else was. And
that was highly effective for increasing the touch points that you have with voters across platforms.
Munther (33:03):
So I'm trying to understand to kind of
fuse all of these ideas together, this is really
fantastic discussion, because on one hand, I'm getting a sense that the majority of the voters
have already made up their minds. They know what they're going to do. And polling is about really
sampling a percentage of those voters and there's a certain error that we can incur and so forth. At
(33:26):
the same time, there's so much fundraising and so much money put into campaigning in order to
sway the opinion of certain people. It seems like the campaigns believe there are enough people out
there to flip and vote the other way. And how much can happen in the next two months? Is that really
(33:46):
a possibility that, you know, with more money and more campaigns and more, more sort of targeted…
Andrew (33:51):
Well, I think the answer is in the
past, in previous decades, there have been big
swings in opinion between Labor Day and Election Day. So if you judge from since 2000,
things have been very stable. You can't say like because things have been stable,
there's no point in campaigning, because that also has to do with both sides having roughly
(34:15):
equal resources. And so it might be you're pushing and the other side's pushing and nothing's moving,
but if you don't push or the other side's pushed, things will happen.
Munther (34:25):
So then, Andrew, from your perspective,
and Brendan, you mentioned this about Cambridge
Analytica and so forth, but do we really know that actually Cambridge Analytica
changed the outcome of the election? I mean, that's a counterfactual, that's difficult.
Andrew (34:40):
Cambridge Analytica is very good
at promoting Cambridge Analytica. That's a
storyline that made them look good. It was also a storyline that various people in the media liked.
I'm not saying it's wrong, it's just that's... Well, when election is very close... I mean,
it's kind of meaningless at some point to say if election is close enough that
(35:00):
almost anything can make a difference. My guess would be that if a campaign did no campaigning
at all, it would really hurt them. If they did a crappy job of campaigning,
maybe it would hurt them a little bit, a couple fractions of a percentage point of the vote. But
in a close enough election, a couple fractions of a percentage point could be important.
Brendan (35:23):
I do think you see you see a lot
of - to Andrew's point about the kind of
post-Labor-Day-to-election period, you see things that happen kind of organically that influence it.
So it's like, oh, here's what James Comey said, and that it influences the outcome of an election.
But you also see that kind of even through paid media, right? So if we take it and we go back,
(35:45):
it's like, oh, 2004, you have the Swift Boat ads that came out and helped contribute to Bush
winning. You have the ways in which Obama was able to tie McCain to the, kind of, struggles in the
election. In that instance, it ties paid media to the general national narrative at the time,
whereas like a Swift Boat was something that was completely kind of fresh and just thrown a wrench
(36:10):
in the narrative otherwise. And I think you see that throughout in each election is that there
are ways that the ads can influence behavior, and to Andrew's point, the really tricky part
of this is you can play multiple sides. One part is getting people to vote for you, but the other
part is convincing people they don't want to vote for the other person. And that is in part driven
by campaigns, but you'll also see the influence of PACs and outside money does a lot on the voter
(36:35):
suppression side to kind of do that negativity of saying things the campaign doesn't want to say,
but really ends up making a difference, which is maybe keeping people at home.
Andrew (36:45):
There's actual laws and policies that are
used to suppress the vote also, so they can make
it harder for people to vote or make it easier for people to vote. And so that's part of campaigning
too. I mean, I think there's a lot of question about where the line is drawn of what's legitimate
or illegitimate campaigning. I think that most people would say that a campaign that actively
(37:10):
involves cheating or stealing votes or alleging fraud when there is no fraud, that seems like
it's going too far. Changing the laws to make it harder to vote for some people, is - that violates
certain constitutional principles or not? It’s not completely clear, and then actual campaigns, like,
(37:32):
to what extent is it okay to lie? And candidates, you know, do that all the time. And campaigns lie
notoriously, will try to do it in a way that if they get caught it doesn't matter, and it's very…
Unfortunately hard to draw the line, but it's not new, obviously, in history of campaigning.
Liberty (37:51):
Well, I think if the next two months
or anything like the last two months have been,
we're going to be up for a very exciting and a changing election. Thank you all
so much for joining us today. It really is an eye-opening experience, so thank you both so much.
Munther (38:10):
Yeah, thank you both.
Andrew (38:11):
Okay.
Brendan (38:11):
Thank you.
Liberty (38:16):
Thank you for listening to this month's
episode of Data Nation from the MIT Institute
for Data Systems and Society. You can learn more about IDSS and listen to previous episodes at our
website, idss.mit.edu, or wherever you get your podcasts. Don't forget to leave us a review and
(38:37):
follow our Twitter at @mitidss to stay informed. Thank you for listening to MIT's Data Nation.