All Episodes

October 20, 2024 64 mins
Mis/Dis: Exploring Misinformation and Disinformation takes a look at how fake news, misinformation, disinformation, and deepfakes are being handled by media companies and journalists. In Part 1, we speak with Dr. Michael Spikes from the Medill School at Northwestern University. He specializes in Media Literacy with an emphasis on News Literacy. Spikes tells us how to become better consumers of news and how to be aware of bad information. You can learn more about Michael Spikes here

 Also, Jonathan Forsythe, Managing Editor of Verify. Forsythe is a journalist with many years' experience working with the Washington Post. He heads up a new team of journalists, editors and producers who work around the clock to verify incoming content before it airs on Tegna stations and digital platforms. See how you can verify your own inquiries here.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to KFI on Demand. I'm Steve Gregory. Thank you
for downloading the KFI News special miss dis Exploring Misinformation
and Disinformation. This is a composite of the four hour
radio show heard on KFI AM six forty.

Speaker 2 (00:16):
As misinformation and so called fake news continues to be
rapidly distributed on the Internet, our reality has become increasingly
shaped by falsom fma.

Speaker 3 (00:23):
The Internet was supposed to be the democratizing force in
our elections, in our dialogue, and in our country and
in around the world, and what we're seeing now is
the opposite.

Speaker 2 (00:32):
Time after time after time that we have consumed or
been exposed to inaccurate information. You've got a massive company
like Facebook that is out there allowing misinformation to be
displayed on their platform.

Speaker 1 (00:47):
It's causing a.

Speaker 3 (00:48):
Lot of confusion. People don't know if the videos that
they're watching are real, if the voice is in audio
that they're listening to has been doctor. The Internet was
supposed to make us, you know, more savvy, right, how
do we get to this point?

Speaker 2 (01:00):
And what see? People don't know the difference between something
real and something created to.

Speaker 3 (01:05):
What do we do about it? Like, we can't just
let the warl function this way.

Speaker 1 (01:12):
Officials in local, state, and federal governments say misinformation and
disinformation are serious concerns for democracy in our society. The
Pew Research Center say's about sixty four percent of Americans
believe fabricated news stories cause a great deal of confusion
about the basic facts of current events. A study by
MIT found that false information is seventy percent more likely

(01:33):
to be forwarded and shared than true information. A report
in twenty twenty estimated misinformation in financial markets could lead
to losses in the hundreds of billions of dollars due
to misguided investor decisions. A Gallup poll found that nearly
seventy percent of Americans express concern about the prevalence of
misinformation and disinformation. I'm Steve Gregory. For the next two hours,

(01:58):
we talked to media professionals, matter experts, journalists, and scientists
about the dangers of fake news and the weaponization of
digital media and how people can spot it, vett it,
and make a more informed decision. This is part one
of the KFI News special miss dis exploring Misinformation and disinformation.

(02:22):
Thank you for joining us. Everyone has a favorite place
to get their news, be it radio, television, newspaper, social media,
or simply word of mouth. But how's your media literacy
and how can you be a better consumer of news?
Doctor Michael Spikes is with the Medill School of Journalism
at Northwestern University. He studies and teaches media literacy.

Speaker 4 (02:42):
My particular area of focus and my research area is
in what's known as news media literacy or also known
as news literacy. And what that does is it uses
the practices knowledge of journalists in the work that they
do to help inform news consumers of the skill sets

(03:04):
that they take that journalists take on to be able
to create credible, reliable sources of information so that those
consumers can take up some of those skill sets. So,
as an example, we would talk about the processes that
journalists go through the verify information and to verify claims

(03:25):
that they make in the stories that they produce and
distribute to audiences to make sure that that information is credible,
how to evaluate sources and so on, so that the
consumers of news can take that up. But we also
use news content as a platform for practicing those skills.

(03:46):
So one of the things we say to the students
in news literacy classes is in order for you to
really know these skills and take them up regularly. If
something comes to you self identifying as news, should be
a number of characteristics that you've learned that journalists do
to do their job in order to judge whether or

(04:08):
not judge to the level of credibility to say of
that content that's calling it self news.

Speaker 1 (04:13):
You know, I want to go back because when we
say media literacy, and in your specific focus, news literacy,
when did this become a concern? I mean when I
was going through high school, obviously we didn't have smartphones.

Speaker 5 (04:29):
You know, we didn't. I mean, we had pagers. Now
this is closest we got.

Speaker 1 (04:33):
But when we talk about this, when did media literacy
become an issue?

Speaker 4 (04:40):
Well, I mean, if you look at the history of
media literacy, news literacy and all these various sort of
approaches to it, you hear information literacy, digital literacy, all
these things. It's actually not that new, I would say,
But it just depends on the context in the time

(05:00):
period of.

Speaker 6 (05:01):
The media that you're talking about.

Speaker 4 (05:04):
So you could be talking about like back in the eighties,
a lot of conversations around media literacy was around, like
television and the impact of it also during that time
people were talking about the impact of video games. If
you go into the nineties, people would talk about like
the impact of violent video games on young people.

Speaker 6 (05:26):
So you had discussions of you.

Speaker 4 (05:29):
Know, building I think in that time it probably was
more so about like safety and putting in you know,
labeling certain content. If you talk about like movie rating,
same kind of thing. It was very sort of protectionists.
And that's actually a sort of approach to media literacy
that colleagues that I talk about. We talk about a

(05:51):
protectionist stance, which is all about their media produces these
kind of harms, so we need to protect particularly young
people from the harms that media can do, because media
can persuade, media influences all those kinds of things. But
also we talk about it as an empowerment sort of
approach too, because we also want to give people.

Speaker 6 (06:15):
What I would call like we.

Speaker 4 (06:17):
Sometimes refer to it as almost like an inoculation against
the ways that media can influence. So again, it's really
dependent upon the time the mediums You can even talk
about media literacy. At the time when any new medium
of mass communication was born, you had people talking about

(06:39):
these issues, whether they have been radio, TV, again, video games,
and now it just seems to be all the more
prescient for us because of the advent of social media.

Speaker 1 (06:50):
And do you think that when you look back at
news literacy, specifically the era of the Walter Cronkite evening
news when there was back when there was only one
you know, I mean three, I think three networks on
the air at the time. When you look at news
literacy for someone back then in that era, do you
think they were having the same concerns because that was
back during the Vietnam War when there was a lot

(07:13):
of propaganda. I think propaganda is one of the big
things that's been around since people were able to communicate.
There's always been propaganda. But when you're looking at something
like that with the Vietnam War, is that still akin
to what we're seeing today, like with the Ukraine Russia War,
the war in the Middle East? Is that are those
sort of the same issues when news literacy is being discussed.

Speaker 4 (07:34):
Well, I think, you know, generally we could say no,
because we have, you know, technology. I think one of
the main distinctions we have today versus be four is
that we are largely we largely live in a media
ecosystem that provides us information from what I like to
really describe as disintermediated platforms. And what I mean by

(07:59):
that is that there it's it's a loss of the
middlemen that tries to like direct traffic of information that's
being created by different types of creators. We talked about
you know, you know, I mentioned journalists. We can mention authors,
We might even mention like academics who are writing books

(08:19):
or research reports. And then just the general you know,
sort of populist who now has access to the tools
that allow them to create and distribute media.

Speaker 6 (08:31):
Then we have the consumers.

Speaker 4 (08:34):
And in these disintermediated platforms, the connection between the consumers
and the producers of content is direct, So producers of
content can directly target their messages to people and get
them out very widely.

Speaker 6 (08:51):
That is in uh uh you they have to compare
that to the more mediated.

Speaker 4 (08:58):
Platforms like tele vision, radio, newspapers and so on, and
those kind of mediums you hand these intermediaries in the middle.
So like in particular, like a in a journalism outlet,
you have journalists, yes, that go out and report and
they shape stories, but before they get broadcast, they go

(09:18):
to editors, they go to you know, they go through
this process, and those editors make choices about what should
be in that what should be in that information, what
should be included in those stories, how those stories of packaged,
and also how relevant and how credible that information is
for getting to audiences. Now, in these disintermediated spaces, human

(09:41):
intermediaries that should be clear and saying are sort of laws.

Speaker 6 (09:45):
Instead of them.

Speaker 4 (09:46):
Now we have these algorithms that make choices about what
people will see on platforms like social media, and with that,
those algorithms are tuned not for like informing purposes, but
more so to increase engagement, and that engagement sort of
translates into more time spent on the site, more likes

(10:10):
this being clicked, more comments, without making any distinctions between
like the types of comments or things like that. It
might lightly do that, but if it sees that more
people are responding to that content, it says, oh, I
need to give you more of that so you will
spend more time more.

Speaker 5 (10:27):
With doctor Michael Spikes.

Speaker 1 (10:28):
But first, this is the kfi Knews special miss disc
exploring misinformation and disinformation. Welcome back. We're talking with Michael Spikes.
He's with the Midill's School at Northwestern University. Has a
PhD in Learning Sciences. We've been discussing media literacy and
actually focusing in on news literacy, and before the break, Michael,

(10:48):
we were talking about how I was kind of given
the comparison between the days of Walter Cronkite in you know,
decades ago, when television news was just in its infancy
and people were still trying to figure out how it
was working, how it was impacting people, and you were
talking about, which is really fascinating, how algorithms have sort
of taken over how we consume news, and algorithms get

(11:10):
to decide what's put in front of our faces now,
and that humans are kind of out of the equation.

Speaker 5 (11:16):
What about this.

Speaker 1 (11:17):
Notion of misinformation disinformation and what can people do to
make themselves more literate when it comes to consuming news
that they're not being victimized by misinformation disinformation?

Speaker 4 (11:31):
So I think what I would go, you know, taking
a step back, I would I would definitely again emphasize
the characteristic of the outlets that we now are getting
information from, those being disintermediated versus the mediated platforms. Now,
if you take these disintermediated platforms again, like I said,
there are those humans intermediators that are out of that

(11:53):
process and we have these more open democratic platforms for
the creation and share of information. Now, on the good
side of that, it's given us access to a lot
more voices that we didn't always hear from before, because
in the past you had to have access to the capital,
the equipment, and the means to be able to broadcast

(12:16):
mentions is broadly now granted, which that came certain responsibilities
and in certain standards. Now in these more open democratic platforms,
we now have many more voices that we can hear from,
which is good, but also at the same time, it's
also opened the door to lots of different people who

(12:36):
might share information with intentions that don't always line up
with those of the audience members. So in particularly when
we talk now this is when we start to talk
about misinformation and disinformation.

Speaker 6 (12:49):
We talked about disinformation.

Speaker 4 (12:51):
We talk about information that has the intent i should say,
false information that has the intent when sharing it to
mislead people. And then you have misinformation, which is false
information and shared without that intent because usually the people
didn't know it was that information was false. Now, if
you take a situation like what's going on right now,

(13:13):
I just was talking with some folks this morning about
missing disinformation that was being shared and was affecting people
who are trying to recover from those huge floods in
western North Carolina. And we're starting to hear that they
are now getting messages or pieces of disinformation that might
be misleading them to the resources they need to get.

(13:35):
And the reasons why or why not they haven't got
it or certain people have gotten it and certain others
have not. What you have had is you have players
who are sharing information in these disintermediated platforms, sometimes with
the express intent to mislead people. And unfortunately, in these platforms,
what we're finding because those again those human intermediaries are missing,

(13:59):
they find that they can share these pieces of miss
in disinformation. I should really harp on disinformation without consequence.
The consequence might be people are taking disinformation up and
they're sharing it with others, creating misinformation, misleading people on
finding the resources that they need. But because again, because

(14:20):
these intermediaries are not there in the middle, they're not
saying to these people you can't do that or you
can't share that there are I should also add that, yes,
in some cases, there will be automated instances that may
try to get in the way of those messages, but

(14:40):
those players most and most times will just find ways.

Speaker 6 (14:45):
To get around.

Speaker 1 (14:46):
When we talk about social media platforms, and I mean,
let's face it, I mean that's really kind of the
hotbed for all of this that we're talking about. It's
it's social media, and then you know, cable television I
think has also kind of helped to perpetuate some of
that because it's not regulated by the FCC. But when
we talk about these platforms, when you look back historically, now, Michael,

(15:09):
what do you think they did or how did they
screw up? And I'm talking about social media platforms. What
should they have done so that we would not be
in this situation today?

Speaker 6 (15:20):
That's a really good question.

Speaker 4 (15:24):
I mean, I would like to say that they should
have kept in mind what may happen when you just
throw open the doors to everyone to communicate. Like when
you when you hear the leaders of these platforms, like
you take like a Mark Zuckerberg at in Meta that
owns Facebook and Instagram, where you take like Elon Musk

(15:46):
as the owner of X. You know, they like to
talk about we just wanted to connect people, right, We
wanted to allow people to share the things.

Speaker 6 (15:55):
That were important to them.

Speaker 4 (15:57):
And while that's well and good, I think you also
have to keep in mind the consequences of allowing people
to just put up whatever they want whenever they want now,
and one would say, like maybe they should have had
more people in between that could have checked the information
that was going up before it got shared with other people. Now, granted,

(16:21):
if you did have that, what that would do is
it would introduce a lot of friction into the process
of creation and sharing of information that would make less
people want to use it. And some other player probably
would have came into that space and said, you know what,
I'm just going to open things completely up, more so
than these other platforms, so they know they are walking

(16:45):
a very fine line. But I do wish that these
technologists in particular would have had more like social scientists
as part of the process, who could have talked to
them about this is what happened, this is what could
happen when you open these things up. And I think

(17:05):
what we're seeing now, we're experiencing the consequence of not
having those people involved in the development the production of
these platforms. We're seeing that more and more people are
being susceptible too and are being affected by the spread.

Speaker 6 (17:25):
Of missing disinformation.

Speaker 4 (17:27):
So while for somebody like myself it makes me a
popular person, it makes people like you come and talk
to me about this issue because this has been something I've.

Speaker 6 (17:35):
Been involved with for probably close to.

Speaker 4 (17:39):
I would say seventeen eighteen years now, and it's something
that it really bothers me to no end. But at
the same time, we're seeing like the larger populace just
being affected in so many different ways by us.

Speaker 1 (17:54):
More with doctor Michael Spikes. But first, this is the
k if I Knew Special miss dis Exploring Missing Information
and Disinformation. Welcome back. We're talking with Michael Spikes. He's
with the Middle School at Northwestern University, has a PhD
in Learning sciences, and he's also been studying misinformation, disinformation,

(18:17):
news literacy, media literacy for the better part of eighteen years.
And before the break, you know, you were talking about,
you know, the length of time you've been working on this,
and when it's almost two decades now you've been studying
news and media literacy and it seems probably a very
pedestrian question, but how has it changed since? In what
got you interested in it first? And how has it

(18:40):
dramatically changed over that last almost two decades.

Speaker 4 (18:44):
Well, well, I uhould say, like I started, I actually
started off in this as a as what I would
call a practitioner. So the first part of my career
was really focused on the creation of media, like when
I was even when I was in high school, and
probably one of my first sort of like career paths
as being like a radio journalist, it was I went
through training programs in the in the city of Cleveland

(19:07):
where I grew up, where I learned about like how
to produce radio and how to go out and do
interviews and how to cut pieces together and how to
create stories for.

Speaker 6 (19:16):
This particular medium.

Speaker 4 (19:17):
And then I sort of branched out and started teaching
these skills to young people. So it was in that
first reund that I really thought about like storytelling, and
then I got a chance to sort of branch out
and do it in video, I got a chance to
do it in some writing, and I was really starting
to think about how you shape stories in all of
these different mediums. But I also was seeing in the

(19:38):
ways that media influences people to you know, it directs
the things they talk about. It can sometimes direct their
own lives in certain kinds of ways. It gets them
to buy things, you know, all these kinds of things.
So then I started to get more interested in the
field of media literacy and news literacy and that that

(20:00):
work I was doing with the Center for News Literacy
at Stonybrook University and at that time, and this was
two thousand and ten when I first was exposed to
this this sort of curricula and really started to think
about and talk about the ways that one take those
practices that journalists do and the sort of responsibilities that

(20:24):
they put upon themselves because they know that they are
arbiters of I want to just say truth, but arbiters
of the sort of messiness that goes on in current
events and information.

Speaker 6 (20:38):
Right.

Speaker 4 (20:39):
Like I started to learn, like journalists understand that when
they publish stories, they're publishing the best version of the
truth as they know it right now, right And they
understand that the idea of truth, which I will also
give a CoPant that truth is really complicated.

Speaker 6 (20:58):
I would almost say like credible information is just headed
one on time.

Speaker 4 (21:04):
To my tongue, it is provisional, meaning that what we
understand now could change as we collect more and more
information and evidence, So it kind of helped me to
like keep that open mind. And as I started to
involve myself more and more in this work, and this

(21:26):
happening at the same time that social media was starting
to be adopted, I started to find opportunities in more
examples of seeing how media influences and what impacts it
can have on people. So one of the primary examples
when I was just starting with Stonybrook was the Arab Spring.
Like we saw the Arab Spring started to go on,

(21:49):
and we had lots of people who were protested, and
we saw this as this just completely new way of
bringing people together for what we were thinking was you know,
really like high minded ends, right, But then we saw
that influence start to sort.

Speaker 6 (22:06):
Of change a bit when we moved into.

Speaker 4 (22:10):
Like the era when the Boston marathon bombing happened in
twenty twelve, and at that time, we had people sharing
information on these open platforms that yes, could allow people
to do really great things to gather and fight for rights,
but then we found people making you know, unfounded assertions
of what they thought may have happened, and people taking

(22:33):
that up as truth. And we even saw news outlets
taking up that information is true. So we saw almost
like a like a mob mentality start to be created
in a situation where people were really unsure.

Speaker 6 (22:50):
About what the future was.

Speaker 4 (22:51):
This was in the time before the two young men
were captured and convicted during this time, and we saw
like this void of information people just fill it with
whatever they think will sort of make sense. And that
brings us all the way up till today where you
see this be all the more amplified when people have

(23:13):
this void in information. And I would almost go back
to that initial example that I bought up earlier about
this idea that the truth or the idea of truth
is provisional and staying open to saying like what I
understand now could change over time.

Speaker 6 (23:30):
And that could happen when we have that void of information.

Speaker 4 (23:33):
More and more people are just running towards people who
would try to give them these very certain, discrete answers
to questions that just tend to fill these voids in
information with stuff that doesn't need any verification, doesn't need
any sort of semblance of credibility. People just take it
because it seems like it's credible, or it seems.

Speaker 6 (23:56):
Like it's.

Speaker 4 (23:59):
It's entice, or those kind of characteristics rather than going
towards evidence and credibility to be able to make sense
out of it.

Speaker 1 (24:07):
So, as we wrap up here, two quick questions I
have for you. One, do you think that because of
the state of society now with social media and social
media is not going away anytime soon? But now with
what's going on? One do you think social media should
be regulated? If so, how? And Two do you think
media literacy and even news literacy should be something that's

(24:30):
taught at the high school level like it is like
sex education. I mean, it seems to be as just
as important now than ever before for any other life sciences.

Speaker 4 (24:42):
Yeah, I mean, I'm definitely one of the people that
is on board with greater regulation of social media companies.
I mean, I don't know about you, but I'm tired
of seeing these leaders be brought up in front of
Congress and then just continually apologize for the things that
they're platforms have enabled people to do and then say

(25:02):
we'll do better.

Speaker 6 (25:03):
Are we going to do so much better?

Speaker 4 (25:05):
We're going to add all these people blah blah blah,
and we can self, you know, regulate ourselves, but then
find themselves just back in the.

Speaker 6 (25:14):
Same situation a couple of months later, Like.

Speaker 4 (25:16):
I know, I'm tired of seeing that, and I wish
we have better rules around it. But I will acknowledge
that it's a super complicated thing to do, especially in
a country like the United States. But I would emphasize
that when people bring out the free speech argument, that
we do have to understand that social media platforms are
private organizations. They are not the government, and First Amendment

(25:39):
issues do not apply.

Speaker 1 (25:42):
You know, listen, Michael, let me have to stop you there.
We're running up on a break. Why don't they just
bring you back for another another segment and then if
you don't mind, do you have the time? If you don't, yeah,
let's let's do one more. So when we come back,
we'll wrap up with Michael Spikes. This is the k
if I Knew special miss dis exploring misinformation and disinformation.
Welcome back. We're talking with Michael Spikes. He's with the

(26:03):
Middle School at Northwestern University, PhD in Learning Sciences, his
specialty media literacy, with a very special focus on news literacy.
And we've been talking about a number of different issues,
how media literacy can really be the foundation for people
being able to almost survive because you talked about in

(26:25):
an earlier segment a lot of this disinformation going on
when there's natural disasters happening, and that that could drastically
shift people's views on where to get supplies and if
they're safe, and if they feel like they're safe or unsafe.
For one of the cases, this disinformation can get deadly.
And before the break, I had asked you two questions

(26:45):
as we were trying to wrap up, but this is
just too much to talk about here in the time
limited time we have, and one of them was should
we regulate social media platforms? And you were in the
middle of that conversation. I'll let you finish that. Then
the finally talking about media literacy in schools being as
important as sex education.

Speaker 4 (27:06):
Yeah, so trying to wrap all those things together. One
of the things that I was talking about was that
the platforms themselves, I think, have really proven themselves to
be not very responsible stewards of the information that's shared
on them. So I think that, you know, we need
regulation and we need mindful regulation to do that. On
the media literacy front, it is definitely something that we

(27:29):
need to teach more to a larger I would say
group of people, not even just young people. So here
in the state of Illinois, yes, we became the first
state in twenty twenty one to put in place a
requirement for a unit of media literacy education. It's the
and that's been into public policy and it's super broad

(27:51):
so teachers can approach it in lots of different ways.
And I have been involved with a colleague of mind
whose name is Yantu Freesim, and trying to talk about
implementation of that in multiple subject areas, not just during
like Media Literacy Week or only in an English class,
but we want to see it, you know, throughout the curriculum.

Speaker 6 (28:09):
But some of the.

Speaker 4 (28:09):
Things that people you know, that we'll teach in like
media literacy classes are to bring it back to this
issue of like the susceptibility that people have, especially when
they're in situations in which they are fearful, they are
uncertain about the future, or they have doubts about the
people around them or the situations around them, to make

(28:33):
them more aware of the fact that they've become more
susceptible to pieces of mis and disinformation at that time
because they become you know, the facilities that we use
to usually put like logic around situations that are happening
to us to come to conclusions or to make decisions,

(28:55):
Lots of those things sort.

Speaker 6 (28:56):
Of break down.

Speaker 4 (28:57):
So being aware of that can raise intend in those
moments to say, I need to be super aware and
be mindful of the sources that I'm going to to
get information from for that will help me to survive
and to thrive. And in those cases, we make distinctions

(29:17):
on the different sources or outlets of information. So you know,
for those folks, I would definitely say they should be
looking at journalists as one of their main sources of
information that they know, check that information, bring them evidence
to that information, and bring them credible information with the
intent to inform.

Speaker 1 (29:37):
You know, It's it's kind of sad that trust in
media is at an all time low. I can't ever
remember the general sentiment against media being as low as
it is now. I mean, it's I think we're lower
than Congress. And it's sad because when I first got
into this business some forty something years ago, you know,

(29:58):
the journalist was sort of held on a pedestal, and
it's like anything any word I spoke, everyone assumed it was.

Speaker 5 (30:06):
The God's truth.

Speaker 1 (30:08):
I liked how you defined truth because that's really where
we're at now. Truth is sort of a it's like
you say, it's provisional. It really is provisional because things change,
things are fluid. You know, you've given a little bit there,
But can you be more specific about how people can
become better consumers of news?

Speaker 4 (30:32):
Well, I mean some of the things I would say
is just that they have to be aware of.

Speaker 6 (30:40):
I talk about four.

Speaker 4 (30:41):
Major challenges that are in front of news consumers. Like
whenever I start a lecture on news literacy or I
do like a workshop, I talk about these these four challenges,
one being that their speed versus accuracy. Just because you
get it really really quickly or just because of that
the top doesn't mean it's the most accurate. We need
to look for accurate things and know it takes time

(31:01):
to get accurate.

Speaker 1 (31:02):
Information, especially especially I'm sorry to interrupt, but especially in
active shooters, especially in mass casualty events, things like that,
there is so much bad info out there. As someone
who goes through us, so I didn't mean to interrupt you,
but I wanted to make that example.

Speaker 4 (31:16):
Yeah, So, yeah, you can have a lot of speculation
in those situations. Two that there is just way too
much information, and because we have access to so much information,
it again, that's one of the things that makes our
own facilities to make sense out of the information, they
sort of break down, because when you're overwhelmed, it's very
hard for you to make decisions. Third is that we

(31:37):
have to be aware of that disintermediation of the media that.

Speaker 6 (31:40):
I talked about earlier.

Speaker 4 (31:42):
And then fourth, we need to be aware of encounter
our own personal biases and dispositions and what they can
do to our own sort of taking up of that information.
Because one of the things that I say to people,
when we come towards new information, we bring all our
baggage with us. We bring our personal beliefs, our identities,

(32:03):
all of those things, and we have to keep those
things in mind when we're getting information, and we have
to stay open to hearing multiple points of view and
countering ourselves when we get into situations where we're hearing
opinions and shaping of news that just aligns with our
own points of view, Like when we have to happen,
we have to say to ourselves, something must be missing.

Speaker 1 (32:27):
Excellent, Michael Spikes, this has been very eye opening. I
really appreciate your insights on all of this. I can't
believe you've spent the better part of eighteen years studying this.
I would drive me insane, just in my world, I mean,
just everyday news. I'm overwhelmed. And for you to be
able to turn it into a career, I applaud you
so again. Thank you very much. You're at the middle

(32:49):
school at Northwestern University. How might someone be able to
get more information about you and what your research has shown.

Speaker 4 (32:56):
Yeah, they can look me up either on the medial
page of the initiatives I have going on. One of
the other ones is the Illinois Media Literacy Coalition, which
they can find out more at ilmlced dot org, or
you can look me up personally on my own personal
website at Michael Spikes dot com.

Speaker 1 (33:15):
Excellent, Michael Spikes, you have a wonderful day, and thanks
again for joining us.

Speaker 6 (33:18):
Thank you.

Speaker 1 (33:18):
Coming up, a media company has created a unit dedicated
to vetting and verifying information before it goes on the
air or online. But first, this is the KFI News
special miss dis exploring misinformation and disinformation. Fake news, misinformation, disinformation,
It's all a problem for journalism. Trust in the media,

(33:39):
is at an all time low, and it's become clear
that reporters, editors, and anchors have got to figure out
a way to gain back that trust. Tegna is a multimedia,
broadcast and digital company with stations throughout the United States.
It saw a need to become more transparent, so it
created Verify, one of the very first media companies to
bring together journalists, resources to confirm content. We welcome Jonathan

(34:03):
forsythe the managing editor.

Speaker 7 (34:05):
That's right, Steve. We're a team that's out there. Our
mission is to help the audience understand what's real and
what's fake, what's true and what's false. And a lot
of what we focus on is answering the questions that
come in from our audience about everything. So that's one
of the ways that we decide on what stories to cover.

Speaker 1 (34:25):
So let's go back to the beginning. Let's learn a
little bit about you, because you are a journalist. You
spent a lot of years at the Washington Post. You're
an award winning journalist, So talk a little bit about
your career, and then Lin's bringing up to how Verified
came to be sure.

Speaker 7 (34:39):
I spent my first fourteen years at the Washington Post.
As you mentioned, I started out on the internet side,
the website back in the early days of the Internet.
I'm going to date myself a little bit, but when
the Washington Post is actually a separate newsroom, the web
operation and the print operation were separate, so I was
there long enough to when we merged in twenty ten.

(35:03):
I ended up managing the video department at The Washington
Post and really learned a lot about news and coverage
and just the talent in that newsroom really helped shape
me as a journalist. From there, I moved on to
a corporate opportunity at McClatchy, another newspaper company that owns
about thirty newspapers local newspapers around the country, including the

(35:26):
Miami Herald, Sacramento Be, a couple of the big ones,
Kansas City Star. And from there I helped launch a
national video operation and really helped with the stations, sorry
not the stations, the newspapers to focus more on their
audio and video production and their capacity in the digital
space locally. And I managed a team across that company

(35:50):
for about seven years, and then shortly after the pandemic
and right after the January sixth insurrection, Tegna was launching
and not this team based on that was sort of
nationalizing this brand that was born back in twenty sixteen,
actually before the term fake news was a term was

(36:12):
a common term anyway, and I signed on to build
that team with Tegna. And right now we oversee we
work with the sixty three stations that are on the
broadcast side with Tegna across the country, similar to my
corporate role of McClatchy, And we not only produce news

(36:33):
packages based on the Verify format that the stations can
use in their newscasts every day, but we also have
a website. We also have a newsletter. We're also across
all the social media channels because that's where a lot
of the disinformation really lives these days and is born
and shared out of context. And so we're focused on

(36:55):
that space as much as we are the broadcast space.

Speaker 1 (36:58):
And why did Tegna see a need for this? What
was the impetus for them saying Okay, we've got to
fund this, we've got to pull people in and we
have actually have to create a unit that did not exist.

Speaker 7 (37:10):
Yeah, great question, Steve, So actually back, So technic does
this innovation summit every year and they bring people from
every station together to brainstorm creatively. And the challenge back
in twenty sixteen are actually right ahead of the twenty
sixteen election, was how do we build trust with our
audience and what can we do because, as you know,

(37:32):
trust has been eroding across the board, not just in
broadcast but across you know, traditional news media, and there's
a lot of reasons for that, but one of the
ways that so the idea for Verify was born of
a different way to present a story that would really
focus on keeping it simple from a true false yes, no,
is this real? Or is this faith sort of a concept,

(37:55):
and then we'd also take the sources, which often are
bare in a story if you're reading something. They might
come up while you're reading a story as far as
somebody being quoted, but we'd put that at the top,
and we'd list the sources first, so that anybody consuming
the story or you know, seeing the question or the
claim that was presented first would know where the information

(38:19):
that we were reporting was coming from. And obviously they'd
be reputable sources and they're not just sort of some
random blog that sort of thing, and that really worked
with the market research that Tegna did, there was a
lot of a lot of positive reinforcement for those early pilots,
so they decided to build on that and then eventually

(38:41):
grew into this national team that I'm now leading.

Speaker 1 (38:44):
We're talking with Jonathan Forsyth. He's the managing editor of Verify.
It is a sort of a filter. It takes information,
filters it checks it, vets it, and then lets people
know whether or not they're getting the accurate information. Now you, personally, Nathan,
as a reporter, you spent fourteen years at the Washington Post,

(39:04):
did you in your career did you ever see a
shift in the trust with the media. Did you see
it personally or was it only after big events started happening?
Did you start to see an erosion of that trust
somewhere in your career.

Speaker 7 (39:17):
That's a great question. I wouldn't have a specific date
to point to, but there have been. There's good reasons
for people who want to question authority, right You want
to hold the powerful accountable. That's why a lot of
us are in this business, and a place like the
Washington Post obviously has a rich history of that. And

(39:37):
so while I think that there's inherently bias across any
media organization. Anybody who says they're not biased is fooling themselves.
It's so important to be objective, and it's harder to
be objective these days than this polarizing climate that we're in.
But it's I think the people who understand and who

(40:00):
actually consume and realize that you're doing stories from both
sides and holding all, you know, sort of leaders accountable
are the ones that really appreciate. That's how you build
that trust. But no, see, if I don't have a
specific date, I would just say that I've seen it.
I've seen it certainly start to erode over my career,
but I don't have a specific date to point.

Speaker 1 (40:22):
Well, okay, with that in mind, did you I mean
you and your colleagues in this industry? And I'm doing
this to lay some framework as so why why we've
gotten to the place now where we have to create
units to verify stuff?

Speaker 5 (40:35):
So why do you think that.

Speaker 1 (40:40):
What was the pivotal time for you to said journalism's
in trouble or this is something's got to go, even
before Verify came to be. But was there a moment
with you? Not a specific date but an epiphany if
you will, that you thought to yourself that journalism's in
trouble or it's in peril.

Speaker 6 (40:57):
Man.

Speaker 7 (40:57):
You know, I hope journalism is not in trouble, you.

Speaker 1 (41:00):
Know what, given the circumstances that's been going on to
your point early. In fact, we're up against a break.
So when we come back, I want to pick up
that conversation.

Speaker 5 (41:08):
But I really do.

Speaker 1 (41:08):
Want to get your thoughts on if you think that
journalism's in a good spot right now, is it bad?
And if so, how does it get better? We'll talk
more about that, but first let's take a break. This
is the KFI Knew special miss dis Exploring Misinformation and Disinformation.
Welcome back. We're talking with Jonathan Forsyth. He's a managing
editor of Verify. And before the break, Jonathan, I was

(41:30):
putting you on the spot a little bit. But it's like,
you know, when is or how is journalism perceived now
that the trust factors and an all time loan fact
I think our rating among the general population is under Congress.
I mean, we're below Congress right now. But did you
ever remember and I kind of try to pin you
down in the last break on not so much of

(41:52):
a date, but something that might have occurred that told yourself, Hugh,
there's something going on here with journalism.

Speaker 5 (41:58):
We've got to fix it. Or am I just off
base here?

Speaker 7 (42:02):
No? I mean it's it's not any breaking news that
I'm revealing here that the business model journalism has been
in peril for a long time, right, and you know,
everybody's used to getting everything free. Obviously, print media is dying,
even though there are still some vestiges that are out there,
and broadcast is not far behind it as well. Right,

(42:23):
this this transition to figure out how to how to
reach this audience. It's one of the things we're doing
to verify as I As I mentioned when we talk
about journalism being in peril, you know, I'd like to
believe that there's a lot of smart folks out there
that are focused on the spaces where more and more,
especially younger people spend their time and presenting the journalism

(42:46):
there rather than sort of a homepage or the traditional
you know, lead in in your a block on the
six PM news, which are decreasing in viewership and audience.
As you well know. I'll point back to probably not
an original take here, but what prompted the TEGNA to
decide to build a national team that I'm now leading.

(43:08):
Is really the insurrection on January sixth, you know, that's when,
but even predating that, if you think back to the
Pizza Gate, you know, at Common Pizza and DC and
what happened with those that misinformation that was sort of
part of the deep state conspiracy about child trafficking and whatnot.

(43:28):
And that's when I think a lot of people started
to realize there's real harm and danger here and in
the and we need to have dedicated resources to cut
through this stuff. That and since then, technology has improved
so much that you have generative AI, which is just

(43:48):
complicating it even more and making it really all the
more hard to understand what's real and what's fake. Just
look at the recent hurricane coverage and a lot of
what's been circulating online about about that. So lots of
reasons to to have a unit that is dedicated to
cut through that. But at the same time, I would

(44:09):
point back to the insurrection to comment Pizza.

Speaker 1 (44:11):
So interestingly, you know, being at Washington Post, you're right
there in the hornet's nest or you know, that's about
the nicest way I can put.

Speaker 5 (44:18):
It in DC. Do you feel like that you are
sort of overexposed.

Speaker 1 (44:24):
To what all this potential misinformation disinformation just because of
the geography and just where you're at.

Speaker 7 (44:31):
You know, it's it's a really good point, Steve. I'm
right there in the swamp, right right at the post.
But and sometimes we do have to step back to
realize how much are people like do people really wonder
about this thing? Or is this more us because we're
sort of living this space like what's really what's really

(44:54):
on the on it's sort of like polling right with
electors and sort of what's what what do people really
care about? Versus what journalists and sort of the media
elite quote quote unquote seem to make the think that
they care they care about. And that's why we rely
so much on the questions that we get in from
our text line, from our emails, from our form on

(45:14):
our website, and once we see several of the same
type questions, we're like, Okay, we're tapping into something here.
There's a new attack ad that's circulating, or one of
the candidates said to this on the trail the other day,
and it's starting to pick up some social some trends.
So that's what helps us determine what to go after
in addition to our you know, using our editorial judgment,

(45:36):
and we're working with our stations as well on requests
that they have. But a lot of that does we
have to sort of remind ourselves that we're kind of
right in the midst mix and not as many people
care as much about the process and the you know,
voting like the process of voting for example, Steve. So
many people that are going to vote are just starting

(45:58):
to pay attention to politics now, right, and that's one
of the things we often don't think about in this
you know, two year four year coverage cycle and so
being you know, thinking about ways that we can be
transparent and help them understand the voting process, which of
course has come under some scrutiny and suspicion of late

(46:18):
in the last few cycles. Is part of what we're
trying to incorporate into our coverage as well.

Speaker 1 (46:26):
The thing now, you've got this idea, Tegna, and you
talked about this innovation summit. It gave you the impetus
to put this together, You got the funding to put
this together. What's the first step I mean, how on
earth do you create something like this out of nothing?

Speaker 5 (46:43):
And where do you start?

Speaker 1 (46:44):
How do you build an infrastructure for something like this?

Speaker 7 (46:46):
Yeah, so what they what going back to that twenty
sixteen idea, what happened out of that was a pilot.
So a pilot was produced on a segment that incorporated
some strong graphics, you know, X for something that was false,
green check for something that's true. That was really well
received by the market research that was done. That's when

(47:07):
they and there has been since additional tests and market
research conducted about people's appetite for fact checking and truth
and this sort of sort of sources upfront approach to.

Speaker 1 (47:21):
This mean airoktee there, But what did the research tell
you about that? What are the people's appetite for that?

Speaker 7 (47:29):
It's so the pilot that originally it was one of
the highest rated pilots that was ever rolled out by Tegna,
and it has only increased in the years since. There
is so much confusion out there and so many people
that don't know what's real and what's fake. They're memes
that circulate and some of it sounds like it might

(47:50):
be true, and some of it sometimes is true. But
a lot of it isn't and so there is there
are a lot of people who just want to know,
cut through the noise, give me the facts, and that's
what we're here for. And so that pilot. What came
of the response to that original pilot was a small
national team that was basically four or five people, a

(48:12):
couple of researchers, an on camera reporter, and an editor
and they would do, you know, a couple of stories
a week and share them with stations. And it was
sort of a small national team. They didn't have a
big digital footprint. They didn't have a YouTube channel, they
didn't have a website that was dedicated to that. And

(48:34):
that's where, you know, after a couple of years get
to the insurrection, it's like, Okay, there's a there's a
need for more of this type of content, and so
let's create a team that actually is dedicated to that
and doing a lot more of this, increase the volume
of stories across the digital platforms, in addition to broadcasts.

Speaker 1 (48:50):
More with verifying Jonathan Forscynth. But first, this is the
KFI Knew special miss dis Exploring Misinformation and Disinformation.

Speaker 5 (48:58):
Welcome back. We're talking with.

Speaker 1 (48:59):
John forsythe He's the managing editor of Verify. It is
both a website verify this dot com correct and then
you can also text. You can register through text and
the informations on the website. I did it as well,
what's really cool about it? And you can also subscribe
to a newsletter. And then what I really enjoy, Jonathan,

(49:21):
about what the model is here is that you're being proactive.
You're sending stuff out already based on what I assume
are the top questions of the day or the top
stories of the day, and you're already verifying this. So
when I get up in the morning and I'm looking
like the hurricane that happened a while back, and I'm
looking at Wow, You've already gone through all these videos

(49:42):
and told me what's real, what's fake, and whatnot. So
I kind of like that proactive approach where I don't
have to go out and search for it, You're already
sending it to me. I think that's a great model.

Speaker 7 (49:51):
I appreciate that. Best part of our audience strategy is
to be across a lot of different platforms and a
lot of different people have different media consumption habits. You
can find us on the local TV station. If you're
in a techno market, you could also find us in
a newsletter. Some people prefer to read sort of a
morning digest kind of a thing. And honestly, those folks

(50:12):
that are signed up are subscribers to our text feature
and newsletter are some of our more engaged audience. They
send us a lot more questions. We prioritize those. We
make a point to respond to as many as we can,
even when we don't have the answer, and that has
helped really grow the sort of loyalty and transparency with

(50:33):
our audience.

Speaker 1 (50:34):
I noticed that you also partner with Gigafact. Do you
want to talk about that a little bit?

Speaker 7 (50:38):
Sure? Gig Effect is this great network. We have very
similar missions to increase the volume of fact check kind
of content out there, and they reached out to us
about a year ago. I ran into their editorial lead,
Robin at a conference and we talked about how they're

(50:59):
building up a work of or news organizations who are
starting to create these fact briefs, which are basically brief
versions of very similar to what we do at verify,
you know, labeling something as yes, no, or true false.
And we saw a lot of symmetry there, so we
decided to create a partnership where we're able to leverage
each other each other's work.

Speaker 1 (51:19):
So I'm looking on your website now verify this dot com,
and I'm seeing that you got roughly if I got
this right, counted really quickly, about eighteen people counting you
on staff.

Speaker 7 (51:28):
Right, that's about right?

Speaker 1 (51:29):
Yeah, So how are these folks selected? And the reason
I ask is because and you you made it mention
earlier about there's no such thing as a one hundred
percent objective individual.

Speaker 5 (51:40):
No, there's no journalists out there. It's one hundred percent objective.

Speaker 1 (51:44):
And even with AI and robots and all other stuff,
you're never going to find one hundred percent objectivity. How
did you bring the people on board here? And do
you have any you know, how do you vet somebody
to bring them in to make sure that they're looking
at this thing objective as objective as possible.

Speaker 7 (52:02):
Yeah, it's a great question, Steve. So we're representative. So
I'll start first with our geographic diversity, so we're across
every time zone in the US. We have. The team
is a team that is largely hybrid. While there are
some of us who are based in the DC area.
We are not all there, and that was important to
make sure that geographically we're just as diverse as a

(52:24):
team as we are in other ways. We also have
a so the team is made up of reporters, video editors, researchers,
motion graphics artists. We also have we also have a
couple of on air folks as well, of course, and
so there's there's diversity in skill sets. There's diversity and geography.

(52:49):
And one of my big roles as a managing editor
is the story selection to make sure that we're presenting
balanced coverage and that's important an end of this day.
And we're we're proud to be on some of the
some of those net some of those outlets that actually
rate fact checking units on whether or not they're the

(53:11):
lean left or lean right. We're proudly in the middle
on all sides as objective and in the middle and central.
And there's another one that would I forget the name
of it off the top of my head that I
will might be media bias dot org. But we're we're
proudly that.

Speaker 1 (53:27):
There's all sides and in media, yeah.

Speaker 7 (53:30):
Right, yes, and we're and and so we're proud of
that and that that speaks to the balance of our
coverage and not having an agenda and being very sticking
to the facts and the way that we report. The
format helps lend itself to that there's no opinion, there's
no you know, columnists that's going to write something that's
going to confuse somebody to thinking whether that's actual news

(53:52):
story and Verify versus an opinion. So that is our approach,
and when we were building the team, we have a
mix of folks from outside news organizations and also folks
from inside TEGNA who are familiar with the Verify brand
and sort of the corporate communication structure. So we wanted
to have a diversity of viewpoints and perspectives where people

(54:16):
live all of the above. As we put this team together,
it's an important thing to be representative of the country
as a whole.

Speaker 1 (54:22):
So on a daily basis, when you're looking at stuff
to verify, what do you decide or how do you
decide what items to go ahead and put your effort into.

Speaker 7 (54:31):
Yeah, so a few things we use our editorial judgment.
We want to be as timely and focused on current
events as possible, and so we're always looking at what
the news cycle is bringing as it relates to confusion
and misinformation questions about current events. Secondly, and just as
important is the questions that we receive through our audience

(54:54):
through the text line like you mentioned, through email, through
the claim submission form on our website. A lot of
time times we will realize that there's a trend when
we get the same question over and over and over again.
And then that's when we discussed in our editorial meetings, Okay,
this is a story we've got to cover. There's clearly
a lot of interest. Let's figure out a way in there.

(55:15):
So those are the and then of course we also
work with our stations, and stations sometimes have local requests.
There's some local misinformation or confusion or disinformation campaigns running,
and we'll dedicate resources to help with that as well.
So it kind of runs the gamut. We're using our
editorial judgment and current events, but also very much relying
on what kind of questions we're receiving from our audience.

Speaker 1 (55:36):
And very much like a medical examiner, have you ever
hit a situation where you're like, this is undetermined.

Speaker 5 (55:42):
We've just hit a brick wall.

Speaker 1 (55:43):
You know, We've looked everywhere We've checked everywhere, and we're sorry,
but we just can't tell you if this is accurate
or not.

Speaker 7 (55:50):
We do, yeah, we do. I'll give you an example
of one that we didn't do the story because we
couldn't get multiple sources. And this was, gosh, I'm gonna
forget exactly it was something that was written on a
coke can.

Speaker 1 (56:08):
You know what I'm gonna have. I'm gonna pause there
and let you think about that because we've got to
take a quick break and when we come back, you'll
have it all ready to go.

Speaker 5 (56:15):
Right.

Speaker 1 (56:16):
Okay, here we go, So we'll be back with Jonathan
Forsyth right after this. This is the k if I
Knew Special miss dis Exploring Misinformation and Disinformation. Welcome back.
We're talking with Jonathan Forsyth. He's the managing editor of Verifying.
Before the break, Jonathan, I was asking you if you
had an example of a situation where you know, you

(56:38):
just kind of hit an impasse or something like that,
and you said you were trying to remember it so
if you had time to remember it.

Speaker 7 (56:44):
Yeah. So this goes back a year or two and
I remember there were pictures of coke can circulating online
that were and this was part of the sort of
backlash to DEI initiatives and anti wokeness across the country
and the Coke can. You know, sometimes coke cans had

(57:07):
slogans or phrases that were unique to a season or whatever.
In this case, I think it was something to the
effect of it's okay to be white and so. And
there was a big backlash. And of course we reached
out to Coca Cola and they denied that these cans,
wherever created, they may have been photoshopps something to that effect.
We suspected it was probably something that was artificially created

(57:32):
and you know and circulated to actually incite you know,
division and that sort of thing. But we couldn't find
a second source other than coke and relying on you know,
the big mega organization who would look good if they
were to say, yeah, this is fake. We didn't have
another source that could prove that that wasn't real. So

(57:54):
we just didn't do the story. And that's those those
decisions and conversations happen every day in our in our
editorial calls if we can get if we can find
the evidence to prove that something is true or false,
and yeah, we're going to do the story, but if not,
then then we can't do it. And in that case,
that was one of the cases that we couldn't.

Speaker 1 (58:16):
And do you do you publicize that this was the
story presented but we just couldn't. It was undetermined you
are you transparent about that or you just even out there.

Speaker 7 (58:23):
So we are with the people who are asking questions
about it, like, hey, you know, we're unable to verify
that as time. You know, we do get a lot
of questions recently about the Bob Woodward book that came
out with some some pretty big accusations about you know,
former President Trump and his interactions with with Putin, And
we get questions about you know, when other news outlets

(58:45):
report something, Hey is this true? And you know, we'll
respond like, look, this is based on exclusive reporting by
this outlet, and you know, we can't verify the truthfulness
of this the story kind of thing. So so we
are we make a point to respond to our audience
as often as possible.

Speaker 1 (59:02):
In those cases, what was the craziest request you've had?
What I mean, what was the one that you looked
at and you look at each other and either laugh
or you cry, like really someone has someone thought this
is something that's legitimate.

Speaker 7 (59:16):
Well, Steve, you know what comes to mind is the
recent questions about the government controlling the weather, and with
these hurricanes and it being right on the eve of
an election, it's there's a lot of conspiracy theories out there,
and I would put that right up there near the top.
We have a lot of weather experts and meteorologists who

(59:40):
at our disposal of sources and it's an easy one
to tamp down, but you'd be surprised at how many
people actually believe some of the stuff that's being said
out there. And that's that's one of them.

Speaker 1 (59:50):
So walk us through how the process works. So whether
it's online verify this dot com, whether someone has registered
through text and someone if I say admit something to you,
what happens next.

Speaker 7 (01:00:03):
Well, Well, we have an audience, a dedicated audience team
that will read every email and every text and share
the ones that start to create a trend right when
we see multiple of them, and then we will bring
those to one of our two daily editorial calls and
we'll assign them accordingly. We have a staff of digital journalists.

(01:00:26):
We usually assigned the research and reporting first, and then
we decide which of those makes sense to produce for broadcast,
and then we'll do a broadcast version that we'll share
with our stations.

Speaker 1 (01:00:38):
Now, as we wrap up here, I've asked you a
lot of questions about the future of journalism, what the
status of journalism is. And you know, our popularity and
our trust is at an all time low. But organizations
like yours, I believe, are really trying to get this

(01:00:58):
back up to saying listen, you know we are doing
the work.

Speaker 5 (01:01:01):
We are here, we are holding.

Speaker 1 (01:01:03):
The powerful accountable, we are preserving democracy, We're doing our
very best.

Speaker 5 (01:01:08):
But what does it.

Speaker 1 (01:01:09):
Really say about the state of affairs when you have
to have units like this now? Where before a journalist
was ultimately trusted with all the information, much like a
police officer, that there was never a question about trust.
Whatever words came out of my mouth or whatever words
I typed on page, you trusted it beyond a doubt.
And now we're at the point where no one trusts

(01:01:29):
anybody where is that? Is that more of a journalism
thing or is that more of a social construct?

Speaker 7 (01:01:36):
Well, I think it's both. I don't think it's I
don't think there's a simple answer Steve, and I wish
I had one. If I did, I'd be in a
different position probably than I am now. I do think
it's a positive sign that more news organizations are creating
units to cut through this and dedicate resources to doing
it in a clear way that presents the facts first.

(01:01:58):
I also believe there's a segment to the audience that
doesn't care about that, and the reality is that you know,
who's going to read the actual fact check or the
verified version of the story versus see the meme that
went by that had ninety five percent false information is
a lot less, right, It's it's hard to reach the

(01:02:18):
same amount of people that these viral you know, disinformation
misinformation campaigns create, and you add you know, the new
generative AI tools that are out there, and there's people
there's as many people just doing it for kicks to see, Hey,
look at how many people are going to be fooled
by this? Right that that just makes it all the

(01:02:40):
more confusing. There's plenty of bad actors right who are
sowing disinformation. It's been well reported there are foreign countries
that are doing that ahead of the election here in
the US. But then there's just as many people just
doing it for giggles, right, just for hey, watch this picture.
I'm going to create a Biden and Trump sitting on
a bench together looking like they're eating an ice cream

(01:03:01):
comp And then there's people who actually call us in
and say, hey, is this real? And it's like, so,
there's this. And then there's satire, which is another one
you asked me about some of the crazier things that
we've got. There's plenty of satire articles that make the
rounds that people believe. Yeah, and then and it's sad
that we have to weigh in on a lot of those.

(01:03:23):
But but I think if we're not, then it's allowed
to just run rampant and no one's going to trust anybody,
and it's a challenge.

Speaker 1 (01:03:32):
Well, I was going to say that I think it's
a combination of technology. I think technology is a big,
big part of the blame here, or should take a
shoulder a lot of the blame. But then you know,
people taking advantage of the vulnerable or the naive. I
think that I think it's just really boils down to
I think it's just human behavior.

Speaker 5 (01:03:50):
I think it really is.

Speaker 1 (01:03:51):
I don't think it's really a journalism issue, and I
think it's I really do think it's a technology issue
combined with people who are set, like you said, set
out to cause chaos or or agitate or whatever. But
I think technology is really at the root of all
this evil, to be honest. So that's the way I
look at it. But you're using technology, don't disagree, and
you're using technology to counter all that, and then I

(01:04:12):
applaud you for it. So listen that that does it
for us here, Jonathan, thank you so much. It's Jonathan
Forsyth with Verify. Thank you so much for your time,
and I think this is absolutely fascinating.

Speaker 7 (01:04:22):
I appreciate the opportunity.

Speaker 1 (01:04:23):
Steve, take care, Thank you for listening to Part one
of Miss dis Exploring Misinformation and Disinformation. Miss dis Exploring
Misinformation and Disinformation is a production of the KFI News
Department for iHeartMedia.

Speaker 5 (01:04:38):
Los Angeles.

Speaker 1 (01:04:39):
The show is produced by Steve Gregory and Jacob Gonzalez.
Advertise With Us

Popular Podcasts

Bookmarked by Reese's Book Club

Bookmarked by Reese's Book Club

Welcome to Bookmarked by Reese’s Book Club — the podcast where great stories, bold women, and irresistible conversations collide! Hosted by award-winning journalist Danielle Robay, each week new episodes balance thoughtful literary insight with the fervor of buzzy book trends, pop culture and more. Bookmarked brings together celebrities, tastemakers, influencers and authors from Reese's Book Club and beyond to share stories that transcend the page. Pull up a chair. You’re not just listening — you’re part of the conversation.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.