Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Thanks for downloading Making a Killing. I'm Bethany McLean. We've
had some amazingly fun conversations so far in the series.
My favorite, I think, is my old friend Alex Gibney
talking about the line between a visionary and a fraudster.
It's a theme that we returned to again and again.
Back in nineteen ninety five. I got my start in
(00:23):
journalism when I joined Fortune Magazine as a fact checker.
Well we had more august titles. They called us reporters,
but we were fact checkers. We are responsible not just
for the accuracy of individual facts. Is the car blue?
Is it pale blue or dark blue? But for the
defensibility of the point of view that the story is boast.
I'll always remember being given my first story to check.
(00:45):
It was something about four one K plans, and it
sounded very convincing, but it was all wrong. That was
the first time I realized that authoritative, seeming words in
black and white can be extremely misleading, if not outright false.
I know that sounds hopelessly naive in an air aware.
As Bloomberg just reported, fake news is such a threat
(01:05):
to US security that the Defense Department is launching a
project to repel quote large scale automated disinformation attacks. The
Defense Advanced Research Projects Agency, or DARPA, wants custom software
that can unearth fake stories, photos, video, and audio clips,
maybe even eventually detect malicious intent and prevent fake news
(01:26):
from going viral. Of course, the big social media companies
are also trying, I think sort of. In late twenty sixteen,
after the deluge of criticism about online disinformation during the
presidential election, Facebook announced its third party fact checking project.
Independent organizations would debunk false news stories, and Facebook would
(01:47):
make the findings obvious to users, even downraking the relevant
post in its newsfeed. Google also has a problem that's
supposed to help news organizations tag stories that debunk misinformation
so that Google News can more easily feature the correct information.
But is it a losing battle. We're not just fighting
plain old mistakes anymore. We're fighting bias infiltration by foreign
(02:09):
powers and maybe even our own government. President Trump's political
allies are trying to raise at least two million dollars
to investigate reporters and editors at The New York Times,
the Washington Post, and other outlets, according to Axios, in
order to make allegations of bias by social media platforms
a core part of their twenty twenty strategy. The consequences
(02:31):
of this are huge. By the time twenty twenty is over,
trust in all sources of information will be low and
perhaps unrecoverable, wrote Axios as Mike Allen, who also said this,
a nation without shared truth will be hard, too impossible
to govern. So I'm excited to talk to Kyle Pope
about this. Kyle is a longtime journalist who's been an
(02:52):
editor at Conde Nast in the Wall Street Journal, among
other places, and he's currently the editor in chief and
publisher of the Columbia Journalism Review, which says its mission
is to be the intellectual leader in the rapidly changing
world of journalism. CJR just published a piece about how
Facebook's fact checking program is in fact, falling short, and
of course CJR is also avidly covering this larger issue
(03:15):
of fake news. So, Kyle, you've been a journalist for
a really long time. So take me back to when
you got started in your days as an editor in
the old world of print and what the fact checking
process was all about. I started my career in local
and regional newspapers and there wasn't really a fact checking
function then, even though it was all print. My first
real experience with fact checking came at the Wall Street Journal,
(03:39):
where I worked for ten years. And when I started there,
it was like I think nineteen ninety two. The first
thing they had, which is instructive because it doesn't exist
pretty much anywhere, they had like a they call it
a spot news desk, and they put you through like
a boot camp, so you had to work there. You
had every new reporter, no matter what you were hired
to do, had to come to me or I was
(04:00):
hired in Dallas to come to New York and spend
a week going through this boot camp where they would
just they would teach you the basics about how to
write an earning story, how to read an SEC document,
what databases you could access to find people, and it
was a real tutorial. I remember distinctly. The woman who
(04:20):
ran it was like she was dead serious and really
sort of enmeshed in this, and every time you would
do an earning story as a young reporter, she would
look it over and she would come back to you
and say, Okay, well you did this wrong. There's a
difference between operating income and net income that you don't understand.
You need to figure that out. And anyway, there was
a whole process. I don't remember, frankly, whether there was
(04:42):
a separate fact checking operation for the whole newspaper. The journal, though,
was extremely heavily addited. I mean there was like two
or three people, at least two would read over every
story and really comb through it. I mean, I think
that was the case for a lot of newspapers. I
mean the fact checking function was most robust at magazines. Yes,
(05:02):
not so much in newspapers. And then what happened is
as we moved from print newspapers. You know, in hindsight,
we didn't realize at the time, but we had the
luxury of time and we had forever right to do
these stories. And I'd get stories a week, a couple
of weeks before they were due to run, and then
be turned over to me as the fact check. Yeah,
but the magazines had this like crazy robust fact checking
operation that all that has sort of gone away. I know.
(05:25):
It's actually it's astounding because there's a woman named ros
Berlin who was the chief of reporters at Fortune, and
when I got there, she taught me how you checked
facts and it you know, if it was if it
had already been reported, it had to have been reported
in three places. If it hadn't been reported in three places,
you needed to call. You had to use a red
pen to check things and a bluepen to do other things.
But it was this whole elaborate procedure around around fact checking.
(05:47):
You know what's sad about that? And when you say that,
I mean, I imagine the response we would get from
a lot of quarters is how over the top that was,
and how like unnecessary in a way? And did it
really make a difference? And I think you know you
talked in your intro about trust and media. I think
it did make a difference. I think I think readers,
I mean, the most damaging thing you can do a
(06:09):
store in a story is get a sort of simple
fact that the reader knows is true wrong. So if
you say Smith Street is runs parallel between eighth and ninth,
and they know that it's between ninth and tenth, it
discredits the entire piece absolutely, even though it's a little
tiny thing that may not even be that relevant to
(06:29):
the argument that you're making. And readers, I think are
seeing more and more of those kind of mistakes because
of what's happening in the journalism business. So do you
think before we even get to the social media sites
like Facebook, do you think we're sewing the seeds of
our own demise in some ways? Yeah? I mean we've
got we've gotten incredibly sloppy. And can we be anything
(06:50):
but sloppy given the deadline pressures that we can? I
think I think so too. I recently I had a
job opening for an editor because of the market. There's
a lot of people in the market. I can't tell
you how many people I talk to editors whose job
it was to read twenty thirty pieces a day. Wow.
(07:11):
These are like st a content farms, right, and they're
just these mills. I mean a couple of them are
names that you would recognize, and I actually would feel
really like, what a terrible thing for like a really
talented twenty something in your old journalists to be doing
spending their day like trying to just hash through twenty
or thirty posts. That is a business model that we
set up to get as many clicks as we can
(07:33):
a lot of times to feed the private equity owners
or whoever it was that are owning these news outlets,
and our job is to sort of fight back and see,
no, no no, no, no, we're doing journalism here. We can't
do twenty or thirty posts a day and do real journalism.
You know, if that doesn't work, we've got to do
something else. I was actually thinking when you were talking.
I mean, how much the world has changed and whatever.
(07:53):
It is fifteen to twenty years from the days where
you'd have three editors on a story, on one story
that might take them two weeks for all of them
to look through it and check everything. To one editor
being responsible for twenty to thirty pieces a day. That's
just it's a shocking transformation. But did we have a
choice given the existential threat to journalism? Is there a
(08:15):
turning point or was there a turning point? Well, our
decision on this didn't turn out so well. Right. We
made a decision to go this way to try to
sort of salvage the business, and it didn't work. Right.
This all reminds me of my ill fated and painful
year and a half I spent working for Jared Kushner, Yeah,
who who owned the New York Observer. He had bought
(08:36):
it right, and I came in as his first editor,
and he was one of these people that he's not
a journalist, and he didn't care about journalism, and his
whole thing was like his role model was business insider yep.
And he would say to me like, they have whatever
the numbers were, they've got only they've got twenty percent
more people than you, but they're doing two percent more stories,
(08:57):
Like why can't you keep up with that? And I
was like, well, they don't do the kind of stories
that we do right and take more time. And he
just could not see it. He was like, that makes
no sense to me. You guys are idiots. And then
of course it all blew up and I got fired,
which probably in some ways was was not bad. Well,
in hindsight, it's one of the great badges of honor
that I have. But that was the mentality at the time,
(09:20):
and there weren't enough editors who are willing to stand
up to people like him and say, you know, this
is all this is frankly bullshit, and you know it's
not journalism that you're talking about, it's something else, right.
It does show the myriad ways in which journalism can't
be slotted into the things that are important in the
modern world, by which I mean productivity, right, number of clicks.
(09:41):
All of those measurements, business measurements don't really apply to
or shouldn't really apply to journalism, and yet we embraced them. Yeah.
There was just sort of profound insecurity at the heart
of it, you know, like we just we didn't know
how we were going to keep the lights on at
any of these places, and we didn't know what journalism
was going to look like. So we tried all this
(10:02):
experimentation with volume and with speed and with brevity, and
and I think, you know, even though it's still it's
an incredibly difficult time financially, I almost think that we're
in a better place now only because I think our
our mission and purpose have been clarified. Yeah, right, And
I think I think that we have a better sense
(10:22):
of like we can try all this stuff and it
may work briefly, sort of like the pivot to video
or can I say the pivot to podcasting, pivot to
pocasting exactly, But you know what, unless it's unless there's
quality there and unless it's good and unless there's real
value to readers and in terms of telling them something
(10:43):
they know that they can believe then it's going to disappear.
And so I'm trying to think about how this change
in the way we all did journalism intersected with the
move towards social media, and it strikes me a sort
of a vicious spiral downward where we did what we
did in response to the rise of social media, and
(11:03):
then the dissemination by social media exacerbated that. The problems
is that is that how you think about it? Yeah,
I think it's going to go down. Is one of
the great mistakes in communication. This idea of like turning
your content, your relationship with your audience over to somebody
else that doesn't share your values to be sort of
(11:25):
like having somebody say to me, you know, if I
spend a lot of money to send my kids to
private school, and they say, well, we can we can
educate your kids for like a lot less. Right you
spend thousands of dollars, we can do it for a
few hundred bucks. And but but be not like taking
any initiative to find out what's up with these people
or do they even know what they're doing, or do
(11:46):
they even care about the same stuff that I care about,
and then be shocked that they did a lame job,
or that they even did harm. And it's almost like that,
I mean, the relationship we have with our audience. I
mean that is like that's built up over decades and
decades and decades. And if you think about you, Bethany McClain,
have a have a brand in the market right because
(12:07):
of work that you did, I suppose, although it pains
me to think that way, but yes, you do. And
it's because of work that you did that has a
certain amount of excellence. And there are people in who
follow especially business news, who knew you and trust you
because of stuff that you did, sometimes decades ago, and
you didn't screw it up along the way, so you
(12:28):
still have that right that took so much effort to get.
And these media organizations had that, and then they then,
for this desperation for clicks, they turned it over to
these silicon ballet companies who'd had none of those values
and none of those interests, and it was all a
black box like they would say, hey, can you tell me,
like how you reach people or what kind of algorithm
(12:51):
you use or do you track people? Don't know. We
can't tell you that it's a trade secret. Please, no,
we can't Okay, fine, it's going to end up being
a massive mistake. And I think we're all now recognizing,
recognizing it's more obvious. It's interesting what a giant loss
of confidence, a collective giant loss of confidence, can cause
you to do right. You know, here we have Facebook
(13:13):
has access to all these people. We want access to
all these people. We sort of trusted them to take
our content. We didn't trust them, but we didn't even
we didn't want to think about what the consequences were
of not going that way. Are We chose what looked
like the easier path, right. I think about this irony
all the time, because Google and Facebook couldn't exist without us.
(13:34):
In a sense, they're parasites, right, and yet the parasite
became more powerful than the host in many ways. And
if they destroy journalism, they destroy themselves. I think about
this every time I send out a tweet, like, I'm
basically producing content for these people, right, Yeah, I'm doing that.
I don't make a penny out of it. They're profiting
from it, and it's my content. I had actually never
(13:56):
thought of it that way, and that is so obviously true. Yeah,
why are we doing that? Because somehow we've fought into this,
so let's move to Facebook. They're trying, right, Is that
a fair way to characterize it, to combat fake news?
I think they're trying to get good pr I think
they want to look like they're doing something. I don't
really think they want to do something, and why because
(14:20):
it doesn't fulfill their business interests. I mean, their job
is to get as many clicks as they can and
to get as much engagement as they can, and trying
to incentivize people not to click on stuff that's not
what their business is about it And in fact, the
more of our base instinct that they can appeal to
and the more that gets people to respond, the bigger
they get. Do you think about the tragedy of the
(14:42):
gap between that rhetoric from Silicon Valley about make the
world a better place and this very harsh business reality. Yeah,
you know, I don't see it as tragedy. I find
it to be completely cynical, cynical and calculated. I mean,
I've really since I came into job at CJR, I've
really turned a corner. As you can hear. Yeah, I
(15:05):
can fund of these companies and I've spent a lot
of time with them, because I've really been trying to
like get my head around all of this, and I've
been trying to get my head around what makes them tick.
First I was like, why can't they seem to get
their arms around this? Then I became convinced that they
have no interest in getting their arms with that, But
that too was interesting. What convinced you of that? What
(15:27):
were the key moments? What did you hear and see?
I mean, it's not an open secret. There's been a
lot of eff recording media journalist and yes New York
and I assume elsewhere, and they would have these dinners.
You probably went to one or two of them. I
don't think I've ever been invited actually, which maybe maybe
just like you're getting fired from the New York Observer,
I'm right, I'm gonna I'm gonna take a dinner where
(15:50):
they would invite journalists and it was all very like
one very swanky, nice restaurants, but also just it was
like we're just we just want to listen, and we
really are interested in what you're up too. And because
this is right when they were launching their news initiatives
and they were sort of partnering with local people and
they would be talking about that with great enthusiasm. But
then I was sort of watched it long enough to
(16:11):
see what they actually did that came out of that,
and it was ended up being so inconsequential. Then CJR
hosted a mini conference in Silicon Valley trying to sort
of put journalists and the social networks together to talk
to each other. We called it can this marriage be Saved?
And it was just so obvious to me that their
(16:32):
worldview and our worldview were entirely different. But when we
were talking about objectivity or facts or rigor, they like
honestly did not understand what we were talking about. They
just fundamentally don't get that language. They totally didn't get
the language. And it was all about, well, if people
are engaging at how can it be wrong? And what
(16:53):
business of it of ours is it to tell them
they can't do that if they're doing it, the mere
fact that it's commercially successful and that it results and
clicks as a defense of what it is, and you
journalists are sort of like somewhat old traditionalist who want
to stifle speech. She trapped in a world where you
believe there's a truth, Trapped in a world in which
(17:14):
you're trying to defend your own reason for being yep.
Why do they want to engage with us? Why do
they care about participating in this? Because they realize they
can't exist without us, so they have to keep they
have to plagueate us. I think they're scared of regulation, yep.
And I think they view us as sort of part
of their lobbying effort. What's been really interesting is they're
(17:36):
trying to hug us, to make us one and the
same with them in terms of protecting against encroachments on
free speech. So they're trying to say, like, Okay, we're
in the same we're in the same game. Yep. Like
if they go after us, they're going after you, so
we need we're on the same team, right right. Whereas
they want free speech, I think about this a lot.
They want freedom of speech without responsibility, and to me,
(17:58):
those two things have always gone hand in hand in
the old world of journalism. You get freedom of speech,
but with it comes enormous responsibility. Right. They're very clear
on the fact that they make no claim to the
veracity of what they run. I mean, they keep pushing
back against this idea that they're editors or that they're
a news organization, And it's because if you're if you
(18:19):
are that, then you get held to your point, you
get held responsible for what you say. Yep. I was
thinking about that when I was looking into this and
reading what you guys wrote that the past six months
of headlines about Facebook's partnership with all these fact checking
organizations too, that it paints a pretty bleak picture. You've
had Snopes come out and walk away from from it
saying they're not serious about this and that it just
(18:41):
it isn't working. Is that what you're seeing more broadly too, Yeah.
I mean people are saying one, they're getting no feedback
from Facebook on what they're what they've seen as a result,
so we don't know whether this is resulted in less
So they still won't won't provide transparency. Just it's the
same answer as it was when we would go to
(19:02):
them early on and say, what's going to happen to
this content? It's a lack of transparency is the issue. Yeah.
And the people who are in these partnerships are like,
we really believe this word partner and we thought we
were in this together, but you're not telling this anything
about what the impact of this has been so again
it's a one way partner organizations feed information to Facebook
and then it goes into the black box, right and critically,
(19:23):
Facebook puts out a press release that the partner organizations
are involved in this. But there's also this really interesting
thing happening with academics too, where they had been brought
in to work with Facebook and on dealing with some
of these problems. And we're promised access to more of
the back end of Facebook's data yep, and they're just
(19:45):
not getting it. And a lot of them are again
are saying like this is a pointless exercise, like we're
being used as window dressing, big leafs, window dressing to
make it look like they care in our trying and
they're just not get And then there's this sort of
like you know, bobbing and weaving Facebook's part, well, you know,
there was a misunderstanding of the kind of the data
that we were going to be giving you, or yeah,
it's coming, it's just taken us a while. Legal is
(20:07):
reviewing it, and I just don't buy it. I don't
buy it. What about Google? Do you see any difference
between how Facebook handles this and how Google handles it
or is it less important for Google? Yeah? I think
it's somewhat less important. I mean they also don't have
these third party back checking partnerships, right, I Mean the
issue with Google is it's a bit harder to pin
(20:30):
down having to do with what they surface. But ultimately
that's that really gets you into the black hole of
nothingness and trying to understand how these places work. But
to their credit, I guess they haven't spent quite as
much effort on the pr front yep as Facebook. As
they're not trying to portray themselves as a friend and
a solver of these problems, they're also not facing quite
(20:51):
the same storm of controversy that Facebook is. Right. Yeah,
the sort of closer corollary to Facebook is Twitter, yep,
because they're right in the heart of a lot of
these issues. And what do you see Twitter doing? I
don't know, me. I'm so fascinated by Jack Dorsey. Whenever
he talks about this is like the world's head explodes usually, right, Yeah,
(21:11):
because it's hard to know from what he says what
that actually translates. Yeah, but you have seen this, it's
a similar issue where you have seen like them trying
to like figure out, what do you have to say
to get banned? Yeah? Where'es that line get drawn? And
then and then that sort of forces them to sort
of lean into their role as arbiters of speech. But
(21:34):
it's like Alex Jones is clearly bad, right, But there's
a lot of people who are like see things like
Alex Jones that are still on there. Has anybody that
you know, have taken a look at this and seen
if there's any consistency and who gets banned, who gets silenced,
who's allowed to continue speaking work on it? It tends
just to be very anecdotal and at hawk. Yeah, non systematic. Yeah,
(21:55):
it's not systematic. I mean I just I sort of laugh.
I have to laugh. They just get tied up in
balls over this stuff. Like and again it goes back
to what I was saying about the difference in worldview. Yea,
this is so not what they want to be doing. Yes,
it's fundamental. It's a fundamental conflict in the way they are.
It involves a judgment call, it involves something other than
(22:16):
data responding to demand, and they just are constitutionally really
ill equipped. I was struck by it was some quotes
from a story about Facebook's lack of transparency that they
just don't provide clarity that you don't know how many
users the fact checks reached, how many people clicked on
the related links from a false story. Do these fact
checking projects slow or even halt the spread of any disinformation?
(22:37):
And all of that is still a black box. How
does this all come to a head in the election
that's coming? Do you think? I mean, I fear that
we're going to be entering a whole new zone of
this because of the move towards closed networks. Ye, both
on Facebook and by the way, I think they're doing
this in part to escape scrutiny, but also and what
(22:58):
do you mean by the move to closed networks? Well, well,
they've now sort of announced that they're tweaking their focus
to be about more tighter groups that you're invited into.
What's app is a closed network. What you're going to
see is like targeted campaign messages. I was in Brazil
during their presidential election, and they did blast out TV ads,
(23:18):
but a lot of the campaign messaging was over WhatsApp
groups and it was location based. So if you lived
in this certain district, everybody who lived in that district
was getting this message, and everybody who lived in that
district was getting a different message, and it made it
really hard for journalists to like understand what these campaigns
(23:38):
were saying because they were saying different things, because they
were saying and if you, as a journalist, weren't in
that area, you weren't seeing it. So that's going to
be happening in this election. That tactic has already been
previewed in other countries. It's previewed and by you know,
Steve Bannon was working with Ballsonaro in Brazil, so you're
going to see it happening here. So if you think
(23:58):
about that, you know, if you're in a contested race
in Wisconsin versus Michigan, versus Florida, you're going to be
seen from the Trump campaign different messages and it's going
to be really hard for journalists to sort of parse that.
That is actually incredibly frightening speaking about got back to
this idea that is a nation to function, you need
at least some idea shared idea of what the truth
(24:18):
is right and if you can't even if there's no
truth even around what a political candidate is standing for,
and no ability to get to look into that because
It's different depending on who the message is being given to.
It's like taking these private dinners where you know, sometimes
people record them and catch them, you know a Hillary
Clinton or say saying something terrible. But now that's taken
(24:39):
to a whole new level, right yeah, yeah. Do you
think that even if given that the active efforts to
let's just say, misuse these platforms, even if they wanted
to do something, could they and we'll just stick with
Facebook for the sake of keeping it clean? Is it
such a game of whack amole as well funded interests
try to figure out how to use this to their
(25:00):
own advantage, that even if they were serious, could they
get out in front of it. There's a lot of
self reporting that goes on, especially on hate speech and
dangerous speech. I mean you see this when there's a
christ Church shooting and the shooter is live streaming. I
mean it took a little while for Facebook, I mean minutes,
but they quite quickly took it down. So and that
(25:24):
was part partly because people were like, hey, you have
to take this down. This guy is doing this. There's
a lot of self reporting that goes on. They can mobilize,
but that's specific and egregious rather than subtle and a
judgment call. Right, And so isn't the ladder where they
get into trouble? Yeah, it is where they get into trouble.
Although let's remember that these companies have pretty much unlimited resources, right, right.
(25:49):
It's not like our newsrooms where we're we're getting rid
of the copy desk because we know we need to cut.
But it's it is about the very ethos of these
things and sort of what do they see them as. Again,
they don't view themselves as content players. I mean, they
view themselves purely as sort of connectors of people. Even
(26:10):
if they wanted to tackle this, it seems to me
that the threats are growing so quickly now with this
when Mark said that Facebook made an execution mistake when
it didn't act fast enough to identify this doctored video
of Nancy Pelosi where her speech was slurred and distorted,
and the tools that can allow somebody to present a
distorted image of reality that isn't even obviously false in
(26:31):
the way that in the way that in the old days,
if I call the CEO and said, back to my
intro with your car blow, is it pale blow? Right?
We can all agree on that color. But when images
of reality that we perceive are being altered, what do
you do in that case? I think ultimately they're going
to have to figure out what sort of brand they
want to have, and there's not a lot that media
(26:53):
organizations can do to affect that. I mean, I think
the big mistake that was made was again turning over
this audience. But then what that What that did was
it put journalism in the same feed as bullshit and
cat videos and Russian disinformation and listicals. And if you're
(27:17):
a reader of your Facebook feed, the New York Times
story doesn't look any different from the Daily Caller story,
from the video of your kids first day of school.
It's all there. That's a really interesting point. It's not
just that we supplied the content to them, it's that
it also stripped any ownership. An't way any any obvious
signs of ownership or branch just ownership, but sort of
(27:38):
like authority, it's stripped away any differentiation. Differentiation looking for
between you as a serious journalist and the very earnest
citizen journalist and the very manipulative citizen journalist, and the
and the active information disinformation agent. Like we all look
(28:00):
the same, and so I think, you know, and we
we as media companies turn that over to Facebook and
we said it's cool to do that, right desperation, right, yeah,
And so I think the onus, I think is going
to be on media companies to say, you know, Facebook,
you can't put our shit on your on your platform,
you don't have the right. You can do that, you
can do that, you can do that, and has anybody
(28:21):
done that? One of the biggest papers in Brazil is
like basically said we want nothing to do with Facebook.
We're off really yeah for you? And when? What was
Facebook's response to that? And what was Is it too
soon to see the business referencequation? Too soon? And I
haven't seen what that all means. Yep. But we can
walk away if we want. And it doesn't even matter
At CGR. We have our own relationship with Facebook as
(28:44):
a platform, and we are like everybody else is they
as they've doesn't centibize news. I mean, everybody's traffic on
Facebook is dramatically less. So it's hard to good because
it's it's totally easier to pull the plug. What else
do you think the press can do? Given the belieguered
state of our industry, which isn't getting any better despite
(29:04):
all the bad decisions we made to try to save ourselves.
How much can we affect the discourse and how much
how much can we shape truth? How much power does
the press still have to do that? That's a big goal. Yeah,
shaping truth. Yeah, well the truth with a ca smaalty
instead of with a CAPITALT. Okay. I think this conversation
(29:25):
about social platforms and what they how they view the world,
and what they is important. But I think we sort
of have to stick to our own knitting in a way. Yep.
I think we have to sort of control the things
that we can control, which is we've got to make
sure we ourselves get stuff right. So you know, going,
this is how we began this conversation, like moving away
from like just fire hose content that may or may
(29:48):
not be totally true, and repeating things that other people
have published, Yeah, repeating things that other people have published.
And also, like I mean, right now, the thing that
I fear is that we're in this kind of like
ridiculous spiral of nonsense. Like I mean, I've been following
this whole thing with Trump and the hurricane cone and
how he used a sharpie to broaden, did you follow this?
(30:10):
I actually did not, and I'm proud of myself that
I didn't. But that's the point. I think you made
the right call. So what just if I can give
people the ten second version of this? So the National
Weather Service had you know, they do these maps that
show we're the likely direction of these hurricanes. What the
direction is? Trump made some comment that Alabama was in peril.
(30:31):
The National Weather Service hadn't said that Alabama was in peril,
and he kept and people were calling him out, saying, like,
you're the president, You're telling Alabama's to be worried the
National Weather Services They shouldn't. That's not right. So Trump
actually took a sharpie and doctored the National Weather Service
map and added his own cone and then held it
up and said, hey, look see. And people are like,
(30:52):
actually you added that if you think maybe that maybe
this is a positive sign given that Trump was one
of the original proponents or users of fake news, right
if he's now getting called out on it. But to
your point, like, so, there's been like endless, endless coverage
of this sharpie gate and these cones and endless commentary
(31:16):
about this, and does that really contribute to the trustworthiness
of news organizations. I think the outrage meter is sort
of broken and we just go everything is a kind
of like five alarm fire, and I think that ultimately
costs you credibility. And part of what exacerbates that also
(31:37):
is the rise of a lot of I'm going to
call them publications, but it's making opinion pieces out of
somebody else's news, right, So it causes this mass proliferation
of anything that people deem interesting, because it's not just
Facebook and Google that are sort of parasites on the
back of news organizations. It's many other news organizations that
(31:58):
are parasites on the back of things that might be
reported by the New York Times in the journal and
then are turned into opinion pieces by others, right. I
think it's I think these social networks have become kind
of distractions for us to stop to not look closely
enough at our own failings. You know, it's fun to
sort of beat up on them, and I like doing it,
but there's a lot that we need to do to
(32:20):
sort of police ourselves. Can we do it given the
state of our industry? Is that realistic? Yeah? I think
it's sort of It has to be necessary. It may
not be sufficient for our survival, but it's necessary. Oh,
I don't know. I mean, I think journalism is going
to survive the practice of writing down stuff people say
and trying to figure out what is real and what's
(32:41):
not real. And now you're sounding like the believer in truth.
Well it's not true. I mean, well, yeah, I think
you ultimately have to what's the alternative? What are we
in this business for? I mean, I do think that
you know, the world is as confusing, and I you know,
it goes back to sort of very career schooly kind
of things about like, you know, telling the truth, holding
(33:05):
the power accountable, standing up for people who don't have
a voice. I mean, all of these are kind of
journalistic bromides that actually are real, that actually not true.
Everything important you need to know you learned in kindergarten. Yeah,
and I think, you know, I mean one of the
we publish out of Colombia and I have a kind
of side view into journalism students, and those values are
(33:29):
really evident in this in the students have coming over
the last couple of years, you see a return to idealism,
really idealistic, not idealistic about the world, but idealistic about
the potential for journalism to do something about it. What
you can accomplish with honest words, fairly reported, and meticulously first.
(33:49):
Now these students are super convinced of that, whereas, which
is great because, I mean, there was a time recently
where there was a lot of cynicism in journalism. Do
you see specifics happening in our industry that make you
optimistic that we're that we're turning the tide. There's been
some amazing reporting around Trump and there's been some amazing
lead terrible reporting around Trump. There's been both. I mean,
(34:12):
I'm happy to see a little bit more sense of
accountability among these organizations. Although we recently launched a project
that was a little bit cheeky, but we installed public
editor We announced that we have public editors for the CNN,
the New York Times, Washington Post, and MSMC. Because they
don't have them, there's no way for readers to have
(34:32):
any sort of meaningful feedback with these places, and so
we hired people to do that job for them. And
then I gotta say, I mean, the Times especially has
been pretty engaged with us on this, So I think
I think that's positive. I mean, I wish they would
have to bring back their own public editor because I
just think that that feedback loop with a public You
want people to feel like you're standing up for them
(34:53):
and you're asking the questions that they would ask. But
if you have no mechanism to know that, it's hard. Ye.
Would it make you optimistic about the media's interaction with
social media if you saw, say, the New York Times,
are the journal do what the paper in Brazil did
and say we're out? I think so. You know, I've
always I've been intrigued to see the number of journalists
who have said individual journalists who have said I'm off,
(35:15):
and they always come back. I know, are you on
or off? So I don't use Facebook, Twitter, but I
am on Twitter and I come and go. I'm half
hearted about it, but I think it is It is
such a source of information, particularly about financial news, that
in such a way to get in touch with people.
I would argue there's a little bit more to the
(35:36):
Twitter universe than seeing your own name and how good
you feel when you get likes on on your tweet.
It also is a very real tool. Yeah, So I
think it's hard to disconnect. Yeah, maybe there. And I
think it would probably be hard to disconnect if your
political reporter too, because so much breaks there. I mean,
to your point, we've given them the power come back
(35:56):
to some way in which this is all going to
get better. Yeah, I mean, I think journalism is no
different than the rest of the country. I mean, I
think we're in a really unsettled, unsure, uncertain time. But
it's just it always is striking to me when you
leave the country and just see and realize how deep
in this soup we are here. Yeah, right, yes, and
(36:17):
being over there, I mean I actually was reading the
Murdoch owned newspaper there and it's sort of like it
struck me that I was about three fourths of the
way through the paper and hadn't read Trump's name yet,
and I was like, wow, that is amazing and it's awesome. Yep. Right,
And then of course they did have it at the end.
But and that's back to your point about how we're
(36:38):
all playing the game of clickbait every every bit as
much as Google and Facebook are. We have to look
in the mirror ourselves. Yeah, and that's why I brought
up this like Trump hurricane thing there was like, there
was a story after story after story after story about this,
and I think it sort of feeds our It heeds
this narrative we have in our mind that he's either
an idiot, or he's dangerous, or he's he doesn't care
(36:59):
about truth, and all of those things I think frankly
are true. But hitting them over and over and over
again for hours and hours in like really big national publications,
I'm not sure as helping any of us. Yea, when
you think about this broadly and you look at how
quickly this is mushrooming with the ability to doctor images,
(37:21):
do you think the government needs to get involved. You've
got DARPA announcing this program where they're going to try
to assess out fake news and halt it. Can you
envision a world in which government regulation can be helpful?
I hope, I know, no, hope. We can fix our
problems ourselves. They have to, and I just don't. I mean,
if if we thought Facebook and Google had a hard
time defining the line, do you really want the US
(37:43):
government defining what the line is? You really don't. I mean,
you see you see this in I mean I worked
in Britain for a while and they have a pretty
aggressive regulator there that gets involved in some of these issues,
and inevitably it's not good for or free speech. That's
actually interesting. If you have government defining what truth is,
then it's antithetical to the very notion of freedom of speech. Yeah,
(38:07):
I don't think that's I don't think that's the answer.
I'm not sure we got to the right answer, but
it was totally fascinating, And thank you so much for coming,
Thanks for having me truth be told. That's not the
way I expected the conversation to go. I thought Kyle
would be far more focused on Facebook, on social media,
(38:29):
on blaming these companies for all of the problems in journalism.
Of course, all of this is an issue, but I
hadn't thought about how much we, meaning traditional media, sowed
the seeds of our own problems, and how we continue
to do so. I'm not sure I see the way out,
but it's actually nice to think that we that I
do have some degree of control, maybe just by remembering
(38:52):
those lessons I first learned as a fact checker all
those years ago. Double and triple check everything, details matter, fair,
try for truth. Makia Killing is a co production of
Pushkin Industries and Chalk and Blade. It's produced by Ruth
Barnes and Laura Hyde. My executive producers are Alison McClain
(39:13):
no relation in making Casey. The executive producer at Pushkin
is Mia Loebell. Engineering by Jason Rostkowski. Our music is
by Jed Flood. Special thanks to Jacob Weisberg at Pushkin
and everyone on the show. I'm Bethany mcclin. Thanks so
much for listening. Find me on Twitter at Bethany mac
twelve and let me know which episodes you've most enjoyed.