Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hi, I'm Molly John Fast and this is Fast Politics,
where we discussed the top political headlines with some of
today's best minds. We are on vacation, but that doesn't
mean we don't have a great show for you Today.
Bloomberg's Matthew O'Neill and Perry Pelts stop by to talk
about their new documentary Can't Look Away, about Tech's stranglehold
(00:22):
on Children. But first we have substax own Jacob Silverman
on Trump's weird plan to make cities deregulated dystopias. Welcome
to Fast Politics, Jacob.
Speaker 2 (00:36):
Thank you.
Speaker 1 (00:37):
I'm so excited to have you here, and also I
have so many things I want to talk to you about.
The last time you were on I don't know if
you remember, but we at the end of the interview,
I was like, so, basically bitcoin is just kind of
a scam, and you were like, yeah, since that interview,
it's probably about six months. I want you to talk
(00:58):
about what is happening right now with the American economy
and bitcoin and bitcoin's other you know, fake money, synthetic currencies.
Speaker 2 (01:09):
Sure.
Speaker 3 (01:10):
Well, for those who don't know, the crypto industry was
the biggest donor by industry in the twenty twenty four election.
This is not a very big or economically productive industry,
but they have a lot of money to punch away
above their weight. And the results of that has been,
of course, that they helped elect Trump and a lot
of pro crypto Republicans, and now they're starting to get
(01:32):
their way with policy and personnel and across the board.
I mean, it's all happened very quickly. We've seen a
lot of actions by Trump for basically remove a lot
of the guardrails around the financial system and to dismantle
all these different task forces and groups within various government
agencies that were focusing on either cryptoregulation or prime related
(01:57):
to cryptocurrency. And simultaneous to this, Trump up here as
basically the country's most prominent crypto entrepreneur and who's done
things that would be considered illegal under any normal regime
or presidency, and is also not only is he the
most prominent crypto entrepreneur, but he's really jumped in the
(02:17):
deep end with the industry's shadiest characters, a lot of
whom have had pardons or headed towards a pardon, or
have had sec investigations against them dropped. And so now
where we're headed is that with the Genius Act and
some of the other regulation coming down the pike, the
whole economy will kind of have to play by crypto's rules,
or a kind of lack of rules. And what the
(02:38):
crypto industry wants from people who are in the bitcoin
on down is access to more mainstream financing, to Wall Street,
to institutional money, and to be able to kind of
create all these complex and definitely synthetic and artificial financial
products on top of crypto, which itself arguably has no
inherent value. With that comes a lot instability, a lot
(03:02):
of risk, a lot of volatility, and I think if
we have sort of a crypto driven economic crash, it's
because the whole kind of economy will be forced to
play by crypto's very dangerous set of rules.
Speaker 1 (03:15):
I laugh to keep from crying, but I also laugh
because crypto doesn't want to be regulated, just like AI.
There are so many parallels in my mind between crypto
regulation and AI regulation, right, They're both These are both
sort of these lobbies that have sprung up like tobacco
and oil, both of which are desperate not to be
(03:37):
regulated and are trying to circumvent any regulation, like by
adding in that ridiculous that states cannot regulate AI for
ten years, like that ridiculous little thing they put in
the Republican BBB. But what I think is so hilarious
about all of this is that in fact it's Congress,
(03:58):
so they're not going to fucking regulate any They never
regulated Facebook. You're really worried they're going to regulate you.
All they do is not regulate discuss.
Speaker 2 (04:07):
Yeah.
Speaker 3 (04:07):
I think this has been actually sort of an issue
across tech for years now, is that they kind of
don't realize how good that they have it, and that
I mean, of course there's going to be lobbying and
influenced peddling and stuff like that, but this is an
industry that even under Biden did quite well, and the failures,
so to speak of some things to take off like crypto,
(04:28):
are kind of their own. And so that's actually one
thing that I found puzzling or try to explain about,
like the radicalization of a lot of tech leaders, you know,
or like whenever markets, yeah, whenever Mark and Reason throws
a fit, it's like, well you had it, so you
are as rich as ever. You own six mansions your
companies are flourishing. Maybe you haven't, you know, created God
in a computer with artificial intelligence or whatever else, but
(04:51):
you pretty much have all you want. But they they
take any sort of criticism and now anything that kind
of impedes them in any way, very personally, and I
mean they're pretty soft in that way.
Speaker 1 (05:03):
I think tech bros Are fucking babies. I mean, I
think that's the net net, right. They were so many.
It's funny because I know one tech bro. They had
ten million dollars at Silicon Valley Bank when it went under,
and he was like started text me. He's like, you
have to talk to Biden about bailing out Silicon Valley Bank.
And I was like, first of all, it doesn't work
that way with Democrats. It's not like if you write
(05:24):
about politics that somehow you have a main line to
the president and he does favors for you. Like, that's
not how any of this supposed work. And also I
was like, bail out your fucking bank, so you want
to socialize the losses and privatize the games. Like I
feel like there's such a disconnect here with tech bros.
I mean, I think it's really important. They're fucking babies, right,
(05:45):
these people Mark Andresen, he was offended. They were so
offended by Lena Khan, by the mere existence of Lena
Kon Linikon wasn't even able to do a lot of
what she wanted to do, but the idea that there
would be any kind of check on capitalism in any
way was so offensive, And I wonder if you could
(06:08):
sort of talk through We know about Andreasen and his
funny shaped head and his love of fascism or of authoritarianism.
We know about Elon because he tweeted before he got
kicked out of the White House. We know that these
people are just pretty excited. But the thing that I've
(06:29):
been struck by is like Curtis jarviin the one sort
of the intellectual ringleader of this crew, right, I think
that article in The New Yorker really exposed him to
be just a complete broad So I wonder if you
could sort of talk to me about where this crew
is now, knowing that they're basically sort of glorized Fox News.
Speaker 3 (06:51):
Well, it's interesting because a few years ago, the new
right or the neo reactionaryas that Curtis Jarvin represents, they
try to present them cells as somehow countercultural or like
the new counterculture was these right wingers who you know,
believe in eugenics and all these other bad things.
Speaker 1 (07:08):
Wow.
Speaker 3 (07:08):
But now you know, the people they talk to or
some of their friends are in power, and I think
that changes things a little bit. And I think they've
also made a lot of accommodations to sort of the
cults of Trump or just you know, we've seen this
even in the last couple of weeks, but people kind of
bending over backwards to justify Trump's supposedly anti war, people
on the right justifying Trump's attack on Iran. And you know,
(07:32):
and Yarvin has also done this thing where sometimes he
claims he doesn't have much influence, which I mean, I
think it is debatable in some ways. He kind of
he gets he certainly gets ritt about a lot. But
they can only be sort of victims or people on
the outside for so long because now their peers are
in power and doing some of the things that they
(07:52):
want to be done, like following out the Ministry of
State and concentrating more power in the present. I think
sometimes when you hear this stuff from someone like Jarvin,
who speaks in these kind of goofy comic book villain terms.
Almost it's hard to take seriously, but then you look
at what's actually happening in the presidency or in the
White House, and it's hard not to see a connection there.
Speaker 1 (08:13):
Absolutely true, and there clearly is a connection there. So
Jesse really wants to talk about freedom cities. Oh, sure,
you need to explain to our listeners. Though not everyone
is as read in on freedom cities as Jesse is.
I made him explain it to me yesterday. So explain
what freedom cities are like, where they come from to
(08:35):
I want where they come from.
Speaker 3 (08:36):
And this is probably a good example of kind of
right wing Silicon Valley ideology finding common cause with just
sort of maga trump Ism and finding something in common.
Trump has talked about freedom cities like creating kind of
cities out of whole cloth in parts of the United
States or maybe Greenland one day, that are kind of
supposed to be low regulation or none inherently somehow innovative.
(09:00):
You're going to have hordes of tech companies going there
to try things or do experiments that they're not allowed
to do in normal law abiding communities. I mean, Trump
sort of talks about it or has did talk about
during his campaigns kind of returned to the frontier, and
the tech industry has been trying to do this in
(09:21):
the form of what they generally call charter cities for
many years. Peter Teele famously funded seasteading and which is
sort of these people who are going to live on
ships or on these floating communities.
Speaker 1 (09:33):
They ever happened.
Speaker 3 (09:35):
Well, there's one example of a charter city really that
I think deserves attention, and that's one in Honduras on
the island of Roatan called Prospera, and it's been pretty
well covered in the Times and other places. And basically
what they did was they took this is a venture
capitalist funded charter city that they basically made a deal
with at the time, the right wing Honduran government to
(09:58):
take over some land where they the basic criminal code
would still cover them, but they could kind of do
anything they wanted, and they started building a little bit
and giving out citizenship to tech entrepreneurs want to come
there to Again, it's a lot of people just awarding
taxes or trying or claiming that they want to do
medical experiments, biotech stuff that can't be done in the US.
(10:18):
There's not much real activity there, and then there is
a yeah, there are at least I mean, I've gotten
varying numbers from a few dozen to a few hundred.
They do have a sizeable chunk of land, and like
real estate. They took over an old resort and there's
been a lot of protests from locals, and so the
National government of Honduras basically turned over, became more liberal
(10:40):
left and said, we don't like this deal that was
made with you guys to own this land. It's a
long term lease, like ninety nine years or something like that.
And they've now been fighting with them, and the people
from Prospera are suing the Honduran government for an enormous
number that's greater than the GDP of Houndurs. It's something
(11:01):
like fifty billion dollars.
Speaker 4 (11:02):
So it's going, well, is what you're.
Speaker 3 (11:04):
Right, and that's the successful one. And I've seen all
kinds of presentations and lectures from people who basically want
to escape the strictures of the law. I mean, the
popular term on the libertarian right is exit, but Peter
Thiel has used this many times over the years, and
the exiting society, exiting the strictures of the law. So
when Trump start, you know, I don't playing the Trump
(11:24):
understands with his time of when you start talking out
something called freedom cities or building new communities from scratch,
that is sort of a whistle for our dog whistle
for tech people who when they hear that, they think, okay,
no laws, no regulation, we could get land.
Speaker 2 (11:40):
Cheap and do what we want.
Speaker 3 (11:42):
And frankly, also some of the people behind this charter
city stuff in right wing tech circles are really excited
about taking over Greenland because that's they want to start
a charter city there. I mean, all this stuff strikes
me as very untenable. I think it's so they can
have an authoritarian dictatorship basically, or you know, be totally
(12:03):
in charge. Because it's like what you were talking on
a few minutes ago with Andreesa and others. They can't
really stand to be bound by others rules, right The
entitlement is overwhelming, and they do think that they are elites,
both genetic elites and cognitive elites who deserve to rule others.
Speaker 1 (12:20):
What I have been so st You know, I'm married
to ed Tech VC and we've been married a long
long time and so for a long time I thought
that venture capitalists, because my father in law was a
venture capitalist, I always thought they were very smart because
they built all this businesses, they built, all this wealth,
they built you know, so much of America was not
you know, over the last fifty years, was built by
(12:43):
you know, investing in small companies and helping them grow
and creating these huge monopolies, which was not probably the point.
But anyway, but when I am struck by when you
got Elon on Twitter, what was so clear to me
was this guy's not so smart. I mean, maybe he's
good at building rockets, but his thinking is cooked. And
(13:06):
when you and and Curtis is the same way, when
you get Curtis going, he's just he's John pod Horz, right,
He's not some genius. He's just a person who has
a lot of discriminatory ideas about how the world works.
So I just don't, you know, I just wonder where
(13:26):
they think they're why they think they're so brilliant.
Speaker 4 (13:30):
It's just because they're rich, That's part of it.
Speaker 3 (13:32):
It's also I think because a lot of these guys
don't do much reading, and I'm not that I expect
them to be tearing through books all the time, but
they kind of think that they've reasoned, they've come to
first principles on their own. And you know, it's like
there's a joke about how every couple of years some
VC reinvents the bus by saying, like, you know, there's
(13:53):
gonna be an uber van that stops at at regular locations.
Like there's very low respect sort of for what comes before,
whether it's businesses or institutions or ideas, and so they
just I think you see that a lot with Musk
and others who kind of make up solutions as they
go along or propose things and you're like, well, there's
(14:15):
already something that does that, or you know, this institution
is supposed to do that, but they just don't seem
to get it. You know, that becomes a real problem
when the people acting like that, are speaking like that
are billionaires with a huge amount of power, not just
kind of idiots.
Speaker 1 (14:29):
On X and who are running the government. And yeah, exactly,
Elon is clearly out right. He is no longer running
the federal government. We don't really have the great details,
but clearly he has just crossed Trump one too many times.
So it seems like I know a big balls has
left the government. I said that Wired piece yesterday.
Speaker 4 (14:50):
I mean, what is that?
Speaker 1 (14:51):
Who sort of are these people now leaving, what are
they leaving with? And how have they fucked upths in
federal government?
Speaker 3 (15:01):
Yeah, Unfortunately, some of that is is sort of TBD
or a long term thing that we're going to find
out as we go along. But you know, I think
Musk and Xai probably got a tremendous amount of data.
You know, these are things I think deserve more reporting
and investigation. But when the Dose people showed up at
the SEC one day earlier the spring, I got a
(15:24):
message from a former SEC employee who said, you know,
they're going to get their hands on so much sensitive
market data because you know, the government, or especially agencies
like the SEC, have has really important and kind of
privileged and unique data about how Americans live, about the economy,
about you know, and we're not even talking classified information,
(15:45):
but just kind of everything around us, how our society functions.
That could be enormously valuable, whether to train in an
AI system like Xai's Grock, or to try to trade
on or you know, do something more sinister with And
I think the answer is that we don't really know
exactly what they're leaving with, but they probably did leave
(16:06):
with a lot of data. And then I think there
was a pretty revealing comment kind of confirming a lot
of us thought from Joe Lonsdale on X a few
days ago, where he's part of kind of the raider
Musk Orbit. He's adventure capitalist Peter Teal disciple, and he
said a lot of our friends are still in DOGE
and in the government pretty much paraphrasing here, and this
(16:29):
is something he said publicly like so, and we can
chart this too. Some of these people, at least a
lot of the appointees who came from a sixteen Z
or from mus crew at DOGE are still there. And
you know, we have the prominent people like Moscow or
nineteen year old big Balls leaving. But the things that
they the processes they started, some of the things they
(16:49):
put in place, I think are gonna be very significant,
especially as AI systems start getting rolled out to more
parts of government and start replacing more people, and I
think it's going to be an ongoing disaster of one
form or another.
Speaker 1 (17:04):
The thing I'm the most worried about is the AI stuff,
because I'm just not convinced that anyone has thought this throw.
Speaker 5 (17:12):
Yeah.
Speaker 3 (17:12):
I mean I've long argued, perhaps not a long yeah,
I've argued that one of the problems with AI is
not that it works, but that it kind of is
seen as something that works or works well enough so that,
you know, the fortune of five hundred CEOs and government
officials say let's put this in everything because it kind
of works well enough, and it saves on human labor,
(17:33):
and it's also trendy and makes them kind of look
up to speed. And I think a version of that
is kind of happening right now in the federal government.
I mean there have been you know, some very well
meaning Democrats have written letters to DOGE and other agencies
asking what are you doing with AI? But we don't
really know. But we do know that these things don't
work as well as Sam Altman and others say they
(17:55):
do or must does, and they're tremendous problems have that
we haven't even figured out yet.
Speaker 1 (18:01):
But that doesn't worry may as much as it working
really well. If it doesn't work like we know what
to do, which it doesn't work right, it might cause
a kind of Chernobyl moment, which scary and socks. But
like we can pull back from something that terrible. The
question is what happens if it works as well as
(18:22):
they say.
Speaker 3 (18:22):
It does, right, well, then I think you see kind
of AI become one of the tools or handmaidens of
more overt fascism, you know, like yeah, I mean we
haven't mentioned pounds here, but you know the US government
with pounds here, who's been being given contracts left and right,
is assembling kind of unified databases of personal information about
(18:44):
Americans that, for reasons legal and practical and to preserve
our freedoms, have been separate for even through kind of
the terrible Bush years. And so I think when you
when you see that kind of authority developing and that empowerment,
I mean that can only lead to some very dark places.
Speaker 1 (19:03):
Yeah, it's really that's good. I have wasn't worried enough,
So this is good. This was so interesting. I hope
you will come back.
Speaker 3 (19:11):
Oh, I would love to thank you.
Speaker 1 (19:16):
Matthew O'Neil and Perry Pelts are the directors of Can't
Look Away, which you can watch now on Jump.
Speaker 6 (19:24):
Welcome to Fast Politics.
Speaker 5 (19:26):
Matt Harry, thanks for having us. We're glad to be here.
Speaker 6 (19:30):
So I want to start talking about how you guys
decided to do this film. Whoever wants to start with that,
but just explain to us how you got here.
Speaker 5 (19:39):
Sure, we've long admired the investigations that happen out of
the Bloomberg newsroom and specifically out of Business Week, and
we're talking with Bloomberg about the possibility of doing investigative
documentaries because that's what we love to do as both
filmmakers and journalists. And read the reporting of Olivia Carvil,
who's a a Bloomberg investigative journalist who had been looking
(20:03):
at these cases in the social media landscape, and we thought, wow,
we know that social media is sort of vaguely bad
or can be a time suck, or there's bullying, but
she was uncovering these cases that showed the real extremes
and depths of horror of what happens on TikTok and
(20:24):
Snapchat and Instagram for children, and she introduced us to
this incredible group of lawyers.
Speaker 4 (20:31):
Harry tell us about the lawyers. So, Molly, the lawyers
are extraordinary. And if we just back up a second,
it feels like when we look at this issue, we
are living through what feels like the beginning of what
could be one of the great public health crises of
our time. And these platforms are built in a way
(20:54):
that prioritizes engagement, and most specifically engagement by kids, and
not safety. And that's not something that's incidental, it's actually
what's part of the business plan. And that's where the
lawyers come in because for so long and it continues
to this day, Section two thirty protects all of these
(21:15):
platforms and makes them impervious to legal action. So what's
unique about these lawyers is they said, you know what,
if Section two thirty is going to be the armor
that protects these companies, we're going to figure out a
way to still get at them. And that is where
the lawyers come in. And they took this on as
(21:36):
a matter of product liability and said, these platforms are
actually faulty products and they're causing harm to our kids.
And that's really where the legal start of this story is.
Speaker 6 (21:48):
It's funny because I really think of tech companies as
sort of the same as oil company is the same
as tobacco company is. These are real dangerous companies that
have sort of been able to spurt regulation. Is that
the sort of ETOs of the movie.
Speaker 5 (22:06):
The parallel between big tech and big tobacco is really
really clear. You know, when you see in the film
we all saw last year as the heads of meta, TikTok, etc.
Were all called in front of Congress, and many of
the families featured in our film were sitting in that
audience holding pictures of their children who had died due
(22:28):
to their engagement with social media. It was reminiscent of
the nineteen nineties when the big tobacco executives were there.
The biggest parallel is they know, and they knew the
things that are happening to children on these social media
platforms were known to the social media platforms, and as
Whistleblower after Whistleblower comes out with more documentation, you see
(22:52):
that children were being targeted and the companies knew of
the social harms and the real harms that were happening
to chill and chose time and again profit over the protection.
And we want to be really specific here because we're
not talking protection for every single user. We're talking protection
for children.
Speaker 6 (23:11):
So talk to me about section two thirty and how
that figures into this.
Speaker 4 (23:15):
Right now, Section two thirty let's just back up a
little bit was written in nineteen ninety six. It shields
tech companies from liability for what users post. And that
made sense back in nineteen ninety six because what existed
then were essentially bulletin boards, so the same thing as
phone companies. That would phone companies be liable, And this
(23:36):
is the argument that you see in the film the
chat Lawyer make. Would AT and T be responsible if
Matt called me and we made a drug deal over
the use of their phone. No, they wouldn't be, because
they are just what they would argue, sort of a pipeline.
And what was typically said is that these platforms were
bulletin boards. They can't be responsible for what people were posting.
(24:00):
That has evolved a lot since nineteen ninety six, and
right now these platforms and the algorithms they provide that
has a real human touch. It's no longer a bulletin
board or a pipeline. So Section two thirty has really
failed to hold these companies accountable. And that's, you know,
again circling back to why the lawyers are looking at
(24:21):
this as a matter of product liability.
Speaker 5 (24:23):
And it's important to understand that these big tech companies
are sort of using this federal statute from nineteen ninety six.
When I remember that was written when Mark Zuckerberg was
eleven years old. That's when for those of you listening,
remember the sounds of the high pitched squeals and beeps
(24:45):
as you logged into your AOL account through a phone line.
It was a totally different world. And the sort of
depths and breath and ambition of the internet and what
might be possible, but also the threats that would come
from social media we're frankly unimaginable.
Speaker 7 (25:02):
So with that evolution, I know a lot of people
seem to say, like, you know, this is both one
of the few issues that could be bipartisan, But I
know the UK, they just are starting to have the
biggest discussion after adolescence about what we should actually do
about regulating I think it's because people have been scared
about what could happen to kids.
Speaker 2 (25:20):
What do we see here? Do we see any roadmap
for change or anything.
Speaker 5 (25:24):
Well, there's a lot of things happening at the federal level.
There's a bill called the kid Is Online Safety Act
that actually passed last year in the Senate with ninety
three votes. So when you talk about bipartisan cooperation, it's
one of the rare areas where you can get ninety
three Senators to agree on anything. However, it got gummed
(25:45):
up by Speaker Johnson in the House who did not
bring it to a vote last year, and Senators Bloomenthal
and Blackburn have re submitted that bill. But what is
happening all around the country whereas the federal level it stalls,
is that states are innovative. You have New York just
past a bell to bell phone ban for schools, which
(26:06):
bell to bell means from the first bell of the
day to the last bell of the day, you're not
allowed to have smartphones in schools. You have legislation in Vermont,
in Colorado, in Utah, all different types of state based
accountability programs for social media companies to protect children. And
I think in this it's really important to clarify because
(26:28):
I use Instagram. I'm actually still consume a great deal
of news on X and social media can be a
perfectly good, useful, additive thing in our world, can connect
people across boundaries that were otherwise insurmountable. What we're talking
about here is the responsibility when it comes to the
youngest and most vulnerable users.
Speaker 4 (26:50):
Just something that I want to add into that Mali,
because I think it really matters. Is a lot of
people who are anti legislation will be very quick to
say this is a parent's problem. This is something that
should be moderated and mediated in the family. Parents need help.
But right now it's like sending them into a war
zone and they have no armor. This is not a
(27:13):
fight that parents can take on. We don't ask parents
to inspect playground equipment, we don't ask parents to inspect
child safety seats. This it would be akin to doing
the same thing there. So when people start to wag
a finger and say, well, parents should be taking care
of this, just want to point out what we're talking
about and just how difficult that would be. Any parent
(27:36):
right now who has a child who's dealing with this
knows that if a child wants to use their devices,
they're going to figure out a way to do it,
whether or not you agree or like it.
Speaker 2 (27:46):
Yeah, that makes a lot of sense.
Speaker 7 (27:48):
So my thing would be, so you're talking about state
level stuff, but this is one of those issues that
obviously we want to solve it across the entire country.
It seems like democratic senators are a way it's this
is there a reason, like Schumer? Are they not interested
in it?
Speaker 2 (28:03):
Like? What does this look like? What I'm imagining in
our audience right now is a lot.
Speaker 7 (28:06):
Of blood boiled parents that are not happy about this
right now, and like they want to know what to
do exactly.
Speaker 5 (28:12):
Well, the what to do is a really good question.
And these things change in complicated ways. And if we
go back to that parallel with cigarettes, it took a
long time for legislation to be an acted and it'll
took a long time for the litigation to go through
the courts. What did happen with cigarettes was.
Speaker 2 (28:31):
A cultural shift.
Speaker 5 (28:33):
I think at height, almost like forty five percent of
sixteen year olds we're using tobacco products in the nineteen
nineties and now it's down to something like six or
eight percent. And it happened precipitously. It was cool to
smoke when I was sixteen. Ten years later, it was
not cool to smoke. And you're seeing this change with kids.
(28:54):
So talking about it with your children, talking about it
with your trusted adult, and any younger person's life level
with them. Ask what they're seeing on social media, like
tell them that you understand and you're interested, because in
the end, kids are smart and they don't like being
taken advantage of. And when they learn about how the
social media platforms work and this idea if you're not
(29:17):
paying for the product, you are the product, they recoil
and I think there's a cultural shift to foot it.
Speaker 4 (29:22):
Also, I think is really worthwhile to tell parents that
you're not crazy, you're not alone. When you start to
see mood changes and secrecy and all of these things.
I think that we all do this, you know, you
sort of say, well, is this just normal team stuff
or is this on the extreme of normal?
Speaker 5 (29:40):
Like what is this?
Speaker 4 (29:42):
You don't know what to do? And I think that
one of the messages of the movie is that these
platforms are designed in many ways, as we said before,
to prioritize engagement to keep your kids online, and that
is part of the plant. Part of the plan is
to actually get your child's brain really really focused on
this material so that they can't look away from their devices.
(30:05):
Thus the name of the film.
Speaker 7 (30:06):
Yeah, you know, one of the more impactful things I've
seen recent times was that Social Studies documentary that showed
what the children were reacting to on their phone and
how their behavior changed. You mentioned though smoking, So the
Truth Campaign was largely credited for this. That then made
the lobbyists very scared. Can you talk to me about
what the lobbyist structure looks like? This just feels like
(30:28):
there's this barrier against the Senate and Congress. What does
the lobbying structure look like? Who is the companies that
are influencing the most in this structure?
Speaker 5 (30:38):
Speaking broadly, what you think about the size of the
tech companies versus the size of big Tobacco. Big Tobacco
just doesn't even look that big at its height compared
to big Tech, who have an army of lobbyists across
both parties. Like many big American industries, be it energy
(31:02):
or tobacco or technology, this is a bipartisan feeding frenzy
when it comes to both political donations and engaged lobbyists.
So the backlash or the counter push is being led
by community based organizations and activists and the parents you
(31:22):
see in this film who are transforming personal tragedy into action,
taking the most horrible things that can happen because their
children couldn't look away from social media and transferring it
into real push for change. And you have that with
the Mothers Against Media Addiction group, you have it with
Project Reset, you have it with the parents featured in
(31:46):
our film Can't Look Away. We're going there.
Speaker 2 (31:48):
Can you give us color on what those two groups
are pushing to do.
Speaker 5 (31:51):
Broadly, groups like Mothers Against Media Addiction and Project Reset
and other people who are going to the hill but
also to state legislative groups to push change are advocating
for responsibility where the companies can be held liable, so
that's a reform of Section thirty. They are in many
cases pushing again and again so that the accountability lies
(32:14):
not with parents or with children, but with the companies themselves.
That's the critical difference. I think. Then that's where there's
a certain amount of education and understanding. Is sometimes when
you hear about these tragic cases, you think that, oh,
it could be a bad kid or bad parenting. But
when you meet the parents in this film and you
meet the parents who are doing this work, you see
(32:36):
that these are good parents and good kids, and the
accountability lies with the product and the internet services, not
with the users.
Speaker 7 (32:45):
So of the social media services, I know people like
to pick and choose, like oh this is my good one. No,
this is my happy place, this is the bad one.
What do you guys see as the landscape you've really
been in this Do you have any feelings of like, Okay,
I'm not that scared of the one, this one's a
little better, or is it really just we should be
scared all around.
Speaker 4 (33:05):
It's a really good question, and I think rather than
just point fingers individually, in general, the rule of thumb
is the platforms that are using reward systems for young people.
So in other words, if you go on Snap, there's
the quick ad right so, and there's snapstreaks, and there's
all sorts of things that are built in so that
(33:26):
kids are getting instant hits of dopamine. The more that
you see that design feature in a system, in a platform,
typically the more addicting, the more compelling it is because
you start to introduce, like I said, the dopamine sort
of feedback system, but also on top of it, a
competition with young people, and we know that competition is
(33:49):
something that really drives usage as well. You go onto
something like YouTube, there's nothing like that, there's no addictive
power of that. That's really one of the things that's
to look at to understand which ones are more complicated,
more addicting than others.
Speaker 7 (34:07):
That's interesting. I guess my question to that though, that
do we feel like the addiction in dopamine mechanism is
really just the main thing that we should fear for
attention spans.
Speaker 5 (34:17):
Less about attention spans, And I think what Perry's talking
about is this gamification to sort of get the children
into the social media platforms, but it's also about checks
and balances on how the algorithms work. There are cases
in Can't Look Away that feature young people who go
onto TikTok searching for a young man in Arkansas, searching
(34:40):
for solace after a breakup, and looking for creational videos.
But the algorithms don't respond to what these children are
looking for. It responds to what they linger on. And
over the course of two weeks, he went from searching
for inspirational videos to being fed a constant stream. And
this is what I think might shock some of the listeners.
(35:02):
It's not just sad content. It may start there, but
he was receiving videos from TikTok that he did not
search for, that that were being fed and pushed to him,
that were explicitly telling him to harm himself and explicitly
encouraging suicide to the point where the background of one
(35:23):
of the videos says, you should blow your head off,
And you know, just to put a point on that,
they know what they're doing. The platforms know exactly what's happening.
They know that predators are using their apps to target kids.
They know that drug dealers are selling drugs, They know
that algorithms push harmful content. That is one of the
most you know, galling pieces of all of this. They
(35:45):
do it anyway because it works and it keeps kids online,
and that is ultimately what needs to be what needs
to be changed and what needs to be fixed, and
it can.
Speaker 2 (35:55):
Can you tell everybody where they could see.
Speaker 5 (35:56):
This, Can't look Away is on the streaming service Jult
Jolt dot film. You can access it there and you
can also find out more about the investigation on Bloomberg
dot com. It's important, we think also for any parents listening.
This is a film after you've watched it, that might
be worth watching with your children and it can start
(36:19):
a conversation. And there is a resource guide on Jault
dot film that a companies Can't look Away so that
you can use as a tool to have conversations with
your children or other young people who trust you in
your life about the real dangers of social media.
Speaker 2 (36:36):
Thank you both so much.
Speaker 4 (36:38):
Thank you Mollie. We really appreciate the time and the
opportunity to talk about Can't look Away.
Speaker 1 (36:43):
That's it for this episode of Fast Politics. Tune in
every Monday, Wednesday, Thursday and Saturday to hear the best
minds and politics make sense of all this chaos. If
you enjoy this podcast, please send it to a friend
and keep the conversation going. Thanks for listening.