All Episodes

July 18, 2025 49 mins

On this episode of The Middle, we talk about just how safe social media is (or isn't) for children. Jeremy is joined by Julie Scelfo, founder and executive director of Mothers Against Media Addiction, and Matthew O'Neill, co-director of the new documentary Can't Look Away: The Case Against Social Media. DJ Tolliver joins as well, plus calls from around the country. #socialmedia #instagram #IG #TikTok #Twitter #Snapchat #Meta #onlinesafety

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Welcome to the Middle. I'm Jeremy Hobson. Tolliver is off
this week and this week, if you've been following the
news about Congress rescinding funding for the Corporation for Public Broadcasting,
you know it has been a tough week for those
of us in public radio. We don't get any money
directly from the CPV here at the Middle, but all
of our public radio stations do, and if they are hurting,

(00:26):
we are hurting. So I just want to thank you
if you've made a donation to your public radio station
or to us here at the Middle through Listen to
the Middle dot com. It means a lot. We appreciate it,
and it's very helpful for our system right now. So
both Tolliver and I are part of the millennial generation.
We grew up before there was social media. I got

(00:47):
my Facebook account just as it was starting during my
senior year in college. But today's younger generation has grown
up with TikTok and snapchat and Instagram and YouTube. And
according to the Pew Research Center, almost half of Americans
aged thirteen to seventeen say they're online almost constantly. I

(01:07):
think most Americans would say they spend too much time
doom scrolling on social media, but when we're talking about
kids and teenagers, the stakes can be a lot higher.
There are significant mental health impacts that it can occur
with heavy usage, and children can be exposed to dangerous trends,
sensitive content, or even interaction with online predators. So that's
our question this hour, how dangerous is social media for

(01:30):
kids and teenagers? Our number is eight four four four
middle that is eight four four four six four three
three five three, And we're gonna get to your calls
in a moment. But first, last week on the show,
we talked about America's standing in the world and whether
or not you're worried about it. We got some very
impassioned voicemails that came in after the show, like this one.

Speaker 2 (01:51):
I'm Monses Bucai. I'm calling from San Antonio, Texas. I'm
originally from Mexico and I'm a naturalized citizen. I believe
in what this country represents, and I came here because
the respect of law and order and the balance of
power and the opportunity to grow has always been the

(02:14):
appeal of this country. This country's losing is footing. Countries
are losing respect and more important than people are starting
to ignore the United States. They're looking away from it
and will become an irrelevant countries we don't find our
way again.

Speaker 1 (02:35):
Well, thanks to everyone who called in. You can hear
that entire episode on our podcast in partnership with iHeart Podcasts,
on the iHeart app or wherever you listen to podcasts.
So now to our question this hour, how dangerous is
social media for kids and teens? Our number again is
eight four four four Middle. That's eight four four four
six four three three five three, or you can write
to us at Listen to the Middle dot com. You
can also comment on our live stream on YouTube. And

(02:58):
by the way, if you're a tea and you want
to tell us your thoughts about this, please give us
a call at eight four four four six four three
three five three and a warning, we may be discussing
some difficult topics, so I want to let you know
that ahead of time. Joining me this hour, Julie Scalfoe,
Founder and executive director of Mothers Against Media Addiction. Julie,
Welcome to the Middle.

Speaker 3 (03:18):
Thanks for having me. Jeremy a pleasure to be here.

Speaker 1 (03:21):
It's great to have you and Matthew O'Neil. Matt O'Neil,
a filmmaker and co director of the documentary Can't Look Away,
The Case Against Social Media, which is available to stream
right now on Jolt. Matt, welcome to you.

Speaker 4 (03:32):
Thanks for having me.

Speaker 1 (03:33):
So before we get to the phones, let's talk about
what we know in terms of the dangers. And I
mentioned a few of the things we've been hearing about.
But Matt, we talked about the thirteen year old age
limit for social media, which there are many ways around.
Your film looks at the dangers for users who are
over thirteen as well some pretty tragic incidents that have happened.

Speaker 5 (03:54):
You know, I think that when Perry Peltz and I
started making this film, like many adults, we had a
vague set that social media was bad, heard of online bullying,
media addiction, but we had no idea the extent of
bad actors on the platforms, of the access that predators,
that sextortionists had to teenagers, and frankly, the content that

(04:19):
promoted suicide and drug use that was being put into
children's feeds day in and day out.

Speaker 1 (04:27):
And you talk about suicide and drug use, in some cases,
the kids can buy drugs that may be laced with
fentanel online on Snapchat and platforms like that. And there
was a case in your film of a young person
who ended up taking his own life after being basically
what sextorted by somebody that he was talking with online.

Speaker 4 (04:50):
In less than six hours.

Speaker 6 (04:52):
You know.

Speaker 5 (04:52):
The Jordan Dumay was the captain of the football team
Homecoming King, an incredible young man, and he came home,
dropped off his girlfriend one night and was instant messaged
by what appeared to be a young woman. She flirts
with him, she solicits an explicit photograph from him, and

(05:12):
when he sends it, she immediately turns around and begins
demanding money, money that he doesn't have. He gives her
three hundred dollars. They demand one thousand dollars. He has
no idea that this isn't a girl two towns over.
It's actually sextortionist, professional sextortionists in another country.

Speaker 4 (05:32):
And when he says, I don't have any money.

Speaker 5 (05:35):
If you do this, I will kill myself, she says
do it, and he does. And this is happening in
where we all think our children are safe in their bedrooms.
The truth is is that if that's.

Speaker 4 (05:49):
The dangerous street corner.

Speaker 5 (05:51):
Now is the open iPhone, the open tablet in your
child's bedroom.

Speaker 1 (05:57):
And by the way, just because we are talking about
the who have suicide, Matt that if anybody is having
any of those thoughts and wants to make the call
to people who can help, the number I believe is
nine eight eight.

Speaker 4 (06:09):
Nine to eighty eight is the national suicide hotline.

Speaker 5 (06:11):
We all know nine one one, but nine eight eight
is something good to have in our minds.

Speaker 1 (06:15):
Great Julie, what are the main concerns that you have
about social media use by kids and teenagers and how
are they different from the concerns that we all have
about social media use and addiction.

Speaker 3 (06:28):
Well, you know, I'm a mom of three kids, and
when my oldest son was born, there were no iPhones
and Mark Zuckerberg worked at a small company called the Facebook,
and so, just like every other parent out there, as
these technologies became available, I was trying to figure out
what's right for my sons. I have three boys, what's safe,
what's appropriate. And by the time my second son was
in pre K, every single person in my neighborhood had

(06:50):
a phone. We were photographing our children, constantly sharing photos
of those kids, And what we didn't think about was
that we were actually training our kids that every single
moment of your life should be photographed and shared. And
now my youngest son, and again he's only five years
younger than my oldest, but he's growing up in the
world of TikTok And even though I don't allow him
to have tiktoks because I'm worried about what they do

(07:12):
to his attention span, I can't stop him now from
seeing tiktoks because its friends sended to him. Instagram has
gone to shorter and shorter reels. YouTube has gone to
shorter and shorter reels, And so in this space of
a very short amount of time, the entire media environment
has changed, and our kids are being exposed to a
volume of content and a style of content that is

(07:35):
really kind of fracking their ability to concentrate and their
ability to acquire the basic skills that you need to
be socially and emotionally and also academically healthy.

Speaker 1 (07:45):
And who has the ability to do something about that?
Is it the teens or the children themselves, the parents,
the companies, the government.

Speaker 3 (07:53):
So there is responsibility all around. However, I don't think
the responsibility belongs with the children. We do not give
our kids access to alcohol and say well, just don't
drink too much or go to the casino, but make
sure you leave in thirty minutes. No, as a society
of reasonable adults, we have decided that there are certain substances,
there are certain activities, they're not appropriate for young people

(08:15):
whose brains are not fully formed. And that's why we
age gate gambling, That's why we age gate tobacco, alcohol,
all those things. So there is a role for parents
to play, and parents need to be more educated about
what's going on. But the reason I started Mama Mothers
Against Media Addiction was because I began seeing the increase

(08:36):
in suicide behavior and suicidality and children more than a
decade ago. I began seeing the role of social media
and promoting these unhealthy behaviors. And again, suicide is just
at the far end of the spectrum, in between healthy
child and wanting to die. You have an increase in
people with eating disorders, body dysmorphia, cutting, bullying, all of

(08:58):
these problems. But the real change needs to happen at
the product level, and unfortunately, we've seen that these companies
are not willing to fix the products themselves, and so
that's why lawmakers need to act.

Speaker 1 (09:11):
Well and Matt one of the big parts of that
is the algorithm that you know, as a user, we
don't really think much about or have much to do with,
but it has a lot to do with what we
see and what's being fed to us. How are the
algorithms making the problem worse, especially for young people who
may have mental health issues already.

Speaker 5 (09:29):
Well, we titled the film Can't Look Away, because really,
what the algorithms do is not feed you what you're
looking for, what you want.

Speaker 4 (09:39):
They feed you what you can't look away from. One
of the.

Speaker 5 (09:42):
Most tragic cases featured in the film is that of
Mason Eden's, a boy in Arkansas who died by suicide.
But the way that started was that he had a
bad breakup with a girlfriend, like lots of kids do,
and he went on to TikTok looking for inspirational video.
But the algorithm doesn't respond to what you're looking for.

(10:04):
The algorithm responds to what you linger on. So what
does a sad, broken hearted teenage boy linger on sad content?
Within three weeks he was being fed content that literally
encouraged suicide, including the line blow your.

Speaker 3 (10:23):
Head off, joy And that video which is in the
film is really hard to watch. And you know, I
just want to take a second to say how grateful
I am to Matt and his partner Perry Pelts, because
that film encapsulates in an hour and a half what
every single parent in America needs to understand is being

(10:45):
shown to our children. They're being shown what you can't
look away from. And for that reason, it's addictive, and
we as a whole society are addicted to these things.
I'm addicted too, you know, I'm not pointing any fingers here.
It's affecting me too. But we have to act to
keep our kids safe.

Speaker 1 (11:00):
Well, and speaking of acting, one of the apps we're
talking about here is TikTok, which Congress voted last year
to ban nationwide unless it was sold by its Chinese
parent company, and the TikTok CEO show to testified on
Capitol Hill in twenty twenty three, and he got an
earful from lawmakers. Listen to this interaction with Georgia Republican

(11:22):
Representative Buddy Carter.

Speaker 7 (11:24):
Why is it that TikTok consistently fails to identify and
moderate these kinds of harmful videos.

Speaker 1 (11:33):
Why is it?

Speaker 7 (11:33):
Why is it that you allowed this to go on.
We've already heard God bless you from parents who who
are here with us, who have lost children. I submit
to you, everybody up here cares about the children of
this country. Tell me, tell me.

Speaker 8 (11:48):
Why this is a real industry challenge.

Speaker 1 (11:51):
And kind a nice industry.

Speaker 7 (11:52):
This is TikTok is we're talking about TikTok. We're talking
about why is it that you can't control this?

Speaker 1 (12:00):
And by the way, President Trump has ordered a delay
of the tic Sorry, and by the way, President Trump
has ordered a delay of the ban on TikTok multiple times,
which is why it is still operating in the United
States despite the band being passed by Congress and upheld
by the Supreme Court. We're taking your calls coming up
on the Middle. This is the Middle. I'm Jeremy Hobson.

(12:21):
If you're just tuning, in the Middle is a national
call in show. We're focused on elevating voices from the
middle geographically, politically, and philosophically, or maybe you just want
to meet in the middle. And this hour we're asking
you how dangerous is social media for kids and teenagers?
The number to call is eight four four four Middle.
That's eight four four four six four three three five three.
You can also write to us at Listen to the
Middle dot com or on social media. Sorry, we're asking

(12:44):
people to go to social media while we're talking about
the dangers of it. But this is the world we
live in right now. I'm joined by Mothers Against Media
Addiction founder Julie Scalfo and Matt O'Neil, co director of
the new film Can't Look Away, the Case against Social Media.
Let's get to Jason, who's in Mechanicsburg, Pennsylvania. Jason, what
do you think about the dangers of social media for
kids and teens?

Speaker 9 (13:04):
Oh, it's a huge growing problem. And I was a
teacher back in twenty fifteen when I resigned prior to
that for ten years and saw this becoming a growing
trend then and less platforms were becoming more popular at
the time, and we were trying to figure out a

(13:26):
way how to control this in the schools at that time,
and it became a pushback from parents and kids themselves
to where we've even had to allow phones be available
to them during the school hours, where when I was
going through school that was unheard of because the technology

(13:49):
wasn't there. So I looked at it as a tool,
and how do we use this tool to our advantage?
And many times it's being to our disadvantage because we
don't know how to properly use this tool. And that's
been a huge concern for many people. It doesn't matter

(14:09):
whether they're teens are adult. I see many adults just
as addicted as teens are.

Speaker 4 (14:16):
Well.

Speaker 1 (14:16):
And I wonder, Jason, when it comes to the pressure
to keep the phones in the schools, which is a
part of this conversation, do you think that it came
more from the kids or from their parents who want
to be able to reach them.

Speaker 9 (14:30):
Initially it was from the kids, and then it became
more from the parents who wanted to reach them. And
as the unfortunate situations of school shootings and stuff like that,
there was that immediate access of wanting to make sure
their kid was safe, and that was that assurance. If
they're able to answer the phone, then they must be safe.

(14:51):
But that can also be an unsafe situation because now
the phone's ringing and the person is wondering where that's
coming from.

Speaker 1 (15:00):
Yeah, thank you so much for that call. Julie, what
do you think about that. It's interesting to hear about
the sort of dilemma that people have between wanting to
use a phone for safety and also the dangers of
using the apps on that phone as a kid or
as a teenager.

Speaker 3 (15:14):
I'm so glad Jason called in because that concern that
he had as a teacher is actually one of the
reasons I started Mama, because over and over and over again,
we are hearing from teachers and principles about how the
presence of smartphones at school is disrupting education. It is
causing all kinds of distractions. We hear not only from

(15:34):
teachers but from social workers that they go into classrooms
and can see children openly watching Netflix, playing video games, gambling,
and even worse sometime sharing pornographic images. And now with AI,
you can even create a deep fake. And there was
a situation in Pennsylvania where students in a class took
a photograph of teachers, ran it through an AI program

(15:57):
and newdified the teachers, which led the teachers to resign.
I mean, how can they be expected to work in
that environment. So we do a lot of parent education
because one of the core parts of our mission is
getting smartphones away for the day at school, and that
means that students will have the opportunity to have seven
solid hours to be present and learn, real in person

(16:18):
relationships with their teachers, real in person socializing with their peers,
because we know those are so critical to being emotionally healthy.
And we also know from many studies that even the
mere presence of a smartphone, even if you're not looking
at it, leads to distraction and lower reading and math scores.
So right now, unfortunately, we are seeing the lowest performance

(16:39):
of reading in math by our thirteen year olds since
we started measuring back in nineteen seventy one. So I
think if we want to get our students back on track,
we know we have to get these phones out of
the class and let teachers do their jobs.

Speaker 1 (16:53):
And a lot of states that are led by both
Republicans and Democrats are starting to do that, which is interesting.
Let's go to justice, who's in Baltimore, Maryland. Welcome to
the middle. Go ahead with your thoughts.

Speaker 10 (17:05):
Hi, I have a lot of thoughts, I said a
sympre ambo. I am twenty two. I am a non
binary and I'm non binary, and I grew up on
the Internet.

Speaker 1 (17:16):
It was.

Speaker 10 (17:17):
It's a double edged sword in a sense because the
Internet allowed me easy access to the information to find
to allow myself to find who I really was, But
by the same token, that easier access to that sense
of community also led to just be talking to a
lot of people I really shouldn't have been talking to

(17:38):
that took advantage of me and a lot of my friends.
One story that really really sticks out to me was
that one of my now closest friends was encouraged to
lie about their age, to say that they were four
years older than they currently were when they were like
nine and ten. And that's horrifying just to like meet

(18:00):
the expectation of of adults or the other people in
that in that sort of in that sort of community.

Speaker 1 (18:10):
So justice, When you hear us talking about this and
the dangers of social media for kids and teenagers, what
do you think about that, Because it sounds like you're
saying there have been some pros and some cons for you,
is there is there some is there some you know,
solution that you would say, this is what we ought
to be doing to make it better in all kinds
of ways.

Speaker 10 (18:30):
I don't have a slam dunk solution. My mind immediately
jumps to it would be great if there was some
level of supervision. But with that also comes like a
possibility for really controlling parents to control what their kids see,
which again could be used to limit access to figuring
out who you really are, to try and fit into
a sort of mold. And it's a it's a catch

(18:53):
twenty two.

Speaker 11 (18:53):
I suppose.

Speaker 10 (18:55):
I'm not sure what else. Honestly, I know a lot
of a lot of people are trying to propose like
ID verification, and there are some privacy concerns with that
that I'm not too knowledgeable on that.

Speaker 1 (19:08):
Yeah, well, thank you so much for calling in. I
really appreciate it, Matt. You know, I'm glad we got
that call for a number of reasons, but one of
them is there are benefits to social media, and I
don't want to, you know, have people think, oh, we're
just attacking social media. We use it, a lot of
people get a lot out of it, but it does
have dangerous as well.

Speaker 5 (19:28):
Jeremy, I think we promoted our appearance on the middle
on social media because social media can be a good thing.
It breaks my heart to hear about justices, bad experiences,
and bad experiences of their friends, but it is also
a place to connect and to learn and to make
important social connections. I think the point big picture is

(19:51):
that there needs to be some sort of accountability. What
the mechanism for is that needs to be figured out.
But right now, because of a law in nineteen ninety six,
the Federal Communications Act of nineteen ninety six and one
section called Section two thirty that your listeners may have
heard of, the social media companies essentially can't be sued.
And I think that if someone on Instagram or someone

(20:15):
at TikTok can figure out that I want a lawnmower
before I know that I want to cut my grass,
or that I want to think about getting hair in plans,
maybe you're getting the same ads that I do. Jeremy like,
they can also figure out how to protect our children
from content that encourages suicide, right Like, there's a logic
here that somehow the brightest minds that are being paid

(20:38):
the biggest dollars can have the tools to keep children
safe and still keep social media a place where people
can connect and interact and make meaningful progress in their lives.

Speaker 1 (20:51):
And that's by by the way nineteen goes back to
the algorithm. I was just going to say, in nineteen
ninety six, when that law was written, we were using
if you had the Internet, you used a twenty eight
hundred bod modem to dialog, which probably our twenty two
year old listeners definitely are not aware of what that
was like. Elizabeth is calling from northern Indiana. Elizabeth, welcome

(21:15):
to the middle. Go ahead with your thoughts.

Speaker 10 (21:18):
Hi.

Speaker 12 (21:19):
As a teenager myself who spends a lot of time
on social media, I truly, truly, truly believe that social
media addiction is real and it should be treated as
other addictions. I was thinking about this the other day
and how there are rehabs for drug addicts and people
who who are drink a lot of alcohol. I feel

(21:45):
like there should be something available for teens, especially as
like a social media rehab where they can go and
interact in person without the worthy of being on their
phone or missing out on anything.

Speaker 1 (22:00):
When did you start using social media and do you
think you're addicted to it right now? Elizabeth?

Speaker 12 (22:07):
I do think I I am aware of my addiction
and I'm trying to fix it. I have screen time
on my phone that I asked my parents to set up.
I have friends that do spend a lot of time
on their phone, and I don't think they realize it.
But I started using social media when I was thirteen

(22:29):
or fourteen, and I'm seventeen now, so that's three Yeah,
that's five years, I think.

Speaker 1 (22:37):
And let me tell us you one more thing. What
do you think is the like, the most dangerous thing
to you about your use of social media? Is it
just the amount of time that you're spending on it,
or is it are there things that you come across
that you think are actually dangerous to your health.

Speaker 12 (22:54):
I think there's a lot of I'm just spending a
lot of time away for family and friends and making
meaningful connections in my own personal life that's not through
a portable device. But I do see a lot of
influencers like pushing, like especially with body image, especially for

(23:16):
pushing like for girls, especially for like a skinny like
model off duty vibe uh, where you have to look
good all the time, and I feel like that's very unrealistic.

Speaker 1 (23:30):
Yeah, well, Elizabeth, I really appreciate your call. Thank you
so much for listening and for calling in. Julie, so
many interesting points there. I told you we were going
to get young callers. We always do. But it's so
nice you.

Speaker 3 (23:42):
Old, I mean, look out of the mouth of babes. Right,
she's recognizing it's such a young age that this is
an addiction. At Mama, our website is we are Mama
dot org. We have a series called the Expert Insights
Series where I speak with different experts about different facets
of this problem. And we had doctor Jim Winston on
who is an addiction expert, and he explained that the

(24:05):
way this works in the brain is exactly the same
as the way any other type of addictive substance is
in the brain. And so she's right, you go through
withdrawals when you don't have access to it. And one
study of young people that took them away from devices
and sort of spent time in the woods away from screens,

(24:26):
found it took four or five days for young people
to be able to go back to normal recognition of
facial expressions and normal eye to eye to eye human contact.
And so, you know, you asked her a great question
what she thought is the most dangerous part of social media?
And I can't pick one part. I think the you

(24:47):
know examples Matt gave earlier about the extreme things like
fentanel sales or sex stortion. That's incredibly harmful. But another
incredibly harmful aspect of it is that social media lends everything.
It's giving you photos of your friends at a birthday party,
cat videos, and then you'll suddenly see a video in
which someone's hit by a car, or suddenly there's a

(25:08):
video of someone talking about a death, and so as
a human being, it takes you on this emotional roller
coaster that's so disturbing. And at least as an adult
I had, you know, twenty five years however, more than
that of growing up without that kind of what they
now call brain rot. But for young people, I mean
my own kids, say hey, this is normal, and guess what,

(25:30):
it's not normal. It's not normal to be exposed to
somebody dying or some horrible accident and then in the
next second see a.

Speaker 1 (25:37):
Cat video, so or these days or these days an
AI generated video of Judge Judy as a baby that's
showing up on mine for some reason. Let's check what's
coming in online at listened to the Middle dot com.
King's son in Los Angeles says, as someone who lived
through his teens in the social media boom. I would
argue that social media is harmful to people of all ages,
not just kids and teens. I'm not saying that social

(25:59):
media can't be a great learning to it for people,
but I feel it's mostly used as a feed for
people to compare themselves against others. And Shelby and Philadelphia
says social media is a commodity that has made purposely
addictive for profit. Just as cigarette companies were found liable
for making cigarettes addictive, social media companies should be liable.
Let's get to another call from Larissa, who is in Aurora, Illinois. Larissa,

(26:21):
go ahead with your thoughts.

Speaker 13 (26:23):
I'm hearing everybody give such negative input on social media,
and I'm very much I have a teenager who is
going to be sixteen next week, and I have an
eighteen year old, so as a parent, I'm right in
the thick of it, and I am so grateful that

(26:45):
they have had social media, particularly through the pandemic. They've
been able to keep touch with their friends, which is
right so big for me. My daughter is neurodiverse and
has a very difficult time making friends at school, but

(27:07):
she has this incredibly strong network of friends because you know,
the kids have formed this group and then they've moved
away from each other and they're able to stay in contact,
which is just it's so important that we have, you know,

(27:28):
people that we trust and have history with.

Speaker 1 (27:33):
And yeah, do you have any concerns though about your
kid's use of social media?

Speaker 13 (27:39):
Not really. I have a really good relationship with both
of my children, and you know, I it's the same
for everybody. If you're sitting on your phone or sitting
on a tablet and doom scrolling, that's not healthy. I
don't care what age you are, but you know, if

(28:00):
you're using it smartly, it is a tool. It's a
tool to connect with people. It's why it's called social media.
And the fact that there's bad content on there, there's
bad stuff in the grocery store. You need to be
aware of what you're consuming.

Speaker 1 (28:16):
Lurisa, thank you very much for that. Matt, what do
you think about that a parent who says social media
has been pretty good for her kids.

Speaker 5 (28:24):
I think that social media can be a positive force
in children's lives. It comes back to a little bit
of what we were talking about before, which is how
do we emphasize the positive and eliminate the negative and
When I say the negative, I don't mean again this
vague sense of it's bad or distracting or even leads

(28:45):
to bad educational outcomes. I'm talking about the negative that
leads to death, addiction, and real life harms. So social
media is not a binary, a simple on all off, good, bad.
It's the same thing like so many aspects of what
we're wrestling with today with technology. There is to steal

(29:08):
the title of your show, Jeremy, there's a middle, right,
and there's a middle ground that we have to meet at.

Speaker 4 (29:13):
And when I say we, I mean.

Speaker 5 (29:15):
Parents, the tech companies, legislators like everyone has to come
around about what's safe for children.

Speaker 1 (29:23):
Yes, and we're going to get to that. And the
fact that in the case of the issue we're talking
about here, there actually is more bipartisan agreement on this
than you would maybe expect. But we're not the only
country that is dealing with these issues. In one country
is trying to raise the legal age for using the platforms.
Australia passed a ban last year for anyone under age

(29:44):
sixteen thirteen here in the US, and it's supposed to
go into effect later this year. Here's the Prime Minister
Anthony Albanize.

Speaker 14 (29:51):
We want them to have real experiences with real people
because we know that social media is closing social harm
a scooge. We know that there is mental health consequences
for what many of the young people have had to
deal with, the bullying that can occur on Laine, the

(30:15):
access to material which causes social harm.

Speaker 1 (30:19):
And one of the things they're trying to figure out
in Australia is how best to do age verification. More
of your call is coming up on the Middle. This
is the Middle. I'm Jeremy Hobson. In this hour, we're
asking how dangerous social media is for kids and teenagers.
You can call us at eight four four four Middle.
That's eight four four four six four three three five three.
You can also reach out to us at Listen to

(30:40):
the Middle dot com. I'm joined by filmmaker Matt O'Neil,
co director of the documentary You Can't Look Away, The
Case Against Social Media and Mothers Against Media Addiction founder
Julie Skelfoe. And before we get back to the phones,
we heard about what's happening in Australia. What do you
think about raising the age Julie for using social media

(31:01):
to sixteen. Is that feasible.

Speaker 3 (31:04):
It's that certainly feasible, and there are some experts who
say they think it would be better to raise it
to eighteen, including some people who work for these companies.
You know, when you talk to folks who work for
these tech companies, they're the ones who don't let their
kids get phones at all. They're the ones who send
their kids to Waldorf schools where there's no electronics because

(31:25):
they understand the risks. The last mom who called in
who talked about the benefits of social media for her
kids had something I think was really important, which was
that her child had made a group of friends and
they used social media to stay in touch, so they
had a pre existing in person relationship and then use
social media to connect. I think that's a very different

(31:46):
thing than when young people go online and meet people
online who they've never actually met in person, because you
can pretend to be somebody else on social media, and
too often we're finding out that kids make friends with
someone who turned out to be an adult or not
who they say they are.

Speaker 1 (32:02):
So that about that this hour, yeah, exactly, Matt. We
talked about the fact that there actually is some bipartisan
agreement on Capitol Hill that social media is dangerous for
young people. Do you think that the lawmakers they were
going to have the backbone necessary to do anything about it?
And I say that because we see that they're not
stopping President Trump from delaying this TikTok ban, even though

(32:23):
they all voted for it.

Speaker 5 (32:26):
It's an incredible the TikTok ban, in particular, when you
think about it, it was voted into law by both chambers
of Congress, bipartisan support, upheld in the Supreme Court, signed
into law, and now the Justice Department is not enforcing it.
But when we look at the Kids Online Safety Act,
which is the major piece of legislation federally that is

(32:48):
meant to protect children online. It passed the Senate last
year with ninety three votes. I mean, how many things
passed the Senate this divided Senate with it any near
that number of votes. And so I think we have
to question where that backbone actually exists, because there is
an enormous amount of lip service and rightfully incredible sympathy

(33:13):
for the parents who are courageously out there fighting to
bring more and more attention to what happened to their children.
So it doesn't happen to other children. But the job
is not getting it not getting done. President Biden last
year promised to sign COSA into law. Speaker Mike Johnson
did not bring it to the floor for a vote.
COSA is introduced again this year with bipartisan support in

(33:36):
the Senate and in the House. It may not be
a perfect law, but it is a law that is
moving things in a right direction to try to protect
our children. So it's a question I think that your audience,
we all need to ask our representatives, why can't this happen?
If you think this is a good idea, if you
think our kids need to be safe online, why isn't
this moving forward?

Speaker 1 (33:57):
Priscilla is calling us from Dallas, Texas. Priscilla, welcome to
the middle. What do you think?

Speaker 2 (34:03):
Hi?

Speaker 15 (34:04):
I just wanted to call in and weigh in my
thoughts because I believe while there is an issue with
self harm on social media and it comes to children,
there's also an issue of radicalization that I see in
young boys specifically, and I feel as though it's equally
as dangerous to allow children and on social media who

(34:26):
are able to see these, you know, very political, politicized
radical videos.

Speaker 9 (34:33):
And posts.

Speaker 1 (34:36):
What are you thinking of, Priscilla, when you talk about
radicalizing young boys.

Speaker 15 (34:42):
When I think about that, I think about a lot
of some social media influencers, such as some recent examples
Andrew Tate. I did not expect so many young men,
at least in my life. And when I say young men,
I mean children that are between the ages of ten
to fourteen that were, you know, repeating a lot of

(35:03):
the talking points that I would see in some of
his videos. And it's something that even though I would
be able to avoid myself on social media, you know,
I'm twenty three, but for them, I would be able
to hear it once again in real life in person
from them. They're repeating these same points, and it's getting
to a point where it's like, you know, sometimes I

(35:23):
believe they don't understand fully what they're saying, and yet
they're having all of these posts pushed into their timeline
from the algorithm. And I think that's, you know, a
big problem that we also have when it comes to
social media.

Speaker 1 (35:39):
Yeah, Priscilla, thank you very much for that call. Julia
your thoughts on that.

Speaker 3 (35:43):
You know, if we took our children to see a
G rated movie in the movie theater and before the
movie start there was a preview for an R or
an X rated film. No parent in America would tolerate that.
And yet if your child signs up for an account,
indicate that they're thirteen year years old, the algorithm can
still promote content to them that has no place in

(36:06):
a child's world. And that's why these products have to
be changed. It's so easy to forget that these products
are made this way by design. There are human beings
at these companies who are designing algorithms to again promote
whatever you can't look away from. And what we can't
look away from is whatever's most extreme, what's most violent,

(36:29):
what's most prurient. You know, we're human beings. We can't
help but look at this stuff. And so unfortunately, you know,
the caller is exactly right. We hear hearing over and
over again about a rise in participation and hate groups
arise in this kind of you know, toxic masculinity. My
sons have come home and said things that my husband
was shocked at and was like, where did you hear that?

(36:50):
That's not a normal thing to say, Where did you
get that idea? And usually after we unpack it and
discuss it, we find out it's something that made its
way through the world through social media. So I think
that's another real harm that we have to think about.

Speaker 1 (37:05):
James is calling us from Fort Collins, Colorado. James, what
do you think about the dangers of social media for
kids and teenagers?

Speaker 8 (37:13):
Hitter, I think what a point that might be being
lost here, kind of getting mixed in the weeds.

Speaker 4 (37:18):
Is that?

Speaker 8 (37:19):
A greater issue here is the commercialization of our data.
Surveillance capitalism praise on knowing everything about us, and that
is often used against us simply because that's how capitalism works,
that's how the market works. So our data is packaged
and sold to the highest bidder, and oftentimes there's not
any oversight or regulation in that. So, you know, we

(37:40):
can have a decent conversation and there's space to talk
about what kinds of content we want to restrict and how.
But I think the greater underlying issue is the commercialization
of data in general. Shoshana Zubov, who's a professor emeritus
at Harvard, wrote an excellent book on this called The
Age of Surveillance Capitalism, highly.

Speaker 1 (37:55):
Recommended, interesting and a great point. Matt. You know, just
the amount of money involved here for these companies which
are making billions of dollars, and he's right on our data.
It's a it's an uphill battle for trying to go
against them, and you can hear the CEOs when they testify.
You know that this is an industry wide problem. I

(38:17):
don't really think there's much we can do about this.
What do you think?

Speaker 5 (38:22):
It's interesting when I hear what you're talking about with surveillance,
capitalism and the data, I hear part of the problem.
I also hear part of the solution, because to go
back to that analogy that one of the callers made earlier,
to the cigarettes, it took not just whistleblowers, not just legislation,

(38:42):
not just hearings to change things. It took a change
in culture. And in nineteen ninety six, when I was
eighteen years old, almost thirty five percent of teenagers had
smoked in the year, and now it's down to something
like six percent. That's not because they were told not
to smoke, because they don't want to smoke anymore. And
I think as young people begin to understand that they're

(39:05):
not using a product right something when they're not paying
for these social media they are the product right, Their
data is what is used and what has value.

Speaker 4 (39:16):
And kids get hit to that.

Speaker 5 (39:18):
And children don't like being taken advantage of. They don't
like being taken advantage of by big companies, especially, So
part of what I think these callers talking about is
part of the education campaign that will change this culture,
because young people are not going to participate in a
system that isn't serving them when they have the appropriate

(39:41):
sort of literacy in that space.

Speaker 4 (39:44):
And at some point it won't be cool.

Speaker 5 (39:47):
Just like it's not cool for most kids to use Facebook,
it's not going to be cool to use TikTok or
Snapchat if they see this instead as an aspect of
surveillance capitalism, that's taking advantage of them and just using them.

Speaker 1 (40:03):
Some more email comments coming in at Listen to the
Middle dot com. Mike in Minneapolis says sixteen should be
the minimum age to be able to purchase a smartphone,
just like gambling, just like alcohol, just like getting at
tattoo or getting up piercing. There are age limits, and
Mark and san Antonio says, can't say it enough. Where
are the parents be involved with what your kids are doing?

(40:23):
Let's check in here with Brady in Arlington Heights, Illinois. Brady,
go ahead with your thoughts.

Speaker 11 (40:31):
I'm listening to the show. I'm very, very tuned into
this because I have a seven year old boy and
a three year old daughter, and I'm I'm myself as
a parent, was born the seventies, so I'm not from
a generation of being so ind inundated and bombarded with

(40:55):
so much ice A staring on the show, how justifying it?
Trying to find the the balance and and everybody's trying
to find the good and the bad and the harm
and the good. And I just keep going back to

(41:15):
the h in my mind and my development, my upbringing.

Speaker 1 (41:22):
I think we we lost him there, but I I
you know, Julie, one thing he was clearly getting to
there is that, uh, we're you know, we're very different generations.
I said at the very beginning. I grew up in
the sort of generation where we started with no social
media and we ended our you know, our youth with
social media. But how are parents to know what to
do when they didn't actually have these products, uh, you know,

(41:46):
at a young age themselves.

Speaker 3 (41:48):
I mean it's tough, and as a parent, I face
those same questions, and I really appreciate Mark that emailed
in and said, like, parents pay attention, and parents do
need to pay attention. And I encourage parents, if your
child has a device, they have a phone, sit down
next to them and see what they're looking at, because
your child can be right next to you and you
think you know what they're doing, but you don't unless

(42:09):
you're looking at it. At the same time, I as
a young mom, signed up all my kids for software
that was supposed to let me know if it would
alert me if they saw something dangerous. And I spent
a week. I hooked up all their accounts, and I
started getting one hundred email alerts a day, and one
hundred different emails saying your child has seen something violent,

(42:31):
your child has seen alcohol, your child has heard profanity.
And I realized there's no way any parent can do
this on their own. And all of these apps, all
of these products that say they are parent controls, they
don't work. And you can set them all up and
you think you're doing right by your child, but they
don't always work. And that is why there have to

(42:52):
be safeguards. You know, we don't let people make baby
formula and put wood chips in it. Just because they
could make a little more profit. Regulations in place that
our standards for food. For our vehicles, you know, you
could make a cheaper car if you didn't put seat
belts in it, but we require the seat belts. So
for social media, I think Matt and I have the

(43:12):
same view that these products need to be safeguarded according
to Meta, and this is Meta's data. You can look
it up on their website. In September, they put out
a press release touting all the things they're doing to
remove suicide and self harm content, and at the bottom
of the press release they acknowledged taking action on twelve
million pieces of suicide and self harm content last year

(43:34):
from Facebook and Instagram. And that was just between April
and June. So a modest estimate is there's forty eight
million separate pieces of content. It's very unhealthy going around
these platforms annually. And that doesn't even include the other
kinds of unhealthy content.

Speaker 1 (43:50):
YEA, let's get to Joshua in Saint Louis, Hi, Joshua,
go ahead with your thoughts.

Speaker 6 (43:56):
Hey, yeah, I had to comment on how earlier you
mentioned with the companies, how they can you know they'll
know even before we think about it's something we need,
yet they for some reason don't have the capabilities to
protect our children. And one thing I remember reading was
how it's like kind of with TikTok and everything over there.

(44:18):
You know, they have, of course a lot of censorship
on it, but a lot of it is positive content,
whereas you look in a lot of other places you
can find a lot of hateful content that you know,
of course leads to the clicks. And I remember growing up,
you know, back in two thousand and seven, two thousand
and eight, when the first smartphone came out, I was
growing up with the uh you know, with social media

(44:40):
and kind of seeing how it evolved and everything like that.
And I really do believe they they do have an
extent some protections on this content, but I still do
see a lot of promotion for it. Like Reddit, for example.
You know, back in the day, they had a lot
of really bad subreddits of course, with gore and very

(45:02):
not safe forward things. But you know, they quote unquote
took care of it and banned them. But for years
there were hidden you know, what could be considered hidden
sub It's still with these things like gore and all that,
but they still had hundreds of thousands of people subscribed
to them and following them and reporting them, of course,
and it wasn't until up till about a month ago

(45:24):
one of these were banned. Yet I've known about it
for years and nothing's been taking care about it. So
it's like, how do you get to that far? Yet
you guys don't do anything about it when it's very
obvious that it's there.

Speaker 1 (45:39):
Well, and Joshua, it's a great point. I'm glad you
brought it up, because honestly, Matt, we could do an
entire show on the idea that these companies highlight the
things that make us the angriest, which is a big
part of what we're talking about here, right.

Speaker 5 (45:52):
That's a whole different set of problems with social media,
right in terms of how they amplify the most extreme emotions,
usually negative ones. What we see in the effect with
children is this depressive content and this negative content that
can transform them emotionally. That's for the next show, Jeremy,

(46:14):
because there's more than one problem with social media.

Speaker 1 (46:17):
Well, and as we close out this ar I'm just
going to briefly ask each of you, you know, let's
try to end on a high note. Here. I'll start
with you, Matt after putting together your film, do you
have hope that things are going to get better when
it comes to how young people interact with social media?

Speaker 5 (46:32):
One hundred percent yes, because I'm inspired by the parents
you see in the film who lost their children, who
are working to change policy or working to change the
way these social media companies work. These parents. Just two
weeks ago, with that big beautiful bill, they fought day
and night to make sure that an AI resolution that

(46:52):
was going to take away the ability of states to
regulate artificial intelligence was removed from that bill. And that
was because of citizen based activism, parents who are changing
the world coming out of that tragedy. So positive change
is possible. Take a stand and do something.

Speaker 1 (47:09):
And Julie, what about you are do you think are
you optimistic that people are taking this issue more seriously
and things will change.

Speaker 3 (47:16):
I am optimistic because when I started Mama, I didn't know,
you know, whether it would work, whether people would want
to join me. We're not even two years old yet
and we are ready up to thirty five chapters in
twenty two states. I'm optimistic because we see lawmakers all
over the country red and Blue States taking action in Missouri, Nebraska, Arkansas.

(47:39):
They've enacted phone free school bands, Nebraska and Minnesota past
Kids Code legislation, which is legislation that will require social
media companies to make their products safe for kids. And
you know, a new Pew study that just came out
said seventy four percent of adult support banning phones and schools.

(48:00):
So yeah, I'm optimistic, but it is going to take
a lot of work because these tech companies have a
lot of money and they lobby hard to prevent any safeguards.

Speaker 1 (48:09):
Right. Julie Selfo, founder and executive director of Mothers Against
Media Addiction, and Matt O'Neil, co director of the new
documentary Can't Look Away, The Case against Social Media, which
you can stream on Jolt. Thanks to both of you
for joining us.

Speaker 4 (48:20):
Thank you, Jeremy fun to be here.

Speaker 3 (48:23):
Thanks for having me.

Speaker 1 (48:24):
A Reminder of the Middle is available as a podcast in
partnership with iHeart Podcasts on the iHeart app or wherever
you listen to podcasts and come into your feed. In
the next few days, we're going to be doing an
extra episode of one thing Trump did about the surging
number of measles cases in the United States, and we'll
be back here next week on the Middle. We'll be
talking about what can actually be done to reduce the

(48:44):
national debt. As always, you can call in it eight
four four for Middle. That's eight four four four six
four three three five three. The Middle is brought to
you by Lungnook Media, produced by Illinois Public Media or
Banitt Illinois and produced by Harrison Patino, Danny Alexander Samburmstage,
John Barthonickadeshler, and Brandon Counterstrunning out of Time technical director
Steve Mork. I'm Jeremy Hobson, and I will talk to
you next week.
Advertise With Us

Host

Jeremy Hobson

Jeremy Hobson

Popular Podcasts

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.