All Episodes

December 11, 2025 54 mins

Journalist and founder of MAMA (Mothers Against Media Addiction), Julie Scelfo, joins the Chuck ToddCast for a wide-ranging conversation on one of the most urgent issues facing American families: the collapse of trust in media, the rise of unregulated tech, and its profound impact on children’s mental health. Scelfo explains how the explosion of social media and smartphones—coupled with virtually nonexistent regulation—has left kids exposed to harmful algorithms, addictive design, and misinformation at formative ages. With both the left and right now pushing to get smartphones out of classrooms, she argues that we’re long past due for meaningful guardrails, especially as Big Tech’s lobbying power grows and federal leaders threaten to block states from regulating AI. Chuck and Julie compare today’s fight with Big Tech to the battle against Big Tobacco, explore why recommendation systems effectively make platforms publishers, and discuss the growing bipartisan appetite for banning iPads in schools and returning to books.

They also dig into MAMA’s mission to protect children in a digital world—emphasizing that kids under two should not be exposed to screens, that early childhood development depends on real-world interaction, and that “tech nannies” have emerged because childcare is unaffordable. Scelfo calls for mandatory data transparency from tech companies, stronger child-safety legislation in the states, and a culture shift that prioritizes healthy offline development over the race to dominate AI. With future jobs uncertain and social skills declining, she argues that society must resist sacrificing sensible regulation in the name of innovation—and recognize that too much time online isn’t just unhealthy, it’s shaping a generation.

Get your wardrobe sorted and your gift list handled with Quince. Don't wait! Go to https://Quince.com/CHUCK for free shipping on your order and 365-day returns. Now available in Canada, too!

Go to https://getsoul.com & enter code TODDCAST for 30% off your first order.

Got injured in an accident? You could be one click away from a claim worth millions. Just visit https://www.forthepeople.com/TODDCAST to start your claim now with Morgan & Morgan without leaving your couch. Remember, it's free unless you win!

Protect your family with life insurance from Ethos. Get up to $3 million in coverage in as little as 10 minutes at https://ethos.com/chuck. Application times may vary. Rates may vary.

Timeline: 

(Timestamps may vary based on advertisements)

00:00 Julie Scelfo joins the Chuck ToddCast

03:00 Erosion of trust in media & news is a massive problem

04:00 Mental health decline in youth correlated with rise of social media

05:00 Both left & right want get get smartphones out of classrooms

05:45 Lack of social media regulation leaves kids vulnerable 

07:00 Regulation is difficult when big tech has unlimited money to lobby

09:00 Threats from Congress & Trump to prevent states from regulating AI

10:30 Executive order from Trump may be last gasp to avoid AI regulation

12:30 AI has been positive for shareholders & owners, not for the public

13:45 What lessons can be learned from the fight against big tobacco?

16:15 Recommendation algorithms turn platforms into publishers

17:45 Advertiser supported speech is different than first amendment speech

18:30 Broadcast networks are liable for misleading ads, social platforms aren’t 

20:00 Momentum building to ban ipads in schools and make kids use books

21:30 MAMA’s mission and goals

23:00 Children under 2 shouldn’t be exposed to screens for entertainment

25:30 Kids know how to find information, but must be taught to filter it

26:30 Most educational building blocks are built during early childhood

27:15 We can’t sacrifice sensible regulation in order to win AI race

28:30 Tech leaders all have very inept and awkward social skills

31:00 Tech must be required to release data for researchers to study

32:30 How to prepare kids for future jobs that may be replaced by AI?

34:00 Real life interactions are critical for a healthy childhood

35:45 We will always need trades, skilled labor and care workers

37:30 What are some near-term activities MAMA is working on?

40:00 States are introducing quality child safety legislation on tech

40:30 There is bipartisan support in congress for regulation

42:00 There hasn’t been one study worldwide on tech at a young age

43:30 We have “tech nann

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
It's sponsorship time. But you know what, it's really great
when you get a sponsor that you already use. And
guess what. Quint's is something that in the Todd household
we already go to. Why do we go to Quint's
Because it's a place you go where you can get
some really nice clothes without the really expensive prices. And
one of the things I've been going through is I've
transitioned from being mister cot and ty guy to wanting

(00:22):
a little more casual but to look nice doing it.
Is I've become mister quarter zip guy. Well guess what.
Guess he's got amazing amounts of quarter zips. It is Quints.
I have gotten quite a few already from there. The
stuff's really nice. They have Mongolian cashmere sweaters for fifty dollars.
I just know, hey, cashmere, that's pretty good. You don't

(00:43):
normally get that for fifty bucks or less. Italian wool
coats that look and feel like designer the stuff. I'll
be honest, right, you look at it online, you think, okay,
is this really as nice as it looks? Well, when
I got it, I was like, oh, this is real quality.
So yeah, I'm going to end up making sure I
take it to my dry cleaner so I don't screw
it up when I clean it. But I've been quite impressed.
In Hey, it's holiday season. It is impossible to shop

(01:06):
for us middle aged men. I know this well. Tell
your kids, tell your spouses, tell your partners. Try quints.
Or if you're trying to figure out what to get
your adult child, what to get your mom or dad,
I'm telling you you're going to find something that is
going to be comfortable for them on quints. So get
your wardrobe sorted and your gift list handled with quints.

(01:26):
Don't wait. Go to quins dot com slash chuck for
free shipping on your order and three hundred and sixty
five day returns now available in Canada as well. That's
qui nce dot com, slash chuck, free shipping and three
hundred and sixty five day returns quints dot com slash chuck.
Use that code so joining me today. Er of something

(01:54):
called MAMA. That's the acronym. It is Mother's Against Media Addiction.
The founder is Julia excuse me, Julie Shelfe. She's a
former New York Times reporter has become a bit of
I think a media ecologist of sorts. And look, as

(02:15):
many of you know, many of my listeners here know
that trying to figure out how we rebuild trust in
the information ecosystem, how can we clean up the information ecosystem?
And one of the more remarkable things in this polarized
environment is that what's been interesting is that the only
time we can find areas of agreement is when it
is about protecting our kids online social media. And so

(02:37):
as we try to figure out how big tech is
going to build AI and will they do it safely
or not? Given the experience with social media, I figured
this would be a really good conversation because I want
to learn more about mama. So, Julie, welcome to the podcast.

Speaker 2 (02:56):
Hi Chuck, thank you so much for having me. It's
a pleasure to be here.

Speaker 1 (03:00):
Let's start with you know this, you know, look, I'm
in I just got out of an hour long meeting
in a nonprofit group I'm involved with called Trust in Media,
and where it's just constantly like, what can we do
to repair and rebuild trust in all sorts of institutions? Right,
sort of the information ecosystem that was sort of at

(03:22):
the at the heart of the What's interesting about this
organization that I'm working with is that it includes folks
in the national security space and folks in the business
space and in health and in sports, as well as
news and politics. And you're tackling this from a from
a youth space, and as we transition from fearing social

(03:44):
media to fearing AI. This is I think a pretty
relevant conversation. So I want to start with what motivated
you to do this? I think it's obvious in some cases,
but are you pressed without quickly we actually have come
together on the issue of at least our phones and schools.

Speaker 2 (04:07):
Well, okay, so there were like three questions.

Speaker 1 (04:10):
And that's what I did. Maybe that's why it's a podcast.

Speaker 2 (04:13):
I'm gonna I'm gonna work backwards. So so I was
moved to start Mama for a couple reasons.

Speaker 1 (04:22):
One is that when did you start it?

Speaker 2 (04:24):
By the way, so I incorporated her in twenty twenty three.
We only announced ourselves publicly early last year, so we're
not even two years old yet. But a few years
before that, I gave a TED talk because I was
so deeply concerned about erosion of trust and information and

(04:45):
what was happening in our media environments, and having spent
my career in professional newsrooms, I was at the New
York Times and before that, I was at Newsweek. I
sort of watched the rise of digital media and how
the legacy news organizations responded to it. And what was
very clear to me, what was very unfortunate is that

(05:06):
even though journalists are really good at understanding and vetting
information quality, they weren't trained in media literacy, and they
didn't understand how what they publish and how what they
communicated communicated. But they were communicating online and on television
was helping shape the information environment. And so you know,

(05:28):
the question about regaining trust is a big one, and
I hope we can talk about it more later in
the show. But specifically for Mama, I was reporting a
lot on youth mental health and suicide. And it was
about a decade ago that we saw a terrible increase
in suicide rates among American adolescents, and I reported on

(05:49):
that and it was manifestly obvious that social media was
at the heart of that problem. And then, as sad
as that story was, we saw suicide rates go up
not just in teens, but in tweens, which your children
as young as nine to ten. Now I'm a mom.
I have three sons, and reporting that story really shook

(06:10):
me to my core. When you have nine year olds,
ten year old children who want to die, something is
profoundly messed up. Because I run mama. I won't use
the crude language that's really in my head about that
how messed up that is. But I realized something had
to be done. And what has to be done is
that this information environment has to be brought under control

(06:33):
so that it's safe for children.

Speaker 1 (06:35):
And that's you know, I look at this movement right
of what we've seen about trying to at least take
bones out of classrooms, and you know, when I see
how hard it is to get the left and the
right to agree on anything, and it is the one thing,
whether it's a super liberal state legislature, super conservative state legislature,

(06:56):
this stuff has made it through. This is the one
place it is made it through. I fear we're too
late on social media, but maybe this gets us there
on AI.

Speaker 2 (07:07):
So I don't think we're too late on social media
because every day a child is growing up and being
exposed to things through these platforms, and every day that
we allow a handful of companies to share whatever they
want with whoever they want, under any circumstances, with no regulation.

(07:30):
Is a day that we are leaving our children vulnerable
to terrible harms. When I was reporting my stories, we
didn't have data about the amount of suicide and self
harm content. Meta released data on it just last year,
and according to Meta's own report, there were forty eight
million separate pieces of suicide and self harm content on

(07:53):
their platforms in the previous year. And that's just the
pieces that they'll acknowledge. There's probably a lot more than that. So,
you know, as long as we've had mass media, beginning
in the nineteen thirties with radio, our government has regulated it.
It has said there need to be limits, there need
to be standards. It's not censoring free speech. It's just

(08:13):
recognizing that not all content is appropriate in all places
for all audiences. I don't think we want a world.
You know, if we allowed every single course vulgar X
rated activity on our regular television channels and on the radio,
you know what kind of world would we have. So

(08:33):
I'm not.

Speaker 1 (08:33):
Exacting the way that world does exist. It's the Internet.
I mean, we've let this happen on the Internet, and
nobody wanted this, right, Like, we know we don't like this,
but we can't agree on how to stop this.

Speaker 2 (08:48):
Well, I think there is actually pretty wide consensus on
how to stop it. I think we also are just
facing an industry that has unlimited amounts of money and
they are spending on godly amounts of money on lobbying.
Our friends at an organization called Issue one have been
tracking this and they have found that the tech big

(09:11):
tech industry has one lobbyist for every two lawmakers in Washington,
and Meta alone has one lobbyist for every seven lawmakers.
So you know what's happening is, even though there's wide
agreement among most lawmakers and most parents and most citizens,

(09:31):
you have big tech spending on godly and holy amounts
of money and getting just a few people who are
really messing up legislation. I mean, one example is the
Kids Online Safety Act, which passed last year in the
Senate by a vote of ninety one to three, which like,
when does that happen? And over in the House, Speaker
Johnson refused to bring it for a vote. Steve Scalise

(09:53):
says it wasn't good legislation. And nobody could understand why.
And then it was announced that Meta is building a
twenty eight billion dollar AI data processing plant in the
state which you know, I don't know. Does that have
something to.

Speaker 1 (10:06):
Do with it? Maybe in the state of I assume
you mean state of Louisiana.

Speaker 2 (10:11):
The State of Louisiana.

Speaker 1 (10:12):
Yeah, Johnson and Scalise, Yeah.

Speaker 2 (10:14):
No.

Speaker 1 (10:14):
And we're seeing the same thing with AI. I mean,
take the take the issue of this moratorium from states
being able to regulate AI that I do think is
become I think it's now too toxic to support. I
think we'll find out.

Speaker 2 (10:33):
Right, I am, well, I hope you're right.

Speaker 1 (10:35):
Yeah, I mean we're going to find out. And maybe
unfortunately this gets can't tell you how many pieces of
really harmful legislation gets snuck in in the month of December.
It's historically because everybody's in holiday mode. It'll just can
can sometimes get So I think we are in the
month of December when we're taping this and when this

(10:55):
is when people are listening to this, So there's always
a chance this is something that get snuck in. But
it does seem as if there's enough opposition. But I
know you're working on this.

Speaker 2 (11:07):
I mean, our members have been sending thousands of letters.
And for those of your listeners who aren't familiar with this,
the tech industry is trying to get this sweetheart deal
passed where they would pass a federal law saying there
can be no state regulation of AI and this is
such a profound violation of states' rights and their fundamental

(11:29):
ability to keep their citizens safe that we had forty
four I think attorneys general send a letter to the
Senate when this was being considered earlier this year in
the Big Beautiful Bill, saying this is outrageous. You can't
do this. Hundreds of lawmakers of both parties, so you
can't do this, And it came out of the Big
Beautiful Bill. Now they're trying to sneak it back into
the Defense Reauthorization Act. And we're also hearing there's the

(11:53):
possibility of an executive order. So you know, the way
the draft was worded, it's so vague. It would not
only for it states from regulating AI, prohibit them from
regulating social media. So nobody wants this, and it would
just be a real boon to a handful of billionaires
who own these companies.

Speaker 1 (12:10):
Well, I'm glad you brought up the executive order, because
it does seem as if that was that was going
to be the tech community's last resort, and that does
look like that's going to happen, doesn't it.

Speaker 2 (12:21):
That's right, you know, I hope it doesn't. I do
think even if it does happen, it's so problematic that
it's unlikely to be enforced right away.

Speaker 1 (12:30):
But I mean, I don't think it's by the way,
I don't think it's enforceable, and I think there'll be
some states that basically like, let's go, we're going to
we're going to do some regulate Let go ahead and
try to stop us.

Speaker 2 (12:40):
Yeah, yeah, I mean, how like, in what other industry
do we say, Okay, you can just do whatever you want.
You know, we regulate our food to make sure it's safe.
We regulate our vehicles to make sure they're safe. We
don't want to live in a world where companies are
not required to test cars and make sure the breaks
work before you get them.

Speaker 1 (12:58):
Well, this gets a something that I think is one
of those do you know. One of the ways the
advocates of low regulation right now in AI, one of
the arguments they make is, hey, we did this with
the Internet. We had very little regulation, and my retort
always is, and how did that go? Right? Like, we

(13:19):
decided to have a hands off approach on social media
and that turned out to be a colossal mistake. Now
I happen to believe social media. I mean, here's the problem.
I think this has been a disaster. I think it
something that destroys the information ecosystem, that destroys trust, that
breaks up families. This is not a successful business, and

(13:41):
yet we want to create the same regulatory environment to
allow AI to thrive. And we think that's going to
be good because of the experience with social media. And
I do think this is why there's more public and
there's more bipartisan pushback on this because I think if
you frame the question of do you want the tech

(14:02):
companies to have this to use this same lack of
rules of the game to build AI as they build
social media? Do you trust the same people that build
social media to build AI? I think the answer is no.

Speaker 2 (14:15):
Well, Chuck, it's not a successful business for families, for society,
for democracy, for children, but it is for shareholders and
for a handful of people who own these companies. And
what we're seeing, is this really unprecedented alliance between a
few people at the highest levels of government and the
folks who own these companies. Now, everybody who has children says,

(14:39):
that's not what's most important. You know, I have no
problem with people making money and building a successful company,
but you should not be able to do it on
the backs of children. So you know, that is why
Mama is working. We started, you know, hoping that we
could have six chapters by the end of the first year.
We were sort of inspired by mothers against drunk driving
and the way they had chapters around the country. Instead,

(15:01):
we're up to nearly forty chapters in twenty two states.
Our members are working in their homes and their communities,
in their schools, and with policymakers to advance changes, just
like we had to make big changes once we learned
about the harms of big tobacco, saying Okay, it shouldn't
be normative that we give this to young children. It

(15:23):
shouldn't be okay that we load it full of chemicals
that can cause cancer. And we have to ensure that
we're both using it in a safe way and that
the manufacturers of these products are held to basic safety standards.

Speaker 1 (15:38):
Glad you brought up tobacco. I was actually that was
the next way I was going to ask, which is
what lessons are there to take away from the essentially
what was a thirty year fight, but it was the
public one, right, tobacco. You know, it's not gone, but
it is. It is now where it should be available

(15:59):
to those who want it, for adults and adults only.

Speaker 2 (16:04):
It's a great example. And I think this fight is
a lot like big tobacco in some ways, and there
are a few differences. And what's similar is that big
tech and social media products and technology in general is
so embedded in our society that in order to shift
how we use it, to shift how we regulate it,

(16:24):
it's going to take change at multiple levels. So, just
like with big tobacco, we need culture change. We need
a lot of education so that people think about it differently.
We had to go back to Hollywood, you know, whereas
tobacco companies had paid them to write scripts where the
sexy leading men and women were smokers. We had to
convince them to make sure that the sexy men and

(16:47):
women leading you know, stars were not smoking. We had
to get rid of Joe camel, So there had to
be a lot of public education. And then there were
also lawsuits, and we're seeing with organizations like the Social
Media Victims Law Center and other lawmakers where they are
suing these companies because the companies are claiming it's not
their fault that children are harmed. But now the courts

(17:08):
are hearing cases where, for example, you know, and apologies
to talk about a difficult subject in listeners, just like
a kind of a trigger warning. But you know, I've
met families whose children were sent affirmatively sent by Instagram
of videos of someone hanging themselves and then their child replicated.
We had a little girl in Pennsylvania who tried the

(17:30):
choking challenge after the platform sent it to her, and
so courts are now hearing that, and that's just like
what happened with Big Tomay.

Speaker 1 (17:36):
Well, this is this whole thing was sexual section two thirty.
You know, I've had this argument, I'm not a lawyer.
I played one on I've played one on television. Was
how I joked, And basically every reporter over time sort
of starts to think like a lawyer at times, right,
And I don't understand how Section two thirty even applies
that once a tech company creates an algorithm, now a publisher.

(18:01):
If they choose not to be a publisher, then they
don't have any liability. But the minute they created an algorithm,
as you point out they sent the video, they are
It's no different than our old newspaper editors deciding what
goes what goes on the above the fold that people
can see in the news box where you you still

(18:21):
when we're you still? Sorry for those of that don't
remember this, we need to go and actually put a
quarter in and get your newspaper out of a box.
And when quote unquote below the fall, it's a choice,
it's an algorithm, is a choice that the tech company makes.
I don't see how Section two thirty applies at all.

Speaker 2 (18:38):
Well, I think instead of calling at social media, we
should be calling at mass media publishing, because that's what
it is. And so when you hear the tech lobby saying, well,
this is you know, First Amendment. The First Amendment guarantees
us the right to free speech. It doesn't guarantee you
the right to publish speech. It doesn't guarantee you the
right to broadcast speech, and it doesn't guarantee you the

(18:59):
right to mak mediated speech. So for the last century
we've had all kinds of laws and restrictions and accountability
for publishers, for broadcasters, and we allow social media to
come in and turn anybody and everybody into a publisher
without any accountability. So, you know, in terms of your
opening question about how do we get the information ecosystem

(19:22):
back on track, I think that it's pretty simple. We
have to pass some regulation that holds people accountable for
what they publish. And again, advertisers supported speech is very
different than First Amendment speech, and social media is supported
by advertising, so that's another legal aspect of it that
I think needs to be looked at.

Speaker 1 (19:45):
Having good life insurance is incredibly important. I know from
personal experience. I was sixteen when my father passed away.
We didn't have any money. He didn't leave us in
the best shape. My mother, single mother, now widow, myself
sixteen trying to figureure out how am I going to
pay for college and lo and behold, my dad had
one life insurance policy that we found wasn't a lot,

(20:07):
but it was important at the time, and it's why
I was able to go to college. Little did he
know how important that would be. In that moment. Well,
guess what. That's why I am here to tell you
about Etho's life. They can provide you with peace of
mind knowing your family is protected even if the worst
comes to pass. Ethos is an online platform that makes

(20:29):
getting life insurance fast and easy, all designed to protect
your family's future in minutes, not months. There's no complicated process,
and it's one hundred percent online. There's no medical exam
require you just answer a few health questions online. You
can get a quote in as little as ten minutes,
and you can get same day coverage without ever leaving
your home. You can get up to three million dollars

(20:51):
in coverage, and some policies start as low as two
dollars a day that would be billed monthly. As of
March twenty twenty five, Business Insider named Ethos the numbnumber
one no medical exam instant life insurance provider. So protect
your family with life insurance from Ethos. Get your free
quoted ethos dot com slash chuck. So again, that's Ethos

(21:12):
dot com slash chuck. Application times may vary and the
rates themselves may vary as well, but trust me, life
insurance is something you should really think about, especially if
you've got a growing family. Oh, it used to drive
me crazy during the during sort of when we were

(21:33):
more talking about what when Facebook was trying to deny
any culpability during the during the Russian influence operation, and
it was sort of like or you know, putting up
advertisements that were totally missing misleading. When I was at
Meet the Press, if we aired an ad that we
knew was misleading, we as the broadcaster, shared liability with

(21:55):
the advertiser. In fact, in some ways we were we'd
be held more liable than the appier themselves. That is
not how it works on these platforms. Meta is you know,
if somebody advertises on an in an Instagram with a
with a misleading ad, you don't get to sue Meta
for that. You only get to sue I guess the
company itself if you want to do it. So there's

(22:17):
that has to change. And I and it's one of
those where I don't understand. I don't understand. As you said,
the laws that are already on the book should already
be applied like this idea, they have to be you know,
they carved out this separate and I think that's the
mistake that was made.

Speaker 2 (22:32):
I think that was a mistake that's made. I think
that you know, when some of these technologies were introduced,
there were folks in Congress who didn't quite understand it.
You know. As you know, we haven't passed a single
regulation federal regulation on social media since nineteen ninety eight,
which is before social.

Speaker 1 (22:46):
Media was there was no social media.

Speaker 2 (22:48):
So you know, we are definitely behind and making sure
these products are safe, and we're seeing other countries take
the lead in protecting childhood, right, yep, So.

Speaker 1 (22:59):
Let's talk about the core goal of your group, right
it's media addiction. And you know, I started out by
sort of trying to be hopeful and noting that, hey, look,
the one place where politicians seem to find agreement is
on this issue, like how do we protect our children?
And I've been heartened by all these laws passed in
the States on self abandoned schools. But do you know

(23:22):
what a lot of schools proactively do. I proactively give
an iPad to every student that comes in the classroom.
In fact, there is a movement in my neighborhood. It's
a very small movement, but that's how these movements start,
right neighbor to neighbor going, Hey, tell Arlington schools stop
giving you know, teach out of books not iPads.

Speaker 2 (23:45):
It's not just in your district, Chuck. We're seeing this nationwide,
you know. MAMA, Chapter leaders across the country have been
among the most vocal in their community to say to teachers,
say to schools, this is not what we want. Teachers
and schools have sort of too quickly, I think, bought

(24:05):
into this idea that if it's technological, it must be better.
And we are now seeing the lowest reading in math
scores in our lowest performing children than we've seen since
the United States started measuring that back in the seventies.
So there's just been a terrible decline in reading in
math performance that matches up exactly to the introduction of

(24:30):
tablets and laptops into classrooms. At MAMA, we're not anti tech.
We think tech can be fun and tech can be helpful.
We just don't think it should replace real life experiences
and interaction. And there's actually centuries of evidence that show
embodied learning experiences are critical for memory consolidation in the mind.
That's how children learn by doing so. We know that

(24:52):
physically holding a book is affecting the mind in a
much different way than looking at it on a screen.
So you I have a three part mission at Mama's
Parent Education. It's getting phones out of school so children
can learn, and it's demanding safeguards that our lawmakers act
to ensure these products if they're going to be out there,
that they're safer kids.

Speaker 1 (25:13):
You know. So, do you think there's a you know,
you think there's a an age where you don't even
you know, you say, okay, look because now you have
these like smart boards instead of chalkboards right where you're
able to do and that to me makes sense. You

(25:35):
want to have these smart boards. And you know, the
upside about having a textbook online is that it's always
up to date, it's always current, right, it's never out
of it's never now of date. You can quickly without
having to buy new textbooks. So there are reasons school
districts want to save money, right, which is to have
some of this stuff available. What's the Is there a

(25:56):
regulatory line we can create in legislation that says, okay,
you know, we're not introducing any tech into classrooms until
sixth grade? Is that? Is there enough studies to support
a hard and fast line like that.

Speaker 2 (26:13):
Well, so, there's a lot of studies that support the
delaying of technology and media for children as long as possible.
The American Academy of Pediatric has for many, many years
recommended the most minimal amount of screens for your infant
and toddler as possible. If your child's under two, they

(26:34):
should not be exposed to a screen. If every once
in a while you let your toddler talk to their
grandparents on FaceTime or Skype, I don't think it's such
a big deal, but nobody should be parking their infant
in front of a screen for entertainment because we know
it affects their brain development. And there is a researcher
who's been taking scans of preschoolers and finding that the

(26:57):
ones who had screens have actual less white matter in
their brains. In terms of whether we could make a rule,
you know, we live in a country where education is
decided state by state, and so every state does it differently.
My recommendation, again would be to delay these things as
long as possible. Now, there's a lot of parents out
there saying we have to have our kids ready to

(27:19):
compete in a technological world. And I agree with that.
But the question I always ask is, you know, if
you want your child to be a safe driver, for example,
And we all agree that we want our children, once
they learn to drive, to be safe on the roads.
But none of us think they should start learning to
drive at age seven, or age eight or age nine,

(27:40):
because their brains and their bodies are not ready for that.
And so for children, what we think is important is
that the foundational skills they need to be competent learners
are there before we introduce technology. And then we introduce
technology in a limited way so that they can learn
the skills they need, but that it doesn't overtake all

(28:00):
of their other learning.

Speaker 1 (28:02):
You know, it's interesting. I had a I was at
the Texas Tribune Festival a couple of weeks ago and
saw a conversation with the new president of SMU, who
used to be the president of ut Austin, and they
were talking about, you know, over the last ten years,
there had been this shift away from kids majoring in
sort of the liberal arts right, majoring in English. In fact,

(28:24):
you have universities dropping that as a major right that
there's more specificity that students were leaning towards and his thesis,
his theory and what he was going to bet on
is that in the next ten years that we were
going to see a turn back to the liberal arts.
And in fact, he thinks his job's going to be

(28:46):
how to help students when they come to college. That college,
that college may turn into how to learn because we're
in some ways, because our kids are being raised on screens.
They know how, they know, they know where to find information,
they don't know how to how to how to create

(29:08):
the information right, they don't know how to confine the
information if if the power goes out type of issue,
and that that's in some ways that may be a
role that undergraduate universities are going to be playing again.
So that was an interesting thesis he had. What do
you make of that?

Speaker 2 (29:25):
I mean, that is a very interesting thesis and it's
something I think about all the time. You know, I worry,
for example, when you see doctors using their phones to
look up everything. You know, I want my surgeon if
I'm going in for brain surgery, I want my surgeon
to know how to do the surgery even if the
power goes out. You know, you want you want our

(29:49):
experts to have the skills and the training that they need.
You know, it's an interesting hypothesis. I think that a
lot of learning, though, happens in early childhood, and there
are certain foundational building blocks. We know, if by a
certain age you're not reading at a certain level, it's
very unlikely that you'll get there in your twenties. Right.
That is why in this country we long ago established

(30:11):
had start in zero to three, because there was so
much evidence that what happens during early childhood really affects
your lifelong success in learning and even other things like
your economic you know, status, your your job, you know, availability,
all those kinds of things. So you know, we we're

(30:32):
pushing to make sure. You know, Look, I'm an American.
I want us to win the AI race, but we
need to make sure that these technologies are developed in
ways that are all so safe and responsible. And I
have every confidence that our technology leaders can do both
of those things.

Speaker 1 (30:50):
Right.

Speaker 2 (30:51):
There's no reason we need to just say, Okay, let's
have no legislation and let's just let them do whatever
they want. You know, we've already seen I don't know
if you caught this in the news, but there was
a Teddy Bear with AI in it that just had
to be recalled because if a child asked the Teddy
Bear would tell them where they could find knives matches.
If you asked the teddy Bear a sex question, suddenly

(31:13):
they lead you, and you know it would lead you
into a conversation about like fringe. You know, sex spetishes
like these are not products that are safe for kids,
and we should not make them available to kids until
we know they're safe.

Speaker 1 (31:27):
I'm curious what you thought of of mister Altman admitting
that he was surprised at how many people chose to
use these these in this case chatchebt as a sort
of therapist, and that he didn't see that coming. And

(31:49):
I'm thinking, in your we've put you in charge of
building this.

Speaker 2 (31:55):
You know, I have the same feeling about him that
I did about Mark Zuckerberg. I mean, I think that
these guys didn't read enough poetry in college.

Speaker 1 (32:05):
I think I have a real cynical view of these guys.
These guys were the people that never knew how to
be friends in real life. They were constantly looking And
you know, I joke that you know that Zuckerberg had
a hard time meeting girls in college, so he thought
he could hack his way into finding a computer algorithm

(32:25):
to match him up with people that might be more
interested in him.

Speaker 2 (32:29):
Well, that's how Facebook started, right. It was like a
rate a girl if she's.

Speaker 1 (32:33):
Hot or not. That's what he was looking for. He
was trying to meet girls. I mean, not an unusual
thing for a nineteen year old. I'm not going to
you know, a lot of nineteen year olds, boys and
girls don't know how to do this. So I empathize
in theory, but I do think the entire the leadership
of Silicon Valley are some people that didn't grow up

(32:57):
the way a lot of other people grew up.

Speaker 2 (32:59):
I'll give you one more example. Nome Shazir, who founded
character Ai, was interviewed in a podcast a couple of
years ago and they asked them about character ai and
he said, well, I'll give you my humorous VC pitch.
He said, you know, when child's out walking with their parents,
they're holding hands, they're asking questions, and the parents giving

(33:22):
the child a lot of information. But they're not just
giving them facts. They're also you know, they're their friend,
They're giving them emotional support. They said, that's what we
want character ai to do. We don't want to replace Google.
We want to replace your mom and he said.

Speaker 1 (33:37):
This, Oh my god.

Speaker 2 (33:39):
And then and then we've seen all of these lawsuits
now because character AI, when people have turned to it
for emotional support or because they are depressed or they
are struggling, rather than direct them to get help, rather
than stopping the AI program is providing detailed instructions and encouragement,

(34:02):
in some cases for someone to end their own life.
So I think to that company's credit, they have now
announced that it should not be used by anyone under eighteen.
But again the question remains, why didn't they consider this
consequence when they created the product? And so again that
is why, Mama, we're so focused right now in ensuring

(34:24):
that our lawmakers take action to require these companies to
be accountable for the products that they make, that there's
transparency so we know what's happening at these companies, because
unless you require them to provide this data, you know
researchers can't even study it, and that there's responsibility and
that when they design them in such a way that

(34:44):
it's inflicting harm on our children at scale, that they
are held responsible for that. And we are seeing multiple
lawsuits now from school districts throughout the country who are saying,
you know what, you have to reimburse us for the
tens of millions of dollars. It's costing us for therapists,
for emergency psychiatric beds, for all of these costs that

(35:05):
have been borne by Americans and our communities that are
really the cause that been caused by these tech company products.

Speaker 1 (35:14):
I want to go back to the school issue, you know,
one of the fears, you know, I think about the
following right with the advent of AI over the last
twenty years, the focus among the many parents. My kids
are now eighteen and twenty one, so I remember, and
there was always, you know, every conversation with every parent

(35:35):
of you know, in my cohort was always you know, hey,
they got to learn coding, and oh but they better
learn coding, and that's what's you know, and STEM and
all of this stuff. And then all of a sudden
we realize, oh, no, coding is not going to be
a life skill. This isn't this isn't going to be
something you need, and it isn't going to help you

(35:55):
get another job. In fact, that is going to be
replaced by a robot. So that's that's a that's a
no longer necessary. That's like teaching somebody how to dig
out an ice block. We don't. We don't refrigerate with
ice blocks anymore. That is no longer an industry that
is necessary. The biggest fear I have now moving from

(36:18):
parent to grandparent. Right, I'm not there yet, but I have.
I have nieces and nephews who are having kids now,
and that is they don't you, And I thought, I
think we have an idea of the world our kids
are going to be living in. I think that's harder
and harder to visualize, and I think it's paralyzing parents

(36:41):
and trying to figure out what it is that they
should be asking the schools to be teaching their kids
to prepare them for the for for this next generation
of jobs or this next generation of society, because I
don't think any of us have the first clue of
what it's going to look like. And I don't know
how that factors into what you're working on, but it's

(37:03):
to me part of the of the fear factor that
has allowed so much technology into the school systems.

Speaker 2 (37:10):
So the way we describe our work at MAMA is
that where a grassroots movement of parents and allies fighting
back against media addiction and creating a world where real
life experiences and interactions remain at the heart of a
healthy childhood. And the reason for that is because tech

(37:31):
is here. It's not going anywhere. We are going to
have a lot of technology available to us, and it's
going to help us solve a lot of problems. But
there are many things about being human that we don't
want to change, and it should always be part of
being human. I don't want a robot to hold my
toddler or my grand baby. I want to hold that child.

(37:53):
And we don't think that we should just blindly say, okay,
tech is going to replace everything. We know that, and
there's abundant evidence about the role of parents and adults
in children's lives and about the role of embodied experiences
in children's learning. And we also know that there are many, many,
many jobs that technology is not good for. I don't

(38:15):
know if you saw that video that was making its
way across social media last week, but there was a
new AI robot. It was presented at a conference and
as it got across the stage, you know it was
it looked really cool until it fell down because they
haven't even figured out balance yet. So, you know, over
and over again, for more than a century, we've heard

(38:36):
tech companies promise that their product is going to revolutionize everything,
and some products do and they you know, a smartphone
has changed a lot of things. Has it changed the
way we eat food? Has it changed the way we
nurture our babies, has it changed the way you know,
our economy has run, not exactly. There are some changes
in our payments, right, but absolutely everything is not going

(39:00):
to change. So when it comes to thinking about the future,
you know, what I want for my children is to
have the same thing that my grandparents wanted for their kids.
It's the same thing I want for my future grandchildren.
I want them to grow up healthy. I want them
to grow up confident and capable. I want them to
have a set of skills so that they are resilient

(39:22):
and can adapt. Because nobody knows exactly what the workplace
is going to need, right, but we're always going to
need trades, We're always going to need skilled workers, We're
always going to need caretakers, and all of these professions
that hold up what it means to have a society.
The tech workers and the folks that the machines are

(39:42):
a small part of that. And I think we also
just don't want to give it all over to the machines.
You know, the Luddites happened because those folks like didn't
want to lose their jobs. And now we're at a
point where many, many more types of work can be replaced,
and we have to decide is that what we want?
Do we want actors to not you have jobs anymore?
Do we want everything to be robots? And I don't

(40:05):
think we do. I don't think anybody wants to take
their kindergartener to class and have them taught by a
robot teacher.

Speaker 1 (40:13):
Do you hate hangovers? We'll say goodbye to hangovers. Out
of Office gives you the social buzz without the next
day regret. They're best selling. Out of Office gummies were
designed to provide a mild, relaxing buzz, boost your mood,
and enhance creativity and relaxation. With five different strengths, you
can tailor the dose to fit your vibe. From a
gentle one point five milligram micro doose to their newest

(40:33):
fifteen milligram gummy for a more elevated experience. Their THHC
beverages and gummies are a modern, mindful alternative to a
glass of wine or a cocktail. And I'll tell you this,
I've given up booze. I don't like the hangovers. I
prefer the gummy experience. Soul is a wellness brand that
believes feeling good should be fun and easy. Soul specializes

(40:54):
in delicious HEMP derived THHD and CBD products, all designed
to boost your mood and simply help you on wine.
So if you struggle to switch off at night, so
also as a variety of products specifically designed to just
simply help you get a better night's sleep, including their
top selling sleepy gummies. It's a fan favorite for deep
restorative sleep. So bring on the good vibes and treat

(41:14):
yourself to Soul today. Right now, Soul is offering my
audience thirty percent off your entire order, So go to
get sold dot Com use the promo code toodcast. Don't
forget that code. That's get sold dot Com promo code
toodcast for thirty percent off. This is where I'm weirdly optimistic.

(41:37):
You know, the human species is pretty adaptable and has
survived quite a few challenges over the last few million years.
I have a feeling we're not going to let ourselves
be replaced by robots. I just wonder if we know
we need it when we're ready to start fighting back.

Speaker 2 (41:53):
Well, it's time. I mean, the time is yere, right.

Speaker 1 (41:55):
And that's where I want to get to. So okay,
So you're you are trying to essentially become a political force,
not left or right, just a force. You know, we're
not exactly advocating an advocacy. What are you working on
in the next six months? I know in the federal
level we talked a little bit about it. It's the

(42:18):
stopping this uh AI moratorium, Ai regulatory moratorium. What are
some near term activities in the States that you're working
on as well? And are you going to, you know,
try to do candidate questionnaires or try to do things
like that or are you not in that that space

(42:40):
just yet.

Speaker 2 (42:41):
So we're not in that space just yet. We're a
five at one C three, we're not a C four.
You know. Working on demanding safeguards is one small part
of what we do. We also work really hard on
our communities to educate parents on why they don't have
to rush to give their kids a phone are wonderful
chapter leader in Pittsburgh, you know, talked with the members

(43:05):
of that chapter and they really wanted their children to
be able to go to school without phones. But some
parents were worried, what about an emergency And they went
and they talked to local shop owners and they said,
you know, in an emergency, can my child come in
and use the phone? And the store owners were like,
of course, and they put little stickers in the window
so the kids would even know they were welcome there. Right.

(43:25):
So Mama works in communities, and then we also work
with schools because this is a huge issue now, not
just the phones, but what people are calling ed tech,
so that is the giving of tablets to kindergarteners, the
giving of laptop to first graders, and all over the
country we're hearing from parents that they're at war with

(43:46):
their schools because they don't want their children to come
home and have these devices. They don't want their children
to have access to YouTube where they're going to be
watching videos over and over, and the platform is designed
to keep them on there as long as possible. And
then in terms of legislation, we are very excited about
all of the lawmakers that introduce bills to try to
require products to be safer. In the last legislative session,

(44:09):
we saw close to four hundred bills introduced. There were
a lot, but we've seen success in everything from bell
to bell phone bands to social media warning labels pass.
And we're most excited about legislation like something called the
Age Appropriate Design Code or the Kid's Code. And this
is law that requires companies that make digital products to

(44:32):
be used by children to show what's called a duty
of care to children in making them safe by design.
So if the if a kid's going to go on there,
then it should be designed in a safe way. And
that means doing things like putting privacy settings at the
highest by default, not the lowest, so that every American
parent doesn't have to go in there and figure out
how to make sure a stranger can't contact their child.

(44:54):
But it's set that way from the beginning. So Nebraska
and Vermont, for example, this year passed the Age Appropriate
Design Code.

Speaker 1 (45:00):
By the way, that just shows you the ideological breadth
if you will, that's there, right, you know Vermont's Vermont,
the homer Bernie Sanders. Well, guess what Nebraska is. They
root for a team they call Big Red. And it
isn't just because Nebraska's colors are red.

Speaker 2 (45:18):
Yep. And you know, we're also seeing some interesting legislation
now being introduced about AI. So in October, Senators Mark
Warner from Virginia, Josh Hawley from Missouri, Dick Blumenthal from Connecticut,
Chris Murphy from Connecticut, and Katie Brett from Alabama introduced
this bipartisan bill that would ban minors from using AI

(45:41):
chatbot companions. And it would have an incredible impact on
the safety of our children. So, you know, that's the
kind of thing. I mean, who wouldn't support that? I
mean you have to ask who wouldn't support keeping kids safe?

Speaker 1 (45:54):
Do you? You know, do you plan on having sort of
Mama Seal of approvals? You know, meaning like if you
want to see what are what are? What are? You know?
Could I, you know, maybe this a little early my
next holiday season, will I go on Mama's website and
be able to see these are products that you should
feel comfortable that have some tech in him, that are

(46:17):
that are safe check.

Speaker 2 (46:19):
I don't want to promise that to your listeners, it's
definitely on our to do list. We're moving as quickly
as we can. As I mentioned, you know, we had
this ambitious goal of having six chapters, and the demand
is just completely overwhelming. Right after we launched, we got
requests from all fifty states and all over the globe,
like every continent except Antarctica. So you know, we are

(46:42):
very grateful that we receive some funding this year from
the Rockefeller Foundation, and we are primarily uh funded by
individual donors and family foundations. There's a lot of folks
who've come together to make this work happen, and we're
growing as quickly as we can. But yes, I do
think eventually that is something we would like to provide.

Speaker 1 (47:02):
You know, it's interesting you just talked about the world.
I heard a stat and I wonder if you guys
are now a clearinghouse for some of these studies that
there has not been a study around the world on
social media usage or early phone usage that it hasn't
mattered whether it's a rich country a poor country, It

(47:22):
hasn't mattered what ethnicity or if it's a you know,
homogeneous society or a multi ethnic society that this is
it is so clear that this this tech at a
young age, has been harmful hardstop.

Speaker 2 (47:38):
You know, social media doesn't discriminate, and we have definitely
seen that. You know, with the National Emergency and Youth
Mental Health, it cuts across race, class, gender, geography, you
name it. There is additional concerns for vulnerable communities. We
have seen unfortunate the increase in suicide rates among Hispanic

(48:03):
adolescents is even higher than in its white counterparts. The
rate of increase in Black youth is high. So I
think we're beginning to see a different kind of digital
divide where low income families. And in part it might
be because childcare is so expensive.

Speaker 1 (48:21):
And I'm just going to say, is that child is
basically unfortunately we have tech nannies, right. You know, I
was a latch key kid. First thing I did when
I had to get home was I had to call
my mom at work to let her know.

Speaker 2 (48:31):
I got in the house, right, And you probably watched
TV and that was fine though, you know, I did.

Speaker 1 (48:36):
Turn on TV. But what would happen now? Right? You
whip on? You probably you know, put the device on, right.

Speaker 2 (48:42):
I mean, I watched more Love Boat than any ten
year old should have. But there was a limit to
what kinds of things.

Speaker 1 (48:48):
My parents did wearing crazy about me watching love Boat
or Three's Company. I remember exactly.

Speaker 2 (48:54):
We didn't even know.

Speaker 1 (48:55):
I'll see an old rerunning go oh my god, like
what that is? I can't believe they let that on television?

Speaker 2 (49:02):
Right well now, the average age of first exposure to
pornography is twelve. Okay, check twelve, like you know, and
look like I think as a parent, you know, I
don't have crazy conservative ideas. I know that at some
point my sons will probably be exposed to pornography in
high school, but I don't think it should be mainline

(49:23):
to them by a couple of companies that are making
gobs of profit if they can get people to stay
on their platform longer.

Speaker 1 (49:31):
There's certain things that should be hard to come by,
and they have to. Frankly, in some ways you should
feel a little bit of shame, because that's actually a
good thing to have through life.

Speaker 2 (49:43):
Yeah, well, you know, look, I don't think anybody would
support allowing pornography at the checkout register at the supermarket,
but we didn't. We don't have a law that forces
people to you know, put it high up behind the
counter or wrap it in paper bags, you know. So
custom is part of this and part of what's challenging

(50:04):
in the digital eras that we're you know, these companies, we're.

Speaker 1 (50:08):
Still making the customs. We don't really have it making
the customs right, And I think that that's been that's
what makes this feel so challenging. And because I think
we all agree we've got to figure out how to
slow down the adapt the adaptation of tech in kids' lives.
But my god, is the toothpaste already out of the tube.

Speaker 2 (50:30):
Well, we have Mama's House Rules. If you go to
we are Mama dot org and you click on learn,
you can scroll down and Mama's House Rules give you
some ideas of how as a family you can manage it,
limit it, put it in a container. Again, we're not
saying there's going to be no tech in your life,
but we're just saying, how do you have a life

(50:52):
that's based on real life experiences and interaction, because once
you or your child are spending too much time online,
it just becomes so unhealthy. And you know we're all
media addicted, right, it's not just our children, and what
we model has a huge huge impact on them.

Speaker 1 (51:11):
Well, if I just went on your site, we are
Mama dot org, so that's the fairly quick and easy place.
You mentioned one other thing about tobacco about how those
that were advocating against it had a went to Hollywood
and said, hey, we need to redo this. Are you
trying to do the same thing with Hollywood with tech?

(51:32):
You know, you say, hey, you know, bo make it
so that you know, everything that happens online is bad,
the cool stuff is in person, or something like that.
How are you trying to influence culture in that front?

Speaker 2 (51:45):
So, you know, we've had some conversations with some people
in the space. I think other you know, Joe Gordon
Levitt has been a tremendous voice for healthy human relationships.
I know he's a dad and he again he's not
anti tech. He just thinks like tech should be safe
and that we shouldn't. He has a wonderful ted talk

(52:08):
that you could check out about his own experience with
social media and what he learned is unhealthy about it.
I think that more and more people in Hollywood are
waking up to this, especially after the actors the sag
after strike last year. You know, so many studios had
sort of suggested that they just pay actors once and

(52:29):
don't hire him again, and then they can just make
an AI likeness. And I don't think anybody likes the
idea of never working again, because you know, pixels can
just replace them. And then I think Pixar in the
Last Toy Story movie, I think they actually had a
digital device kind of be one of the negative characters.
So maybe somebody over there is getting the message.

Speaker 1 (52:50):
Interesting, Julie, I am, you know, cheering you on. I'm
glad to give you a platform. I want to stay
in touch, want to continue to help spread the word.
I mean, I think I think generally everybody gets it.
I think there's a lot of paralysis out there, and
I think what you've shown is that hey, you're not alone.

Speaker 2 (53:13):
You're not alone. And it's just been so moving and
inspiring to see all these parents come forward. We're looking
for more chapter leaders and a couple of specific states.
So if you have any listeners in Wisconsin or Tennessee
or Kentucky.

Speaker 1 (53:31):
Okay, here you go and challenge accept it. Here we go.
Come on, Kentucky and Tennessee and Wisconsin. I got listeners
in all those states. We know this. Let's go, let's
get this done. And I'd like to see you know
who the best advocates for this could be today's college students.
You know, I'd love to see you get college chapters.

(53:51):
That was look, that was a big part of Matt.
They got some college chapters and that helped absolutely.

Speaker 2 (53:58):
We work with some youth led groups now that are
just truly wonderful, and you know, I am both grateful
for their advocacy and it breaks my heart that we
have put kids in a situation where children have to
go to Congress and testify about what happened to them
and how they were groomed online or received eating disorder encouragement.

(54:20):
You know, we're the parents. We have to make it
safe for our kids.

Speaker 1 (54:25):
Julie so much, so grateful to talk with you.

Speaker 2 (54:27):
Thank you you too, pleasure. Thanks
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.