Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
S1 (00:00):
The views and opinions expressed in this program are those
of the Speaker's and do not necessarily reflect the views
or positions of any entities they represent, including Alaska.
S2 (00:09):
Media laws. Media. Oh, mass media presents Nation state of play.
(00:37):
Welcome to the Nation State of Play podcast. I'm your host,
Brian Miller. And each episode we explore the political stories
that are driving public policy in California. We explore these
stories with political insiders, business leaders, journalists and policymakers in
South to get below the surface of the headlines and
show you the true forces shaping our nation's state. Well,
thanks so much for listening. We have a great guest,
(00:59):
Larissa May with half the story of Project. We're talking
about social media addiction and specifically legislation that's pending in
California to try to do something about that. And we
talked about this topic on the show before this bill
came pretty close to passing actually last year. And I
think it's really important arguments that people need to hear
about why it should pass this year. It's a topic
(01:20):
that's important to every American, particularly parents. So hope you
can listen to this short conversation, but a really important conversation.
So stay with us. Marissa mayer with Half the Story
Project coming right up.
S3 (01:32):
American democracy is good, but we can make it better.
The National Association of Non-partisan Reformers includes organizations across the
country that are working right now to build a better
democracy by opening primaries, implementing safe, secure voting systems, reducing
corruption and increasing transparency. Listen to our weekly podcast, How
to Win Friends and Save the Republic. To hear the
(01:54):
latest updates from the Democracy reform space, subscribe and learn
more about us at nonpartisan Reformers dot org.
S2 (02:03):
Welcome back to the nation's State of Play podcast. Larisa,
thanks so much for being on the show today. It's
a pleasure to have you.
S4 (02:09):
Thank you for hosting me for this very critical topic.
S2 (02:13):
It really is. You know, I wanted to start with
actually asking you where the organization's name comes from, because.
Because I think it's provocative. So how did it how
did you come up with the name?
S4 (02:22):
So half the story comes from the idea that social
media is only half of the story. And what we
put out on social media divides us more than it
does connect us. And it really came out of my
own experience as a young digital native trying to understand
my role in the digital world and in really thinking
about and better understanding how these digital devices and platforms
(02:45):
actually impacted my mental health because I was only seeing
one side of the story. But it goes beyond that.
It really also speaks to the algorithms and the subcultures
and really just the worlds that social media create for
us and how important it is to really break through
the echo chambers that the algorithms create, which I think
also kind of leads into why we are here today,
(03:07):
which is to talk about a bill that ultimately will
create a create accountability so that we can help break
some of these cycles.
S2 (03:16):
Yeah, and let's let's do that. If you don't mind,
you didn't really brave talking about your personal story, including
on legislative testimony. And I'd love to ask if you'd
be comfortable sharing some of that because I think it
says a lot about the depth of this topic and
why it's so important.
S4 (03:30):
Yeah. So I actually have been a digital wellbeing activist
for seven years, so I'm really happy that we're having
this conversation because not it's not a very long time
ago people didn't really even believe that technology had emotional
implications because they were really so mesmerized by the innovation
(03:51):
piece of it. And what we've seen over the last
couple of years, and really starting with my own story.
This is something that young people and humanity has really
started to question, like what is the opportunity cost of
these devices, of the speed of open data? I of
Web3 and I faced that when I was 21 years old.
So I've got Instagram for the first time when I
(04:12):
was 18, which actually was pretty late in the game
compared to most kids and the modern world. So what's
interesting about my experience is that I remember a life
without social media, but I also grew up in it.
And during one of the most pivotal points in my life,
when I was in college, I downloaded it. And when
I was a sophomore in college, I was using social
(04:35):
media 10 to 12 hours a day. I was struggling
with an eating disorder, depression, anxiety. And during that time
I remember I was like a moth to a flame.
Ever since I downloaded social media, my brain was never
the same. It was like when I was feeling more sensitive.
I was never really able to regulate my emotions the
same way I was before. And social media played a
(04:58):
role in that. And it became so bad that I
was actually I remember when I was 21, sitting in
my dorm room on my bed and opening up my phone.
And there are really two sides to my life. One
was my love for fashion, and I had a fashion
blog and the other was the fact that I was suicidal.
I was looking at depression content and I was looking
at photos of women who were thin and thinner, that
(05:20):
I was that woman that I would never look like.
And my algorithm and the content that kept serving me
was showing me photos of the woman that were either
in recovery from eating disorders or that were then fitness
influencers that I would never be like. And so I
wound up really hitting a rock bottom. I mean, the
list goes on and on of I think, all of
the things I was really facing. But at its core,
(05:43):
I was dealing with an eating disorder. I was dealing
with chronic depression, which really neither of them were treated
until that point. And social media's role in that story
was really a magnifying glass. It made things harder. It
made my emotions more intense. And quite honestly, it also
was a shield between me, my emotions and the world.
(06:04):
And people didn't really understand what was going on because
I was only sharing half of my story. And when
I hit rock bottom, my aria, my college campus came
into my room and she was like, The girl that
we see online is not the girl that's sitting in
this room. And now I know that you're experiencing suicidal ideation,
which I was. And she took me. She was in
(06:24):
pre-med to the psychological Care center on campus. And I
was very lucky. And I think I always start by
saying I'm doing this work because I'm alive today. And unfortunately,
the second leading cause of death for young people in
the United States is suicide. And when you look at
the rates of anxiety and depression and suicide attempts amongst
(06:45):
young woman next to the growth of 0 to 1000000000
Instagram users, the correlation and the growth looks almost the same.
And I I'm not I'm not I don't think that
life can be looked at as statistics necessarily because you
can't really put a value on a life. But what
I realized in that period was as a young person
navigating digital technologies alongside a mental illness. So again, I
(07:08):
was at the worst case scenario. I was sitting in
the doctor's office and they gave me A.I. to. And
they asked me about drugs and sex and all of
the things that we would indulge in as as college students.
And I wasn't really one. I never was into drugs,
but the drug that I was into was the one
that was in my pocket and the one that I
was spending 10 to 12 hours a day on. So
(07:31):
I really took the matters into my own hands because
even though I left and started taking my antidepressants and
my baseline mood was better just because my chemicals were off,
I was still struggling with comparison and I was still
my algorithm was built for the girl that was broken
and sick, and it was really preying on that. And
so I started half the story when I was like,
This is pretty screwed up and I need to either
(07:53):
be a part of the solution or I need to
get off this. And so I printed a bunch of
stickers that said half the story and gave them to
students on campus and really started by building a grassroots
movement of young people around the world. And now we've
really become one of the leading youth led nonprofits at
the intersection of education, research and advocacy to give the
(08:16):
next generation the tools that they need to be empowered
to take back control of technology and their own emotional regulation.
S2 (08:25):
Amazing story, and I'm really glad you've been able to
see this up here in Sacramento and other capitals. So
I want to ask you about the legislation. Let me
start by asking you, how do you define addiction? Because
that's what the legislation focused on and we'll get into it.
But how do you actually define that term?
S4 (08:42):
Well, I mean, I think that that's a really interesting question.
And I think that's the question that and it's hard
to kind of. I think there's like a lot of
different ways to think about what addiction can mean depending
on what type of individual you are. There are different
drugs from alcohol to prescriptions to heroin. There are opioid.
(09:05):
But ultimately, you know, a person with an addiction in
the way that I think about it is that when
we engage in a behavior and our brain ultimately receives
a reward that encourages us to keep repeating that activity,
although it is actually detrimental to us and that part
(09:26):
of our brain that gives us that reward and enforcement
is the dopamine, the neurotransmitter dopamine, and that is the
dopamine hit in the dopamine cycle that the algorithms are
using to prey on the limbic systems of young minds.
So that's how I define addiction. But, you know, when
(09:46):
you think about social media and I'm just going to
be honest because I've heard all sides of the story,
I think that there are some arguments from people that
social media addiction does exist and that it doesn't. And
the data tells us and the science tells us when
you look at young people's brains, the way that these
devices have been created and the way that young brains
(10:08):
react to this, especially because the frontal lobes have not
been fully developed, pretty much mimics the same type of
addiction that one would have to a drug or smoking
the same areas of their brain light up. So, you know,
I just would like to say that, yes, I do
believe in social media addiction because I lived that experience.
(10:29):
And I definitely have an addictive personality and I know
that about myself. But, you know, there are a lot
of varying thoughts around this. And I think one of
the hardest parts about mental health, which I would also
bring up in this argument, is that it's it's very
the brain is the the one organ, even though one
in four Americans struggles with the mental illness, the brain
(10:50):
is the is the organ that is least looked at
from an imaging perspective in modern medicine. And so a
lot of mental illness and conditions tend to seem more
subjective than objectives. But I really believe that, you know,
mental illness and mental health is objective. And until we
start looking at our our brains like our bodies, it's
(11:11):
going to be impossible to you know, it's going to
be impossible to make change. And so social media addiction,
every time you got to like every time you open
your phone, every time you get on a snap streak,
that's the part of the young individual's mind that is
ignited and is telling that them that this feels good.
But the truth is it's only in the moment. It's
(11:31):
not the long term. And that is what's creating these
social media addictions, because the platforms are designed to prey
on these especially vulnerable minds to keep them coming back.
S2 (11:43):
Yeah. I mean, I want to tease out the Twitter
example here because a lot of the listeners here spend
a lot of time on political Twitter. And I think
the dopamine point race is really at the heart of
all this. So, you know, I read this interesting study
that basically said, you know, somebody goes and tweets something
outrageous on Twitter. They get a certain amount of cheers,
(12:05):
likes interactions, and they get a certain amount of dope.
But what happens over time is your baseline dopamine level
is actually adjusting to to be higher. And so in
order to get what you perceive as a hit and
increase above your baseline, you need more and more likes,
(12:27):
more and more interactions. And the problem with that on
Twitter and there's probably different versions of this on different programs,
is that just means you have to be more outrageous
to get more interactions. And I think it feels like
that's one of the reasons like Twitter is so just
in the sewer in terms of what goes on, because
to get that dopamine, you just you've got to be
(12:49):
crazier and crazier and be provocative.
S4 (12:52):
Yeah, you have to. And that's one of the big
issues right now with misinformation and how information is spreading,
because we have moved from a culture that prioritizes truth
and accuracy into a culture that is incentivized by attention
and is incentivized by engagement. And the way that truth
(13:12):
is engagement engage with versus comedy or provocative content is
very often not equal. And that's why because truth is,
it's it's not as interesting to people. And now we
know what's true from what's not. And young people are
going on social media to self-diagnose themselves. They are you know,
(13:34):
there's all these terms that are being thrown around and used,
which is also harmful in it in its own right.
But I think that, you know, what your your example
about Twitter is really similar to Doc and that young
people now we kind of we call it shock culture,
social media reward shock. It rewards the tide challenge. It
(13:57):
rewards the choking challenge, which when I was at the
capital working on Kazaa, there was a group of parents
sitting at the table. One of them had a ten
year old daughter who died from the choking challenge on
Tik Tok because she went into another room to do
the challenge. That is the reality that young people are
facing and that is the reality that we need to change.
S2 (14:16):
All right. Well, let's let's talk about the bills. So
how does the bill try to change them?
S4 (14:21):
So first, let me I think this is there's a
couple of things that I want to share before we
go into the social media duty children that really paint
the picture of why this is so important. So first
and foremost, the average American, this is the average average
American teen. There was a study done by Common Sense
Media last year is spending seven and a half hours
a day behind their device that average over the span
(14:43):
of their life. That's a third of their life, which
is 30 years. So think about this. Your kids and
your grandchildren are basically going to spend 30 years of
their lives behind a device that has not been regulated
virtually since it was created. So the other piece of
this is that kids that are struggling with mental illness
(15:04):
are more likely to experience casualties, I believe, from social
media just because of the way that the algorithms are built.
There was a recent report that came out in the
last week, and it's been really going viral on social
media by the Center for Countering Digital Hate. And what
it really found that what it did is it looked
at algorithms among 13 year olds in the US, UK,
(15:25):
Australia and Canada. And what they found is that harmful
content is served every 39 seconds. Suicide content is served
within 2 to 6 minutes. Within 2.6 minutes, suicide content
is served within 2.6 minutes. An eating disorder content is
served within 8 minutes. So that is what in studying
(15:46):
13 year olds globally, the the amount of time it's
taking for them to be exposed to the type of
content that can start as curiosity and end in them
taking their own life. So the social media duty platform
to do so this year, or reintroducing the Social Media
Duty to Children Act, and this will ultimately be the
(16:07):
first bill that creates accountability from an infrastructure and design perspective.
Right now, social media platforms are designed to hook, to
addict and to continue to serve that content to underdeveloped minds.
So what does that actually look like? Well, first and foremost,
(16:29):
if a young person, if there's a casualty and if
the social media platforms do not follow what this bill passes,
it will actually give the district attorney the power to
review a number of cases within the district and act.
Then pursue a case against the social platform. So it's
really allowing us to put our trust in our city
(16:50):
officials and to really take the cases that are going
to have the most amount of impact for the majority
of you, because there's hundreds of millions of youth online
that are harmed probably every single second. So what does
accountability look like? And that's one of the biggest one
of the biggest questions that young young people and parents
and even people on the other side are asking. So
(17:12):
right now, there is essentially no financial incentive to actually
make sure that you're building platforms that support the development
and emotional health of youth. We are facing the world's
greatest mental health crisis in history for youth that we
have ever seen. When you look at the fact that
kids are spending more time online than anywhere else in
(17:34):
the world, that basically means we're putting kids into a
bunch of cars without seatbelts that have never been tested
on the road. That is what the equivalent is right
now on technology. Kids are going to sleep with the bully.
They're addicted to the place that's killing them. We have
more fentanyl overdoses happening where kids are getting drugs on
social media. The list goes on and on. So what
(17:57):
we're basically saying is, hey, social media platforms, you know,
you have and Facebook specifically, you have a team of
data scientists inside that are actually aware of what you're
doing because there was a report that was published on it.
So what we are going to say is not hinder innovation,
encourage you to keep innovating, but when you innovate, ensure
(18:21):
that when you launch something in the market using your team,
that it is not going to be harmful for you.
If you don't do that and it is harmful, you're
going to be charged and you're going to be held
accountable if you do develop something. And within last year
it was 30 days. This year we're still editing the bill,
so there could be changes within 30 days. You find
(18:42):
out that it's harmful and you pull it, you're not
going to be charged. And so what this really does
is put the onus on the platform and actually has
a financial incentive at the forefront for them to develop
and continue innovating, but innovate with the emotional wellbeing at
the forefront for their users. So that's what we're really
looking at. And when we talk about what are the
(19:03):
types of features that we're trying to remove for minors,
it's the ones that are the most addictive, the snap streaks,
the endless scrolling, all of the things that are keeping
our kids hooked to these devices and preventing them from sleeping,
preventing them from their academic achievements, their social intelligence, and
even just their engagement with the world.
S2 (19:24):
So what is the operative language of the bill? I
thought it was if they designed something addictive, But it
sounds like. Is that that's that's the key trigger.
S4 (19:34):
Yeah. If it's addictive and and that's you know what
what and and Kim, feel free to jump in here
but yes it's if it's addictive. Basically right now what
they're doing is they're designing and actually most of the
time doing research to make sure that it is addictive
before putting it out because that's what's creating their bottom,
(19:55):
their bottom line. But what this bill does is it
actually ensures for minors, which are the most vulnerable, as
I mentioned, to ensure that when they are innovating, it's
not happening.
S2 (20:08):
Okay, So let me dwell for a minute on the
enforcement provisions of the bill and how this would work.
So the bill, as I understand it, would create a
public right of action for prosecutors to bring claims. Is
that correct?
S4 (20:25):
Yes.
S2 (20:26):
Okay. So not criminal liability. We're talking about civil liability
by public prosecutors here.
S4 (20:32):
Yes, it's civil liability by public prosecutors. And it actually
places a duty on the social media platforms to prevent
their children from being addicts.
S2 (20:44):
Got it. And there's no private right of action to this,
but this is not like a boon for trial lawyers
or anything like this. This is public prosecutors on public prosecutors. Gotcha. Okay. Yeah, No,
I think that's really helpful kind of context. So if
you could tell us what happened last year with the bill,
because I know you made a lot of progress, didn't
quite get it over the finish line. Where did things
(21:06):
get left off last year?
S4 (21:07):
Oh, wow. So first of all, we had zero no
votes and the bill was silently killed in the Appropriations Committee.
So that was a huge. Actually, in many ways it
was a big win because it created a global conversation.
There have been a number of bills since introduced, and
I believe that it was really the catalyst for a
(21:29):
lot of the legislation and conversation that continues to be
at the forefront of digital wellbeing around the world. But
now this year we want to come back and where
you want to introduce this and we want to get
it over the line. California has been the leader in
virtually every single cause, and we believe that as the
state that is the home to many of the social networks,
(21:50):
it should be the one that is is taking the
steps necessarily necessary to create accountability because kids have they
have systems in place for cars, for toys, for candy,
for virtually anything they touch. But this is the place
where they're spending all their time and there's nothing.
S2 (22:07):
Yeah, thanks for saying that point about California. I mean,
I've said this before and the shows that these are
by and large California companies. I think we have a
special duty to tackle this in California. And if we don't,
it'll be tackled for us in Washington against certainly one
of our biggest industries. It's not our biggest industry. And,
you know, one of the things that's strange about tech
(22:29):
is maybe it's a strange but I think it sort
of forgets history is they essentially have argued against any
regulation whatsoever and fairly successfully to their credit, done it
so far the I've seen this issue that'll come up
in other sectors where eventually the industry says, you know what,
we're going to be regulated. Eventually this will be a
part of the process, do it on our own terms,
(22:51):
come to the table and suggest something we can live with.
Have you seen any evidence of that sort of mentality
from tech or is there approach hands off? We're just
not agreeing anything?
S4 (23:01):
Well, there is a company called TechNet, which really represents
all of the big tech companies. And I think there
are some tech players that want to come to the
table and build or build more empathetic designs and experiences.
I think there are companies and granted, this does not
apply to every single company. It's like if you have
(23:22):
over $100 million in revenue, this this applies to you. So, look,
I think right now there are always advocates within tech
platforms that want to create more sustainable practices. A lot
of people that you talk to that work in tech
don't actually let their own children on these platforms. The
entities as a as a whole in a capitalist society
(23:44):
are really here to serve one group of individuals and
that's their shareholders. And I think that we've got that
a lot of times in these conversations of what is
the ideal, what versus the real. And I think the
way that I think it's you know, no one's going
to be excited to lose hundreds of millions of dollars
in revenue. No CEO, whether it's tech or not. And
(24:05):
that's what this is really coming down to is humanity
versus profit. But I think we need to start looking
at this crisis similar to the way that we're looking
at the environmental crisis and that companies are having to
invest more in sustainable practices and products for their businesses,
sometimes at a loss to them because it's better for
the world and the environment. I think the same thing
(24:26):
goes here. We're going to have to start taking cuts
to our profits to ensure that there are less kids
that are becoming addicted and dying and that I'm not
sitting at a table with mom holding photos of their
kids who have all died under the age of 15.
S2 (24:40):
It is tech's argument that they cannot do what you're
asking them to do as a practical matter, or that
they should be required to do it.
S4 (24:52):
Well, I think that they're both really important questions. When
it comes down to algorithm design and experiences, algorithms can
be a bit of a like I write, technology is
a reflection of humanity. I think in a lot of
ways they can 100% control certain features and things that
they create. So it's not really about that. It really
(25:14):
comes down to profit. It's profit. That's what it's about,
especially now. And I think we just saw there were
hundreds of thousands of tech layoffs last year. So I
think there's probably going to be even more resistance than
ever just giving in the economic climate and what tech
companies are up against. And that's just the reality of
what we're at. And I think we can't we can't
(25:35):
ignore the economic climate that we're living in and also
that this bill isn't really about preserving wealth, it's about
preserving humanity.
S2 (25:46):
Know what's going on in other states on this topic.
Has anybody passed something similar to what you're asking for
in California?
S4 (25:55):
No. So right now, most of the bills that have
been that are sort of on the table, I mean,
California passed the Privacy Act. But I think the big
thing we need to share with the world is that.
Privacy and algorithm design are two completely different things. Of course,
privacy is a non-negotiable. Kids should be I mean, kids
(26:15):
should be protected, period. But protecting kids and their privacy
is not going to change the fact that these algorithms
are manipulating their young minds from the start. And we
need to actually make infrastructure changes, not just privacy changes,
in order to see a real shift in the emotional
well-being of the next generation. So that's sort of my,
(26:39):
you know, what I have to say about that. I
actually just spent a number, a bit of time at
the Capitol working on Kosar, the Kids Online Safety Act,
which is actually pretty similar to this, except for it
doesn't use addiction as really the key term. It's more
broadly a technology that's harmful, harmful to kids. And I
(27:01):
think this is just I mean, it's crazy that.
S2 (27:04):
Federal legislation, certain.
S4 (27:05):
Federal regulation. Yeah, federal legislation. And quite honestly, I know
that mental health and social media, the US Surgeon General
came out last year and said that social media is
playing a significant role in the mental health of young
people in a negative way and we need to do
something about it. So, you know, the best way to
do it is go to the top and go to
the bottom. The top is federal legislation because protecting kids
(27:27):
in California is not going to save the world. It's
a step in the right direction. And with that will
ultimately and why it's so important for us to pass
this is because if we can pass this, then many
other states will pass this. You know, we have BLUMENTHAL
we have Marsha Blackburn in Tennessee that are trying to
work on this at a federal level. And I think
if California can make it through, then we have an
(27:48):
even better case for a bill at a federal level
to do this. And it's really it's really wild because
there's really only a couple of a few key players
that are contributing to this that can make one policy
change or a few policy changes that can really protect
or at the very least, ensure that they're putting their
(28:09):
best foot forward and the resources where their heart is
to protect youth. And I think that that's what this
is about, is like, of course, it's not going to
be perfect out of the gate because that perfect is
just not a reality. But it's going to make sure
that people are at least diverting their resources into the
right places and not slashing teams that are working on
wellness and mental health, especially given what we're facing right now.
S2 (28:32):
If you're a parent and you're listening to this, what's
your advice to parents in navigating the digital world right now?
And where can they find resources about this?
S4 (28:42):
So our organization is all about empowering parents or I
say caregivers, students and educators. We have an educational program
that we've actually been bringing in to a number of
the Bay Area schools called social media to really give
kids the power they need to better understand how these
tech platforms work and what it does and how they
can break through those barriers to take care of their
(29:04):
mental health. But I think as a parent, there's really
three key things you need to know. One. Don't create
shame and negativity around social media. It can be a
really powerful place and could even save your child's life
if they're using it the right way. The more shame
you create, the less likely they're going to be to
talk to you about it. The second thing is you
start the conversation about social media with your own personal
(29:27):
experience and acknowledgment. A lot of times, like anything, as parents,
you're probably inquiring about this because you don't have a
great relationship with tech either. And it's important to be
vulnerable with your kids if you expect them to open up.
And then the third piece is make digital wellness fun
and gamify the experience. You know, tell them. Try to
get them to calculate how much money they think the
(29:47):
platforms are making off of their time. And you know what?
That could lead up to over a year or, you know,
a lifetime of digital tech and also make it fun.
Try to have screen free holiday challenges or weekend challenges
where the first person to pull out their phone at
dinner is stopped doing the dishes. And parents, you know,
parents will also be responsible for that, too. So I
(30:08):
think we just have to take the stigma away and
make it more fun and interesting and engaging so that
young people can can ultimately just connect more with this
topic and in a way that doesn't feel scary and
doesn't feel negative.
S2 (30:22):
If people want to find out more about those topics
or get involved with your organization, read more about your
work with Sebago.
S4 (30:29):
They can go to w w w dot half the
story project dot com or you can find us online at.
Half the story anywhere. And we have tips pretty much
anywhere that you need them.
S2 (30:41):
Thanks so much Riana so great to have you a
hugely important topic and look forward to hear more about
the progress you will hopefully make this legislative session. So
thanks for what you're doing.
S4 (30:49):
Thank you.
S2 (30:51):
We invite you to share story, ideas, comments and questions.
Find us at Neptune Hopscotch or on Twitter at at
Nation State of P one again, that's at nation state
of P And then number one, follow us and subscribe
to listen to all of our episodes as we continue
to explore the inside stories driving California policy. Thank you
(31:12):
for listening to the Nation State of Play podcast. Powered
by Neptune. Oh loss media.