All Episodes

November 17, 2022 53 mins

It’s been about 15 years since Facebook went mainstream and the iPhone was released. In this short time social media and our pocket computers have become indispensable, even addicting. And now we have a new frontier on the horizon, the Metaverse. You’ve probably heard this trendy buzzword, but do you know what it is? Dr. Ibrahim “Abe” Baggili, a cybersecurity and forensics expert, shares his opinion, some content and security risks his team has found, what his biggest fear is, and more. Then our second guest, Jim Steyer, a civil rights attorney and the founder/CEO of Common Sense Media, offers a path forward. His global child advocacy and parental resource non-profit is devoted to child privacy, fighting for big tech regulations, and helping parents navigate the ever-evolving digital wild west.

If you have questions or guest suggestions, Ali would love to hear from you. Call or text her at (323) 364-6356. Or email go-ask-ali-podcast-at-gmail.com. (No dashes)

**Go Ask Ali has been nominated for a Webby Award for Best Interview/Talk Show Episode! Please vote for her and the whole team at https://bit.ly/415e8uN by April 20, 2023!

Links of Interest:

Dr. Abe Baggili: Twitter

Connecticut Institute of Technolog

Hacking the Metaverse, LSU Media Center (11/08/22)

Baggili Recent Research: Rise of the Metaverse’s Immersive Virtual Reality Malware and the Man-in-the-Room Attack & Defenses

Information Commissioner’s Office (UK)

Jim Steyer: Twitter

Common Sense Media

Book: Which Side of History: How Technology Is Reshaping Democracy and Our Lives (2020)

Book: Talking Back to Facebook (2012)

Child Mind Institute

Metaverse in the News:

Oculus Founder Claims To Make VR Headset That Will Actually Kill You If You Die In A Game (Forbes, 11/8/22)

An Exploration of 12 Metaverse Use Cases (Ericsson, 6/30/22) 

Murder In The Metaverse: Crime or Creativity? (Medium, 5/28/22)

CREDITS: 

Executive Producers: Sandie Bailey, Lauren Hohman, Tyler Klang & Gabrielle Collins

Producer & Editor: Brooke Peterson-Bell

Associate Producer: Akiya McKnight

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Go, Ask Alli, a production of Shonda Land
Audio and partnership with I Heart Radio. When I have
been with friends and that happened and I paid my pants,
I did lose the room, they did leave. I saw
her light up and I was like, I'm just going
to work, but we are here until one of our
last grips. I was just the one that was meant

(00:22):
to take care of mamma. It's for me to remember
every single day is that I always have a choice.
Everyone always has a choice. Whenever somebody says no, you can't,
or there's no rules for you, or you have to
look like this, I go. I'll show you. I'll show you.
Welcome to Go, ask Allie. I'm Alli Wentworth. Do you

(00:45):
know what the metaverse is? Because I do not, and
I'm already terrified of it. I don't want to live
in the metaverse. I don't want to make new friends
in the metaverse. I like where I am right now,
in three dimensional real life. It is such a big
topic that we have two guests to cover it. Ibrahim

(01:06):
Bighilly is a digital forensics expert and professor who will
tell us what the hall of the metaverse is and
then Jim Steyer, founder of common Sense Media, will fill
us in on big tech and how his organization is
advocating for kids safety and privacy in this wild West
of digital media. Dr Ibrahim Bigeelly is a professor of

(01:30):
computer science and cybersecurity at Louisiana State University. Prior to that,
he was the founding director of the Connecticut Institute of
Technology at the University of New Haven, specializing in cyber
security and forensics. In twenty nine, the National Security Agency
gave you an H the prestigious designation as a Center

(01:51):
of Academic Excellence in cyber Operations, and in two did
the same for LSU, becoming one of just twenty two
schools on the elective list. He also has co authored
over seventy publications, and his work has been featured in
news and media worldwide and in twenty languages. Hello, Ibrahim,

(02:11):
I'm going to call you Abe for the rest of
this podcast, after I've established your very elegant name. Um.
The reason I'm having you on besides your a genius
of all this uh virtual techno stuff that I feel,
I still scream at my iPhone to you know, turn
off the lights. You know, I look at Instagram, social media.

(02:33):
The metaverse is kind of like the wild West. And
I know, as we're trying to figure it all out,
as parents were trying to figure it all out for
our kids. But let's just start with the big, broad,
simple question, which is, what the hell is the metaverse?
I mean, you know, you've hit on something that's close
to my heart, right, what is the metaverse? And you know,

(02:55):
people will give you many different answers, but the real
answer is nobody knows what metverse is because in my
personal opinion, it doesn't really exist at this point in time.
It's what people are trying to build, right. Um So,
so the metaverse is supposed to be essentially an alternative
virtual world where you know, you could do things in

(03:17):
a virtual environment with your let's say VR headset or
mixed reality headset, and you can socialize with people. You could,
um hang out with people, you can go to class,
you could do all the things that you do in
the real world, but in the virtual world. That's basically
what it's supposed to be. Um it wasn't invented by Facebook,
and I just wanted to make sure I say that

(03:38):
because Facebook ream in itself to meta and you can't
really take that credit for virtual reality. And that sort
of thing has has existed for many years, so it's
not like something that's uh that that we're just talking
about now. Well I wasn't even gonna mention Facebook, So
how about that? Um, I think of it as sort
of a virtual reality based world that's separate from our

(04:03):
physical one. Is that aptly put? Yes and no? Because
it could be actually part of your physical world if
you're talking about mixed reality or augmented reality, right, So
you like the hollow lens glasses, Um, you see the
physical world, but you augment it with information. Right. So
for example, if I'm wearing my glasses and my glasses

(04:24):
are telling the turn left, and there's a signal in
my glass telling the turn left, I could still see
the physical world, but now the virtual world is also
interacting with me and overlaid on top of the physical world.
And virtual reality is when you are immersed into a
virtual environment completely. So if you wear a headset, it's
going to cover your eyes completely. You might have headphones

(04:46):
where you're listening, so you can't see anything except the
virtual world. So when people use the word metaphors loosely, there.
I think including this idea of augmented reality and mixed
reality into that equation. But it is it is a
reality that if you pull off your glasses or your
helmet or your goggles, you are no longer in that world.

(05:09):
It's a three D form, yes, uh, definitely, it could
be a three D form. It could be a two
D form also. I mean a good example if that
would be Ready Player more in the movie, and I'm
sure a lot of people have watched it. You know,
you're you're in this alternative world where that people are
basically stuck in and they live in and and you know, um,

(05:29):
maybe we'll get to the matrix one day where you
plug in. You know, it feels like we're getting closer
to that. Are you for or against the metaverse? Like?
Have you bought beach side property there? Uh? I mean no,
and I don't intend to do so. UM, I understand

(05:50):
the potential market value, right. UM. I was at a conference,
a really big conference in California, UM, talking about you know,
virtual reality, security and all these problems, and I basically
told people the metaverse, as you know, maybe Facebook or
other organizations are trying to build it is nothing but
an app store, and they looked at me, it's like,

(06:11):
what do you mean. I'm like, it's nothing but an
app store. Like, it's an app store for all of
these different virtual worlds that other people are going to create.
You know, you go on your phone and you download
an app, you know, for video conferencing, you download games,
and you download all of these various things and then
you immerse yourself in them. And if you think about
when the dot net era started kind of emerging, Yahoo

(06:35):
and all of these companies, what they wanted to do
was to become the front page of the Internet, right like,
and Google of course was a big winner in that
regard towards the end, which is, you know, through a
search engine. But if you think of the metaverse, I
mean the metaverse, the winners are the ones that are
going to be that that front page for all of
the other things that people are gonna build in those

(06:56):
virtual worlds. Um and and I think that's really what's
happening right So um that that's really where my mind
is is we're we're still at the stage of creating
some environments that people can be in, but not necessarily
the environment that's the access to all other environments. Well,
why don't we just stay in the reality. Why do

(07:17):
we have to go to the virtual reality? I mean,
that's a great question. Why why do we have Instagram accounts?
Like why do we want to show what we just ate?
I mean, humans, um, humans I believe have a tendency
to want to share at least good parts of their

(07:37):
life on the internet. And there's a whole you know,
theory of de individual ation when you look at psychology
and and you know how when you're online you feel
more anonymous. Therefore sometimes you can be more of yourself.
So there's all of these various reasons why you know,
online environments can actually be quite awesome, and especially virtual environments.

(07:57):
Like you know, if your grandma is you know, older,
and she came from another country and she wanted to
see this church that wasn't her hometown, but she can't
travel anymore, and all of a sudden, uh, you know,
you can get her to see that, uh, And those
sort of things have happened right where it really gives
them a sense of nostalgia and a sense of something

(08:19):
that that's awesome to look forward to. Um. But you know,
shouldn't be everything, and that's really the question that I
don't know. I mean, I think just we're as humans.
We get addicted to certain technologies and certain things, and
you know, the technology goes out there and and there's
a lot of issues that we weren't anticipating, right. But also,

(08:42):
you know, when I think about when you were talking
about Instagram and now virtual reality, it's sort of all
boils down to connection. People want connection. Um. Certainly we
saw her in the pandemic that particularly teenagers needed connection
more than anything, and virtual reality would benefit that, just
like Instagram and you know, so so much of social

(09:04):
media did benefit them from from being lonely. And I
have read that there are benefits, um, you know, in
virtual reality in the metaverse for autism, therapy, learning, retention, socializing.
So yes, I think there are with most Internet stuff,
there are good stimulating factors to it. On the other side,

(09:26):
I worry about a word that you just said a
minute ago, which is addiction, because we seem to have
an addiction problem in our culture when it comes to technology,
you know, and it started with Facebook and then Instagram,
and you know, how do you ever get your kids
to disconnect from the metaverse? Why not just live there

(09:47):
full time. Um. I mean, it is a problem, and
we're starting to see parts of it. Um. But then
the question when it comes to kids, right, is at
what age is it appropriate for you to have that?
And and different companies have different ages, Different companies have
different age limits, and there aren't really any sort of
standards for us to understand that a little better. And

(10:09):
and that's kind of a problem. You know, you don't
just give something to everybody without any rules, like there's
gonna there has to be some rules of engagement, right.
There was also a study, um, I think the Information
Commission's Office, uh did a study where they went into
VR I think it was v our chat fire number correctly,
and you know, the researchers found that users, including miners,

(10:33):
were exposed to abusive behaviors about every seven minutes. They said, um,
things like, you know, miners being exposed to graphic sexual content,
Like should that be the case? Of course not, that
should not be the case. And here's the problem. When
your kid is is you know, has that VR headset on.

(10:54):
You can't most of the time see what they're seeing
unless they cast it onto another device. So let's go
through a few of the warnings, particularly for children, because
let's say they're on gun raiders or one of the
like the gaming devices. Everything I've read they is that

(11:14):
kids are exposed to bad words, racist words, and sexual content.
Now I'm not saying that's every time they log their
brains into this, but you know that's that's enough to
be scared of as a parent, just those three things. Actually,
I think it's even worse than that. There's bullying, their

(11:35):
sexual harassment. Um, there's grooming is a big thing. Obviously, racism, violence, content, mocking. Certainly,
I have two daughters. You know, I'm worried about how
you know safe that world is for girls because on Instagram,
you know, they have a lot of issues with appearance

(11:57):
and self esteem and here in the metaverse they can
make themselves look and be anything they want, which, on
the one hand, is not realistic, and in how they
see theirselves, it does set them up. As you said, grooming, Uh,
explain what virtual reality grooming looks like? How do you

(12:18):
do that in a VR setting? You know, you would
have essentially an adult that would be in virtual reality
and they would talk to children and start slowly talking
to them about, you know, things that are completely inappropriate. Um,
and eventually get them to do things that they shouldn't

(12:40):
be doing. And when you say get them to do things,
are you talking about getting their avatar to do things? Well?
Their avatar. But here's the thing. If the avatar is
doing it, um that sort representation of a person in
the real world, right, it's not just a theoretical avatar.
That's that's just doing things. So I think that's that's

(13:00):
the problem there. Can you block avatars? If I felt
uncomfortable with somebody or if my child was complaining to
me about an avatar, could I block them? Yeah? I
think it just depends on on the application you're using.
Some applications might enable you to block certain characters and
avatars and others won't. And and a lot of systems

(13:22):
don't have the parental controls that are typically available on
other devices, those sort of headsets. I mean, I don't
know how old your children are, but you know, my daughter,
she's younger, and she's on what I call a gateway drug,
which is rule Blocks. Right everywhere I go people are
talking about roadblocks because Roadblocks is a game that's three

(13:45):
D and you interact with other kids and things like that,
and it's just blown up, you know, at least that's
on a tablet. Asparental controls and so there are parental
controls for the metaverse specifically, well, again, the metaverse doesn't exist.
There are there are, right, it's just something that they're

(14:07):
trying to sell us on that hasn't happened yet that
you know. But there are parental controls in some of
the applications that that you can use, uh, you know,
on your tablets and on your VR headsets and things
like that. That might not be part of the entire system, right,
but it could be parentical controls for a specific app

(14:29):
that would be installed on your on your device. So
is it perfect no? Is it an added level of security? Absolutely,
But we also have to take into account something else, right,
Like we are operating under the assumption that all the
kids were talking about that are using these VR systems
are in great home environments, you know, we're talking about

(14:51):
this like you know, all parents and all kids lives
are equal, and that's not the case because a lot
of the kids that might get groomed, or a lot
of the kids that might have challenges are basically using
this as an escape mechanism potentially from their real world,
which is not a great world to be in in
the first place. And that's when you know, a lot

(15:11):
of the taking advantage off might actually take place as
they do in the physical world. I know you say
this is also the physical world, but I mean, you know, absolutely,
I mean, I think legally it's a challenging question, but
psychologically it's a very important question for us to pursue.
I mean, think about you know, sexual assault, if that

(15:32):
happens in VR, which people have claimed that's happened to them.
You know, what are you going to argue that this
is just the virtual world and and that's it because
of immersion, they feel like they're really in this place. Um,
could be really damaging and you can't just be like,
that's not real. There was no real you know, physical

(15:52):
sexual assault, right because there is the there is the
psychological trauma, but there may not be you know, bruises
or pregnancy or scratches. You know, they're like forensic evidence
of it, right absolutely, um. And actually that's a lot
of the work that we initially did. My area of
expertise is mostly in digital forensics, so like you know,

(16:15):
investigations that involve computer based systems and extracting the evidence
in illegal and scientific manner. And one of the first
things we did with work that was funded by the
National Science Foundation is look at what evidence can we
recover from these systems so we can answer the questions
of the who, the what, the when, and the where um.
And we were able to find some good stuff at
least with the systems we looked at. If you'd like

(16:37):
me to talk about it, yeah, share it with me.
Unless it's secret, it's people can read it. Um. So
there's uh security research. We have this thing called a
man in the middle attack, and it's the idea that
you know, right now, I'm communicating with you, and if
there was somebody that was intercepting the communication between me
and you, they would be injecting themselves in the middle.

(17:00):
And that's what's called the man in the middle attack.
So imagine you're in a virtual room, maybe having dinner
with your significant other, right, or doing something very personal,
and there's this person right there in the virtual environment.
You don't know that they're there, you can't hear them,
you can't see them, but they're they're right. I call
this like the invisible peeping tom right. And then the

(17:23):
human joystick attacked. So essentially, if you're if you're playing
virtual reality, you're in a virtual world, but that world
is reflected in the real world, right, So if I
move my hand up, or I move my hand down,
or if I walk around, you're still walking in a
real room, right, um, And that real room has boundaries
like a wall, and and typically what you do in VRS,
you you detect the real wall so you don't hit

(17:45):
it in the virtual wall, so that when you get
close to the real wall, it kind of gives you
a sign that this is a real wall, don't walk
into it. So we were just playing with a file
and he like it was a bug in the code
that moved the center of the room, and all of
a sudden, the girl that was in the virtual environment
just moved to that location that he specified by mistake.

(18:07):
And we're like, whoa, you can control people. You could
move them, right, So basically you can just start pushing
people to any location you want them to to move to.
And we thought that was really interesting and we called
it the human joystick attack. Oh my gosh. And there's
also I've read about cyber sickness. People get motion sickness
from in this VR world. Yeah, if you're in VR

(18:31):
for a long time, it doesn't make you feel great.
Number one. Number two, We we also ran that as
an experiment where the environment started flipping around and moving
around and made you feel really sick. Um. We also
did the new We called it an overlay attack. So
imagine you're in virtual reality playing your game and then
people take over your headset by just putting images in

(18:52):
front of it, and now you can't see anything. Worse,
what if they start putting pictures of your own kids
in front of you? Like that going to really psychologically
impact the person that's uh, that's in that environment as well. UM,
I mean, what kind of parental guidances are you going
to put in place in your home for this? The

(19:13):
main thing for me is, you know, we can't control
everything that our kids do, but what you can do
is at least have some parental controls. One of the
the simplest things that parents can do. UM. You know,
if you're using a computer or laptop or a VR
system or an in our system or something like that,
you should probably be using it in some public space,

(19:34):
like in your TV room where there's other people around you,
Because I think is as soon as they're putting an
environment where they're completely on their own, other things could happen.
If you're using the roadblocks, you know, make sure that
they can't chat with other people. Make sure that you
know you're choosing the right age limits for the applications
or the games that they have access to through roadblocks

(19:56):
and other systems like that. And I assume that you
as a parent have the same in conversations about virtual
reality that you would with any other app or social media,
which is you know this, if an avatar you don't recognize,
you know, comes into play, you know you have. It's
like there's a whole education for parents every time something

(20:21):
in this cyber world gets created. I mean, I say
to my kids, let us know if anything makes you
feel uncomfortable. We just want to make sure that there's
an open line of communication is I think the most
critical thing. And I think I think by by ensuring
that you constantly tell them you can tell us anything,
we're not going to be mad at you. Um now

(20:44):
you might get mad at them, but that's you know,
that's another story. But in that scenario, like you can
tell us anything and you know we're here to support you.
I think goes a long way in terms of ensuring
that you know they can come to you with with
with with the things they they should be telling you, right.
I mean the truth is, I think that we just

(21:07):
can't keep up anymore. UM. I mean, if you really
want to be honest and we want to talk openly, yes,
I do want to be honest. We can't keep up,
you know. And the amount of information that's thrown at
us constantly is parents, is I think beyond our ability
to comprehend what's going on. And I'm the person that

(21:28):
does this for a living in terms of research, and
I'm saying I can't keep up anymore. UM. I can
only imagine, you know what other parents that are not
in the technology field, they're probably feeling like, oh my god. Yeah,
I am one of those parents, And I feel like
there's a mental health crisis with these younger generations right now.

(21:49):
And I can't tell if virtual reality, the metaverse, social media,
if it's helping them, if it's not helping them, because certainly,
I know, like I said before, the during the pandemic,
that they were able to connect. I know that there
are good, positive socializing aspects of this. UM. But do

(22:10):
you think that there's a benefit for for younger children
attains when it comes to anxiety. Well, I do definitely
believe there's good stuff and there's bad stuff with every
single thing that we do. There's definitely some very positive
use cases for you know, using virtual reality, uh, you know,
for people that might not have great social skills, for

(22:32):
folks that might have autism and um. But also there's
some very negative things about it, like all the data
that they might be collecting with your eyes and your
gazes and well into that data you know, being used
positively or negatively and and are they just you know,
leveraging that data for making more money or not? Um
And and these are the sort of questions that that

(22:53):
we need to think about very carefully, other than the
safety issues and of course a lot of the challenges
that we about. But definitely definitely there are positive use cases, um.
But it doesn't mean that they're going to overpower the
negative use cases. And that's that's kind of the the
crux of the problem. Right As a security researcher, you

(23:15):
realize that when companies are releasing new technology, they're trying
to get it to market as quickly as possible. Right.
So security is a complete afterthought. Um and and and
that's one of the issues that really happens. When people
get excited about technology, they just want to watch it,
they want to put it in people's hands, and the
rest is history. Right, And then all of these problems

(23:37):
start appearing, and we're not thinking about the problems. So
you know, sorry, I'm kind of drawing a dark cloud here. No,
I'm happy you're drawing a dark cloud because I have
a lot of fears, as I think a lot of
people do about this. And I'm even gonna make you
go darker by asking you what are your biggest fears

(23:58):
about the future? And the metaverse? Um? I mean, so,
so there's there's the saying, right, money is the is
the root of all evil? Right, that whole metaverse thing.
It's not really driven by you know, it's not it's

(24:18):
not pure. People are not trying to move into that
direction because they're thinking that it's really going to help people.
The hard to push from all these big companies. It's
not driven by we're gonna make your life better, right,
It's really driven by we're gonna make more money. Right. So,
so that's really what scares me is is there's no
real foresight and thinking about the potential impact of moving

(24:44):
in that direction, like in terms of how do we
do this right? And I think that's honestly one of
my biggest problems with not just the metaverse, with any
new technology that's coming out. You know, I think that
parents should know that at this point the metaverse is
not there, but that extended reality, virtual reality, augmented reality

(25:10):
is truly there, UM. And and you know, we need
to be very cognizant and as parents, we need to
maybe unite in some fashion to make sure that you know,
things are being done in a way that that will
benefit our kids in our society. And I think that
that's that's an important message that we need to all

(25:31):
consider as we're moving towards the soul called metaverse. Yes, yes,
thank you. UM. Before I end this, I want to
just ask you, Abe, how did you get into this? So?
I mean, I'm from Jordan's in the Middle East, UM,
and I grew up in the uae Um Dubai Abu

(25:52):
Dhabi area and where I grew up, the Internet was
fully censored UM. And in order for me to sort
of things that I wasn't supposed to see as a kid,
I had to learn how to you know, bypass the
Internet proxy at that point in time. So you were
a baby hacker. So that's how I got into security, right, okay,

(26:15):
baby hacker. Thank you so much for doing this, Abe,
I really appreciate it, and thank you for all the
work you do to you know, help keep it safe
for all of us. Thank you. And you know, my
students did the hard work, you know, I just was
there to support them. And it's time for a quick break.

(26:35):
But wait, when we come back, the founder of common
Sense Media tells us what his organization is doing to
help us protect our children. Welcome back to go ask allie.
My next guest is Jim Styre, the founder and CEO

(26:56):
of Common Sense Media, a nonprofit that provides ratings and
recommendations of safe media for kids. Styre describes a group
as nutritional labeling of media. It also focuses on the
effects media and technology have on young users and advocates
for kids privacy protections online. On top of this, Styre
is an award winning professor at Stanford University, where he

(27:19):
also attended college in law school. And I met him
when I was twelve, So Jim sire, besides being the
founder and head of common Sense Media, we should probably
disclose that, in fact, you were my camp counselor at
a very young age. You were very very cool camp counselor.

(27:40):
You wore like pooka beads and you're always in bare feet,
and your hair was longer. And look at you now,
look at you now. I love that. I love that.
You know that I still think of you as Dabber.
That was your nickname when you were twelve years old. Yes,
it was. And uh, you are one of my all
time favorite campers. Said that means a lot to me.

(28:02):
Thank you. That is totally true. So, first of all,
you are the founder of common Sense Media, and as
a mom who has always been concerned about content, I'd
like to thank you. Um, how would you best describe
common Sense Media? I would describe us as the biggest
child advocacy group in the United States and also the

(28:24):
biggest media and tech advocacy group in the world. And
I think most of your listeners in our audience knows
us because we rate and review every movie, TV, video
game website we have about a hundred and fifty million
unique users and also we have a curriculum on digital
literacy and citizenship. That's in a hundred and ten thousand
schools in the United States around the world. So we

(28:46):
created that whole field of the safe, ethical, responsible use
of cell phones and the Internet and Instagram, Snapchat, that
kind of stuff. So it's a big nonprofit. How did
you jump into this? This is obviously something you're very about.
I graduated from Exeter, the fancy private school up in
New Hampshire, and I spent a year teaching in Harlem

(29:07):
in the South Bronx with my mom before I went
to college. And that was right before I became a
camp counselor. And so when you knew me when I
was eighteen years old, I already knew that I wanted
to work with kids, and I knew that I didn't
just want to work with Upper East Side kids, but
but I wanted to work with kids who lived in Harlem,
the South Bronx, East Oakland, the toughest areas in the

(29:28):
United States. And that's really been my career. So I
went to Stanford, I taught again in the worst schools
in New York City for a couple of years, and
I went back to law school and became a civil
rights layer. And the reason I started Common Sense Media
was there was no major child advocacy group in the
United States that had a constituency, which is really parents
who have to advocate for children because kids don't vote

(29:50):
and they don't have political power. And the reason I
started Common Sense was to get people to join the
organization and understand that that both media and tech media.
At that point it was mostly movies, TV, video games,
We're having an incredible impact on kids lives, but also
that you needed to be part of an organization that
advocated for children. I did not know into the thousand

(30:13):
and three that that rating and review platform would be
so successful so that moms like you would use it. Yeah,
it's a it's a big need. We can't we can't
as parents. I knew that I have four kids. Yeah,
you have four kids. You can't be aware of what
movie or what game or what you know new app.
You can't control it let alone you know know if

(30:34):
there's going to be nude scenes or swearing or violence.
And so it was a huge need in our culture.
I really appreciate. And what happened was you could tell
right from the beginning once we had those movie and
TV ratings. There was nothing like it at the time,
and people just flocked to it. And then what happened
was a few years in this happened the iPhone and

(30:58):
Facebook and install Graham and Snapchat, and the truth was
kids went from watching movie and TV shows to being
glued to their screens. And we were there and we
were by far the biggest organization in the country, and
we realized, oh my gosh, what we really have to
do is look at the impact of technology and cell
phones and social media on kids and the rest is history.

(31:22):
And by the way, and I want to do a
deep this is the area I want to do a
deep dive into because um, I mean, of course, we
didn't grow up with social media or cell phones, so
there's no This is like the wild West for us
parents that didn't grow up with it. And even though
I have older girls, I feel like the conversations with

(31:42):
parents all the time is when do I give my
kid a phone? When do I allow them to do this?
Because we have no idea. And you know, I do
this work with the Child Mind Institute because I have
very very strong feelings about technology and how it's sexualized.
This whole generation of girls. And I feel like what
people don't talk about is there's a whole new component

(32:05):
to parenting, you know. I feel like I've spent a
majority of my lecture series with my children talking about tech.
I totally get that. And I went to Silicon Valley
with the Child Mine Institute and there were all a
lot of people from Google and Facebook and everything there.
We had this big conversation, and then afterwards many of

(32:27):
them came up to me and whispered to me, Oh, yeah, listen,
it's far worse than you think. I won't let my
kids go. And I'm like, well, wait a minute, come on,
you hypocrites. You mean you're making billions of dollars working
for these companies and yet you know how bad it
is for our children. You're absolutely right, Alan, and every
parent deals with that today and they need basic, simple advice.

(32:48):
Um and I will tell you we also have big
political leverage over the tech companies. Referring to so and
so for example, you mentioned Google, So Sundar he's the
CEO of Google. He's actually a good guy, way better
than the predecessors, I would tell you on these issues.
He cares about it a lot, and Susan Wajetski, who
runs YouTube We work with two now. By the way,
we criticize them when they don't do good stuff for kids,

(33:11):
but we also work with them and I agree they
now all understand in the big Silicon Valley companies as
parents a the incredible impact that their platforms are having
on kids. Be most of them, with the exception of
Mark Zuckerberg, love common sense media. And the reason Mark
doesn't this because we're so critical of Facebook and Instagram
and have been for so long. Um, but this is

(33:34):
the seven reality for every parent, and we try to
make it simple easy, you know, and and you know,
we have several hundred staff here who all they do
is try to educate parents in the broader public about
the impact of media and technology on kids, what we
can do about it, and then we try to pass
big laws around privacy and holding the social media platforms

(33:56):
accountable for the misinformation and disinformation. We're doing a study
on porn and sexting. As a mom, I guarantee you
you will be interested because most parents out don't know
their kids are exposed to stuff at ridiculously young ages. Now, yes,
so I mean, you know, a little seven year old
boy can just google titties and he can go to

(34:17):
the dark web in a matter of minutes. Correct, And
they're not supposed to be on YouTube or TikTok, but
they are. I mean that's where kids live today, right,
It's primarily on YouTube, TikTok, Instagram, Snapchat, probably in that order.
And the problem is these are all driven by algorithms,
and what happens is they also maximize the most sensational stuff.

(34:37):
That's how you engage people. Violence, sex, and so little
kids can go down the rabbit hole very quickly on
these platforms. It must be such a battle for you
because there are so many billions of dollars made in
this arena, like you know, sex and violent cells. But

(34:57):
we're pretty powerful. I mean, you'd be surprised, a honestly.
I mean this we've been a almost all the executives
use our stuff. You said you went out there for
the Child Mine Institute, and by the way, we work
with them on kids and mental health issues a lot um.
But the guys who run most of the companies and
the people at the levels below them are almost all
common sense media consumers, and so we go to them.

(35:19):
But we also regulate them. We passed all the big
privacy laws here in the United States. We passed this year.
We did in California over the opposition of certain tech companies,
the Age Appropriate Design Code, which means that if you're
building a new platform like a Snapchat or an Instagram,
you have to think about the kids who are in
your audience and build it for that. And you know,
the Europe and the rest of the world are in

(35:41):
certain cases farther ahead than the U S. The U
S has done a terrible job of regulating the tech
companies in my opinion, and only over the last few years,
because the broader public totally agrees with the common sense
perspective on this. Has everyone come to understand that the
wild West, which is how you correctly described, has got
a change. So we're entering an era now where young

(36:04):
people themselves, including yours, have really come to understand how
much their lives are being shaped by these devices in
these platforms. So give me an example of something that
you fought in one. I'll give you two good ones.
Early early years of common sense media. Early years we
went after the video game industry because they made really disgusting,
ultra violent and sexually violent video games like the Grand

(36:27):
Theft Auto series were postal, and so they were coining money.
Video games, by the way, are a way bigger business
than movies. I don't know if the public knows that,
but those companies are way more valuable than movie companies are.
And you know, boys in the US and around the
world are addicted to video games in many cases. But
before we came along, there were a ton of ultra violent,

(36:49):
sexually violent video games that kids could buy at age ten.
So we went after them and passed laws in Michigan,
in Illinois, California and elsewhere the stopped the marketing and
sales to miners of sexually violent also had a lot
of racist stereotypes. I mean, black and brown people were

(37:09):
always either prostitutes or drug dealers in the games. So
we went after them and one and we filed all
these laws. By the way, the case went up to
the Supreme Court on First Amendment ground, and it's like
six years later we lost, but it didn't matter because
by then the video game industry cleaned up its act
on a lot of this stuff and stopped marketing and

(37:30):
selling him to get So that's a big example where
we took on a huge industry, and one, I would say,
the other biggest one is look, I read a book
inn called Talking Back to Facebook, which really piste off
Mark Zuckerberg and Cheryl Sandberg, and they even they wrote
me threatening letters saying we're going to block the publication
of it. And I said, how you know, I'm a

(37:50):
First Amendment law professor at Stanford. You're not gonna block
the publication in my book. But um, and it exposed
them were what they did around girls body image and
all the errative, performative stuff that kids do on these platforms.
And in those days, Facebook was a platform that kids used.
But what happened there was I wrote a whole chapter
about privacy. Privacy is a fundamental right. But if you

(38:13):
think back a decade ago, Ali, my own children told
me Dad, no one cares about privacy. Privacy is passe,
and people like Mark Zuckerberg and Eric Schmidt. Was that
of Google in those days, they would go around telling
the world no one cares about privacy, privacy is old
fashioned stuff. Well that was ridiculous. It's also a fundamental
right under the US Constitution, although with this Supreme Court,

(38:35):
who knows. But having said that, what happened was we
passed in the California Consumer Privacy Act, which is the
law of the land in the United States, and it
gave everybody in your audience in the United States rights
as consumers to protect their own personal information, data and
their kids. So we had a huge victory in that one.

(38:56):
And what happened is we split the industry. So Microsoft
an Apple came with us. Remember their business model is
they make money by selling devices, not by hoovering up
your personal information or your kids personal information. Whereas with Facebook,
you are the product or your kids are the product.
They're selling your private information to advertisers. That's their business model.

(39:19):
It's also Google's business model, by the way, and it's
why the leadership of those companies are so important in
terms of where they're responsible or not. How much have
you learned from your own kids about this? I mean
really a ton, right, totally. I mean I'll say this
when we first when I first started Common Sense Media,
Let's see, Lily was about our oldest was about nine, right,

(39:40):
So we have two boys and two girls. So I
saw the body image issues really big with my daughters, right,
and I could see them photos shopping their images I
could see body image and eating disorder issues with them
in their classmates. You have plenty of young people, even tweens,
let alone teams, who are exp aariencing that. And it's

(40:01):
because they're constantly exposed to social comparison. You know, they're
growing up on these platforms where you're constantly trying to
show your perfect which which only you are Alley, only
you and I are only were the only two perfect
people in and so. But when I wrote talking back
to Facebook, it came out wrote it inven I could

(40:23):
not believe how much kids self esteem, girls and boys
was being shaped by these platforms, and how irresponsible the
platforms were about it. Even though some of the people
who random were parents. Part of the problem was the
people who built all these platforms were twenty five years
old and they didn't even think about what they're doing.
They're just making money, right, and they're building products that

(40:44):
were addictive, and they built all these features that would
addict you to the platform. No, I mean I I've
seen it, and I've talked a lot about, you know,
particularly with because I have girls, how it affected girls
eating disorders, but also how many likes you got was
you know, affected your self esteem or I remember one
boy on one of my daughter's um instagrams, which by

(41:05):
the way, are private, and I said, they will be
private and your till your twenty one. UM. One of
the boys was like, you'd be hot if you had teds.
You know, things like that that I'm like, that get
in their heads and I'm like, who is this kid?
I'm calling the mother and they say, mom, no please.
But then they would see, you know, girls just a
couple of years older than them, you know, in bikinis

(41:27):
with their thumb in their mouth and all this stuff.
And I really I spent hours saying to my kids,
this is a girl that has a whole she's trying
to fill. This is not something to emulate. Please don't
think that this is your currency. And it's exhausting and
it's you know, right now, it's completely out of control.
I can't. I mean, I look at Instagram and I
just go, oh my god. I agree. Yes, and it's

(41:53):
time for a short break. Welcome back to go ask Alli. Okay,
but here's what I want to talk to you about. Two.
What are you doing about the metaverse? A lot? Okay? Good,
tell me because if that terrifies me. Okay, so we're

(42:13):
coming out with a report, so it should be terrifying
partly because the leading company in the spaces Facebook, and
they're the least responsible of all the companies. I would
never trust my kids to that company, right and they're
the biggest player in the metaverse. So we're about to
come out with the report. It's going to show you
that it's a completely wild wild West scenario and that
they're hoovering up your kids data. So if you think

(42:35):
online data or on Instagram or or YouTube, you're getting
data right in the metaverse. They get all your facial characteristics,
they get all of the data about the way you
behave And there's not a single company in that space
right now that's doing it in a way that's sensitive
to kids and teens. I would not, as a parent
right now, let my kids room in the in the

(42:57):
metaverse nowhere, I tell me why to me the scary
stuff because the companies don't have protections, they don't have
privacy protections. And also you can get into porn, You
can get into a lot of stuff you shouldn't be
in as an as an eighteen year old, let one
as an eleven or twelve year old, which is who
we're buying some of these headsets, and it's a completely
unregulated environment. The metaverse should scare you, but what are

(43:18):
you scared about? Tell me, like porn, it's porn, but
three D porn that they're participating in exactly. And the
thing is this, if you actually look at new platforms.
Steve Kate is the founder of a o L would
tell you this when when new platforms emerge, the way
you really grow an audience is usually with bad stuff
at the beginning, right before it becomes universally bodular. So

(43:38):
aile All in the early days had a huge porn audience.
So if you really look at how how those platforms,
social media platforms work, it's the sensational stuff that creates engagement.
So it's violence, sex, and anger. So that's a really
the last one is a really important issue. Hate an
anchor appeal to people. Right, they stay on there and

(44:01):
then they get in big, you know, virtual arguments with
each other, and kids are susceptible to that. They're younger, right,
they get drawn in. And also there's a massive amount
of misinformation and disinformation on these platforms, which we've been
very critical of them for we created with the Anti
Defamation League and the Double A CP a campaign called
stop Haide for Profit a couple of years that led

(44:23):
to an advertising boycott of Facebook. But in there are
a lot of ways as a parent that you really
need to be aware of what your kids can be
exposed to. Is there any worry about like spending too
much time in the metaverse and not in one's real life.
I think so, But I think that's true with any
screen honestly. I mean, if you're in the metaverse for

(44:44):
three hours, that means you're three hours you're not out
in Central Park, or you're not out playing outside, or
you're not you know, out with your friends. You're just
zoning out in a virtual world. Me that that was
one of the things about the pandemic is it was
so hard on kids and some of the mental health
issues that that we're already there, but they were exacerbated
by the pandemic because kids sat in front of screens.

(45:06):
Our youngest kid was in high school, and you know,
he would sit in his bedroom right and go to school,
and I was worried about him being depressed because you're
in front of the screen all day and the big
ramifications by the way for lower income kids, because the
other part of what happened in the pandemic is a
lot of kids just didn't go didn't go to school.
If you're in a single parent home where your parent

(45:27):
is working full time and you have to leave your
kids at home and unsupervised, you know, to go to school,
a lot of kids just sort of dropped out and
the schools weren't able to follow through. So we're going
to be living with the consequences of the pandemic on
young people for a while. Have you found with tech
um the suicide rates have gone higher with with teens

(45:49):
in terms of isolation, And so here's what I would
tell you. First of all, it's not a pure causal relationship.
I mean one, we're pretty careful. I mean again wearing
my Stanford professor hat um. You have to be careful
about just like blaming it all on social media, right
or tech And it's simple to do, but it's complex.
The data is clear that kids who are already anxious,

(46:12):
or have other minor issues of depression or anxiety or
the normal insecurities of that all adolescents have, they can
be exacerbated by their online experiences and their social media experiences.
So there are clear major mental health implications of that.
And this was really exacerbated during the pandemic because kids

(46:33):
are just living their lives online. All the screen time
rules went out the window during the pandemic, partly because
you were going to school for a while on this
on on online. I mean, weren't there aspects of it
that helped kids um that they were able to talk
to friends and not feel so isolated, that there was
some form of socialization. Yes, so platforms can do good

(46:55):
stuff too, right. It can actually connect you to mental
health resources, online counseling, a lot of stuff I don't
think people realized was there until the pandemic. So it's
not a simplistic that it's all caused by being you know, online.
But I will tell you that about half of all
American teenagers experienced some form of depression during the pandemic.
That's unbelievable, right, And we have an and mental health

(47:17):
epidemic right now. So Common Sense Media and Child Mind Institute,
who we partner with a fair amount, are probably the
leading organizations around this. We're about to do a big
campaign with the Surgeon General of the United States of
a Murphy who's fantastic on these issues and who wrote
a book about isolation and loneliness. See, that's another issue
for for all adults. But kits is that that social

(47:41):
platforms and Internet platforms can allow you to isolate and
just sort of retreat into your own little world. And
and that's really dangerous for young people. Yeah, I mean
I can feel it myself. I you know, I I
can monitor how I feel when I go on Instagram
and stuff, and I know how much time I spend
on screen. Is that I could be you know, out

(48:01):
walking the dog or doing other things. So and it's
it's you know, ten times that for the kids or more.
And it's also compulsive and addictive, and you constantly looking
at your phone. You're constantly checking your messages. But if
you talk to kids about it, they know that more
than we realized, and they're they're concerned about it. But
they also have issues that are very interesting that I've

(48:22):
learned because kids view it differently than we do. So
for example, um, one of the reasons why they so
constantly check their phones is they want to be there
for their friends and they feel they need to respond
to their friends when their friends and I'm like, why
who cares? But they actually feel pressured to do that. Um.
They also feel pressured to perform sometimes politically, as as

(48:42):
young people have become more politically aware during the insanity
years of the Trump they have also now been you know,
called on activism online. Right, teenagers feel that they should
respond to issues like racism and and other social problems online.
So it's there's a lot out of pressures they feel
that we don't always understand. So I was my concern

(49:04):
was I was worried that these younger generations were losing
their sense of empathy. But based on what you just said,
that's maybe not true. So maybe there is empathy more
than I thought. And it's it's a really important point, Ali,
because empathy is so basic to human relationships and too
friendships and to family and everything. And the truth is

(49:24):
it's not black and white. Meaning the platforms do enable
you if you really read what kids say. It's really important.
And by the way, this is what my kids tell
me all the time, Dad, you don't get who we
really are. And there's some truth to that because we
did not grow up with this, right, and they are
natives they only know this experience. They have no other
childhood or adolescence without these platforms, and it's it's it's

(49:48):
more nuanced and empathy. Can they can actually communicate online
and ways we do not write, We don't use the
platforms the way they use the platforms. Um before we go,
tell me what you're hoping for with common sense media
in the next let's say five years, what are what
are the sort of the dream the dream ideas. I

(50:09):
would say, number one, you permanently close the digital divide
so that everybody in the United States has access to broadband,
because whether you like it or not, you need it
to go to for school, you need it for work,
you need to do your homework. So currently close the
digital divide. The money's there. We're doing a huge campaign
on that for the next couple of years. I would
say another big dream is that we really address the

(50:30):
youth mental health crisis in the United States and adult
mental health crisis. But that we really do that, and
that we make the resources available to everybody who needs
it in the suicide prevention area, but also for people
who have significant but not necessarily rising at the level
of suicide or you know, massive self harm, but that
we really really take this mental health crisis seriously. I

(50:54):
think third, that we work with the companies and regulate them,
the big tech platforms and social media platforms, so that
they're way more responsible, both in terms of how they
shaped the lives of kids and families, but also about
how they've screwed up our elections. I mean, I think
our democracy is that its most fragile state in my lifetime,

(51:16):
actually since the Civil War. And I will tell you,
I think that the media platforms are a big part
of the problem because they've spread misinformation, disinformation. It's where
all the jam six stuff was was formed online, and
people are manipulating these platforms in really really devious ways.
That's true here in the US, but it's really true

(51:37):
globally as well. Okay, see you at the Camp reunion.
That would be so fun. I know, I can't thank
you enough for everything you do with common Sense Media
really go to I mean, I want your whole audience
to become members of Common Sense, because guys, this is
your kid's life. But it's also the way to advocate

(51:57):
for children. It's also the way to advocate for democracy.
It's like thirty bucks a year and the truth is
this is the reality our kids are growing up in
and that we're living in. So the more we do,
the better. All that's right. Thank you, Jim, Thank you
for listening to Go ask Alli. As always, please check

(52:17):
out our show notes for other great info and links.
Be sure to subscribe, rate and review Go ask Alli
and follow me on Instagram at the real Ali Wentworth. Now.
If you'd like to ask me a question or suggest
a guest or a topic to dig into, I'd love
to hear from you, and there's a bunch of ways
you can do it. You can call or text me
at three to three four six three six, or you

(52:38):
can email a voice memo right from your phone to
Go ask Alli podcast at gmail dot com. And if
you leave a question, you just might hear it. I'm
go ask Alli. Go ask Alli is a production of

(52:59):
Shonda land An Audio and partnership with I Heart Radio.
For more podcasts from shondaland Audio, visit the I Heart
Radio app, Apple Podcasts, or wherever you listen to your
favorite shows.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.