Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Away. Welcome back to intentionally Disturbing. This week, I'm having
a chat with Sarah Adams, known online as mom Uncharted.
Sarah was recently named on Time one hundred's Creators List
for twenty twenty five, and she talks about the dangers
of putting images and over sharing your children online. As
(00:33):
someone who has worked on very disturbing child sexual abuse cases,
I can tell you that there are very real dangers
to putting your children online. One thing I love about
Sarah is her unwavering eye on this issue. She's dedicated
and has sole focus on exposing bad behavior by parents
(00:53):
who exploit their children online and digital exploitation in general.
I hope you enjoy this conversation and it makes you
think twice about sharing your children online with strangers because
you don't know who was looking. Welcome back to intentionally Disturbing.
(01:15):
I'm Doctor Leslie and today I get the honor of
speaking with Sarah Adams, mom Uncharted. Now I need to
know what is this name about? Where did it come from?
Speaker 2 (01:26):
Yeah?
Speaker 3 (01:26):
That's fair.
Speaker 2 (01:27):
Honestly, when I started my social media journey, I didn't
really know what I intended to talk about. I knew
these were interesting to me, but I'd never really put
myself out there on social media, and I felt like
a little uncharted and what I was doing and where
I was going and where i'd take this platform if
I was capable enough to build one. So I just
thought that it was kind of really encompassing. Right, I'm
(01:50):
a mom and I don't know what I'm doing.
Speaker 1 (01:53):
You're a mom, and now everyone knows you are a
Canadian mom.
Speaker 3 (01:58):
Yes, I am a Canadian mom.
Speaker 1 (02:00):
I am too, but I have been here for long
enough where now I just say mother, her mom. It's
turned very American.
Speaker 2 (02:09):
Yeah, when I started, everyone was commenting on my little accent,
but little little note. My mom is actually from Minnesota,
so I spent many summers. Half my family is American.
I'm kind of a fifty to fifty but born and
raised here.
Speaker 1 (02:30):
The fact that you even questioned how you were going
to do, yeah, you know, uncharted, and now you're on
the Times one hundred creator list. I mean when I
saw that, I was like, I know her, I know.
Speaker 2 (02:43):
That was really surprising, and like I just feel really honored,
Like time is time, Like that's legacy media and to
be acknowledged, especially when you're discussioning issues that are regularly suppressed.
I don't get a lot of traction on these platforms.
(03:03):
My following does not grow. I have been stagnant for years,
and so to have time honor me in this way
and say, hey, what you're doing matters. It really, you know,
lit a little more fire within me because I'm sure
you can relate.
Speaker 3 (03:18):
It can be a.
Speaker 2 (03:19):
Little discouraging sometimes when you're providing informative and educational material
that you want parents to see because you're just trying
to make their lives better in this digital world and
they can't see you and they can't find you, and
the platforms don't want you to be heard.
Speaker 1 (03:39):
Definitely. I mean, I know in America everyone's wondering what's
going to happen with TikTok. Yeah, but the fact that,
you know, Canada does censor a lot of stuff when
people aren't really aware of that. I mean around the world, aren't.
Speaker 3 (03:52):
You No, I don't think so.
Speaker 2 (03:55):
I can't recall the exact bill name at this moment.
It escapes me by I can't read American news on
meta platforms. So like if I wanted to share an
article from you know, the Washington Post, the New York Times,
probably Time Magazine. I can't do it because I can't
have access to that Instagram post to share on my feed.
(04:17):
So there is definitely censorship up here as well.
Speaker 1 (04:21):
It's amazing. I've published articles and what I end up
doing is downloading the pdf and I just DM the
pdf to people in Canada.
Speaker 3 (04:29):
Yeah.
Speaker 2 (04:29):
I use when I want to share like a link,
because it won't let me share links from news sites
on meta platforms. I have to go in like tiny
ulr it and then share it there. And it's very
frustrating and it's very discouraging.
Speaker 1 (04:44):
I don't I don't fully understand it, but I love
that you. I love that you are fighting against it
in so many ways, and even if even if you
aren't growing, your message is clear, Yes, get kids the
fuck off the it's not monetizing off children.
Speaker 3 (05:02):
Yeah.
Speaker 2 (05:03):
It's just reached a point where, honestly, I think we're
starting to see a shift. You know, there's been a
lot of documentaries coming out, More of these conversations are happening,
More people are pulling their kids offline.
Speaker 3 (05:14):
You go to families there, we.
Speaker 1 (05:15):
Have to do a plug for you. What was the
documentary Your Distant.
Speaker 2 (05:18):
Oh yeah, I was in bad influence the Dark Side
of Influencing on Netflix as well as Born to Be Viral,
which is streaming on Hulu and Disney Plus. I have
some cameos and commentary in those, which is great. A
must watch, yes, definitely, how hard watch, but must watch.
Speaker 1 (05:40):
I know it is. It's not Lee. I would watch
it maybe at like four o'clock in the afternoon, maybe
not right before bed.
Speaker 2 (05:46):
Well, and I think we're a little like you know, well,
especially you Leslie, we're a little desensitized, right, we were
in this world, so we see and we hear so much.
So then when you're telling other you know, parents and
people like that, you're like, oh and then they start
watching and they're like, oh my gosh, Like I can't
stomach this.
Speaker 3 (06:04):
I can't get through this. It's too much. Like how
do you do it? Right?
Speaker 2 (06:09):
And I think we do it because it's important, you know,
like we want to spread awareness for so many different reasons.
Speaker 1 (06:18):
It is so important. How did you get into it?
How did this become? Because you know, my account is,
my accounts are a little schizophrenic. I'm all over the place,
like I went at run that Ren account like crazy
and then I got distracted. But you're really good at
staying focused on your mission.
Speaker 2 (06:38):
I've really niched myself down. I generally don't go outside
of that. Are there more things after four years of
talking about these things I would like to talk about? Yes,
I do plan on incorporating a few more conversations, but
they all kind of work into what.
Speaker 3 (06:52):
I'm already talking about.
Speaker 2 (06:54):
Right, I'd like to talk about the materialism and the
consumerism of these family vlogging and these influencers and more.
But how I initially got into it was when I
first had a child in late twenty seventeen. We were
all online. I followed like some mummy influencers. I didn't
see a problem with it. I thought I was a
new mom. I could learn some tips, tricks, see some
(07:15):
c outfits and whatnot. Life moves on, I have a
pandemic baby. We all go online more, and the more
we're all there, the more I start to feel uncomfortable
with what everyone is sharing, from friends and family to
these mom influencers. And I just started talking to people
(07:36):
around me and they're like, oh, yeah, you know, it
is weird. It's a little uncomfortable. And then there was
a mom I followed for a long time and her
son had a medical emergency and everything was documented, the
whole thing, and it was like a light bulb went
off Leslie, and I just thought, I shouldn't know this.
I shouldn't see this. I'm just a mom up in Canada.
(07:59):
Why do I have such intimate details of this child's
medical journey, pictures information, doctors' names, like appointments. I just
felt so uncomfortable, And I think that really lit something
in me that like, other people out there have to
feel this way too. So maybe if I just talk
(08:19):
about it, I'll find a community that agrees. And maybe
I'm not as you know out there as social media
is making me feel.
Speaker 1 (08:28):
One hundred percent and you have the community. Yeah, and
it's yeah, it's so true. I mean, people are interested
in children's lives, but when it takes that turn towards
the parents monetizing off of it and then the parents
over sexualizing the children, people are uncomfortable and maybe they
(08:52):
won't follow it, but there's no voice saying hey stop it.
Speaker 2 (08:56):
Yes, hey stop it, because like the big tech isn't
saying anything because they're making money off this, right, It's
a lot of their content and it's a lot of
content that trends and gets engagement a lot.
Speaker 3 (09:09):
For all the wrong reasons.
Speaker 2 (09:12):
And when I first started talking, I was mainly coming
from like a privacy and consent standpoint. I know, like
predators exist on the internet, one of the reasons I
wasn't going to share my kids.
Speaker 3 (09:22):
But it wasn't like I didn't.
Speaker 2 (09:24):
Know the magnitude of the problem until I found some
of these other communities that was truly shocking, you know,
especially the Instagram moms who were exploiting their young girls
like that led me down a path I was not
prepared for. I think that led me to meeting you online,
(09:46):
because right, and that was a lot of time that.
Speaker 1 (09:49):
That New York Times article came out or the Wall
Street Journal did one too, and yep, basically, how like
these moms were selling the used tights of their little
girls to followers. Yeah, I mean the disgustingness, but yeah,
those when those articles came out, I think a lot
of eyes opened.
Speaker 2 (10:10):
It was really nice, and I had the pleasure of
talking to those journalists for many months and providing information
in what I had seen and things like that. So
to see it finally get mainstream attention was really rewarding.
Speaker 1 (10:29):
Right.
Speaker 2 (10:29):
It wasn't my article, but I do think that I
was one of the first voices on the topic. I
think there have been some changes made, but you know,
if you start digging, you'll still find a ton of
moms doing this stuff.
Speaker 1 (10:44):
Yeah. Well, I mean, at least we got that run
account down. Yeah, that was a huge account.
Speaker 2 (10:53):
That I think at the height it had like seventeen
million followers, and like, no toddler should have seventeen million
strangers following them online.
Speaker 3 (11:06):
Period, end of story.
Speaker 2 (11:08):
I think it is crazy that parents are allowing strangers
into the lives of their children, and we also don't
know the long term repercussions of this down the line
for their mental health and their well being. And with
the advancements in AI, I'm sure you think about it
(11:29):
often because I do. Like I think about these young
girls who are being exploited by their parents, and now
there's the technology to turn these images into worse things.
So what they get online when they're thirteen and some
stranger online has a whole bunch of edited and deep
faked images and videos of them, and what threatens them
(11:50):
like sextorts them? Oh yes, I think that parents need
to think really long and hard and long term. Too
many parents are posting things like in an instant versus
thinking about long term repercussions.
Speaker 1 (12:06):
Yeah, the ren account, so she so the mom Jacqueline
was posting her gorgeous daughter, beautiful child from age I
want to say birth, like it was young. Probably it
went on for years. She was I'm not sure how
she was when it concluded. But every post was the
(12:28):
little girl with something that was a phallic image, and
it was the little girl alone.
Speaker 3 (12:34):
It was usually her alone.
Speaker 1 (12:35):
Yeah. Yeah, And so images were like eating a corn dog,
trying or some of the videos were pretending to put
a tampon up to her vagina.
Speaker 3 (12:47):
Yeah, that one really bothered me.
Speaker 1 (12:49):
Yeah, showing her underwear, images of the like those testing
or you're testing drinks and you're sucking on straws. So
everything related to male oral sex and sex in general,
even intercourse was in these posts, and they were those
were the ones that were getting millions and millions of views.
(13:09):
They were saved millions of times. And when you go
look at the comments and then you look at the accounts,
they were of men who only follow little girls.
Speaker 3 (13:19):
Yeah, there's a lot of accounts like that online.
Speaker 1 (13:22):
Yeah, so I say, if anyone's interested, there are articles
about this, or go to other people's platforms to have
them explain it. Don't go directly to the platforms we
talk about on the show.
Speaker 3 (13:35):
Yeah.
Speaker 2 (13:35):
Yeah, And I think that platform is privated. I don't
think there's any content. But here's the thing with nothing
is ever truly gone from the Internet.
Speaker 3 (13:44):
Even though the.
Speaker 2 (13:45):
Mother I don't think deleted it. I think privated. They
count whatever, all of those videos and all those commentary videos,
and they're all still circulating. There were people stealing those videos,
making Facebook pages like. It's still out there, right, and
nothing is truly gone from the Internet. So as much
(14:07):
as the original videos are gone, they're all still everywhere,
which is really hard, heartbreaking.
Speaker 1 (14:14):
It is heartbreaking. Yeah. I think CPS was involved with
that case. But in so many cases it is interesting
that what we call what internets lose, I don't know
what we call them, but people are actually calling the
police on the parents of these accounts.
Speaker 2 (14:29):
There are so many like YouTube family vlogger people that
like have clickbait videos like CPS was called on us.
There are many examples of TikTok families CPS being called
on them. Also, I will say some for like very
I think not appropriate reasons in the sense some people
(14:51):
follow moms online and just hate them. Right, They're not
necessarily as bad as other people are doing some really
Harris pranks and really exploitative, you know, child.
Speaker 3 (15:04):
Abusive type behavior.
Speaker 2 (15:05):
But there are people who put their children online in
quote unquote innocent ways and people still hate them and
still call CPS on them, right, And I think that's
something that you need to be mindful of as well. Like,
when you put your child and your family online, a
lot of things can happen.
Speaker 1 (15:25):
Yes, it's it is so vulnerable. You are so vulnerable.
Speaker 2 (15:30):
Yes, And they're already so vulnerable, right, They're just kids
and they didn't ask for this information. And parents are
curating digital footprints for their kids that are going to
follow them for the rest of their life.
Speaker 3 (15:41):
Like as you said.
Speaker 2 (15:43):
In regard to Rent, all those articles, all those videos,
all that commentary that will forever be part of her
digital footprint, and that's I think going to be really.
Speaker 3 (15:54):
Hard for a young girl to rectify.
Speaker 1 (15:58):
Yes, because you have spent so much time in this area,
I'm really curious what stands out to you as one
of the most disturbing accounts or moments that you've had
in this realm.
Speaker 3 (16:14):
Oh my gosh.
Speaker 2 (16:14):
It's so difficult to pick because there's been like a
lot of what I would deem abusive material shared by
parents online. But I think in regard to the little
girls being exploited, like I've had to make police reports before,
(16:35):
there were accounts that were exploiting their young girls on
the meta platforms offering like subscription. And then what happens
is the followers of these accounts, these people start having
relationships with the parent who is exploiting the child, right,
They talk about it in DMS and stuff like that,
and then they ask, hey, let's go to an encrypted
(16:56):
site like telegram, right, And then on teleg they start
pushing for more content. And I was sent an email
with very inappropriate content of one of these girls and
I had to call local authorities actually separate girls.
Speaker 3 (17:15):
I think I made three.
Speaker 2 (17:16):
Police reports in regard to different cases, because they will
go on encrypted sites and further ask for more pictures
and sometimes the parents oblige. And I would say, that's
the hardest, the hardest stuff I've seen.
Speaker 1 (17:32):
Now, I don't want to be I don't want to
this isn't for clickb but I want to explain to
the audience what pictures are like. I in legal casism,
constantly reviewing evidence. So I'm looking at these children and
(17:52):
I'm just going to be really graphic. Yep, so trigger warning,
but I'm looking at little girls who have their vagina
ripped all the way to their anuses. Their bodies are
torn apart because of the size of the male penis,
but also because objects have been used on them, and
I'm often looking at pictures where they're not alive. Oh Leslie,
(18:15):
So this is I don't want to make you say
it all, but this is the same stuff that you're.
Speaker 3 (18:21):
Well I am.
Speaker 2 (18:24):
I am hoping to stop that before it gets to
that point. I have not seen that type of imagery,
fortunately for me. But the images I have seen are
generally young girls in very limited clothing brawlettes, underwear, bathing suit,
(18:45):
leotard's dance wear posed exceptionally provocatively. One of the reports
I made was of a little girl in like kind
of like a rainbow bright costume on a bed in
extremely vocative poses with lollipops. Like someone was paying for
(19:05):
these images, right, it was like a sexualized photoshoot of
an eleven.
Speaker 3 (19:11):
Year old girl.
Speaker 2 (19:14):
And you just think, if a parent's willing to do that,
what else are they willing to do?
Speaker 1 (19:20):
Right?
Speaker 2 (19:20):
And I do know there is a situation in Texas
where a mom and a photographer.
Speaker 3 (19:26):
Were charged with taking these types of photos and.
Speaker 2 (19:32):
She's in jail and she was one of these like
moms in this mom circuit.
Speaker 1 (19:38):
Good right, Yeah, yeah, it's it's horrific. And I think
where you and I link together is that if it's
on the internet, people lose Men are gratifying themselves with
these images, and they lose stimulation when it's just an image,
(19:59):
and then they want a real child. Yeah, I it's
directly connected.
Speaker 2 (20:06):
Yeah, it's horrific to think about. It's just horrific to
think about. I have been seen screenshots of chats on
encrypted sites where they talk about kidfluencers and these young
girls in graphic detail and what they would do to them,
also with like AI generated imagery attack. I have seen
(20:32):
someone sent me fan fiction, which is a really big
thing in the world fan fiction, and often it doesn't
take this turn, but some of this fan fiction was
written about kid influencers from family bloggers, and it got
extremely graphic, and these are all different things that are
happening right now. They take these images from social media
(20:57):
and they take them to other places, and they often
altered them, and if they don't alter them, they just
talk about them in really graphic detail, about what they
would do if they had their hands on this child.
And it's so dark and it's so warped, and like,
in comparison to you, I'm kind of like on the
surface level, right, So with your knowledge and expertise, like,
(21:22):
it gets worse. Parents, it gets a lot worse.
Speaker 1 (21:29):
And if it's not happening as your kid, you may
be encouraging it a man to do it to another kid.
And that's where I think we have to reflect on
that narcissism that you know, I have the ability to
keep my child safe, other people may not. And by
putting it out there, you're also encouraging this community. We're
(21:51):
going to take a quick break and we'll be right back.
It lends well into the post you put up yesterday
on the man.
Speaker 2 (21:59):
Of Oh Yes, the Manisphere, which I'm continuing to learn about.
I think a lot of people think that we know
everything in all the areas online.
Speaker 3 (22:11):
You can't possibly know, it's too vast, but.
Speaker 2 (22:15):
The manisphere where they are promoting very misogynistic ideals, and
sometimes they put it as like dating or fitness, but
the underlying message of this are like women are insubordinate
(22:35):
to men and great gender inequality. And a lot of
people are listening to this stuff, and a lot of
young boys are listening to this stuff, and it often
isn't as noticeable to parents because sometimes they come off
as like live streamers or prank channels or podcasts and
(22:58):
things like that. But it's something for parents to be
mindful of because there's a lot of messaging in the
manisphere that you don't want your children to adhere to
and agree with and move forward in life thinking like
this is normal.
Speaker 1 (23:17):
I found the television show Adolescence. Yes I found it incredible,
and I know it was fiction, right, but around the
exact same time a case played out in the UK
exactly the same way in real life. So to explain
adolescence is this. It is these young boys who are
(23:41):
vulnerable and they are looking for they are looking for
role models, and they are very enticed by these misogynistic
role models like the Tate brothers who are saying, you know,
women are worthless or they are lower than us. We
need to use sex power money to control women, and
(24:02):
we need to basically brutalize them. But it's getting so
convoluted in an adolescent's head that the show details basically
them killing a little girl.
Speaker 3 (24:16):
Yep.
Speaker 2 (24:16):
And again to your point, like it's fiction, But is
this seeping into young boys in society, Yes, it definitely is.
It's taking that mini series took from what is happening
and magnified it and turned it into a really eye
opening mini series that a lot of parents weren't familiar with.
Speaker 3 (24:40):
I think this really opened a lot of eyes.
Speaker 2 (24:44):
It was the first time kind of, you know, a
mainstream show touched on the subject. And I think the
young man the young boy, because I think he's twelve
or thirteen. I think he was recently nominated for an
Emmy because his acting was just outstanding.
Speaker 1 (24:59):
The moment he acted out his anger and aggression towards
the psychologist, Yes, it gave me chills because he wanted
so much validation from her that she did not give
and immediately he turned to violence and anger and just
unrelenting emotions. Yeah, And it shows you how much we
(25:23):
need to offer adolescence how much we need to parent
them and be there and protect them from all this
fucking shit on the internet and these people with these
platforms that are not good people. Yeah.
Speaker 2 (25:37):
And I think one of the lines that really said
with a lot of parents is when the parents said,
you know, I thought he was safe. He was at home,
he was in his room, like I thought he was safe, right,
But they were unaware of what he was consuming. And
I think it's a note to parents that we need
(25:58):
to be involved. We need to be having these conversations.
We need to be asking questions, you know, oh, have
you heard of this live streamer? Or did you see
this situation? Or what podcasts are you listening to lately?
I think a lot of parents, you know, we are
on our screens a lot, and I think sometimes we're
getting kids screens so we can do our emails and
(26:22):
do the things we need to do. But we need
to make sure that we understand what they're doing on
their screens and what they're consuming on their screens, because
just because they're at home doesn't mean they're safe anymore.
Speaker 1 (26:36):
I can't even oh, my gosh, I think it happened
yesterday My daughter is eight, and it is way too
smart for an eight year old. And I keep setting
all these securities on her iPad, and I let her
have her iPad for just a little bit of time,
like maybe we're out of fancy dinner or something like that.
Her iPad has she again downloaded YouTube Kids, so she's
(27:00):
like working around these regulations. I'm constantly stopping and editing.
It is set to four years or younger four year olds.
I walk in and we're like getting ready for dinner.
We're in the bathroom, and I can hear this voice
of daddy, do you like this outfit? And I thought,
what the fuck is on your iPad right now? So
(27:22):
I go over. It's set for four year olds and
the title of the show is little Girl trying on
outfits for her daddy.
Speaker 3 (27:31):
YouTube Kids. Because it's labeled YouTube.
Speaker 2 (27:34):
Kids, parents think it is safe, and as you just articulated,
it is not safe.
Speaker 3 (27:40):
I hear story after story.
Speaker 2 (27:41):
From parents saying that they thought their kid was watching
Bluie and the next thing they know, there's like cartoons
about like undressing or things like that. And another thing
parents have to worry about is the scrolling these shorts,
these attentions right. Kids aren't sitting and watching a full
(28:03):
seven minute YouTube anymore. Their attention spans are getting so
short that they're scrolling and scrolling, and we're not helping them.
We're not helping them and their brain develop when we're
allowing that type of content. So I tell every parent, like,
no YouTube kids, shut it down, and if your child
wants to watch YouTube, put it on the big TV
(28:25):
and sit there with them, right, sit there with them
and treat it like a TV show. Because I don't
think all YouTube is bad. There's some great educational channels
on there about adults teaching science and not geographic and
things like that, but I think there's something to be
said for kids keeping most of their media on a
(28:46):
big screen versus iPads. I'm not anti iPad, My kids
have them, but you just got to be really mindful
of how you use them.
Speaker 3 (28:55):
Because I don't know about.
Speaker 2 (28:56):
Your daughter, but my children, they like, I can see
it in their eyes, you know. I'm like, okay, five
minutes done, and they're pretty good with giving it up,
but then they want it. They want it, They're talking
about it. When can I have it again? What I'm
going to do? I'm putting them to bed and they're like, well,
tomorrow I'm going to play this, and I'm like, oh, no,
(29:16):
you're not right.
Speaker 3 (29:19):
So not all kids are like that, but mine are.
Speaker 1 (29:22):
It changes. I notice like when we have weeks without
iPad versus a little iPad, their behavior and their defiance
completely changes. And I think it directly to completely agree
that kind of impulsivity and the speed.
Speaker 2 (29:37):
I just completely agree like my children, if they're having
too much iPad time, their emotional regulation is not present anymore,
and we.
Speaker 3 (29:47):
Go weeks and weeks without it.
Speaker 2 (29:49):
But you know, currently it's the summer and like mommy
needs a little help sometimes, so.
Speaker 3 (29:54):
Yeah, get on your dou lingo. They've been doing due lingo.
Speaker 1 (29:59):
What are your recommend for age appropriateness for internet and
just everything we've talked about, How do you gauge what
to do?
Speaker 3 (30:09):
You know what?
Speaker 2 (30:10):
I don't think there is necessarily like a blanket approach,
because all kids are different and all families are different.
I am big in delaying smartphones and delaying social media.
Speaker 3 (30:20):
I think delay.
Speaker 2 (30:22):
Smartphone as long as you can. I think there's some
great like smart watches out there. I think delaying social
media is really big for myself and you with children,
you know, under the age of eight. I think we
can all rally together to normalize not having social media
(30:44):
until you're fifteen or sixteen.
Speaker 3 (30:45):
I don't think that is some.
Speaker 2 (30:47):
Crazy expectation, right, you know, you start to drive a car,
you start to you know, get social media. But that
doesn't mean that we can negate teaching them digital.
Speaker 3 (30:57):
Literacy and media literacy.
Speaker 2 (30:59):
Like we need to see these things in school, especially
with the rise in AI and misinformation. They need to
learn how to see what is real and what is
not right. It's already I don't know if you saw
this trending on TikTok the bunnies jumping on the trampoline
the other day. No, okay, so there was a viral
(31:20):
video of bunnies jumping on a trampoline and a lot
of people thought it was real, Like a lot of
adults thought it was real and it wasn't right. Like,
if we are falling for this and we aren't looking
at videos anymore and really taking.
Speaker 3 (31:38):
Note, think about what our kids.
Speaker 2 (31:39):
They don't have a chance, right, So we need to
get digital and media literacy in schools. And I do
think we need to be mindful of the amount of
time we are letting our children on devices, because I
do think it is affecting their attention span.
Speaker 3 (31:57):
And there are different ways to play games.
Speaker 2 (32:00):
I grew up playing like Nintendo sixty four, but we
were in one person's living room playing together on a
big screen. Right now, the kids and school, they go
home to their own little desks and they get online
and play together. Right So, I think it's just being
a lot more mindful and saying, like, you know what,
(32:23):
this is an age appropriate right now. And I am
a parent, and I have house rules, and I've done
a lot of research on brain development, and it's just
not good for your brain right now, and therefore we
can have this conversation later down the line.
Speaker 3 (32:36):
Hold your boundaries, you're the parent.
Speaker 1 (32:39):
I think that's amazing advice. Hold your boundaries, because God,
they are going to push back, and especially related to
what their friends are allowed to do.
Speaker 3 (32:49):
One hundred percent.
Speaker 2 (32:50):
You know, there's a game my child has been asking
for for like three years, and I just hold my
boundaries and I just say, you know, you're too young
right now for all these different reason. But if you're interested,
we can read the books, we can get the t shirts,
we can get the toys, and so when you're ready
for that game, you'll already know so much about it
(33:11):
and you'll be ahead of the curve.
Speaker 3 (33:13):
But right now you're not ready. And that approach has
worked out really well.
Speaker 2 (33:18):
It's like when people I get a lot of questions
about parents whose kids want to start YouTube channels. That's
probably one of the biggest questions I have. You know,
my eight year old wants to start a YouTube, and
I just say, you know, you don't have to crush
through dreams and say no forever. But you say, hey,
it's not appropriate right now. But if you're interested in videography, photography,
(33:39):
story writing, acting, we can do aftercare camps, we can
put you in programs, we can do an online course,
so when you're ready for that, you'll be ahead of
the curve. So there's other ways to approach things without
just saying no. You have to give them more reasons
so they better understand why it's to no.
Speaker 1 (34:01):
I completely agree. My daughter loves art, and she loves
the videos where they show you there's no real person
in it, it's just a hand, but they show you
how to draw step by step, and she wants to
make videos like that to teach other kids how to draw,
and to her she doesn't understand the difference between having
(34:23):
a YouTube channel and me sending this video of her
drawing to her best friend's mom. Yeah, and she is
just as happy with just one kid, her bestie seeing
her doing a drawing video.
Speaker 3 (34:36):
That's another. That's another thing.
Speaker 2 (34:38):
Do it privately, have it private, and send it to
grandma and grandpa and a small group of people. And
you know, that's how I always suggest parents share online.
Speaker 3 (34:49):
Go private?
Speaker 2 (34:50):
Right, you know, there's no need to be sharing all
of these photos and videos of your family publicly with
you know, billions of people online.
Speaker 3 (35:00):
What we know about the internet.
Speaker 1 (35:01):
Right, yeah, exactly, And it's time for a break. I
have some lightning round questions for you.
Speaker 3 (35:12):
Oh dear gosh, Okay, get ready. Okay.
Speaker 1 (35:16):
These are my favorite because everyone answers them so different.
Speaker 3 (35:19):
Oh my gosh, I'm nervous.
Speaker 1 (35:21):
Okay, okay, if you could commit a crime and get
away with it? What would you do.
Speaker 2 (35:29):
If I could get a commit a crime and get
away with it?
Speaker 3 (35:32):
Money laundering? It's expensive, life's expensive these days. I need money.
Speaker 1 (35:42):
I can help you. Okay, I'll launder too.
Speaker 3 (35:45):
Hey, a little a little white collar crime, no big.
Speaker 1 (35:47):
Deal exactly, and then white collar prisons are much nicer
as well.
Speaker 3 (35:52):
I figured, right, little Martha Stewart out there.
Speaker 1 (35:55):
Yeah, it's like a camp.
Speaker 3 (35:57):
I like a camp.
Speaker 1 (35:58):
Yeah exactly. Okay, if you had to die by death penalty,
how would you want to go? Now? Our options in
America are more brutal than yours.
Speaker 3 (36:11):
But we don't have firs. You guys still have firing squad.
Speaker 1 (36:18):
Oh, we just restarted it in some states. That's why
I was really hoping Brian Coberger would get it firing squad.
Speaker 2 (36:27):
Wow, that's that's I know. Of the injection, I thought that.
Speaker 3 (36:32):
Was the only one.
Speaker 1 (36:34):
Electrocution.
Speaker 3 (36:36):
Electrocution. We don't have those options in the great way.
Speaker 1 (36:42):
More because of the Canada to America right now.
Speaker 2 (36:48):
Honestly, I think our life in prison, I think our
life in prison is actually twenty five years. Oh my gosh,
I will have to don't quote me on that, but
maybe something to look into. Twenty five years in LA
you're like a dangerous offender and then you like stay in.
But I think that is technically still what life is here.
That doesn't mean we have a bunch of like people
(37:08):
doing twenty five years and then released. If your categorized
a dangerous offender, you stay in But okay, so I
guess I'm not doing a firing squad and electric.
Speaker 3 (37:19):
I guess inject me. That's my option.
Speaker 1 (37:24):
Yees. Yeah, the most part, For the most part, it's
been successful.
Speaker 2 (37:29):
Well yeah, and I think you know, like we have
a medical assisted in death up here, and so that's
I wouldn't say the same, but like an injection.
Speaker 1 (37:41):
So yeah, you can get a little made done.
Speaker 3 (37:44):
Yeah, there we go on.
Speaker 2 (37:46):
That's the one I can say. I've been on a
lot of podcasts and I've never had a lightning round
like this, Leslie.
Speaker 1 (37:57):
We have to we have to bring humor to the darkness.
Speaker 3 (38:01):
I know, right, it's nice to end on a light note.
Speaker 1 (38:06):
Okay, If you could change a law, what would it
be and why?
Speaker 3 (38:12):
Oh my gosh, If I.
Speaker 2 (38:13):
Could change a law, what would it be and why.
Speaker 3 (38:18):
I think I would?
Speaker 2 (38:19):
I think I would have child sex offenders stay in
prison for longer terms. I think it's pretty wild how
many get released early. And I think their reoffending rate
is also really high.
Speaker 3 (38:33):
You can let me know if that's true.
Speaker 2 (38:36):
But I would have people who hurt child children stay
the length of their sentence or extend the sentence. Yeah,
have harder punishments for them, for sure.
Speaker 1 (38:48):
It's a it's a weird thing in America because it's
really different in every state. But in California, if you're
a violent sex offender, you are categorized. You can be
categorized under as a patient, so not an inmate, and
then you're civilly committed to a hospital. Okay, and it's
usually the rest of their entire lives.
Speaker 2 (39:09):
Okay, Okay, well that's good, yes, but it's not every state.
Speaker 1 (39:14):
Yeah, staying there. It's called coalinga state hospital in California.
Anyone can look that up, and it is horrifying. It's
a pretty high bar to get there. But it is full.
Speaker 3 (39:24):
Yeah, it is, Sadly it is. It is full. But
I think yeah, I think so.
Speaker 2 (39:30):
If you gave me more time to think on it,
I probably think of something else.
Speaker 1 (39:37):
We could change a lot that you either California becomes
Canadian or Canada comes to the fifty first state.
Speaker 2 (39:44):
Absolutely not girl, Nona, but California can come up here.
Speaker 3 (39:50):
We'll take you.
Speaker 1 (39:53):
Yeah, I would choose that one.
Speaker 3 (39:55):
Yeah, where are you originally from?
Speaker 1 (39:57):
It?
Speaker 3 (39:57):
Okay? And our final.
Speaker 1 (40:00):
Oh Toronto.
Speaker 3 (40:01):
Oh, okay, I was also an Ontario girl. Yeah.
Speaker 1 (40:06):
I my middle name is Victoria because that's where I
would go do my tennis camps.
Speaker 3 (40:12):
Oh, very cool. Victoria's beautiful city. Yeah.
Speaker 1 (40:15):
I love Vancouver and Victoria is one of the most
beautiful places.
Speaker 3 (40:20):
Honestly, truly, it's the best. I love the West Coast.
Speaker 1 (40:25):
Okay, final question, Okay hit me so last, the last
question we have is can you tell me a secret?
Speaker 2 (40:34):
A secret is that I probably have a really like
one of the worst sweet tooths in the world. I
have no chill when it comes to ice cream and
doughnut specifically, and I can easily down a pint of
ice cream sitting on my couch watching a terrible Netflix
reality show easy.
Speaker 1 (40:54):
I love it and I.
Speaker 2 (40:55):
Don't even feel and I don't even feel shame after it.
I feel impressed with myself.
Speaker 1 (41:02):
Okay, so we're just gonna hope for no diabetes.
Speaker 2 (41:04):
Yes, I know, I know, Leslie, I know, it's only
like it's not every day i'd be crazy.
Speaker 1 (41:13):
Well, I want to thank you for coming on intentionally
disturbing and you know where can we send people to
follow you?
Speaker 3 (41:22):
Yeah?
Speaker 2 (41:22):
So I'm on all socials mainly YouTube, TikTok and Instagram
under mom dot Uncharted, so find and follow me there.
As mentioned, I was recently on two documentaries, one Bad
Influence on Netflix and the other is Born to be
Viral on Hulu and Disney Plus. And in the fall,
I plan on launching my own podcast because I need
(41:46):
a different outlet to which I have more conversations. So
let's try something new, and we're going to talk about
more things, parents, uncharted, and all the things that we
don't really have a guidebook for as we navigate these
like kind of terrifying times.
Speaker 1 (42:02):
Perfect I will happily be a guest.
Speaker 3 (42:04):
Oh, you will definitely definitely be a guess.
Speaker 1 (42:07):
Right.
Speaker 3 (42:07):
And there's a lot to talk about.
Speaker 1 (42:09):
Yeah, we covered a lot today. I'm really grateful and
I'm really happy people will listen to this and learn.
Speaker 3 (42:14):
I hope so too. Thanks so much for having me.
Speaker 1 (42:18):
I'm sorry it's time for commercial. Thank you for listening
to another episode of Intentionally Disturbing and with Sarah Adams
Mom Uncharted and as I always say, get your kids
off the fucking Internet. Intentionally Disturbing is a podcast from
me Doctor Leslie. It's distributed by iHeart Media. Liam billiam
(42:45):
is the senior producer and he also edits the show
and puts up with my shit. Katie Cobbs does the
social media and she attempts to keep me in my lane,
not all successful. The executive producers are Paul Anderson and
Scott McCarthy for Workhouse Media, who have told me not
(43:07):
to text them twenty four to seven. But you know what,
I'm still the boss. Thanks again for listening. We'll see
you next week for more intentionally disturbing