All Episodes

December 31, 2024 • 52 mins

Send us a text

Can sharing your child's life online do more harm than good? In this thought-provoking episode, we tackle the world of momfluencers and child influencers, diving into the concept of "sharenting" and its ripple effects on children's lives. Sharing family moments on social media may seem harmless, but we explore real-life cases where it has led to bullying and embarrassment. Our conversation tracks the shift from early social media platforms like Facebook and YouTube to the newer realms of Instagram and TikTok, shedding light on the growing phenomenon of parental oversharing and the urgent need for more mindful practices.

As we navigate this digital labyrinth, we raise alarm bells about child privacy and the looming threat of online exploitation. Parents often face a tricky balancing act between sharing and safeguarding, yet the stakes involve more than just social media fame. By sharing stories of families grappling with the consequences of exposing their children's lives online, we highlight the vulnerabilities these children face, from potential exploitation to the ethical quagmire their parents tread. The discussion doesn't shy away from examining the uncomfortable responsibilities parents shoulder in this brave new digital world.

The narrative takes a poignant turn as we confront the unsettling trend of young influencers posting provocative content. We scrutinize the moral implications and the roles of both parents and tech companies in perpetuating this cycle. Frustrations bubble over the lack of actionable measures against inappropriate content, but there's a glimmer of hope as we spotlight recent legislative efforts to protect these young talents and their earnings. Through it all, we issue a clarion call for awareness and proactive measures, urging listeners to rethink how we navigate the digital landscape with our youth at the forefront.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Hi and welcome to the Toxic Cooking Show, where we
break down toxic people intotheir simplest ingredients.
I'm your host for this week,lindsay McLean, and with me is
my fantastic co-host.

Speaker 2 (00:26):
Christopher Patchett, LCSW.

Speaker 1 (00:29):
And our EP.

Speaker 2 (00:32):
Little Miss Molly.

Speaker 1 (00:34):
She's snoozing, hopefully.

Speaker 2 (00:38):
After being dragged away from the bed.

Speaker 1 (00:40):
It's hard.
It's hard when you're a littledog.
It's hard, it's hard whenyou're a little dog.
So have you noticed we alwaysstart our episodes like that.
I've been trying not to, but Icouldn't think of a better way
for this one.

Speaker 2 (00:53):
I did the same exact thing for last episode.
Yeah, I was like so.

Speaker 1 (01:03):
Tried to think of something else.
I was like no, it's got aanother episode, I'll do
something different.
So we've talked about the, thehistory of the mama sphere and
mom fluencers you learned aboutthem and family vloggers.
I don't know if I like fullyexplain that.
The first family vloggers arekind of included in that they're
just not necessarily women, butthey're parents who are showing

(01:26):
off their family and familylike to be a family vlogger.
There are two-legged children.
I don't think I've ever seenanyone call themselves a family
vlogger.
They'd be like hi, it's me andmy six dogs.
You could, theoretically,there's nothing stopping you.
I think you're just maybe goingfor the wrong crowd there.
So it's a two-legged variety.

(01:54):
Then we talked about the superwhack jobs, like the real crazy
of the crazy, the ones who havebeen arrested for whatever
nefarious deeds they were doing.
Like you know, we went.
We went all the way there.
But now I want to kind of backup a little bit not too much,
just a little and talk to youabout some of the more normal
things and normal people thatyou're going to see within this

(02:15):
sphere and just how terriblethey also are.
These are the ones who they'remomfluencers and they're
managing their childfluencers?
I think that one'schildinfluencer still, but
they're momfluencers and they'remanaging their childfluencers,
I think that one'schildinfluencer still, but it's
momfluencer.
Fucking English language.
And most of these people thatI'll talk about here have a
decent following.

(02:36):
Certainly they've ended up onmy personal radar, which is why
I decided to do this, because Isaw one too many posts where I
was like, oh no, so they're bigenough that, like, they know
what they're doing, they knowthe risks.
And what are some of thoserisks?
I hear you asking yourself andyour melodious voice over there.

(02:56):
We can start off with like justplain old oversharing, which
apparently has a name at thispoint it's called sharenting.
It's when parents shareeverything okay.

Speaker 2 (03:08):
So I know from past episodes we we kind of talked
about this where, like,especially with little kids, and
oh, bobby pooped in the bathtubtoday and it's like maybe let's
not put that out there.

Speaker 1 (03:25):
Yeah, we talked about that in terms of like a almost
hypothetical.
It's like this is being put outthere.
This is going to affect kids.
It already is.
You and I just had thisconversation about feeling old,
because there were kids born inlike the mid-2000s who were like
I'm a teenager.
Now there are people whoseparents are posting this stuff
and it has called up to them inschool.

(03:46):
I mean, we're talking about youknow, imagine knowing the date
of your first period becauseyour mom posted it on facebook
for the world to see and you arenow an adult, you are in your
20s, and this information is outthere because mommy, dearest,
felt the need to share that andnow everybody knows, including

(04:10):
you.

Speaker 2 (04:11):
I'm so happy that my mom never posted about my first
period.

Speaker 1 (04:15):
That was very nice of her.
I'm glad she didn't either.
You have that.
This same person from thearticle also spoke about how she
had a staph infection andapparently mom posted about that
too and this got back like toher high school classmates, who
of course, made fun of her forit.

(04:36):
Like this is already happening.
The parents who overshare theirkids are not small babies.
Their kids are in high school,their kids are inare.
Their kids are not small babies.
Their kids are in high school,their kids are in college.
Their kids are legal adults.
And this is happening alreadythat we are oversharing to the
point that, like people arebeing bullied, people are facing

(04:57):
that, and this was years agothat she cause she's 25 now, so
you imagine, even 10 years agowe were at the point where, like
parents were oversharing andthat was getting spread around
and you're you know your highschool teacher knows that you
had a staph infection okay, I'mtrying to think back, kind of

(05:19):
doing the timeline of instagram.
So instagram, well I I guessthese would be posted on youtube
more these would be on facebookfor for some of this, so it's
not even like these are parentswho are like super big
influencers, but this was beforeinstagram and tiktok.

(05:39):
This would have been likefacebook kind of youtube area,
blog area area area era whenpeople were sharing.
I know this specific personspoke about the fact that their
mother did get like concerttickets and some other stuff and
she wasn't just posting forfunsies.
This was a bit of a thing thatshe did to get stuff, that she
just shared everything abouttheir lives, including this type

(06:03):
of like very, very personalinformation and, again, based
off of the age, I would probablysay that this was like Facebook
Facebook to YouTube.

Speaker 2 (06:15):
Only, who knows that we could have been Cause.
Yeah, if she's, if she's 25,then this would have been like
maybe 2010, 2000.
Yeah, 2000.

Speaker 1 (06:23):
Yeah, yeah, and that would have been.

Speaker 2 (06:25):
That was before instagram, yeah instagram was
yeah, I think it started in 2011and I associate with.

Speaker 1 (06:35):
Like 2011, 2012, 13, 14 was like the big post photos
of things, but people weren'tinfluencers yet.
On there it was still.
It was like when facebook usedto be where it's just like we're
all friends, but facebook hadalready started turning to like
sharing stuff, just post it all.

(06:56):
So, yeah, that's that'shappening, that kids are getting
blackmailed, they're havingracy photos of themselves sent
to their schools, either photosthat they took or photos that we
may or may not get into in thisepisode, photos their parents
may have taken of them in tightor revealing outfits, and these

(07:20):
are have been stolen or acquiredand they're being sent to the
school.
I mean, like there's a photo ofthis kid, here you go and like
then the kid gets in troubleBecause why is this photo of you
like making the rounds on theinternet, like it's the child's
fault?

Speaker 2 (07:35):
Yeah, that's.
I'm so happy that this wasn't athing when I was growing up.
I know.

Speaker 1 (07:43):
I mean, my parents wouldn't have.
You know, my parents theyunderstand that boundary, but
I'm just really glad that it'snot.
All of my friends, the ones whohave kids, are very careful
about what goes online abouttheir children and I appreciate
that.
I appreciate that I'm notgetting poop stories from them

(08:06):
and I appreciate that they'retaking care of their child's
privacy Like we.
Just, we love this for your kid.
I want this for your kid andfor your family.
Yes, yes, yes.
So you might ask like why dopeople keep doing this?
Like what are they getting outof it?

Speaker 2 (08:24):
Money.

Speaker 1 (08:29):
Bingo.
I know that was a hard question, but you get a ton of money,
and the way you get all thismoney is the almighty,
all-seeing algorithm, and that'sproblem number two.
So, to clarify what exactly analgorithm is, because we just
kind of talk about, oh, thealgorithm, the algorithm.
What is it, though?

Speaker 2 (08:50):
because I think some people don't actually know you
so, like I, think, having donethis podcast and everything like
that, yeah, trying to tounderstand what the algorithm.
It's funny because it seemslike the more you learn about
the algorithm, the less you knowabout the algorithm yeah,

(09:13):
because then you knowtechnically how it works, but
still like what?

Speaker 1 (09:18):
so, in broadest terms , this is the set of
instructions for what contentyou were to see.
So each site has its ownalgorithm.
Facebook has one, instagram hasone.
Yes, they are owned by the samecompany.
They may still have differentones, we don't know.
They don't publish all of thisinformation, which is part of

(09:39):
the problem is that they don'twant it out there.
Their algorithm is their own.
This is why TikTok got insanelypopular, because the TikTok
algorithm was very good atpicking up what people might
want to see.
So, in general, the algorithmis looking at user history,
location, your profile, thepopularity of the content, your

(10:01):
current activity, your friend'sactivity.
So like what?
All of this like kind of goesin together and gets turned into
a little equation and it comesout the end.
It's like you want to see thiscat video, we think because of
you know your friend watched itor you've watched it in the past
.
And it's also looking at whatpeople are interacting with, and

(10:24):
this is very key.
Interacting means you watch it,comment, send it to a friend,
you save it.
All of that counts asinteracting.
You can try this.
You can go on Instagram rightnow, go save a couple of cooking
videos, even though it'sprivate and like I won't ever
see that.
You have saved those.
Now you're going to see morecooking videos and as you watch

(10:48):
more, I am more likely to seecooking videos because you are
interacting with them and youand I are friends, and so the
algorithm is going to pick up onthat and I may start to get a
little bit of that Now.
Obviously it's very nebulous tohow much, but that does
actually have an effect.
The more you watch of it, andbecause we are friends on there,

(11:08):
the more it shows, the more youinteract with it.
And people forget this thatwhen you're commenting on videos
, that video is still playing inthe background, racking up
views.
That tells Instagram that thisis an exciting video.
Instagram, facebook whateverthey're, all the same in this
case that tells, whichever siteit is, that this is a really
great video and that more peoplewant to see it, even if you're

(11:31):
there in the comment picking afight.

Speaker 2 (11:33):
All publicity is good publicity and this is the thing
like, and and I I think I I'vetalked about this, uh, in a
previous episode where, you know, somebody had said something
like outrageously, like harshabout like you know, like you
know, during the elections andeverything like that, and I

(11:55):
wanted to comment.
And as soon as I brought up thecomment on it, uh, I saw, like
you know, somebody had said samething that I was going to say,
and then underneath it, theperson who posted said something
along the lines of yeah, that'sgreat, you just posted on my
thing and you're just driving myvideo to be seen more.

(12:16):
Yeah, no, thank you.
So, like you know, like that.

Speaker 1 (12:21):
And then so, yeah, it's picking up on exactly that,
exactly that like, just anycomment is a good comment.
So even if people are justleaving emojis or stuff like
that, like that pushes that to abigger audience.
It tells the algorithm yes,keep sending.
And that also means that youdon't necessarily have control

(12:42):
over where your content goes.
You can try to, obviously, ifyou have like ads and stuff
running, you can try and targetcertain groups, like people who
like dance, people who are, youknow, into volunteering.
You can try and target thatspecifically.
But even that like it's notlooking at what people have

(13:06):
reported.
When you sign up for Instagram,you don't say here are my
interests and that's it.
It is basing it off of whatyou're actually consuming.
So if you're consuming a lot ofdance videos, it's going to put
you into that category of isinterested in dance, which means
that then when somebody runs anad for dancewear shoes, tutus,

(13:30):
leotards, whatever you are morelikely to see that ad, no matter
who you are, which is whylittle girls' clothing and
jewelry pages are very popularwith adult men, and that is a
proven fact.
Unfortunately, the New Yorktimes I hope you're ready for
this New York times did a test.

(13:51):
Uh, cause somebody hadcontacted them about it and they
saw the ads that she had run.
This was jewelry for likefive-year-olds, this is not
adult jewelry, this is childjewelry, for children, and so in
the ads, some of the ads had,like, the actual child shown
wearing the bracelet or whateverit was.
Others was just you know thebracelet, like oh, buy this now,

(14:12):
type thing, and I quote the adsgot direct responses from
dozens of Instagram users,including phone calls from two
accused sex offenders, offers topay the child for sexual acts
and professions of love yeah andthen meta.

(14:32):
When new york times contactedabout this, meta was like I
don't think that was a very goodstudy.
I think you like rigged it.
I mean two accused sexoffenders tried to call this
child or the person who ismanaging the account for the
child.

Speaker 2 (14:46):
Yeah.

Speaker 1 (14:47):
People were offering to pay a five-year-old for even
the professions of love.
That's gross.
You're an adult man.
You're an adult anyone, I meanyou know.

Speaker 2 (14:59):
And then, oh God, like even to say what's up to a
five-year-old yeah, absolutelynot I, I, you know, I, I
couldn't even imagine like, eveneven hannah like no, you, just
you don't you recognize.

Speaker 1 (15:17):
Even I as a woman.
I'm not going to even before Idid the research for this if I
saw a page of, like a young girlwho was posting sports or
anything like that.
I am very careful about what Ilike or comment on.
I don't comment on basicallyanything, because I find that

(15:41):
mildly inappropriate that I, asan adult, am watching you do
gymnastics that your parent hasclearly put you in.
It's clearly pushing and youmay be wearing a slightly
revealing outfit Gymnastics alittle bit more.
You're like okay, you'rewearing the outfit because this
is what you wear, little kid ina bathing suit.
You're like nope, nope, nope,I'm not touching that.

(16:01):
I'm not touching that with a10-foot pole.
Okay, how cute you are.
Like I don't want to beanywhere near this photo.

Speaker 2 (16:09):
Like I'm literally I'm trying to think in my mind
if there would be an appropriatetime, like, even like a parent
taking a video and saying, likeyou know, like something in the
background.
But no, I can't even think oflike you know, she really isn't.

Speaker 1 (16:30):
Yeah, like you know, like something in the background
but no, I I can't even think of.
Like you know, yeah, you justlike you'll need to be
commenting on this stuff,especially like kids that you
have no idea who they are andthey just like breeze through
your life in this video andimagine being like, hey, cutie,
like oh, that's so sweet.
Look look at your life and lookat your choices.

Speaker 2 (16:41):
You were an adult commenting on the video of a kid
I think the only only possibleway I would ever even like that
would be is if it's my friend'skid exactly.

Speaker 1 (16:54):
Yeah, if it's my friend's kid, if it's something
like facebook, you knowsomething like that then yes, I
may like the phone and be like,oh my god, they're so cute.
But again, like most of myfriends, do not post photos of
their children online.
I have friends who haveactually said, like I don't send
photos to people of my kidslike you, just you won't ever
see.
Like I have one who she hasnever sent me a photo of her son

(17:16):
.
When I see her, she has tons ofphotos on her phone and so
she'll go oh look, we were doingthis, we were doing that.
She just doesn't post thembecause she doesn't want them to
potentially be like to get outand they're not incriminating.
It's not like these are photosof him.
You know, it's a littlefour-year-old running around
butt naked.

Speaker 2 (17:34):
It's just a child and , and you know, one of the
things I did here and I don'tknow if you're gonna bring this
up, but is that sex predatorsare going to do things like oh,
here, here's Samantha in in hercheerleading outfit, and the
cheerleading outfit has, likeyou know, the Shamney Indians,

(17:58):
you know logo, and now thatperson's able to say, oh, she
goes to the Shamney high school.

Speaker 1 (18:06):
Way to jump ahead of me.
Excuse you, I was going to talkabout that.
Fine, we can talk about it now.
No, it's absolutely true.
Is that these kids?
Even if they are just promotingsomething as simple as
dancewear?
Or here's me in my latestcheerleading outfit.
You're now putting this on theinternet.
People see it for that.

(18:26):
Who knows who these people are?
They now start to follow youraccount.
You can trace back with shockingease, more than a lot of people
realize.
You can figure out wheresomebody lives.
Yet with shocking ease you canfind out a lot of information
about people, and there is onecouple in particular that

(18:47):
they've been called out by theirfans.
They are a lesbian couple inthe UK, caitlin and Leah.
They have two children who theyhave said the kids' names on
the internet.
They don't show the kids' facesbut they'll post photos, like
real-time photos of them at apumpkin patch and the kid from
behind the one who's mobile andI was like you, could you and

(19:12):
people know where you live?
Like the town, someone who wasreally gross could absolutely
just be in town, could bewatching your stories, sees this
story of you and your wife andyour kids at the pumpkin patch,
goes there and then you know,finds your kid and is like, hey,
little so-and-so, you know,mama, told me because they know

(19:33):
that you use these names likemama and mommy or whatever it is
told me to blah, blah, blah.
And they know all about the kidbecause every bit of
information has been shared.
And then how does the kid knowthat this is not appropriate?
You know my mom, you know myother mom, you know who I am and
you know this stuff about me.

(19:55):
You must be okay, right?
How would you know all this?
And even if you don't even gothat far, what's to say that
somebody doesn't just go thereand take a picture of your kid?
And now they have a photo ofthe kid's face?

Speaker 2 (20:10):
Oh God.

Speaker 1 (20:11):
Yep, yep, yep, yep, yep.
So keep in mind too, for dosports like dance, gymnastics,
swimming, where they're inrevealing outfits, kids who just
are taking photos in bathingsuits, let alone the ones where

(20:33):
the kids have been adultifiedand are wearing makeup, grown-up
clothing, that type of thing.
These parents know what they'redoing at this point in time.
These parents know and I readsome really horrifying articles
about parents talking about thisand they knew.
They fully admitted.

(20:54):
They're like yeah, every day Iwake up and I delete followers
off of my six-year-olddaughter's Instagram account,
and every evening for bed, Idelete followers off of her
account because they're grosspedophiles, but I keep the
account.
The account is still there andI keep posting photos of it
despite this.

Speaker 2 (21:15):
I saw something along those lines as well.
I forget exactly who, but thiswas maybe about a year or two
ago.
But I saw something and Iactually was going to comment on
her page like hey, it'sprobably not a good idea, but

(21:36):
other people had commented thesame thing.

Speaker 1 (21:39):
Yeah, they don't care .
Yeah, again, caitlin and Leahhave been called out.
You can scroll through theircomments and people are like
this is not safe what you'redoing.
There's another couple, julieand Camilla, again a lesbian
couple.
I think these two tend to get alot of call out because they've
both been, not because they'relesbians, but both because

(22:01):
they're very vocal about privacyand caring about others and
empathy and this type of thingand social justice.
And so then to have that sideof stuff where you're like, yes,
this is all good, and then theyturn around like I'm exploiting
my child for money, like, oh,okay, I mean Julie and Camilla,

(22:22):
they have a son and they havenot shown his face online and
they have not mentioned his realname.
They use a nickname for him.
They were at one point alertedto the fact that he was
potentially like.
Photos of their kids weremaking their way around the dark
web and their only response tothat was like, oh, we're looking

(22:43):
into it, thanks.
And then they just startedblocking everyone who mentioned
it.
Julie and Camilla, look up,look up, camilla, lore, julie
had a whole bunch more Cause.
Julie is the one who actuallywas like pregnant with the kid
and she had a hissy fit and hidall of her stuff on Instagram

(23:05):
for six months.
They're problematic for so manyreasons, but this is just one of
them.
They don't always do a greatjob of covering his face when
they take these photos and maybea year ago, somebody had
alerted them to this and peoplewere like, oh, you need alerted
them to this.
And people were like, ooh, youneed to be really careful.

(23:26):
And they were like, thanks forlooking into it.
And they've continued to postphotos of people like yo, your
kid's feet are just like hangingout, and like you see so much
of him, please be careful.
And they just they, just they.
Delete comments.
Yeah, I see Like yeah, exactly,it's like that's.
You see a lot of his face.

Speaker 2 (23:47):
And great.
Now he's going to be on myfucking algorithm too.

Speaker 1 (23:49):
I know sorry.

Speaker 2 (23:51):
Son of a bitch.

Speaker 1 (23:54):
Well, and that you know combined with like videos
and stuff like that.
Again, like we just said,people know where they live.
Ooh.

Speaker 2 (24:02):
Ugh.

Speaker 1 (24:10):
But yeah, look, you've got to make that money,
and so that's why these peopleare willing to sit there and
they're like, yeah, I have todelete followers constantly
because they're nasty, nasty men, but like the money, because
keep in mind, at the beginningof this year, you could the
statistics I found you could begetting around $18 per 1000
views on YouTube, and that's ifyou have a bigger channel, it

(24:32):
could be more.
Plus, then you've gotsponsorships.
You can run ads.
You could be making way morethan that if you have any sort
of decent following.
That's YouTube.
Instagram doesn't pay, but likeInstagram, you know you can
have ads, you can havesponsorships all of that that
will be running too, that youcan be making so much money off
of.
Which is why you don't want tobe deleting these people.

(24:53):
Necessarily thing, you wantthese people interacting,
because all those comments ofthe heart hey, sexy, I'd fuck
that.
You know why don't you?
You know, give us a littletease and lift up your skirt.
These are these are realcomments that I saw while

(25:13):
researching this that were leftby men and nobody had deleted by
the time the screenshots weretaken.
Like they, they had managed tostay there for long enough that
it was just building up there,because that is interaction and
that interaction drives thechannel and when your channel is
moving along like that andyou're getting all this
interaction, it's going to bepushed to more people and you

(25:35):
can make more money off of yourchild who you were exploiting
gross, gross.
Yep, that I mean that that'sliterally the only thing I can
say is just gross yeah, back in2020, meta found that there were
500 000 child instagramaccounts that had quote

(25:57):
inappropriate interactions eachday.
Half a million kid Instagram,not just Instagram accounts in
general because, look, if adultswant to do that, that's fine.
Y'all be gross to each other,but not on the kids account.

Speaker 2 (26:14):
Yeah, no.

Speaker 1 (26:15):
Yeah, half a million every single day in 2020.
And yet people are still outthere posting this stuff because
you can get so much money fromit.
You can be making hundreds ofthousands of dollars by running
these type of accounts, and evenif you're not making upfront
money, a lot of these aregetting things, especially the

(26:36):
moms who are running accountsfor their daughters who do stuff
like cheer or beauty pageantsor dance cheer or beauty
pageants or dance Because whenyou go to these competitions,
it's quite expensive to pay totravel to the competition, to
stay in the hotel, you have tohave all these different outfits
that you're wearing, and justthe outfits alone can get really
, really expensive.
And so if you can find a way tobe gifted or sponsored by a

(27:00):
company that can go a long waytowards helping your kid do this
, this, whatever thing it isthat they hopefully want to do,
and it's not you projecting ontothem You're able to make it
work because the clothing ispaid for, the hotel is paid for
because you got a sponsorshipfrom somebody and now you're
able to go there.

(27:20):
You wouldn't have been able tootherwise, and so I think that's
why they just keep sittingthere, because you can get so
much more again, more than justlike dollars, you get the crap
you know so.

Speaker 2 (27:33):
So this, this is something that kind of pops into
my head.
It's like I mean, now you're,now you're placing views on the
child's worth.
That's going to cause a lot ofdifficulties as far as in the
future.
I'm curious to see what HoneyBoo Boo is up to today.

(27:56):
I'm curious if she's ever spokeout against or for whatever her
upbringing.

Speaker 1 (28:05):
I know that she has sort of stayed in the media.
I don't know where she isexactly right now.
She stayed for quite a whileand she would pop up
periodically.
Part of that was because thatthat family is a hot mess and so
sometimes it was, you know, momor dad that was in the media
for the wrong reasons, but Ithink that because she grew up
in it and there was actually.

(28:27):
There's a famous childinfluencer that I found as part
of this.
Her name is jackie dejo deyo.
She's dutch originally and youknow, kind of arguably, she was
groomed by her parents like shewas brought up in this
environment, kind of like honeyboo boo.
But there are always cameras,you're always performing and so
I think you lose that kind oflike oh, this is, this is

(28:49):
actually not okay that my entirelife is just out there.
I mean, she went, she wentrated r this one.
I don't know about honey booboo I don't think she has, but
jackie, since turning 18, has anonly fans.
Even before that she wasposting risque photos and her
parents allowed her to.
They're like, oh, if she reallywants to, it's okay.

(29:10):
Like, yeah, we delete like dickpics off of her account.
For her when she was younger,oh god, I was like the fact that
we have to talk about that, thefact that we're talking about
your 13 year old, because theystarted posting photos of her
when she was about six photosand videos of her doing sports
she likes to, um, I think shedoes snowboarding and like one
or two other things that she wasquite good at and so they were

(29:32):
posting that for her.
And then, when she was 13, shegot her first swimwear brand
deal and that's where thingsjust like see a dicks man
everywhere, and then like nudephotos of her were published.
Uh, and then she was like youknow what, you can go ahead and
publish it, I'll publish my own.
And her parents were like, okay, and just let her.

(29:54):
And so now you've got this likebarely legal child running
around posting, because she wasbrought up to think that that
was okay and because she hasmade so much money off of this
and her family has made so muchmoney off of this and her
family's made so much money offof this, no one can convince her
that this is not okay, becauseshe's like, yeah, but I make
bank capitalism as finest yeah,in case you needed another

(30:19):
reason to hate on it.
Also, another fun statistic forif you're thinking that's like,
oh, this is only for like reallybig accounts, that like smaller
accounts are not going to havethis issue, that is partially
true.
Smaller accounts tended to havefar higher numbers of women
followers, based on studies thatwere done again by the New York

(30:42):
Times looking at it.
Followers based on studies thatwere done again by the New York
Times looking at it.
Once you got to around 100,000followers, often those accounts
had over 75% of their followers,who were male Over 75%.
And this is an account that isshowing a young girl.
There are very few accounts ofboys in this, this situation.

(31:05):
Obviously they do exist, butnot anywhere near.
You see the number of likeyoung girls who are being posted
on instagram by their parents,because at that age you're not
posting yourself, it's your mom.

Speaker 2 (31:17):
She's the one who is posting these photos and videos
yeah, they're that's you know,and and so okay, so the thing
that that kind of comes into mymind is like rather than
deleting these comments and shitlike that, I don't know, uh,
call the fucking police yeah, Imean you, you can report this

(31:40):
type of thing and you you canand should.

Speaker 1 (31:43):
I know that a lot of people are, and there has been
some discussion about thefeelings of frustration, about I
feel like I'm reporting stuffand nothing is happening, like I
keep posting this and nothingis happening.
We'll get into a little bit ofthat when we talk about where do
we go from here.
I do feel that part of theresponsibility is just not
posting this stuff Like weactually do have control.

(32:05):
The internet does notnecessarily need all of these
photos and videos.
I've been hating on Instagramfor all this, because Instagram
is the main social media that Iuse and so that's where I see it
, but it's it's everywhere.
It's Facebook, it's YouTube,it's TikTok, it's Snapchat.
All of them are guilty of this.
All of them have algorithmsthat push any content that has
interaction.
All of them are guilty of this.
All of them have algorithmsthat push any content has

(32:26):
interaction.
All of them have known problemswith like actually following up
with like.
Hey, this person is making likethreats and really disgusting
comments on stuff like they needto be blocked.
All of them have that issue ofactually following through.
Yeah, yeah yeah, yeah, and Iknow that 100,000 followers

(32:50):
sounds like a lot, but if youwant to be making money, you are
going to be needing that.
Not necessarily that level, ofcourse it depends.
But you're not going to begetting sponsorship deals with
100 followers.
You're not going to be gettingsponsorship deals with 100
followers.
You're not gonna be gettingsponsorship deals with 1000
followers.
Like.
You do actually need to havethat base to build up.

(33:12):
So having 100,000 for somebodylike this isn't a strange or
like oh, there's only you know10 kids out there who have more
than 100,000 followers.
No, there are a lot.
So I mean we we could continueto talk about, like, all the
terrible things that arehappening out.
I I think that is enough enoughchild predators to to give us a

(33:38):
good base to say where do we gofrom here?

Speaker 2 (33:43):
I would say, yeah, don't fucking do it.
And then, on top of that, Ithink that you know if we're, if
, if somebody is, because that'sa hard thing, because, like, if
somebody, if, if a 40 year oldguy is commenting, you know like

(34:05):
, hey, hot stuff.
Um, you know, like it's, it's,it's not against the law.
And you know, unless theyactually post something on long
lines that is actually sexual,that no way in hell that it can
be taken any other way butsexual, yeah, no, and that's

(34:30):
that's fucking gross.
Um, but it's I, I it's momentslike this.
I believe in public shaming.

Speaker 1 (34:45):
I would agree with you on that.
I know you and I have had somedifference in opinions on like
the call out videos and thattype of public shaming.
But absolutely for this type ofthing, I fully support tracking
down these men.
We don't dox them, don'tpublish like that information.
But you know, no shame infiguring out who they are,

(35:06):
making sure it's the rightperson.
And then I don't know whoopsies.
I sent a screenshot of this toyour workplace.
Were you aware that youremployee talks like this?
Were you aware that youremployee is making these
comments on videos and stufflike that to their wife?
Are you aware that yourhusband's running around on

(35:27):
Instagram and TikTok saying thistype of thing?
Now, I fully name and shame forthis type of thing.
I agree with that becauseunfortunately we can't count on.
It'd be nice to say, hey, youknow Instagram and TikTok,
they're going to get serious.
That would be ideal.
You know Instagram and TikTok,they're gonna.
They're gonna get serious.
That would be ideal is if thecompanies would police

(35:48):
themselves and say, hey, this isa problem.
We know this is a problem.
We've known it for years, thatthis is like a really big
problem and so we're going tocut down on, you know who can
see stuff and you'll be able tosay like, oh, I don't want men
to see my content, like to addthat feature in.
So if you're working with achild's account, that you could

(36:11):
actually separate that and saythey're not going to, because as
long as you were on their siteand watching and commenting on
stuff, the more you do that, themore ads you see.
And the more ads you see, themore money you make for them.
It's capitalism.
Again back to bite you in theass, and so they're not going to
.
We are starting to see laws getpassed, though Never thought I

(36:33):
would have something good to say.
Where do we go from here, bart?
Now I'm going to tamp theenthusiasm down, because most of
this is around payment forchild influencers, which is a
whole nother deal that, like,there are these kids acting and
working Right and there'snothing to protect them when

(36:54):
they turn 18 from their parentsbeing like bye bitch and kicking
them out and they don't getanything.
They may not actually benefitfrom any of this stuff that is
happening for them.
It's just mommy who makes allthe money.
Mommy and or daddy Some ofthese are the family vloggers.
We are at least starting to seelaws get passed that say that
you have to compensate childinfluencers.

(37:15):
50% of earnings for a piece ofcontent must be placed in the
block.
Trust fund is actually a lawthat Illinois passed.

Speaker 2 (37:21):
Wow Illinois.

Speaker 1 (37:23):
I know Now that depends on how much of the video
the kid is in.
So it may be less if the kid isonly in like a tiny little
second of the video, but ifthey're in and I forget what
percentage of the video you wererequired to put this bit aside
for them again in a block trustfund.
And it was actually a teenagerwho helped get this passed.

(37:46):
A 16-year-old saw what washappening and was like damn, I
wouldn't want my most intimateprivate moments of a child to be
published online and then tohave everyone know this and see
it and I not be able to benefitat all or be able to delete this
.
So she helped get this passed,which is good for her.

Speaker 2 (38:09):
Go Jen Alpha.

Speaker 1 (38:10):
Right.
I was like, oh, putting therest of us to shame.
California's in the process ofpassing something similar.
So they already have the CooganLaw for child actors and
they're rolling childinfluencers into that.
France has passed a law in 2020regarding how under-16s can act

(38:32):
on the internet not act on theinternet, but content about them
.
What happens to their earnings?
Going back to the?
U?
The US?
Washington's working on asimilar bill.
We've kind of acknowledged that, like something should be done.
Now we're not talking about allof the data that's out there at
this point, like no one'sreally doing anything about that
.
Now.
Europe, of course, has theright to be forgotten.

(38:55):
Do you know the GDPR?

Speaker 2 (38:58):
No.

Speaker 1 (38:59):
Okay, so the laws passed in Europe five years-ish
ago five, six years ago, maybe alittle bit earlier.
That includes a whole bunch ofregulations and one of them is
that, for instance, if you'reever here in Europe and you're
trying to get on the internet,every different site you go to,
you're going to have to eithersay yes or no to give them

(39:20):
access to your data, and likewhich ones.
This is essential, notessential cookies, that type of
thing.
That was part of it.
One of the things that's alsoincluded in GDPR is the right to
be forgotten.
You can legally like you havelegal backing to go into Google,
for instance, and be likedelete everything, delete this
article about me, it's gone.
It's gone from the internet,and this has extended in France,

(39:46):
and I think in Europe thatthey're working to get kids the
ability to kind ofsemi-retroactively, like when
the kids become adults, they canbe like I want this gone Now,
unfortunately, like it can stillexist other places, because you
know, know, the internet isforever and people may have
saved copies or screenshots orsomething like that.
So it may not be truly, trulygone, but it at least gives them
a better ability to be like Iwas forced to do these videos or

(40:07):
take these photos when I wasreally young.
I hate it.
It ruined my life like.
I don't like any of that and I,now that I'm 18, delete it.
Delete that account I like thatI like that too.
I'd love to see more of that,and I think it's gonna have to
be the government that steps inbecause the companies aren't
going to, which is a sad daywhen you're like, please,

(40:28):
government, come, save me oh godyeah, we fucked.

Speaker 2 (40:37):
I know like.
So like I do know likeCalifornia did pass something
along the lines of um.
You know, please don't sell mydata yeah, gdpr.
The California law is based offof GDPR, I believe and so, oh
god, well, you know, and we'restill at the childlike of the

(40:59):
Internet.

Speaker 1 (41:02):
And look how bad it is now.

Speaker 2 (41:04):
Yeah, well, I think that's going to be true with
anything.
Yeah, you know, I think of likeeven cars.
When they first came about, youknow, there was no speed limits
.
There was no.

Speaker 1 (41:16):
I I mean seatbelts.
Forced seatbelts is arelatively new thing yeah, so.

Speaker 2 (41:25):
So yeah, you know like um, uh, even turn signals
and brake lights and so, yeah, Imean know at one point cars
were, you know, as timeprogressed, you know they got
safer and safer, but you knowhow many people got killed, you

(41:50):
know, prior to that.

Speaker 1 (41:52):
No, exactly, and unfortunately the Internet is
moving so much faster thanregulation can, because
regulation is just it's slow.
It takes a long time for yourealize the problem.
You start talking about theproblem, you make a big deal
about the problem, you finallyget it, you know, to the right
people.
The right people talk about,make a big deal about.
You know, eventually it takesyears before we finally like, oh

(42:15):
, hey, hey, that's, it's illegalto do that, it's not a speedy
process, but the internet movesso, so quickly that I need the
government to speed up, dosomething, because by the time
they finally react to this,we'll have moved on to other
things, other, far worse things,and the sooner laws are set

(42:37):
down.
Because, again, the companiesknow if this information, I'm
finding this information, likedoing my searches, like it's not
like I was doing some sneakyshit, like looking around within
their like hidden internalfiles and I was like 500 000
accounts a day, what theypublished that information
themselves almost five years agoand we, just we, we haven't

(42:59):
gotten there, but like somethinghas to be done to start saying
this is illegal.
That's illegal, you know, and itmay not be top government, but
like within countries even, butlike the, the norwegian couple,
julie and camilla.
You know that's the type ofthing where I look at them like
norway would do well to they.
They know who, julie andcamilla.
You know that's the type ofthing where I look at them like
norway would do well to.
But they they know who julieand camilla are.
They're famous influenceswithin the country.

(43:20):
They've been known for years.
I am truly shocked thatsomebody hasn't kind of been
like yo, child protectiveservices, like we may be a
little bit worried about, likeis this child being exploited,
child being okay, that type ofthing.
I'm a little bit shocked thatthere hasn't been more of a
grass movement, grassrootsmovement, towards like reporting

(43:41):
these people or towards pushingfor even just local laws that
say you can't do this, youcannot post your child on this
age, you know, if you werecaught commenting illegal, not
racist, but really super sexualstuff.
I'm shocked that we haven'tacknowledged that that's a no-no

(44:05):
.

Speaker 2 (44:10):
Oh God.

Speaker 1 (44:13):
So on our scale of toxicity, where would you rate
mom fluencers?
Because that's that's kind ofthe overarching theme for all.
Like three of these episodes,mom fluencers are like the main
issue here.
Would you say that they are agreen potato, make you kind of
sick if you eat it, but justscrape off the green part and
it's okay.

(44:33):
Are they?
They a death cap mushroom,50-50 chance of death or coma,
even when cooked?
Or are they a delicious butdeadly last snack of antifreeze?

Speaker 2 (44:47):
So I would put this at death cap plus Uh-huh.
And the only reason why I don'tsay antifreeze is because,
unfortunately, uh, influencersare how we learn to live
nowadays.
Um, yeah, sad but true.
And you know, like being ableto find the best way of having a

(45:11):
play date and being able to,you know, introduce your kids to
you know other kids and youknow things like that.
Yeah, that that's great andvaluable and things like that.
But I mean, when 75 of yourviewers are fucking old guys
yeah like that was a horrificfigure.

Speaker 1 (45:33):
There were so many horrific figures that I ran
across here.

Speaker 2 (45:38):
And even you know because as I was saying that
like even giving benefits of thedoubts to the men maybe half of
them are single dads.
Yeah, are single dads who aretrying to figure out, like you

(45:59):
know way to become, you know, afather slash mother to the child
yeah, or maybe you know theirdaughter is also in whatever
sport and so they're followingalong.

Speaker 1 (46:07):
Absolutely.
You do, of course, have some ofthese men who are legitimately
following.
I was like you wouldn't look atme like two percent man.
That's awful.
But when you look at 75 percentof followers on a child's
account, it's like even if halfof you are legit which I don't
think is true that's a lot ofdudes yeah, you don't need to be

(46:31):
following this.

Speaker 2 (46:32):
Yeah, yeah.

Speaker 1 (46:37):
So I would actually give this antifreeze because I
think it has such a potential toruin people's lives for way
longer than we think.
Again, we are just now kind ofstarting to see how it can ruin
we know how it can ruin ourlives as adults.
People have learned that lessonand they're still learning it.

(46:59):
But I think that as time goeson, in the next like 10-ish
years, we're going to see thatwave of kids who grew up on the
internet, who are now becomingadults, and some of them may be
like Jackie and they're likehell yeah, I'll take what I can
get, but I think a lot of themare going to be like.
Some of them may be like Jackieand they're like hell, yeah,
I'll take what I can get, but Ithink a lot of them are gonna be
like some of these other peoplethat I saw who were like no,

(47:21):
this is horrific, like all thisinformation exists about me and
I don't want it to, and peopleknow intimate details about my
life and my childhood that I cannever erase and people will
always know that, and so I'malways at risk because some
crazy stalker followed me as akid and now here I am, I'm 25,

(47:42):
I'm 30, and they're stillobsessed with me and they're
still following me through otherpeople or they know where I
live.
They found out about me and now, no matter what I do, they keep
following.
What I do, they keep following.
I give the benefit of the doubtto the first momfluencers way

(48:03):
back in the day, the mombloggers.
I don't hate on them for whatthey did, because we didn't know
.
I mean, think about the stuffwe used to post on Facebook or
MySpace.
Nobody was talking about thistype of thing back then.
Yes, of course it existed.
Of course child pornography wason the internet since day one.
It's the internet.
But this was not a conversationthat was happening about like,

(48:25):
hey, your photos could be taken.
We didn't have AI that couldanimate a photo, a still photo
of you and have you say or doreally inappropriate things.
That never actually happened.
We have that technology now.
That didn't exist.
Back then we weren't thinkingabout that, and so I don't blame

(48:48):
them for starting it and saying, yeah, I'm going to have this
blog and, like, I'm going topost my kids on there and like
you can follow along and seewhat we do.
I don't blame the people who youknow, want to post some photos
of their kids on Facebook.
I love seeing my friends' kids.
I am always excited to see them, whatever it is they're doing,
but there comes a certain limitwhere I'm like you are posting a
lot of photos of your kid.
How is your kid going to feelwhen they're older?

(49:09):
Like, put yourself in theirshoes.
And that's even you know, and Ihope you have your privacy
settings in order for that.
If you're out here posting a lotof photos of your kid, you
absolutely should be checking.
You can post them, though I'mnot going to say like no one
should ever post photos of theirkids, because that's just no.
But the people who arewillingly choosing to create
accounts for their children inthis day and age, you know what

(49:33):
is out there.
You don't have the benefit ofthe doubt to say, oh well, I
really want my seven-year-old tohave these opportunities and to
get these scholarships and it'snot even scholarships like
sponsorships for dance becauseit's her passion.
I'm sure it is her passion, butyou know full damn well that

(49:55):
there are going to be childpredators looking at photos of
her and pedophiles looking atphotos of her.
Is that worth the money?
And I think that when you sayyes, it is.
I think that makes you a reallybad person.
Yeah, yeah, hence antifreeze.
I don't think you can be amomfluencer in 2024 or 2025 and

(50:20):
be a good person.
If you are posting photos andvideos of your kids, if you are
sharing details about them tothe wider internet to make money
.
You're a bad person.
You're a bad person.

Speaker 2 (50:34):
I think there is a proper way to do it, which sure
You're not going to get as manyviews and things like that.
but yeah you know, if you dowant to, if you do legit want to
help somebody out, and youfound this new way of doing

(50:54):
things.
You don't have to show videos,you don't have to post pictures,
you don't have to go over everysingle part of their lives.
You can still say hey, Idiscovered, like you know, like
a really good way to have mychild gender neutral.
My child you're not giving anyinformation out to meet friends

(51:21):
via online or whatever.

Speaker 1 (51:23):
Yeah, they like to go on this website and they can
play these games and it's supereducational.
Yeah, I would agree that Ithink there are some very, very
narrow ways that you can stillbe an influencer about kids and
motherhood, and all of thatwithout involving your kids.
It just takes so much more workto do that.

(51:46):
I think people aren't going todo it.
They're going to have to put ina whole lot more work to get
there and then it will always bea fight because the people who
show their kids are going to getthe views and the people who
are careful are not going to getthe views.
Oh God, on that happy ending.
If you also hate momfluencers,or if you have been traumatized

(52:09):
by a parent who overshared, whogot a little into sharenting and
shared horrific things aboutyou online, you can write to us
at toxic, at awesome lifeskillscom.
You can also write to us orfollow us on social media.
We've got all of them, exceptfor Snapchat, cause don't got
time for that.
Follow us there, please.

(52:30):
We like to see you and that'sit until next week.

Speaker 2 (52:40):
This has been the toxic cooking show bye.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.