All Episodes

January 26, 2025 52 mins
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:11):
This podcast discusses true crime, which may entell violence, and
other material intended for mature audience. Listener discretion is advised. Hey,
it's Kayla and it's Lexi. And I forgot to mention
our last episode that we have a different release schedule
for episodes. I forgot to mention that. I feel like

(00:32):
it was important. Oops, my bad.

Speaker 2 (00:35):
It's not Sundays anymore.

Speaker 1 (00:37):
It's not every Sunday, it's every other Sunday.

Speaker 2 (00:39):
Oh yeah, okay, yes.

Speaker 1 (00:42):
Yeah, because we're just kind of slowly getting back into
instead of full force, and every week it's kind of nice.

Speaker 2 (00:50):
We're doing our best. I feel like that comic of
that that like creature emerging from the cave, that's like, hey, guys,
I'm finally done this semester, Mom Dad Friends. Yeah, probably
because blessedly we've been out of school for a while.

Speaker 1 (01:10):
Yeah. And also what I'm going to start doing. I'm
working with other podcasts, Like we have like a group
chat on X I've been in there for like a
year or two. I've just never talked in it, but
there's uh, We're gonna start swapping ads. So at the
end of the each episode, I want to play another
podcast ad, so you guys can hang around and listen
to it and check them out because they're also indie podcasters.

(01:33):
Nice Today it should be it's Murder up North.

Speaker 2 (01:38):
Murder Up North. Oh okay, I should probably make like
a burner x account so that I can be involved
in this tough It's like it's like TikTok, Like technically
we co own the TikTok and I continually forget that
that app exists.

Speaker 1 (01:53):
It did there for about twelve hours.

Speaker 2 (01:55):
Oh yeah, I forgot that happened. I'm a lot more
offline than I used to be, so like I'm I'm
out of the loop. I'm like watching I'm like watching
Instagram reels. I'm like finding out about memes like three
months after they drop. It's it's embarrassing. I've become my mom. Actually, no,
my mom is more hip than me. Like my mom
knows more memes than I do. Now.

Speaker 1 (02:16):
I saw this video somebody got a tattoo of a
grace though with TikTok on it, and then as soon
as it was done, they're like, hey, TikTok's back up.
You kind of jumped the gun there, Bud.

Speaker 2 (02:29):
Like that is so funny. I love people getting like
completely stupid tattoos that you know, are like they're gonna regret.
Like that's like when people like the one thing do you.
I don't know if you remember in like twenty sixteen
when people were getting like Bernie Sanders tattoos, and I
was like, regardless of the outcome of this election, this
is a tattoo with a time limit a very very

(02:52):
short time. Yes.

Speaker 1 (02:54):
I saw some guy at the gas station during that
during twenty when was the last election, twenty twenty the
sky has on the side of his skull Trump twenty
twenty make.

Speaker 2 (03:05):
America Great Again.

Speaker 1 (03:06):
I was like, oh no, oh, like that's even if
it was like the other component whatever, Like that's a
tattoo that's on your head for life. Oh my god.

Speaker 2 (03:16):
More tattoo artists just gotta start saying no, like I
get it, get your bag, but oh that's that's foul.

Speaker 1 (03:26):
I would probably do the fast tattoo artist because I
would just giggle. You know what, this is your money,
free money, thank you.

Speaker 2 (03:34):
You're like free money. Don't tag me in the Instagram
post I don't, or TikTok or what the what the
hell at red note the hell are people posting on
these days since like apparently Instagram's kind of dying. Facebook
is like definitely dead, Like how what are y'all posting
pictures on?

Speaker 1 (03:49):
I just joined Threads like last night, and it's like
there's a lot of people on there.

Speaker 2 (03:54):
Oh, I'm on Blue Sky. I tried Threads and I
can't remember why I stopped using Threads. I think it
just kind of dead. Like I tried to like reuse
it and I was like, man, this is nothing. And
I mean I'm on Blue Sky, which I'm using for
the same purpose that I used to use Twitter and
probably Threads, which is to like put stupid thoughts out
so they don't cause brain damage. What's this episode about? Okay, well,

(04:29):
I'm not medicated today. I'm so sorry.

Speaker 1 (04:32):
Oh my god, that was me. Yes, today I laid
down in bed. I'm like Eric, I forgot to take
my ADI psychotics and He's like, good sword.

Speaker 2 (04:40):
Eric has the opportunity to do the funniest thing. Does
he have a hat man costume? God? Well, the thing
is normally you and I like Ki Ki before the episode,
I forgot we hit record.

Speaker 1 (04:54):
Yeah, I didn't say Oh. I think I sang a
snapchat about this topic ago Smart school Boy nine.

Speaker 2 (05:03):
Oh, I've been dying to get into this one because
like I literally have the YouTube like video essay on
it saved and I have yet to watch it. And
I've been I've heard it's like the most insane rabbit
hole of twenty twenty four. I am so excited to
get into this one. I'm going into it. I know
nothing about it.

Speaker 1 (05:22):
Is it the Nick Crowley YouTube video?

Speaker 2 (05:24):
I think?

Speaker 1 (05:25):
So that sounds could I watch the million? That's That's
what I have in This is gonna be in the
show notes because that's what the one I watched. So
we're gonna talk about Smart school Boy nine and Piper Rocks,
which is like they're kind of in the same ballpark.
Oh Okay, have you heard of Piper Rocks?

Speaker 2 (05:39):
That one I have not heard of.

Speaker 1 (05:41):
Okay, alrighty, So if you guys don't know what Smart
school Boy nine is or Piper Rocks, this episode is
going to deal with. It's got like not child porn,
but it's gonna mention it. Like I'm obviously you don't
talk about stuff like that detail wise, but this is

(06:02):
like it mentions it and online safety for your children.

Speaker 2 (06:08):
Okay, So it's got a good message.

Speaker 1 (06:10):
Yeah, so we need to watch the kids online basically,
so today we'll be discussing a strange topic that may
or may not be a conspiracy online accounts that are
children oriented but may or may not have ulterior motives.
So this was found as a Reddit post by a
numerous cut seventy nine to two in the form are

(06:31):
Internet Mysteries. This post was asking about strange Instagram accounts
that they happened upon and wanted to know if anyone
else had heard of these accounts. We will start with
the accounts truth Sticks eleven and Girl Chloe twelve. These
are both in that I mentioned their Instagram accounts and
these two accounts were started as accounts that seemed to

(06:52):
be ran by mothers who were posting about their children.
Truth Sticks would posts normal photos of a child learning,
earning rewards in school, playing, et cetera. Just normal like
mother post mother child stuff, and the account was supposed
to be of a twelve year old boy. The post
started in twenty twenty one, and they would even make

(07:12):
posts warning about predatory accounts and to not communicate with
them in case they mean harm. The reddit post went
on to say that the Instagram posts were normal children posts,
but as time went on, the post seemed to get
more and more disturbing. The child started to look more
AI generated instead of like a real child, and it
had giant red lips and a bright white face. So

(07:35):
I say, I really hate that I had these pictures
on my phone. I'm gonna try to put them in
the discord chat pull it up. It just makes me oh,
oh yeah, if you have, if you guys look it up,
be worn. It's creepy as hell.

Speaker 2 (07:54):
Oh that's oh, I mean, that's just oh, it's so
like Uncanny Valley.

Speaker 1 (08:05):
Yeah.

Speaker 2 (08:06):
And like if they are AI generated, the idea of
giving like AI creation tools feeding them photos of children
makes my stomach turn because of like the implications of
the potential outcomes and abuse of that. Like, I hate
I hate this for like more than just it looks disturbing.

(08:27):
Like I really oh, I'm uneasy about this already, I know,
I said to Meg, And I'm like, oh, I'm so
excited to get into this.

Speaker 1 (08:33):
I ooh, I'm yeah, I wouldn't know what to expect.
So these pictures that I put that I showed are
on the there's a lot more on the YouTube documentary
and I wasn't prepared, and I'm in this plant room
by myself, in the dark, and then all of a sudden,
like these creepy as pictures overtake my screenp No made

(08:57):
me a little nauseous. So the best way I can
describe these photos to you guys is it's just like
a pale like a child, very uncanny, valley doesn't really
look like a person. The lips and mouth are very.

Speaker 3 (09:12):
Wide, very big, dark red lips, bright, big square teeth,
no eyebrows, very creepy, very creepy.

Speaker 1 (09:26):
So the photos that these accounts would post cause some
disturbance on Instagram as the photos started looking more and
more AI generated, and eventually one post showed a grown
man dressed as a child with a white painted face
and giant red lips. Let me should setting another picture. Oh,

(09:50):
and that he has videos of him like talking, and
like his tongue like kind of sticks out of his
mouth as he's trying to talk. I don't know if
that's just how he is or if he's trying to
like be childlike.

Speaker 2 (10:02):
I don't know, Yeah, but I mean that's that's a
like that is so clearly a grown person. That is
a that is an adult, that is a taxpayer. Yes.

Speaker 1 (10:14):
After the postings of the man, it went back to
the AI generated child and it was getting to the
point that some of the posts were starting to become
suggestive of child porn, where the AI child would be
put into suggestive positions and their face would be edited
over adult bodies.

Speaker 2 (10:29):
Oh my god.

Speaker 1 (10:31):
Yeah, the video obviously didn't show those like it showed
like a face and that was about it. Like, so,
if you guys do watch it, it doesn't show anything
like suggestive Okay. It was obviously starting to look less
like a page ran by a mother about their child
and more like a predatory page looking for who knows what.
Girl Chloe twelve was a similar account that was supposedly

(10:54):
ran by a thirteen year old girl and quote monitored
by a parent. This account would als suppose about the
dangers of fake profiles and were aimed towards children. What
was also weird about these two accounts is that they
started posting photos of the child wearing black leather mini
heeled boots, Like so every post like had like the
pictures of these children and they would be like cartoon

(11:17):
bodies wearing these boots or the boots would like be separate.

Speaker 2 (11:21):
What on earth?

Speaker 1 (11:23):
I don't know what the if it's a fetish thing
or but it has to I have no idea. Some
speculate like it says type of fetish content. The only
difference between these accounts is that the Girl Chloe account
would post photos of real children, not seemingly AI generated
like the truth Sticks account. So where those photos of
those children came from is unknown, whether they were from

(11:45):
the Internet, other posts of Instagram, or if the poster
had actual access to their children would post their photos.
Don't know where these came from, which is terrifying.

Speaker 2 (11:56):
Yeah, I don't like that.

Speaker 1 (12:00):
People started to suspect that maybe these accounts were ran
by the same person, but for what reason is unknown.
One of the posts on the account is a video
of a falsetto childlike voice talking about the dangers of
fake accounts and to be on the lookout for them,
which was also very creepy, just because the sound of
it is sounded like a grown man trying to sound.

Speaker 2 (12:19):
Like a little girl.

Speaker 1 (12:20):
Oh yes, now let's get to the account Smart school
Boy nine. This account is similar to these accounts in
that it is also posting about the dangers of fake profiles,
and is also obsessed with the many leather heeled boots
and school uniforms. However, instead of posting photos of children

(12:41):
real or AI generated, Smart school point nine posts photos
of the grown man dressed in a white face, red lips,
and a two small school uniform with the black boots.
This is the last picture I'm sending.

Speaker 2 (12:53):
Oh, it's a it's really disturbing.

Speaker 1 (13:01):
Yeah, he like wears like boots, a skirt, blonde wig.
I needed to leave the photos off my phone before
I forget and have a heart attack later.

Speaker 2 (13:14):
It's so disturbing, it's almost giving me like early YouTube
Uncanny Valley content, like the robot singing I feel fantastic.
I don't think I've seen you know, or like the
Max headroom incident where you're almost like, is this just
someone being really weird or is this like, you know,
fetish cut, Like what is it? But it's it's really creepy.

Speaker 1 (13:40):
Yeah. He's also seen wearing short skirts and long wigs.
This is supposedly the actual user of all these accounts,
and he would post about going to school wearing his backpack,
posts about getting braces, and other childish things and it's
just very off putting to see.

Speaker 2 (13:57):
Yeah, nothing about that gives me good feeling, like even
if you're doing it like as a bit, like it's
just nope, yeah, no, that's a weird bit. Yeah, exactly
to keep doing it's.

Speaker 1 (14:10):
A weird niche.

Speaker 2 (14:12):
Yeah.

Speaker 1 (14:12):
The Smart school Boy nine account is mainly these videos
and they are honestly just very creepy to watch. So
be warned if you watch the Nick Crawley video that
is posted and the sources on this episode. I may
post some photos on the Instagram. Honestly, I don't think
I put that in my scrub. I don't think I'm
gonna do it. I don't. I don't need that juju
on me.

Speaker 2 (14:31):
You're like, I'm changing my mind.

Speaker 1 (14:32):
We're not doing that, No, because next thing, you know,
Smart school Boy and I cut is gonna come up
as a suggested follow which I tried looking up these accounts,
but there's so many, like copycat accounts. I don't know
which one was which, so they all have the same pictures.

Speaker 2 (14:53):
That's so short. What is the motivation for copycat accounts
of something like this?

Speaker 1 (14:58):
I don't know, no idea, Like I feel like this
should be studied. Yeah. So, not only does he dress
and act like a child, but he has videos of
talking and the voice sounds like the high child like
voice from the other accounts mentioned before. He also posts
videos of himself walking outside and posing outside and on stairs.

(15:18):
It's which was very weird, like he's there's videos of
him walking on the street and people like you can
hear people saying hi to him like it's a normal thing. Okay,
I don't know if I would say hi to somebody
dressed like this, and that is being very judgy. I'm sorry.

Speaker 2 (15:36):
I don't say hi to normal looking people on the street.
And then again, that's not a part. Don't take that personally,
that's that's a that's just I'm used. I'm usually busy
and I can't stop to talk or I've just had
one too many street conversations with strangers go horribly awry.
I just look forward and keep going, and I especially

(15:57):
don't engage in conversation with whatever is happening here.

Speaker 1 (16:00):
Mm hmmm.

Speaker 2 (16:01):
Like seriously, last time I was like, last time I
was in Raleigh, I was there for a veterinary conference.
I had a homeless man ask for some you know,
leftover brunch that me and my wife had and we
were like, yeah, sure, go ver to, like you know,
go right ahead, here you go, like you know, here's
like five bucks too, go ham And he was just
like are you all gay? We were like yeah, and
he was like you're gonna go to hell and then

(16:23):
just stood there for like five minutes telling it's bad.
I We're gonna go to Hell. And I'm just like, okay, buddy,
and where are you going? Not home? It's great, Like
I don't talk to I don't talk to strangers anymore, like.

Speaker 1 (16:34):
The audacity of like can I have your food before
you go to Hell? I an eternity? Can I have
your sandwich?

Speaker 2 (16:42):
Literally sitting there holding my muffin like you're gonna go
to hell? And I'm like all right, and you go
enjoy that muffin. I guess like okay, yeah, I don't.
I don't talk to I don't talk to people anymore.

Speaker 1 (16:54):
I don't think I ever know. I never did. I
try not to, like I do this smile and not thing.
Hope it wasn't a question, and just go on with
my day.

Speaker 2 (17:02):
The same same ear aha, And they keep going and
you hope they didn't ask, like, you know, where's First Street?
And you're like, ah, yeah, so that's all you're getting
fromt of me?

Speaker 1 (17:12):
Sorry, Oh my gosh. So internet detectives on Reddit and
Instagram have found at least a dozen or so other
accounts just like these ones, and they're all supposedly ran
by the same man.

Speaker 2 (17:27):
Can I let the I raid his house already? Like this?
Something fishies a foot?

Speaker 1 (17:33):
No, nothing, we'll get into that. I mentioned it later.

Speaker 2 (17:37):
Oh boy.

Speaker 1 (17:39):
Now, if the post about him dressing as a child
and making any children posing suggestively wasn't about or creepy enough,
it gets worse. It gets weirder. This is the second
episode or wherever want to say that, it gets weirder.

Speaker 2 (17:53):
Oh boy, all right, I'm buckled in.

Speaker 1 (17:56):
He also has posted videos of real children playing on
playgrounds that is assumed that he has recorded himself, but
obviously cannot be proven that he did. Like he's recording
children and just putting them on like that he has access.

Speaker 2 (18:08):
To I don't I don't like that. I hate that.

Speaker 1 (18:14):
Hm. There is also another account that is supposedly his.
It's under the name of Stephanie, and this one differs
in that there's no creepy photos, but only strange poems
and that account dates back to twenty eighteen. So now
how does all of this seem dangerous and not just creepy?
This person is writing comments to real children on real accounts,

(18:35):
trying to befriend them and warn them of fake accounts. oOoOO, yeah,
that's the bad thing. Like they're messaging children. They're also
posting photos of real children, and the origin of these
photos cannot be found, so it must be assumed that
these photos are either being sent to this person privately

(18:57):
or being taken. So it's also not then who would
be sending those photos children or predators.

Speaker 2 (19:03):
Yikes.

Speaker 1 (19:04):
So people have tried contacting the police about this user,
but not much can be done as no crime has
officially been committed, and I don't really think they know
who or where this person is. Like, yeah, he's creepy
and like unsettling, but like officially like no crime, he's
not doing anything illegal, yeah.

Speaker 2 (19:22):
And I mean, like I guess I get that, like,
like technically there's no overt crime happening in this, but
it's just it's just kind of like, if you're doing
that publicly on the internet, what are you doing privately
in your home, and that's where I feel like the
crime is maybe perhaps happening.

Speaker 1 (19:37):
Yeah, but I guess, like I don't know if they
just don't have like enough.

Speaker 2 (19:41):
Like just enough evidence to get a warrant or yeah,
I mean yeah, it's just unsettling to think about, like,
you know, what's what's this going to be? What are
they going to potentially find out or what's this going
to escalate to, like ten years from now mm.

Speaker 1 (19:56):
Hmm, because like if they like got a warrant for
every single creep on the internet, like that would take forever.
Like there's the Internet's a scary place. Some of his
posts show buildings in common areas people could recognize, along
with mentioning places in his poems, so it's assumed that
he's in the London area, so it's not like in

(20:17):
the US area, which I guess kind of made me
feel like he wasn't creeping around my neighborhood.

Speaker 2 (20:24):
Yeah, I mean, I guess there's that, you know, like
he's he's not secretly my across the street neighbor.

Speaker 1 (20:30):
Yeah. In his poems, he mentions a person named David
multiple times, and it is thought that this man's name
was actually David. Some Internet sleuths think that they found
the name David Alter as this person's identity, and that
he's fifty nine years old. He has no past criminal convictions,
and he was being looked into by the authorities through

(20:52):
the posts and read it and discord about him. But
as far as I know, nothing really came from it.
They probably just saw it and were like, well, like
I said, like, we can't, there's nothing illegal going on.
He's just a creepy dude.

Speaker 2 (21:04):
Yeah. Yeah.

Speaker 1 (21:06):
Compared to his children accounts, his personal accounts are more
well written and something that he is generally just mentally
unwell yeah, which I mean, yeah, like I agree with this,
just because I person don't think a mentally well person
would impersonate a child and try to befriend them online
like he is and just post the way he posts.
Oh yeah, that's just not a healthy thing to do.

Speaker 2 (21:29):
Mm hmmmm.

Speaker 1 (21:32):
So people have messaged him with accusatory messages about him
being a pedophile and being inappropriate, and he responds just
like angry messages back. I can't remember what exactly he said,
is that it shows it in the video, but like
he just it's like more eloquent writing like an adult
instead of a child and angry.

Speaker 2 (21:53):
Yeah.

Speaker 1 (21:54):
He goes on to still post about other accounts being predatory,
and even accuses some of wanting to kill children with
cannon and ritualistic sacrifice.

Speaker 2 (22:02):
What.

Speaker 1 (22:04):
Yeah, he accuses like other accounts like, oh, hey, they're
bad for kids because they want to kill kids?

Speaker 2 (22:09):
What on earth is going on?

Speaker 1 (22:11):
But like, there's obviously there's no evidence of this, and
it's suspected of him just posting random things to just
maybe get the tension off of his account. I don't know,
I guess.

Speaker 2 (22:21):
I mean, you know, if you're like, oh, like look
over there, you know there's worse things happening over there.
I mean, I guess that makes sense. But it also
seems like this person is posting for attention. But maybe
they're just mentally unwell to the point where they're like, no,
I want the positive attention of this type of posting,
but I don't want the negative attention of being accused

(22:42):
of anything. Yeah.

Speaker 1 (22:45):
Maybe. So, as I was watching the YouTube documentary that
is linked in the show notes, again, it showed a
concerning clip of him chasing a small boy around a
yard and playground, which made me feel like very uneasy. Yeah,
it was unknown if he knew this child, if they
were actually playing around, or if this child was in

(23:07):
true danger. Ooh, it's just like a quick clip of
him like chasing a small boy and the boys like
booking it.

Speaker 2 (23:14):
Oh my god.

Speaker 1 (23:15):
So nothing else comes of this video to show what
happens next, or if it's even a real video. It
has been edited. The subreddit of Smart school Boy nine
has been banned due to harassment. So I guess like
that subreddit that I mentioned earlier is like done.

Speaker 2 (23:32):
Okay, and I'm assuming that was probably the subredit, So like,
that's basically a bunch of people that are like, what
the heck is going on with this account? Yeah, you know, like.

Speaker 1 (23:40):
Yeah, yeah. So as I was looking into this topic,
another came up that caught my attention. Piper Rocks. Okay,
So Pip Piper Rocks is another suspicious account, except this
one's on YouTube and was supposedly ran by a young
girl that just wanted to show her editing skills by
making videos. Mm hmm, so it sounds innocent enough, but

(24:02):
the videos were remarked as being very violent and some
were perverted. So the Yeah, so the documentary showed quick
clips of this Pepper Rocks account and like it would
be just like a weird like maybe fairy tale background
with like a random girl dancing in the corner and
then like a monster comes and eats her head. What

(24:25):
on earth? Just like random random things. The videos were
a lot of children edited into videos depicting violence, monsters,
and sometimes the children mainly girls, being posts and provocative stances.
The videos also didn't really make sense, Like I mentioned,
it's just random music dancing kids and like maybe like

(24:48):
I think one of them had like johnnyed up as
Jack Sparrow and like a monster. It was very put together,
random crap.

Speaker 2 (24:56):
What on Earth?

Speaker 1 (24:58):
It's like a fever dream. Yeah, like creepy. There was
a channel connected to it called Isabelle Piper, which had
similar material. The pages seem to have been abandoned around
twenty nineteen. One of the most concerning things, yeah, one
of the most concerning things about these videos would be
the grown men commenting on them about the children.

Speaker 2 (25:18):
Oh yeah.

Speaker 1 (25:19):
Account would comment back yeah. And what was worse is
the account would comment back saying she was eleven years old.
But this didn't really deter the commenters, which is obviously
concerning in and of itself.

Speaker 2 (25:31):
And that's actually something that even with like you know,
family blogger accounts or accounts that are genuinely just like
moms posting their kids, that stuff is still such a
major problem of you know, grown men making those awful
suggestive comments you know, towards yeah, children, I mean, to

(25:52):
the point of babies and toddlers, And it's awful. And
there's so many parents, you know that keep their kids
completely off so media. I mean, I like, the only
photos that I have, like of my daughter on social
media are all on like you know, private accounts with
only like friends and family, like any account that's public.
I don't have any pictures of her just because I'm like,
I don't know who could see it, save it or

(26:13):
download it and use it for whatever purpose. And this
episode is just kind of validating that. I'm like, oh, yeah,
I'm making the right decision by like, you know, not
ever having her on like TikTok or like public Instagram
posts or anything like that. Like it's just it's just
it's so creepy, like what's out there.

Speaker 1 (26:30):
Yeah, I don't, like I have an issue with like
mommy bloggers that use their kids for likes and views.
Oh agreed, especially like on TikTok, like you see a
little girl like in a swimsuit or something. How many
saves that video has?

Speaker 2 (26:44):
Yep, I've actually followed this account. I follow this account
on Instagram. I think she has a TikTok too, and
it's called mom Uncharted, and basically she talks about that,
and she talks about like why as a parent you
should keep your children off social media and how important
it is to be safe and does like expose's on

(27:05):
like those quote unquote mommy blogger accounts that are being
irresponsible and putting their kids in danger and also having
an influencer kid. One hundred percent. That's that's that's it's
got to be a violation of a child labor law.
Like I do think they should expand child labor laws
to include family bloggers. If that's how you make your money,
if you're monetizing your account, technically a form of child labor.
But also like I saw some one video that was

(27:26):
like help me make a period basket for my daughter
or whatever, and I was like, can you imagine being
some poor thirteen year old girl and the fact that
you had your period is like blasted to the entire internet.
I actually can't imagine. Sorry, Sorry guys, this is an
anti mommy blogger podcast.

Speaker 1 (27:45):
Yeah it really is. So if this offends you, I
don't know what to tell you, right, sorry.

Speaker 2 (27:51):
Right, I'm not saying so like I am a parent,
and I'm not saying sorry you use a photo album?

Speaker 1 (27:57):
Like were you on to say? Talk whenever? The whole
like conflict or conspiracy about the rent account? The little
girl named Wren.

Speaker 2 (28:06):
She was actually the first person that came to mind
when we were talking about like exploitative mommy bloggers. Yep,
that that like haunts me. My friend. Uh, actually I
think she's in the discord Alma. She introduced me to
that whole situation, and I legitimately could not learn more
about it because it just it disturbed me so much.

Speaker 1 (28:25):
Yeah, Like, I haven't seen any of her, Like, I
don't know if she's still posts. I'm assuming she still does.
I just haven't seen it in like months. I don't
know if like she changed her settings or what. But like,
I haven't seen people complaining about it, posting about it,
or my algorithm just changed. It's probably just that AnyWho
we find where I was, we just keep we keep

(28:47):
finding more content.

Speaker 2 (28:48):
We're like, oh yeah, and then this content which is
similar to this, which is similar to this account, which
is this account crazy?

Speaker 1 (28:55):
Yeah, so the video is normally didn't have a voice,
but the one video that was posted with a voice
sounded very unnatural, like someone was impersonating a child's voice.

Speaker 2 (29:06):
Damn.

Speaker 1 (29:07):
In twenty twenty one, six to eight more channels emerged
that were similar to Piper Rocks, and eventually four chan
users got a hold of the accounts, asserted an online investigation.
I knew you.

Speaker 2 (29:21):
Not four cham. I mean, you said investigation. At least
maybe they're using their four Chan powers for good. They did, okay, yay.

Speaker 1 (29:30):
They started an online investigation and managed to get a
hold of the email used for the accounts, and they
did a reverse search and found the owner of the accounts,
a man named William Glenn Whittaker. Who does that sound familiar?

Speaker 2 (29:46):
No, it's just like we have a name. Like they
just straight up We're like, yeah, we found this guy.
Yeah damn.

Speaker 1 (29:51):
He had thirty thirty other email accounts that were connected
with every other YouTube accounts of children. So we're gonna
talk about him. Yeah, We're talking about him.

Speaker 2 (30:00):
A little bit. I'm sorry, I don't care how like
mentally unwell you are, Like if you are a grown
ass man and you have thirty accounts about or impersonating children,
like they need to put you under the jail.

Speaker 1 (30:15):
Yeah, you need to not see daylight again. Yeah. Ever, Like,
that's a whole conversation that I don't think we should
have about here.

Speaker 2 (30:27):
But like you, I like, I'm at the discord.

Speaker 1 (30:32):
We'll talk about that in the discord. He was a
sixty five year old man who was a felon that
was arrested in two thousand and two for being caught
with child pornography and is a registered sex offender who
lived in the Tampa area of Florida.

Speaker 2 (30:48):
All Right, you guys didn't see it, Kayla watched my
little mini mental breakdown where I threw my microphone across
the room. I knew it. Yeah, I knew it. I
knew it. It's it's it's I know this is it's
it's reminding me of that again. That that's to bring
it back to like classic sort of YouTube conspiracies. The

(31:09):
video of the guy singing the really creepy old dude
singing like I Feel pretty or something like that, and
it turned out that he was like this felon on
parole and he was like violating parole by posting. And
that was like the video that when it went viral,
it like put him back in jail or something like that.
Like that's what it's reminding me of. I know that
was really vague. Maybe I can like find it for

(31:30):
you sometime, but that.

Speaker 1 (31:31):
Yeah, somehow this man opened up and released his own
agency for children called Lizard Productions, modeling agency specializing in
child stars and giving hope to families for a music career.

Speaker 2 (31:46):
No, oh my god. I remember when I was younger,
getting like modeling brochures and stuff like in the mail
and whatnot, and my mom always being like this is
so fishy. And it's like I feel like every single
one of them has been exposed as something like that,
you know, at best, a scam, at worst like predatory
attempts at child trafficking. Like please do not ever, just

(32:10):
if you have kids, just don't ever sign them up
for modeling because apparently you can be a registered felon
and just do this shit.

Speaker 1 (32:17):
Yeah, but the business was unlicensed, and no one just
thought to look into it to see if it was
a legit business or not. He was just like advertising
it and people were like, oh yeah, old man that
has you know, criminal career history out through my child
at you Google? Google was you know a thing back

(32:42):
then as well? He could have just done a quick
Google search.

Speaker 2 (32:45):
That's crazy. That was absolutely crazy.

Speaker 1 (32:48):
The really unfortunate part of this is that people would
bring their children to his house unknown if accompanied, and
he would record them singing and dancing for him, and
he would claim to send the video to higher ups.

Speaker 2 (32:59):
For you what, like you're telling me that, like, okay,
you don't google the guy or his business license. At
least I can kind of understand that if you're not
a business savvy. But to just take him to a
house like to arrive and you're like, this is not
a place of business that looks legit. This is just
a house. This is a residence, and you're like, all right, peace,
I'm I mean at some point the common sense has

(33:23):
got a kick in.

Speaker 1 (33:24):
Yeah, I don't if it was like like you said,
like a business building, okay, like you probably have to
have a license borrow like a business space, building space,
but like you've just shown up to some guy's house
and dropping your kid off for them to dance in

(33:45):
front of.

Speaker 2 (33:45):
A camera exactly exactly. Something that I learned in becoming
a parent is like, not every parent has like protective instincts,
and not every parent really acts like a parent. Like,
so I'm just like, have kids, and then they're like,
this is just a guy that lives in my house.

(34:05):
I don't really care about them. I'll leave them with anyone.
And it's like, more than you think, it is crazy.
And then there's also a lot of parents out there
who are well meaning and are just incredibly ignorant to
the dangers their kids are. And I think there's so
many people who unintentionally put their kids in danger because
they simply think, oh, that could never be me. And
then it's like, well, you drove your kid to some
guy's house listening a dance on camera, So.

Speaker 1 (34:28):
What blows my mind? Sometimes So I'm in my local
town's like facebook groups, which I think I mentioned it
last episode. I'm just saying it because like the posts
make me laugh because it's just a bunch of old
people complaining about taco bell.

Speaker 2 (34:41):
Oh my god, And that's why I can't I can't
be in my town's Facebook. Well, I haven't been on
Facebook for a very long time. Like, that's why I
can't be in Facebook groups. They're always just like the
McDonald's messed up again. I'm like, then, stop going well
with the flea market. That's what I want to know.

Speaker 1 (34:58):
Every so often, somebody posts in there, Hey does anybody
do babysitting? I need my kids watched this weekend.

Speaker 2 (35:05):
I've seen that. I've seen that.

Speaker 1 (35:07):
I've asked the kindom stranger on the internet to watch
your kids, right, Like, I don't have kids, So some
people think I don't have him to judge, But I'm
going to judge anyway, because that's just asinine.

Speaker 2 (35:18):
Again, as as a parent, I'm giving you the card.
I don't agree with the paradigm that you can't judge
until you have kids. I think lots of people without
kids make very correct judgments on when parents are messing up.

Speaker 1 (35:34):
The town I live in, if you look up on like,
you know, registered sex offender website, it blows up.

Speaker 2 (35:41):
Yep.

Speaker 1 (35:42):
There's sketchy people all around wiley.

Speaker 2 (35:44):
Yep, looking like they're not like us album cover.

Speaker 1 (35:47):
Yeah. Ugh, So no one really knows what truly happened
with those videos. He was eventually caught by police in
two thousand and five, but was given not given any
jail time. They just shoot down the agency.

Speaker 2 (36:04):
They're just what on earth?

Speaker 1 (36:06):
They slap on the wrist, just I'll do it again.

Speaker 2 (36:10):
It's so wild to me that there's people, you know,
who wait like years and years in detainment facilities awaiting
trial for like non violent offenses, and then there's people
who are like child predators that they're just like oopsie, poopsie,
don't try and start another business. Yeah, can't wrap my

(36:33):
head around it.

Speaker 1 (36:34):
So not much has heard about heard from about him
as he possibly died in twenty twenty one, and nothing
has been posted connected to him, like I google him
and like it just like registered sex offender page came up.

Speaker 2 (36:48):
That is crazy. That's that's where this episode leaves off
that that's the last update we have on him is
maybe he's dead.

Speaker 1 (36:55):
He might be dead. He doesn't have, as far as
I know, a modeling agency anymore.

Speaker 2 (37:00):
But well, thank god for that. Okay. I never thought
I would say this, good job, four Chan, Good on you.

Speaker 1 (37:13):
I vaguely remember you like always complaining about four Chad,
and then I was writing this, less he's gonna have
a little meltdown.

Speaker 2 (37:20):
It's just it's just like it's just like it's so
And maybe maybe I'm biased, because I'm sure four chan
is just like Reddit. There's probably good circles and bad circles.
It's just you know, the kind of og anonymous posting platform.
Well not really. I mean, it's based off two chan,
but that's it's a it's a whole that was a
I think a Japanese forum. Anyway, it sort of set
the stage for anonymous posting, and unfortunately, when you give

(37:42):
people a mask and you give them anonymity, they become
the worst versions of themselves online. So four Chune has
a really really negative reputation. However, there's boards on there,
you know, just like Reddit that some of it it's
just people talking about book reviews or paranormal discussions, but
you know there's also people like posting manifestos and planning

(38:03):
terroristic attacks. Like it's it's insane, it's kind of and
a lot of people have compared what X has become
two og four chan, and I guess, I guess in
this instance, the anonymity allowed people to band together to
like to catch a predator. Basically, yeah, so hey, so

(38:26):
broken clock, silver lining and all that.

Speaker 1 (38:28):
So when it comes to Smart school Boy nine, it
comes down to whether it is a witch hunt or
a true disturbance. So I say this because it has
come out that David may or may not be his
real name, or his age or location. In this information
has led to address stocksing, bricks thrown in windows, and
calls for this David to be quote unquote taken out.

Speaker 2 (38:47):
Oh so they don't even know if they have like
the right guy that.

Speaker 1 (38:51):
Yeah, they don't know. They're just kind of like think
they have it. Like this guy is obviously like real
and what he's doing is disturbing, but their leaking addresses
that probably aren't him, see, and that's innocent. People are like,
that's where it's like considered a witch hunt.

Speaker 2 (39:08):
Yeah, yeah, that's and that's kind of where I've got
That's always where I get issue with with, you know,
internet witch hunts. It's like, if the dude's guilty, go
ham go crazy, go stupid, get silly. But it's kind
of like when there was that I'm again, this isn't
going to be extremely vague because it's been many years
since this happened, but there was like a master doc

(39:28):
leak that went out that was supposedly KKK members and
their families, and it turned out to be incredibly flawed
and just had a bunch of random people who were
in no way affiliated with the KKK on there, and
they were, you know, oftentimes innocent families with children that
were getting docks or or attacked because people thought that
they had KKK affiliations. And I'm like, that's not good.

(39:50):
You know, obviously knowing who might be a covert KKK member,
I guess would be very helpful to find out, like, oh,
oh crap, like that person serving on the school board,
that person's a lawyer, a teacher, a cop, like they
absolutely shouldn't be. But then the problem is half the
list was just completely incorrect, and then it's like, yeah,
that's that's a problem.

Speaker 1 (40:10):
Like there was I can't think of his name. He's
kind of irrelevant because he's a piece of trash, but
his YouTuber and he was.

Speaker 2 (40:16):
On to Shane Dawson.

Speaker 1 (40:19):
No, not just him, but the guy that says, your
body my choice?

Speaker 2 (40:24):
Oh Nick? When when when whatever? Yeah, Nick.

Speaker 1 (40:31):
Trash? He got docs. I think his house got set
on fire last i heard.

Speaker 2 (40:37):
I'm sorry, see like when it's you see what I mean,
Like when it's hip specifically, it's like I don't feel
bad like play stupid games. When stupid it's not interesting.
I would feel bad if like his neighbor got docs
or like, you know, someone mistakenly attacked his neighbor. But
because it's just him, I'm like, I don't care. Go
him go crazy.

Speaker 1 (40:53):
He goes to Yeah, he's hiding on his mommy's house anyway.

Speaker 2 (40:56):
So like, for legal purposes, don't commit any crime. This
podcast does not advocate for stalking, harassment, or anything. All jokes,
he ha ha. But also I don't feel that for him.

Speaker 1 (41:12):
Yeah, don't set fire into houses, but also don't say
your body my choice, because then your house might get
set on fire.

Speaker 2 (41:20):
There's that seems like a fair trade off.

Speaker 1 (41:23):
Fair trade off. It can be agreed that the creator
has very disturbing behaviors, but technically no crime has been committed. However,
what makes this a conspiracy. There's the question of whether
there is an ulterior motive to these accounts, or if
they're just creepy videos and posts, or if it's just
the accounts of a mentally ill man, and if this
should be out online smoothed and treated like a witch hunt.

(41:48):
So personally, I think it should be definitely looked into more.
As he is getting photos of children that don't belong
to him and posting them, so it's like where are
they coming from? And is he sharing them with others
who do mean harm? Yeah?

Speaker 2 (42:03):
Exactly, Yeah, Yeah, And I know that there's entire agency
is basically dedicated to find like tracking down and investigating
potential like child trafficking and child sexual abuse material you know,
online and things like that, and unfortunately there's just there's
there's so much and like that's just it's so horrible
to to to realize that and to say that and

(42:25):
to recognize it. But there's just so much that it
could be that, Like we do have like the FBI
on his trail, it's just they're maybe not moving very
swiftly because they're just dealing with such a high caseload
of tracking this kind of thing down.

Speaker 1 (42:37):
Yeah. And as for Piper Rocks, was it also just
the front to share chalipe porn or was it the
works of another sick individual who just wanted to share
disturbing content. I feel like it's either or.

Speaker 2 (42:53):
The one thing I will say, I feel like if
you run into content like that, don't like, don't interact,
don't share it, just like report it, you know, just
send it to the authorities that need to deal with it,
because you know, interacting with it even to comment, like,
you know, this is messed up, like it it feeds
the algorithm, and these people seem like they want attention,

(43:13):
So I mean the best thing you can do is
just kind of starve the beast. But yeah, oh man, that.

Speaker 1 (43:19):
Is just it just crazy like with that, like that
account like that, with the people commenting on the videos,
like the met commenting and being creepy. That's just the
way for the account to messagees people and be like,
hey what do you have? Mm hmm, hey where do
you went too? Like I feel like it's just the
way for it's like flies to honey.

Speaker 2 (43:40):
Mm hmm, yep, yep, yep.

Speaker 1 (43:43):
But yeah, that is that is what I had for
you guys today. It was I feel like that was
a lot that we just went through together.

Speaker 2 (43:51):
I said, I was excited to get into this, and
I don't know why I was excited to get into this. Honestly,
I assumed it started and stopped as like a creepy
AI project. I'm disappointed, yet unsurprised in the direction that
it ultimately ended up taking.

Speaker 1 (44:05):
Yeah, another just another warning. If you guys look up
the pictures, they're very creepy. If you watch the documentary,
I highly recommend you watch the documentary. There is some
people who disagree with it, saying that's because of the
witch hunt stuff, But like, I feel like it has
a little bit of backbone, which is why I used
it for this episode.

Speaker 2 (44:25):
Yeah, yeah, I mean, and I like it too. I
feel like, if you're gonna engage with content like this,
definitely do it via a third party so that you're
not giving you know, the original poster an he clicks
and he likes feeding their algorithm. You're just all watching
it through a third party. I think that's kind of
the best way to do it, best way to interact
with content like this and get informed. But that was

(44:46):
a doozy. That was a doozy.

Speaker 1 (44:47):
That was a weird one.

Speaker 2 (44:50):
I feel a responsibility to bring you guys something not
completely unrelated to anything we have discussed today to make
up for it. I feel like.

Speaker 1 (45:04):
That's what always happens. Kayla brings you know, the doom
and gloom and Lexi's like, you know what we're gonna
talk about clowns?

Speaker 2 (45:12):
I do you bring? That really disturbs up? And like
top ten Japanese sea monsters, Like I'm like a BuzzFeed article.

Speaker 1 (45:21):
Lexi is palate cleanser. I am the bitter wasabi.

Speaker 2 (45:26):
We need to sell merch that has that. It's like
the wasabi and then the little slices of ginger? Which
one are you and your bestie? Every best friendship in
every relationship is a black cat and a golden retriever.

Speaker 1 (45:44):
Yeah, yeah, Kayla is the black cat. Yeah, which even
in my marriage, my husband's the golden retriever.

Speaker 2 (45:56):
That is true, I have met him.

Speaker 1 (45:59):
Just gotta be friend everybody, be nice to everybody, make
plans with everybody, ads everybody on Facebook, Snapchat, And then
he's like, kill, why did you talk to people?

Speaker 2 (46:09):
Because my wife is the one that will text me
during social gatherings to say can we leave? Because like
she won't orchestrate the leaving process. I will, So I'll
like check my phone and see the text and I'll
be like, guys, we actually really have to go pick
up a prescription. Like you know, it's been real fun,
Like I know, that that means I have to get
her out like post taste, and it's a great dynamic.

(46:29):
And I actually found out that two of our are
like one of our couple friends, They're like, oh, we
do the same thing. They're like we have a secret handshake.
That means like I need to go.

Speaker 1 (46:38):
Yeah. No, I can do that to Eric and he'll
he'll be like, why'd you text me? Like you're not
supposed to say that.

Speaker 2 (46:45):
Don't do that. Guys, if your partner's the one texting
you can we leave. Don't announce that they texted you.
Just find just just be like, oh I left my
oven on, Oh I gotta go lock the dog. I
don't mind. I don't mind it at all. Someone to
tell me when to stop socializing or I will keep
going forever. I'm very thankful that she texts me and
she's like, it's time to go, and I was like,

(47:05):
you're right, queen, it is time to go. It's time
to have scroll TikTok in bed together, time to recharge.

Speaker 1 (47:13):
So I really want to hear what you guys have
to say about this. So you can always join our
discord on our link tree. Just click the button that
says join the discord. You can send us an email
at a Little Wicked Podcast at gmail dot com. Go
to any of our socials and message us. I always
post about our episodes, leave a comment, do something we

(47:35):
love hearing from you guys. I recently put a poll
on Instagram and I think about like what people like
to hear, and I think it was between cults and conspiracies.
Is our favorites?

Speaker 2 (47:46):
The big the three seeds, cults, conspiracies, cryptids, crime, four
seas I can count.

Speaker 1 (47:52):
Yeah, So I think my next one is a cult,
a couple of cults because the one cults I picked
is very short of an episode, so I have to
find other ones that are related.

Speaker 2 (48:05):
But I mean, I haven't covered any nuclear accidents in
a while. I might show up next week with like
three Mile Island or something. So I and I'll tell
you what. I was just at the Motor Museum not
long ago. If you don't know, that is a museum
in Philadelphia associated with the Philadelphia College of Physicians. You
can support them online. They are actually an incredible organization

(48:28):
and it's just a museum displaying medical oddities. And I
can't recommend it enough for oneever year in Philly make
reservations ahead of time because they do sell out very quickly.
It's a nice, small, cute little museum. And I found
a poster that was like an old timey poster advertising Radium,
and I was like, we need that poster in our

(48:51):
house because I am constantly covering the nuclear disasters and
radiation episodes, and I myself have been radiated more than
once on purpose. I was it was for medical purposes.
I have been radiated, and so I'm like, I need
it in my house. I grab one of the posters
from the bin and it was the wrong poster that
had been put in the radiation bin. And so while

(49:12):
I'm not upset that I now have an excuse to
go back to Philadelphia and get the correct poster, I'm
so bummed that I don't have my radiation poster in
my house.

Speaker 1 (49:21):
They don't have like an online shop.

Speaker 2 (49:24):
I simply haven't looked. They probably do, like I have
not done the research. I don't look things up. I
do not look before I leap.

Speaker 1 (49:37):
And I'm thinking, I don't know if we should start
the YouTube backup. I don't know if I'm feeling froggy
enough for it yet or if we just want to
take some time.

Speaker 2 (49:47):
It would give me an excuse to invest in like
a good like maybe some good headphones, a good backdrop it.

Speaker 1 (49:54):
But she like has unicorn horns on right now from
her headset.

Speaker 2 (49:59):
I do, I do. I borrowed them from my daughter
to this episode for a little bit of whimsy. And
I also have a Christmas tree behind me. And it's
almost February, so like, if we do the if we
do the YouTube, I'm gonna have to clean up my act.
I cannot be making a creepy YouTube with an adorable
Christmas tree behind me.

Speaker 1 (50:18):
Well, we'll work on stuff, we'll figure it out.

Speaker 2 (50:21):
But yeah, you.

Speaker 1 (50:22):
Guys can message us on any of those let us
know what you will, what episode you'd like to hear.
We always love taking you know, recommendations of what you
guys want us to look into. Something catches your eye.
But yeah, and don't forget after this episode there is
an ad for It's murder up North frecko then checking

(50:42):
them out following all their socials and stuff. But yeah,
I think that's all I have for today.

Speaker 2 (50:49):
Well, that was disturbing, and that a who that puts
the responsibility on me to come in with something lighthearted.
If anyone has any suggestions, feel free to send them
my way. I'm I'm on Socials. I am I prom
I feel like normally do this episode. I'm like, I
want TikTok, I want to a xhmlah blah blah blah,
and I'm like, I think I'm on Instagram like in theory,

(51:12):
I'm on Blue Sky and I'm in the discord. I
do have a massive list. If you guys don't have
any suggestions between now and then, I will find you
on something. I will deliver, just like I always do.
Call me Papa John.

Speaker 1 (51:22):
Alrighty, well, so our discord just disconnected, so you're not
going to get an adjective from the both of us.
So that was just kind of creepily wicked and sorry
you're not hearing. You're out pro from LEXI. Goodbye, everybody.

Speaker 2 (51:56):
Hello, I'm Jenny, the host of its Murder up North,
wishing to introduce you to my new series focusing on
the inmates of Wakefield Prison known as Monster Mansion, which
houses some of Britain's most dangerous offenders, individuals whose actions
have pushed changes in the law, while others have resulted
in the system itself being questioned, yet all of them

(52:19):
have left a wave of devastation.

Speaker 1 (52:22):
Join me as I delve into the life and crimes
of the residence of Wakefield Prison in my new series,
Infamous Inmates of Monster Mansion. Its Murder Up North is
available now on your favorite podcast provider.
Advertise With Us

Popular Podcasts

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.