Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_03 (00:16):
Hi, and welcome to
the Toxic Cooking Show, where we
break down toxic people intotheir simplest ingredients.
I'm your host for this week,Lindsay McLean, and with me is
my fantastic co-host,Christopher Patchett, LCSW.
I, on my own, had decided tofinally take a stance and come
(00:37):
down and come out publicly in myhatred for reaction videos.
Because I think we should allhate them.
SPEAKER_01 (00:44):
So okay, so I gotta
how the fuck did you uh did you
look up uh Sniper uh SniperWolf?
SPEAKER_03 (00:51):
Yes, I did.
SPEAKER_01 (00:52):
How how did you not
hear about her during this whole
no?
SPEAKER_03 (00:57):
I am actually
surprised that I didn't.
Like I was I was I read thewhole thing and I was like, oh
yeah, like this sounds likesomething that should have made
you know its rounds on stuffthat I watch.
And somehow it didn't.
I feel like I would rememberhaving heard about some YouTuber
chick who showed up livestreaming at her enemy's house.
(01:21):
Like I just I feel like thatwould stick in my brain.
Alas, it didn't if I did hearabout it.
But you know, reaction videos onthat topic.
So I I think there are a coupledifferent types of videos that
all fall under the broad titleof this is a reaction video.
You have call-outs, you haveresponse videos, you have
(01:44):
stitches, you have the directones that like are just straight
up reaction videos, and you haveseen these, everyone has seen
these.
It ranges from me making a videoand saying, I'm gonna tell you a
story about how ChristopherPatchett did me dirty, to me, I
find a video of you dancing at aconcert and you look really
(02:07):
fucking stupid while you'redancing, and I post that like
and maybe I do like aside-by-side where I'm like
mocking you, or I periodicallypop up and like make fun of how
you're dancing and how stupidyou look.
SPEAKER_01 (02:22):
Oh my god, if it
came down to me dancing, I would
mock myself on how stupid Ilook.
SPEAKER_03 (02:28):
See, you're safe.
Yeah, these are they're everit's a really popular format of
video.
People really like them, andagain, sometimes it is useful
because sometimes the person isproviding additional information
or context or clarifyingsomething.
A lot of them don't actually addanything.
(02:50):
Like sometimes it's just theperson, like there's no sound on
their end, and they're just yousee their faces, they're like,
oh my god, like reacting towhatever is happening in the
video, but they haven't actuallyadded anything to the
conversation.
There is no conversation, infact, they're just there.
(03:13):
Wow, oh my god, even better thanthe ones where like there isn't
even any sound, and they'll likefall out of the frame laughing.
I'm like, I can't even see youanymore.
And the types of people who dothese, so again, there's this
wide range of things I would saykind of fall under the reaction,
(03:34):
and the type of people who dothem also there's a big variety
because you do have people whoit's the occasional thing, like
they may be doing it in responseto a rude comment or in response
to another video that came up intheir field, and so they stitch
that, they react to that toagain, you know, maybe provide
(03:54):
some more context or informationto call out somebody for
providing false information.
You also have people who this istheir whole spiel.
They make no original content,they simply react to other
people's videos.
That's it.
That's it.
There are a lot of these peopleon there who, again, no original
(04:18):
content.
They find videos on their own,or people tag them in videos
that are kind of within theirniche.
So, like, you know, I'll talkabout him a little bit later,
red flag guy.
He'll get tagged in potentialred flag videos.
Other people, what is it,Freedune TV, I think is his
username.
It's just people being stupid.
(04:38):
That's what he likes to reactto.
SPEAKER_01 (04:40):
Do you know what I'm
talking about?
I I'm I'm curious where this isgonna go because uh two of my
favorite like YouTube's isreaction videos.
SPEAKER_03 (04:51):
You may uh if I do
this right, I hope your opinion
will change.
SPEAKER_01 (04:56):
Either that or I'm
gonna dox you.
SPEAKER_03 (05:01):
Yeah, if I don't
manage to, that'll be the thing.
You're like, no, I'm notconvinced, and to punish you for
not convincing me, her addressis do you even know my address?
Actually, yes, you do.
SPEAKER_01 (05:13):
Yeah, I was gonna
say.
SPEAKER_02 (05:14):
Mm-hmm.
Mm-hmm.
SPEAKER_03 (05:16):
So yeah, these
they're they're really popular,
they're everywhere.
You even sometimes have peoplewho it's just like viral random
internet videos, and people jumpon it again to kind of like
laugh, react, or something, andlike somehow that gets pushed to
a wider audience.
They're everywhere, they'refucking everywhere.
And I firmly believe that theyare making our lives worse.
(05:40):
Here's why.
SPEAKER_01 (05:43):
You have a lot of
convincing.
Let's let us hear it.
SPEAKER_03 (05:46):
Okay, okay.
The first issue is that thesepush media to people who might
not have seen it otherwise.
So a lot of the time when I seethese reaction videos, I I never
would have seen the original.
That was not in my sphere ofthings that I like to watch.
Like people being really bizarreor super niche content, or
(06:11):
videos of people being shit totheir partners.
Like, that's not what I like towatch.
Just putting that out there.
And these reaction videos putthat out there to a wider
audience because the person whothis is their whole thing is
either reaction videos.
They have really big platforms,a lot of these people, because
(06:31):
this is a really again popularvideo format.
And so that video that mayactually have only had like a
couple hundred views, it now has250,000.
And tied to that, you know,there's also the deal of like
different platforms and stuffright there.
And so you have stuff that'slike coming up on TikTok that I
(06:54):
don't use TikTok, so I don'thave any way of seeing that.
But people make the reactionvideos, put that on Instagram.
Again, that original contentdoes not exist on Instagram.
People who use that would neverhave seen it, would never have
been exposed to it.
And for the most part, it's juststupid shit, right?
(07:15):
Like it's people being weird,it's people doing dumb things,
it's people, you know, fuckingaround and finding out, and
they've compiled it and put itthere, and I was like, yeah, but
I I wouldn't have seen itotherwise.
And when it's just somebodybeing dumb, that's not a big
issue, but when it's stuff like,you know, political or you know,
(07:38):
stuff like red flag guy, so hisreal name is Dustin Pointer.
Like, that is, and that's notdoxing him, by the way.
That that is his actual likename on Instagram, but people
know him as Red Flag Guy.
He's a classic example of this.
I have looked back at some ofthe videos that he has stitched
or you know, reacted to, andthey only have like a couple
(07:59):
hundred views.
They're from really smallcreators, and he has given them
this massive platform that nowpeople are aware they exist.
And it's like, yeah, more peopleare now going to find this
person and potentially followthem.
There's another guy who I thinkhe's Australian, Will Hitchens.
(08:19):
I forgot to write down his name.
The blonde guy, the Australian.
SPEAKER_01 (08:26):
I I like how you're
you're you're you're coming to
me like I actually know.
Oh, the the uh the Australian.
Okay, yes, yes.
SPEAKER_03 (08:35):
Yes, yes, that one,
yes.
SPEAKER_01 (08:37):
Yes.
SPEAKER_03 (08:38):
He's also a great
example of he's stitching videos
that I wouldn't have beenexposed to before, and so it
gives you this idea that likemaybe this is a really big
problem in the world becauselike I'm seeing this when it
might not be.
SPEAKER_01 (08:55):
Okay, I think I know
you're talking about now.
The the Australian guy who uhhe'll sit on his bed and react
to the video of like toxicmasculinity or something like
that.
SPEAKER_03 (09:06):
Yeah, yeah, yeah.
That guy, that guy.
He's got like blonde hair that'susually pulled back like up in
the ponytail or man bun.
Yeah, that one.
See, you you know him.
Again, he has this like massiveplatform.
SPEAKER_01 (09:19):
I I think the uh the
man bun is probably the uh toxic
trait there.
SPEAKER_03 (09:26):
Come on, let the man
bun live.
You're just jealous as a baldy.
SPEAKER_01 (09:34):
I'm proud of my
baldness.
SPEAKER_03 (09:36):
Uh-huh.
Uh-huh.
You say that now.
But speaking of red flag guy, inconnection to that, some of
these videos, and by some, Imean a lot of them, have the
potential to spreadmisinformation.
Because what you need to beasking yourself when you see
(09:56):
that original video is like, doyou know where this video comes
from?
Do you know he who the peopleare?
Like, what's this stage?
Are they actually actors?
Because some of them are, someof these things are faked for
views.
Some of them are faked forviews, and they are actually a
parody.
And if you go to, if you liketrace all the way back to the
original video, on the creator'smain page, they will mention
(10:20):
that this is a parody.
But people don't be reading, andthey will just see the video.
And I've been caught by thistoo.
Like, full disclosure, I'vedefinitely seen this like type
of thing and been like, What?
And then realized that this wasnot true.
You you don't know.
And there have definitely beeninstances too where I've been
watching a video and you look atand you're like, Wow, did the
(10:43):
other person not realize theywere being recorded?
Like the camera angle is juststunning, and the lighting is
just perfect, and like thecamera is not wiggling too much,
and it stays on their face, andit's held it it's being held up.
How do they not stop and belike, yo, why are you holding
your camera like that?
(11:04):
before they go on and say themost heinous and racist thing
known to mankind.
SPEAKER_01 (11:10):
The first thing that
that comes to mind is so one of
the things, or one of the type,because actually as as you're
talking, I'm realizing more andmore how many reaction YouTubes
I actually do watch.
SPEAKER_02 (11:25):
Uh-huh.
unknown (11:27):
Everywhere.
SPEAKER_01 (11:29):
But like one of my
favorite ones is is this guy who
debunks ghost videos.
And that's part of it is likewhere the you'll you'll see the
the cut at a perfect time wherethe guy is like swinging like
the camera around real fast.
He'll show like how somebody whois perfectly getting like you
(11:53):
know, like the ghost in thebackground, and and how they
just happen to get the camera inthat that perfect spot at that
perfect time.
SPEAKER_03 (12:01):
Imagine that.
Yeah, this is like that.
Some of these videos are veryclearly like it it was just a
bad situation.
Like I've seen, you know, somethat people have reacted to
where you know the classic oneis it's at a wedding and you see
the woman say something or thebride say something like, you
know, no, don't put it in myface, or whatever, and the the
(12:24):
new husband is like, no, no, no,he I won't.
And then it's like smash thecake in her face.
And you're like, all right, I Ithink we can understand what was
going on here.
I think that was that was alegit video.
But the other thing too is thateven when you're like, nah, this
is a legit video, you don'talways have the information
(12:45):
about what led up to thismoment, what was happening
around the people that you can'tsee, like what happened before
or after.
For instance, recently I sawthis video that was going mildly
viral.
There was this woman in Bulgariawho was denied access to a
flight, and you can see her likecrying dramatically in front of
the gate, and the problem wasbecause her baggage was too big,
(13:10):
and people were were mocking herfor this, and then there were
people who were like, No, youdon't understand, like, this is
a really big issue because youknow, the airline people, other
passengers offered to pay forher luggage, and the airline
said no, so it's super unfair.
It's like, is that true?
We don't know, like we actuallydon't know if they did.
(13:32):
There's no way to know, unlessyou were there.
We don't know what was involvedwith all of this, and so people
are just making up stuff that goalong with these videos because
you know they go viral, and sopeople want to hop on that, ride
the train to you know, viralland and become famous and get
some money off of this.
And another classic example ofthat is everyone's favorite
(13:56):
Coldplay Cheater video.
I love that one, and I will Iwill address the the the issue
of being like no reactionvideos, but also oh my god, I
love watching that video and allthe reactions to it.
(14:18):
But there were a lot of likethings that popped up right
around that.
So there was like a fake apologyletter that he wrote, there was
somebody who I saw claiming,like, oh yeah, I was the person
who actually like bought thetickets to the concert for them,
and I got fired because of it.
There's all of thismisinformation just swirling
(14:39):
around that you know, if you sawit and you believed it, you
know, when I saw the apologyvideo where he like tried to
quote cold play at the end, Iwas like, man, fuck this guy.
Like, already fuck him, but likewhat?
And then supposedly it comes outlike, oh, that's not true, and
you're like, oh damn, I was soready to hate him even more.
But that is something thathappens too when you have this
(15:01):
reaction and then people pileon, is that you don't know what
is true and what is not trueanymore.
SPEAKER_01 (15:08):
Well, so like I'm
I'm curious with with the
apology letter, was it was ithim reading this apology letter?
SPEAKER_03 (15:17):
There was an apology
letter that appeared on the
internet that he wrote,supposedly, okay, saying, Oh,
I'm so sorry, I realized Ifucked up.
But you know, actually, I just Ithink it was really unfair that
you know we got filmed likethis, but anyway, as you know,
one of my friends once sang, andthen he quoted Coldplay, and
(15:39):
people were obviously likeripping him to shreds.
The company then came out andsaid, No, he didn't write it.
I have yet to see an actualapology letter from him.
So who knows?
Supposedly, though, that was nothis.
The company said, like, that's afake.
SPEAKER_01 (15:58):
Yeah, I I did see
something along the lines of
like some girl who is comingout, I'm his wife, and blah blah
blah.
And it was it was it was ahundred percent a parody.
And if you just saw like maybethe the first, like I don't
know, maybe five, ten seconds,you would hate the guy even
more.
And as she's kind of going intoit, then it's like, is it yeah,
(16:21):
this gotta be a joke.
SPEAKER_03 (16:23):
So the person who
said that they worked for
Astronomer and you know boughtthe tickets from them, that was
one of the ones I looked at andI was like, what?
SPEAKER_02 (16:34):
Really?
SPEAKER_03 (16:36):
You know, like that
type of thing.
It's like there's no way, unlessyou want to dox the person,
which is what we're gonna getinto next, that you can figure
out like, is this actually trueor not?
And so again, and misinformationgets spread because you make a
video, even if it is a parody,even if it is a joke, and people
may not realize, they may justsee a tiny bit of it and scroll
(16:58):
away, but that fake informationis now out there and linked to
this, and it just it muddies thewhole narrative.
And certainly, you know, withcold plate doesn't matter
because we knew they werefuckers to begin with.
But for other people, you know,you might look and be like, wow,
that's actually a really badsituation.
You know, something happened andpeople completely misconstrued
(17:19):
what was happening, and then allthese reaction videos are
pushing it to like a super wideaudience, and people are jumping
on, being like, I'm the grandma,and I was it's like, no, no,
you're not grandma.
Grandma's sitting right here,grandma doesn't even understand
how TikTok works.
Like so the third issue isdoxing is when you publicly
(17:41):
identify somebody or providepersonal identifiable
information about them withouttheir consent.
So that ranges from I'm watchinga video and I'm like, oh my god,
I know that person, and I writein the comments that Christopher
D.
Patchett born, you know, likelist all the information.
SPEAKER_01 (18:01):
Damn, you even put
in my middle name, too.
SPEAKER_03 (18:04):
I I do know your
full middle name, I just didn't
say it here.
But I do know it.
Do you know my middle name?
Oh short shit.
Close enough.
I'll accept it.
SPEAKER_00 (18:18):
Fuckhead.
SPEAKER_03 (18:20):
There we go.
And I thought we were friends.
But yeah, so that's that's notlike the lighter end, obviously,
of talk because it can stillhave serious consequences.
It also can range to, you know,and this is what we associate
with it most often.
You're mad at somebody, you'repissed at somebody, or whatever,
(18:42):
and you're like, you know what,I'm gonna get you.
And you look up this informationand then you publish it.
So the information may bepublicly available, but you have
taken that extra step of youfound it and you put two and two
together and said, Hey, thisperson right here, this is them,
(19:03):
right?
Like connection.
Here you go.
So there's, you know, there'scertainly a lot of discussion
about like, as we saw last week,using it, you know, like we like
it when it's used to find Nazisor cheaters, cough, cough, Andy
Byron, then Kristen Cabot.
But people can be wrong, andthis comes up a lot.
(19:26):
Was it, you know, again, withthe Coldplay video, people were
on that shit like really fast.
They identified this is him,this is her, and there was
another woman who is brieflyidentified.
You see her in the video, it'sthe woman with the brown hair
who she has her face turned andshe's she's covering it up a
(19:46):
little bit.
People originally identified heras another person in the
company.
Astronomer has said, supposedly,no, that's not her.
That's we don't know who thatis.
SPEAKER_01 (20:00):
Oh, damn, because I
actually thought that she was
the other person in a company,too.
SPEAKER_03 (20:06):
I did too.
I did because people correctlyidentified Andy and Kristen, and
I was like, I mean, this womanseems to maybe be attached to
them, and she clearly is likelooking embarrassed and weird.
And people said, Yeah, like thisthis is her.
You know, looked at the photoand was like, close enough.
You a bitch.
But apparently, maybe you not.
(20:29):
And you know, I have to wonderif that poor woman, if indeed it
wasn't her, you know, I'm sureshe got hate mail.
I'm sure she got people justlike swarming all over her
LinkedIn, leaving nastymessages.
There have been instances ofpeople who, you know, harassment
again in public spaces or inpublic places online, people
(20:50):
think they've identified theperson and just go for them.
Swatting is really common.
I actually think in connection,is there was a guy who after the
Boston Marathon bombing, whatwas that, 2013?
I think.
unknown (21:04):
Yeah.
SPEAKER_03 (21:04):
2013, 2014, yeah.
You know, internet sleuthsjumped on board and were like,
we're gonna find like who'sinvolved.
And if I remember correctly,there was somebody who was
identified and it was not him.
He committed suicide.
SPEAKER_01 (21:17):
So what we what was
the word that you used?
Swafting?
SPEAKER_03 (21:20):
SWATING.
SWAT SWATTEM is when you SWAT iswhen you call the SWAT police to
somebody's house.
Okay, you report, you reportsomething, and you're like, they
got bombs, they got drugs, andthe SWAT team obviously is like,
well, and they suit up and theyshow up at your house, like and
(21:41):
kick down your door, and you'rehaving dinner with your family.
SPEAKER_01 (21:45):
Damn.
SPEAKER_03 (21:46):
Mm-hmm.
Yeah, it's those are extremes,but people be wildin', and you
don't know what's going tohappen, and especially in a case
like this, where I think thevast majority of us, and if
you're not in agreement, I havesome questions for you and your
spouse, are in agreement thatwhat happened between Annie and
(22:09):
Kristen was horrible, and theyare bad, bad, bad people for it.
You know, maybe maybe contactinghis ex or soon-to-be ex-wife on
Facebook isn't is not the rightthing to do.
Because that's what people weredoing.
People were like, No, they foundher Facebook, they found her
(22:30):
LinkedIn too, and werecommenting stuff.
And, you know, in general, itwas relatively supportive, like,
you got this girl, like, drophis ass, we'll help pay for the
lawyer fees, like that type ofthing.
But still, I mean, you're, youknow, she's just found out that
her husband is cheating on herwith the head of HR at his
company.
Let the poor woman process anddeal with this in peace.
(22:53):
Like, oh my god.
This is she'll find out, trustme.
The whole internet knows.
And again, also, too, with thewoman who is falsely identified,
she now has to potentially livewith this for the rest of her
life.
That somebody was like, This isher, and quote unquote doxxed
(23:15):
her, but it wasn't.
But now her name is putsometimes like tied up in
things, and she's got that extraburden of proof to always have
to show it's like, no, thatwasn't me.
Like, yeah, my name soundsfamiliar.
Okay, I know why.
My final beef with reactionvideos, and this is the biggest
(23:36):
one, is they normalize a lack ofprivacy.
And I'm particularly worriedabout that in the US right now.
Thank God I don't live there.
Saz is true.
SPEAKER_01 (23:52):
Just keep bragging
about it.
SPEAKER_03 (23:56):
Look, we we have our
own problems here, okay?
We got problems here.
Don't worry.
SPEAKER_01 (24:02):
Now that I know what
swatting is, I'm gonna I'm gonna
see if there's like l swat.
SPEAKER_03 (24:07):
Swatting.
SPEAKER_01 (24:09):
Don't you fucking
dare.
SPEAKER_03 (24:13):
But this this is
something that we are currently
seeing in the US and actually inother countries too.
Um the UK is putting in placesome laws that I believe it's
just around like porn websites,like you have to prove your age.
And that sounds really good,right?
Like, we don't want childrenhaving access to this, but it
(24:37):
can be used by governments whomay not have your best interest
at heart, and that is somethingthat you have to always keep in
mind.
We've definitely, when it comesto privacy in the US and in
other places too, we're in that,you know, classic frog in the
pot scenario where if thishappened all at once, we'd be
(24:59):
like, oh my god, no, but it's alittle bit here and a little bit
there.
And something that you have tokeep in mind is that the same
tech that allows us to mock dumbCEOs and head of HR is the tech
that ICE is using to deport yourneighbor.
That's yeah, it it's it's thesame type of stuff, it's the
same skill set that is beingused for good and evil.
(25:24):
And it also, in addition, Iguess that's bad enough, on a
lesser, but I think similarscale, it means that you are
constantly living in fear ofending up in a video.
And yes, you could say, hey, I'mnot doing anything wrong, so
it's not a problem, and that'strue.
You know, maybe don't cheat withpeople at concerts if you don't
(25:47):
want to be found out.
But, you know, are you now goingto have to go through your
entire life being like, oh wow,I don't want to dance in public
because somebody might film meand I'm gonna show up on the
internet and people are gonnamock me.
This happens.
You know, we see videos all thetime of somebody doing some
(26:07):
stupid little dance and they'rejust having fun.
They're just living their bestlife, but somebody filmed them
and put them, put that online,and we may not have their name,
but because doxing happens, likenow, now we do.
And there may also be videosthat you look at and you're
like, what fresh hell is this?
Like, what what are we doing inthis video?
(26:30):
And it was maybe a video thatwas done between friends, it was
done as a joke, it, you know,it's been taken completely out
of context, but now these peopleare being mocked all over the
internet when all they wanted todo was like just have fun.
Think about the number of timesthat we have said stuff and our
chat on WhatsApp that if youjust took a screenshot of that
(26:53):
and put that out in public, wewould be cancelled in like
negative three seconds.
There's a lot.
There's a lot that we say, butyou know, in context, you're
like, oh no, this is fine.
And that's what we're doing topeople is that we're taking
little snippets of them oftenout of context, and that can
(27:16):
have like really outsizedeffects because you're just
existing, and it might even befor a quote unquote good reason.
Like, you know, you take a videoof the person because you're
like, wow, look at this like hotguy walking in front of me, or
this hot chick, like look at herdress, you know.
This is so cool.
But because you posted that, theperson that person may have a
(27:38):
stalker who now knows where theylive.
That person may that thathomeless person who you gave a
hundred bucks to and filmed,which you're a bad person for
doing that, but you're like, oh,this is a good thing, like,
shouldn't you be happy?
Like, watch this reaction.
That video goes viral, and theymay have people in their life
who didn't know they were inthat situation, and that they
(28:01):
didn't want to know or didn'twant them to find out about
that, and now they are foreverknown as like, oh yeah, that
homeless person who got ahundred bucks.
SPEAKER_01 (28:11):
I I I'm gonna have
to to counter to to save my my
my enjoyment here.
SPEAKER_03 (28:18):
Okay.
SPEAKER_01 (28:19):
So I think kind of
going from what you're saying,
there's two types of reactions.
There is the type where somebodyputs out a video trying to get
views, and somebody reacts tothat video, whether it's
negative or whether it is uh apositive or whatever, but
(28:43):
they're they're using that thatvideo to talk about their own
thing.
SPEAKER_02 (28:48):
Yeah.
SPEAKER_01 (28:49):
Then there's the
other type of reaction where it
is somebody who just happened toget caught up.
So like I one of the first orone of the things I think of is
the first video to go viral wasthat kid who was filming himself
at school and he was doing likeStar Wars martial art.
SPEAKER_03 (29:15):
Uh-huh.
SPEAKER_01 (29:16):
Do you remember
this?
SPEAKER_03 (29:19):
No.
No, no, I've got God and animeon my side, kid.
SPEAKER_01 (29:24):
No, the this was
this was before YouTube or
anything.
This is probably like 2003,2004.
SPEAKER_03 (29:33):
Hold on, I'm gonna
I'm gonna look this up.
SPEAKER_01 (29:36):
I think you might be
able to find out under Star Wars
kid.
SPEAKER_03 (29:39):
Yeah, I know this
kid.
I've seen this video.
Yep.
SPEAKER_01 (29:43):
Yeah, so that this
was like the the first first
video that I actually want toborrow.
SPEAKER_03 (29:48):
Yeah.
SPEAKER_01 (29:49):
And basically all it
was was him at his high school,
like during during he used theuh the the the film studio.
Like it was like a little highschool foo or film studio.
And he was just he was fuckingaround.
And then somebody found the tapeand put it up on online.
(30:12):
And obviously, I mean, you see,you see, he's he's not exactly
the world's greatest martialartist.
SPEAKER_02 (30:22):
No.
SPEAKER_01 (30:23):
But yeah, I mean, it
it really crushed him.
He was he was a fuckingteenager.
SPEAKER_03 (30:28):
Yeah, he was just
having his living his best, like
stupid Star Wars life.
We've all done that.
Let's be honest.
We've all like that.
SPEAKER_01 (30:34):
Oh my god.
I mean, like I still to thisday, I will still do air guitar.
SPEAKER_03 (30:39):
I expect nothing
less.
SPEAKER_01 (30:41):
I mean, that would
if if people were to see a
46-year-old man doing air guitarto Nirvana and shit like that,
yeah, I think I'd be prettyfucking embarrassed too.
SPEAKER_03 (30:56):
But you're you are
right that there are two types.
And there is the type thatinvites people to comment and
react with it, and there's thetype that was kind of put out
without consent.
And the type that people areputting out there and they know
that they are doing somethingwhack and they are purposely
(31:16):
doing it to get a reaction, thatis very different from the
person who is simply existingand did something weird and now
there's a reaction.
While these are both reactionvideos, they're going to get
reactions, they're going to getpeople talking about them.
I would agree with you thatthere's a big difference.
(31:36):
And so I'm not saying that ahundred percent of reaction
videos are bad.
SPEAKER_01 (31:41):
Damn right.
SPEAKER_03 (31:43):
There are some good
ones in here, but that's a
discussion I want to have.
Is where do we draw that line ofthis is, you know, something
that we can make fun of?
This is a you know good reasonto find people.
Because you could say that,like, what's the difference
between you know, the AndyByron, Kristen Cabot?
Because let's not forget, shealso is married and was
(32:06):
cheating.
We all we only talk about him,but it's both of them.
SPEAKER_02 (32:10):
Fuckers.
SPEAKER_03 (32:12):
That you know,
what's the difference between
them and you know, somebody whois filming other stuff and maybe
catches somebody, you know,doing something a little bit
weird on the camera.
And they know that they'reshowing up on the Jumbotron, but
it's not, you know, I'm cheatingon my spouse with my coworker.
SPEAKER_01 (32:32):
One of the reactions
I I do like is it's uh Forrest
Valmore, I think his name is.
So he has his channel, it's uhReacteria.
And he he does science, science,like I uh reactions, and one one
of the things I he he does, hedoes like videos of different
(32:55):
videos.
One is you know, like grandpopgoes to college, and it's it's
basically it's a guy who he's inhis 60s, he never did like a
film before, he had a budget oflike ten dollars in his shows,
and probably just wanted to dosomething fun and do something
(33:17):
for his church or whatever, butlike in in the process, uh the
guy is ripping apart likebiology, which this guy is a
teacher of.
Yeah, so you know, he he reactsto the video and he he talks
about like the the points thatthis guy is uh making and trying
to debunk, and he's likereacting to them and and saying,
(33:40):
like, this is what you know likethe actual facts are.
So one of the things that he hassaid in the past is that he
tries to be responsible when itcomes to videos, that he'll go
for videos that has like adecent following.
So like a lot of the videos thathe reacts to is like Answers in
(34:04):
Genesis, where you know, again,it it's just you know, it's a
very large channel, and andhe'll do reactions off of videos
like that.
Or like the the Ghost andBunker, you know, like
somebody's posting up like anactual video, like trying to
show, oh look, my house ishaunted, and blah blah blah.
(34:25):
And he's just like, yeah, youcan obviously like if you cut it
down frame by frame, you see thesplicing right there.
Yeah, and things like that.
Another one that I really loveis it's it's a music teacher,
she a singing teacher.
And she's I I I absolutely loveher innocence.
(34:49):
She's about my age, but she'snever seen like Alice and Chains
or Toll and things like like bigbands back in the 90s.
SPEAKER_02 (34:56):
Uh-huh.
SPEAKER_01 (34:57):
And she's she's
watching these performances for
the first time, and she's goingoff about like, you know, like
their voices and like howthey're they're using their
voice for like certain thingsand how they're getting the
effects and everything likethat.
But I I love the fact that whenyou watch her, you just see her
(35:18):
eyes like blowing, like, wherethe fuck have I been for the
past 40 years?
SPEAKER_03 (35:23):
Come to the metal
side.
SPEAKER_01 (35:27):
So so yeah, I mean,
obviously, Tool is known by
millions and millions andmillions of people.
So her reacting to a tool videois not going to shame them or
anything, even if she she wasn'tamazed by you know his voice.
SPEAKER_03 (35:45):
Yeah.
She's also adding something tothe discourse there to say, hey,
I am a professional musicteacher and I can explain
potentially to you why this isactually a really cool thing
that they're doing, or a uniquething, or you know, maybe give
you some context as to why thisis the way it is.
(36:09):
That is that is what I wouldclassify as a good reaction
video.
Again, you're adding somethingto my life.
When I watch this, I come awaysmarter, not dumber.
SPEAKER_01 (36:23):
And and yeah, I I
think that you know, the the
whole thing that you know, I Ispoke about like last week with
Sniper Wolf, where if it's justyou know, somebody who's doing
dumb things and they post up onTikTok, and your reaction is
wow, oh my god, why are youdoing that?
Huh?
(36:43):
You know, like, or oh my god,the the the ones I hate more
than anything is where theperson doesn't even say a word
and just like points at thevideo and like shaking their
head.
SPEAKER_03 (36:54):
Yes, right?
Like you have added nothing, youhave taken away from the video
because now I've got to look atthe video and it's tiny because
your stupid ass is like hoveringaround.
I hate those that need to bebanned.
Yeah, though those are bad.
There is also this is why Imentioned doxing and privacy
(37:18):
concerns.
Even when you think you're doingsomething good, kind of sort of,
you may not be.
I had a really weird incidenthappen to me last year where I
was watching a video, and it'snormally the type of video I'd
like to scroll away from or tellInstagram that I'm not
interested in.
For the reason I didn't, it wasthis dude who was reacting to
(37:40):
you know how people will leavenasty comments on videos, and a
really common thing to do is togo back to their profile and
publish a photo of what theyactually look like.
Like, if you do you know whatI'm talking about?
SPEAKER_01 (37:56):
No.
SPEAKER_03 (37:57):
Okay.
So imagine that I make videosand you patches hop on my video
and you comment like, God, thisfat bitch won't shut up, type
thing.
What I would do is I would thenclick to your profile, find a
(38:18):
photo of you, either yourprofile photo on whatever
platform we are, or go throughother photos of you and publish
another one as a way of kind ofshaming you to be like, you
called me a fat bitch.
Look who's the real fat bitch.
SPEAKER_01 (38:33):
Okay.
SPEAKER_03 (38:34):
That that is
something I see a lot, and
sometimes it's warranted.
But I stopped kind of enjoyingthose at one point when I, for
that reason, watched this video,and you know, these were women
who were making nasty commentson this guy's page, and then he
would like flip through to aphoto of them, and he showed the
(38:56):
one, you know, woman who'd madethe comment, and the next photo
he showed was a person that Iknow, and it was not the same
person in the photo, and it wasthis reaction of like, oh my
god, I know who that is.
And I watched it multiple times,and I I get it, both of these
people had like distinctlycolored hair, and so I would
(39:16):
guess that he just did like aGoogle reverse image search type
thing, and her like Facebookprofile photo showed up, and he
did not do due diligence andmake sure that that was actually
the right, you know, brightlycolored haired woman that he was
talking about.
Um so I I reported the video andthen I had to do like the
(39:39):
awkward thing of messaging her.
It's like, hey, I've beentalking like 10 years, and this
is a really weird thing to say,but I just want you to know I
discovered this, I've reportedit.
Like, I just want you to be safein case people and it didn't
look like it had gone out to youknow millions of people.
But you know, I don't want herto get all of this negative, you
(40:03):
know, because people could thentake that photo and be like, all
right, you know, we're reverseGoogle reverse image search that
she now shows up, and she haspeople on her Facebook page
being like, wow, I can't believeyou said that.
You're a terrible person.
I'm gonna report you to youremployer.
It was very disconcerting to seeher be like, oh my god, no, no,
(40:26):
no, I know who this is.
Have I influenced you in any wayto maybe be you know not as big
of a fan of reaction videos?
SPEAKER_01 (40:40):
No.
But but wait, wait, wait.
SPEAKER_03 (40:45):
A little bit.
SPEAKER_01 (40:46):
This is this is so I
I think because I'm already on
that page that the you know,like I said, the the reactions I
uh or reaction videos I do likeare not the reaction videos of
dumb shit, or that it is justsome dude like pointing at
(41:09):
somebody, or that it is areaction of a teenager who who
left a copy of a tape somewhere,but it's it is reactions of like
actual of actual videos thatpeople put online to get the
attention.
SPEAKER_03 (41:30):
This is true.
You were already going for likethe more okay type of reaction
videos, as opposed to the notokay ones.
I guess even for instance, whenit comes to stuff like red flag
guy, do you feel any differentlynow that you know you take all
of this type of thing intoaccount?
SPEAKER_01 (41:47):
I uh so I will say
that I think that, yes, I do
agree that it is bringingattention to videos that
shouldn't get attention.
unknown (41:55):
Okay.
SPEAKER_03 (41:56):
I I will accept that
as a small pin.
And now, where do you see usgoing from here with reaction
videos and how we react to them?
SPEAKER_01 (42:09):
So I I think this,
and this kind of goes, I I I
think that just kind of basedoff of that that whole thing of
like giving videos attentionthat shouldn't get attention,
this is kind of goes along withuh an episode that we did way
back when of you know buildingthese platforms for like you
know Jordan Peterson and AndrewTate and things like that, where
(42:34):
the only reason why they have avoice is because people gave
them that voice.
SPEAKER_03 (42:39):
Yeah.
SPEAKER_01 (42:41):
And same thing with
this is that you you have
somebody who is the red flagguy, and they're bringing out
videos that only have like twoor three hundred uh views, but
now all of a sudden they'rethey're getting all these other
views because the wrong personfound it or or the wrong people
(43:04):
find it, who are like, yeah, youknow, like I I I think so much,
blah blah blah.
Because there are there are anyeven content where it is pretty
big, you know, like so like thefirst thing I think of is Nick,
what's his name?
You know, your body my choice.
SPEAKER_03 (43:26):
Ah yes.
SPEAKER_01 (43:28):
Like I never heard
of that kid until that came out.
SPEAKER_03 (43:33):
Yes, and then
suddenly his name and that
fucking phrase was everywhere,and it still is, unfortunately.
SPEAKER_01 (43:39):
Yeah.
So like, yeah, I mean, eventhough the the intention was
good to to shame this person,dude still makes millions of
dollars, and dude now has abigger following because
everybody was reacting to hisvideo.
SPEAKER_02 (43:55):
Yeah.
SPEAKER_01 (43:56):
Yeah, I think that
being able to kind of find that
that that division of where areyou giving this person a voice
versus where are you trying toinform other people?
SPEAKER_03 (44:13):
I think that's fair,
and and that falls in line with
what I would like to see, whichis for people to just be maybe
slow down a little bit beforeyou dox someone.
Maybe slow down just a littlebit before you hop on the train
and make a reaction video aboutsomething that you don't, you
(44:35):
know, it it's breaking news.
We don't have all theinformation or you don't fully
understand.
I saw, oh my god, I saw thisvideo on YouTube that I knew
about the drama.
And long story short, there wasa woman on Instagram who
suddenly appeared and was like,I'm gonna teach you how to have
(44:55):
a micro bakery out of your houseand support your family.
And people were a little bitconfused and they could front
her about.
Like, you just suddenlyappeared, and you know, all of
your videos uh feature youwearing the same like two or
(45:15):
three outfits and the same typeof clips, and they were all made
at the same time, and like, youknow, what what are your
credentials for this?
And she got like reallydefensive about it, and you
know, the rest is is notinteresting, but there was a guy
on YouTube who did a reactionvideo to it.
He had no knowledge aboutbaking, he had no knowledge
(45:38):
about operating a business, asmall business, especially one
with food, outside of yourhouse.
He had no insider knowledge, hewas just uh filling in the space
between the videos and turned itinto like a 20-minute video.
That was like, and and now he'sprofiting off of this.
And red flag guy sells merch, bythe way.
(45:58):
That that is that's a differentproblem that like we won't get
into here.
But you know, just imagine thatone the worst day of your life,
your husband, maybe ex-husbandnow, smashes something at your
baby shower, and somebody poststhat video online without your
consent, and this guy isreacting to it, pushing it to
(46:19):
you know, millions of people,and then he's the one who gets
money from it.
Yeah, that that is shitty.
SPEAKER_01 (46:29):
And he also does
shitty uh sponsorships with uh
shitty companies like he doesbetter help, yeah.
Yeah, that's where that's whereI lost complete respect for him.
SPEAKER_03 (46:41):
I knew I okay.
To be fair, I was always lookingfor a reason to hate him.
Something about him is rubbed methe wrong way.
So I'm looking at anything.
But yeah, I would love forpeople to just maybe take a
couple moments to think aboutlike is this something that I
should be publishing?
Will it do good because thereare legitimate reasons?
It's like again, name and shamepeople who are, as we talked
(47:03):
about last week, willing toadmit that they are a fascist.
I do want to know that person'sname.
We all want to know thatperson's name.
Like, I think that's good.
The people who are cheating,yeah, you deserve to be named
and shamed and to have your lifefall apart, you know.
But maybe take that step backand kind of evaluate person and
be like, wow, how would I feelif I was the person in this
(47:27):
video?
You know, and I wasn't doinganything wrong.
I was just making a weird faceor having a bad day, and I ended
up here.
Would I be upset?
Probably.
So on our scale of toxicity,where would you reate?
Where would you reate?
Where would you rate reactionvideos?
(47:51):
Shut up.
SPEAKER_01 (47:54):
I was like, what the
hell does re-ate mean?
SPEAKER_03 (47:59):
The new verb.
Oh my god, new verbs aredropping left right.
Yes, would you reate reactionsas a green potato?
They'll make you sick if you eatthem, but you can also just peel
off the green part.
Are they a death cap mushroom,50-50 chance of death or coma,
even when cooked?
(48:19):
Or are they a delicious butdeadly last snack with lime
jello and to freeze?
SPEAKER_01 (48:26):
I I'm gonna have to
put this at a solid death cap.
I think that just for all thereasons I was kind of saying
about like where do we go fromhere?
That yeah, first off, I I I'mgonna sound so old saying this,
but I think that kids today areare ruining quality
(48:50):
entertainment.
Like, literally, if you aremaking or if you are getting
hundreds, thousands, millions ofviews for sitting there and
pointing at a video and shakingyour head, like come on, are you
fucking serious?
Like I said, I I still I feelbad for that that kid, the Star
Wars kid, because you know, eventoday, if you look up his name,
(49:15):
like it is probably gonna showup like 10,000 like Star Wars
kids.
SPEAKER_03 (49:20):
Yeah.
SPEAKER_01 (49:22):
Because again, that
was the first video to go viral.
But yeah, I I think that shitlike that, that that's really,
you know, extremely toxic.
But then again, like, you know,maybe it's me being biased or
whatever, but like the videosthat I do watch, I think that I
think that it makes for goodentertainment, and then also is
(49:46):
kind of teaching people things.
You just as you said, you'releaving the room smarter than
when you walked in rather thandumber.
So I would say definite uh deathcap.
SPEAKER_03 (50:00):
I'm gonna have to go
with whatever death caps that
Australian woman used, wherethree out of four people who ate
her delicious meal died.
It this is not antifreeze, but Ido think the potential for
misbehavior by people on theinternet, like I've I've said
(50:21):
entirely too many times, youknow, once you put that
information out there, you don'tcontrol it.
And you may have had goodintentions, but you don't know
what people on the internet willdo with that.
You don't know who is going toget a hold of that information
and maybe mentally not be doingtoo hot and have a bad reaction
to that and decide that they'regoing to handle it.
(50:43):
Like with Pizzagate, you know,somebody gets somebody sees this
and gets really, really upsetabout it and is like, I'm I'm
gonna fix the problem.
And out they go.
And you know, you don't havecontrol over that anymore.
You can't account for whateveryone's going to do, but
(51:03):
because we know this is such aproblem, I think it is something
that you have to consider whenyou're propagating this type of
information, making sure thatit's, you know, truly like you
said, the music teacher.
That's fine.
That is more or less consensual.
Did Tools say, like, yes, youcan do this?
No, but it's also clearlypublicly available footage of a
(51:26):
concert that, you know,thousands of people went to.
So there's definitely adifference there versus
something that you just found onthe depths of TikTok that you
don't have the context for orthe background information for,
and you're now commenting on it,you know, being like, wow, look
(51:46):
at these people.
Like, you see this video likecrazy, how he would treat her
like that.
It's like, well, you you don't,do you know what happened
before?
Do you know what is happeningright now?
Do you know who these peopleare?
No, you don't.
You're just reacting to whateverpeople send you, and you're not
doing the proper checks to makesure that this is a legit video
that you should be reacting to.
(52:07):
This not actors, and it's not aparody and it's not something
fake.
Like, I think that's why.
Oh, and then also let's notforget that like when we when we
normalize the fact that likepeople are out filming all the
time and that you could befilmed at any moment, I think
that is very bad for the public.
That's why I don't want to livein China, for instance.
(52:29):
I'm not okay with that level ofpublic surveillance.
Like, are there, you know, do Ilive in a big city?
And so are there going to bepublic surveillance cameras?
Yes.
Yes, there are.
But it's a very different levelof, you know, you might walk
behind, knowing or fearing thatyou might walk behind somebody
who's doing a dance, and you endup infamous online because you
(52:50):
you did you, I don't know, youpulled a wedgie or something as
you were walking along, youthought no, we could see you.
And you know, some damntic-tacker over there is doing a
little dance, and you know, theysee you, and now everybody knows
you as like wedgie dude.
It's like, no.
So yeah, and and because we'vealready established too that
(53:12):
people be bullying a lot, Ithink that it's you really gotta
be careful.
And I would love to see us dofewer reaction videos and maybe
steer more towards, you know,the nicer ones.
And then I would be willing tolower my rating, but as it is, I
just see too many that they comefrom unknown sources.
People are jumping on it becauseit's a trend, people are
(53:35):
profiting off of it, and that'swhy they keep doing this, and
they don't care if the video isactually legit or not, because
they just want to push theirmerch or their comedy show or
you know, whatever it is.
That's their main goal, andthey're using this as you know a
way to promote that.
I think they're just a few toomany of those people.
Again, not Anna Freeze, but weare we're getting close.
SPEAKER_01 (53:57):
Okay, I'll I'll buy
that.
SPEAKER_03 (53:59):
All right.
Ooh, that's two in the row wedidn't agree on.
Look at us go.
So if you agree with patches orwith me, you should let us know.
You can write to us at toxic atawesomelifeskills.com.
You can, if you're listening onSpotify, you can use the text us
feature.
(54:19):
Uh, we will get a ping becauseit it it's an email to both of
us.
So, you know, have fun with thatone.
Use it responsibly, do not doxanyone.
You can also find us onFacebook, Instagram, Blue Sky,
technically Twitter, andtechnically TikTok.
(54:39):
Until next week, this has beenthe Toxic Cooking Show.
Bye.
SPEAKER_01 (54:44):
Bye.