Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Welcome back to It Could Happen Here, a podcast about
it it being bad things happening here here being you know,
wherever you are. We're talking specifically about wherever you are.
I'm Robert Evans, one of the hosts of this podcast,
and with me today is a guy I have a
(00:27):
lot of admiration for probably my favorite YouTube documentarian, which
I guess would be the fastest way to sum up
who you are and what you do. Dan Olsen from
the channel Folding Ideas.
Speaker 2 (00:40):
Dan. Hi, Hi, how you doing.
Speaker 3 (00:46):
I'm doing well? Thanks for thanks for inviting me.
Speaker 2 (00:48):
Yeah.
Speaker 1 (00:49):
No, Dan, you and I have like a topic of
shared interest to discuss. But the first thing I wanted
to talk about is your name on the internet is
foldable human.
Speaker 2 (01:00):
Yeah.
Speaker 1 (01:00):
You. I don't feel like I could fold you very well. No, Okay,
So back in high school, I used.
Speaker 3 (01:08):
To be like I was a really small guy, Like
I was a really skinny guy. And you remember the
you remember all the ads from the nineties for exercise equipment.
Speaker 2 (01:20):
I do remember some of that. Yeah.
Speaker 3 (01:22):
So the tagline that they always used for like the
as scene on TV exercise equipment was that it folds
for easy storage. Ah and being being dumb ass kids.
You know. It's like one person in our friend group
like has a car, but there's like seven of us,
and so someone's got to ride in the trunk. And
it's like, well, Dan gets to ride in the trunk,
(01:43):
Like we're going to stick Dan in the trunk because
he folds for easy storage. Because I was a small
guy and so it so I don't know why. When
I was like busy eric like trying to brand the channel,
like you know, a decade ago. Uh, I was like,
(02:04):
I had this phrase that I was using with students
that I was interacting with, was like, well, let's unfold
that idea, you know, yeah, yeah, yeah, But like that.
Speaker 2 (02:10):
Was kind of like on my mind.
Speaker 3 (02:12):
So I was like, ah, well, we'll like call the
channel like unfolding ideas. Unfolding didn't really like just sound good,
so it's like, oh, folding ideas.
Speaker 2 (02:20):
Uh. And then well.
Speaker 3 (02:23):
Esthetic parallel to that, you know, foldable human. I don't
know it, it just it it came to me and
it sounded good, and it was nowhere on the internet.
There was like no overlaps. So I'm like, all right, interesting,
We're good to go. SEO locked in that's.
Speaker 1 (02:42):
An example of a thing that I you know, we're talking,
We're gonna be talking a lot about stuff that's unsettling
about our modern era and how the Internet has sort
of altered human dynamics. One of the things that I
think is kind of neat about it is its ability
to kind of preserve an amber aspects of you from
the deep past. Like I have one of my emails,
(03:03):
like my personal email is a Gmail that I got
back when you had to get an invite to get
a Gmail, right like right when Gmail first became a thing.
And it's like, I'm not going to say it on
here because then my email will get bombarded with shit.
But like it's like a stupid joke that doesn't really
make any sense. And every time I give it to someone,
(03:24):
their like, why is that your email? Because I was
like twelve, Like I don't even remember why I set
this thing. It's just this like moment of something I
thought was funny when I was pre Pewbescent frozen in
amber forever. Because that's the Internet does that in little
ways for each of us.
Speaker 3 (03:39):
I definitely abandoned my original something Awful account because some
like's like, you know what, maybe.
Speaker 2 (03:46):
Not that user name anymore.
Speaker 1 (03:48):
Yeah, yeah, so you are so Dan if people are
not familiar with you, and I'm going to guess a
significant chunk of our listenership is kind of one of
the biggest touchstones for you recently was you put out
a video about the NFT craze that a lot of
people have credited with helping to kill it, me among them.
Speaker 2 (04:11):
It's a wonderful, wonderful video.
Speaker 1 (04:12):
Line go up was the actual title line goes yeah,
line goes up, very good breakdown of how they work,
why it was a con And you've been doing you know,
I think kind of the first really the first time
I became aware of you was you did a Flat
Earth documentary, which is is very good.
Speaker 2 (04:30):
You know.
Speaker 1 (04:30):
I did an article recently on like AI kids books
that was partly inspired by an investigation you did into
these kind of audible slash kindle grifters, the Mickelson Twins.
So you do that kind of thing, right like you
you you Bay, You kind of run across things that
are troubling or confusing to you, and then you investigate
them to a pretty impressive extent and put together very clear,
(04:54):
uh video investigations. You know, that's that's uh, I think
in a nutshell, probably pretty accurate.
Speaker 3 (05:01):
Yeah, that's kind of where the where the channels ended up. Yeah,
it's it's been a few different things over over the years,
but that's kind of the phase that it's in right now.
Is this kind of like I don't know, like yeah, documentarian.
Speaker 2 (05:15):
Place.
Speaker 1 (05:16):
Yeah, and you know, a lot of your stuff seems
to focus on basically like the topic of kind of
online grifter culture and and sort of its intersection with
like different weird cultic milieuse like there's kind of a
cross especially like with NFTs, a real big crossover betwixt
the two, right, Like, it's kind of I think a
(05:38):
lot of crypto culture was kind of this intersection between
old school cons and kind of internet cult dynamics. So
I wanted to talk today about the problem of scam
culture in the United States because and I, by any
(05:59):
sort of like objective reckoning, I've been looking into this.
There are more scams and more fraud than right now
in the United States than at any point previously, and
basically from all sides, like phone scams are at the
highest rate they've ever been.
Speaker 2 (06:15):
People are getting.
Speaker 1 (06:16):
Like, like I think the statistic I've got is that
in March twenty one, I think is kind of when
that peaked at like four point nine billion robocall scams,
which is like like just kind of an outrageous increase
over where it was a few years ago. The rate
of like fraud against elderly people seems to be in
(06:37):
an all time high, at least in terms of dollar amount.
One of kind of the unsettling quotes that I came
across when I was looking into the degree to which
old people are being scammed, and it's often through various
email scams that are kind of based on getting trust
or frightening them that like someone else is trying to
scam them and so they need to give in from it. Anyway,
(06:59):
the thing that the quote I came across was a
regulator talking about this and being like, yeah, it's not
it's no longer like smaller even medium dollar cons. People
are stealing generational wealth, which was really interesting to me.
And then there's kind of like phishing attacks are at
(07:19):
pretty close to an all time high. I mean, there's
a I'll send you there's a graph that I came
across in a what was the source on this? In
a comparaateech article that I mean, it's just a straight
line up from January twenty twenty nine to like the
end of twenty twenty two. And so I'm I'm, I'm
kind of looking at all this, and there's a couple
(07:40):
of different causes, right, Like some of the stuff the
SEC did under ajit PI gets blamed for why it's
gotten even worse with with phone scams, although that's not
the whole story. AI powered tools have been a big
part of like why phishing attacks have increased so much.
But then like you've got like the degree to which
the elderly are being conned, which is like this kind
(08:01):
of at this intersection of a few different things. How
much more online old people are today than they were,
you know, ten to fifteen years ago.
Speaker 3 (08:08):
For population bubble, yes, you know, multi valent dynamics.
Speaker 1 (08:12):
Yeah, but kind of the commonality is that like scams
are all around, like people, we're all kind of being
assaulted by scams.
Speaker 3 (08:21):
I mean time, I just I just grabbed my phone
and like of my last like fifteen text messages.
Speaker 2 (08:28):
Maybe it's like.
Speaker 3 (08:33):
Your pickup is available a couple like three with you know,
four with actual friends and then interact you just received
an e Interact transfer high long time, no talk, just
got your money, Okay, we'll send soon. Three hundred and
eighty nine can now be routed to your institution. Submit
Why the Canada Revenue Agency has sent you money? Your
(08:55):
verification code for visa transfer is And it's like it's
like it's just.
Speaker 2 (09:01):
Constant, constant. I hadn't even because like.
Speaker 3 (09:05):
I'm not going to be shocked at all if I
get a phishing text message during this conversation.
Speaker 1 (09:11):
Yeah, like between my email and and like my text
messages or just phone calls, right, I get every day
I get two or three calls from scam likely, you know, yeah,
my good friend scam likely. Yeah, my old buddy, he's
always got always got something cooking. But yeah, it's and
I this kind of like I started focusing on this
(09:33):
more a couple of months ago because you know, I
had vaguely noticed, boy, it's just like nothing but fucking
scams coming into me through my phone these days. And
then a couple of months ago, I got a phone
call from my bank and it was one of those things,
like everyone else my cell phone lets me know when
like a call is from scam likely, or when it's
(09:53):
from like you know, and it had the name of
my bank on there. It was the right number, you know.
So I pick it up and human being is on
the line and they're like, is this you know, Robert
Evans And I'm like yeah, And then they're like, we've
seen some like fraudulent activity on your account. Can we
ask you a couple questions? And that is I've gotten
that call before. Legitimately, you know, it's not a weird
(10:14):
thing for your bank to be like, hey, let's talk
about these cuss.
Speaker 3 (10:17):
Are you in the country right now? It's like no,
I'm I'm in New York. It's like, okay, so we're
seeing activity out in New York.
Speaker 2 (10:23):
That's you.
Speaker 1 (10:23):
Yeah, did you just buy something in Florida? No, I
never go to Florida, but uh so yeah so And
I didn't actually get to see where they were going
with this. Nothing suspicious had happened. But like after they
say that, I'm like okay, yeah, like what's what's the charge?
And then my phone disconnects right like it's you know again,
my where I live, you know, Oregon is the middle
(10:44):
of nowhere, so like sometimes connectivity is not great. So
I call them back, you know, and I get on
the phone with a person and they're like, yeah, what's
what seems to be up? And I'm like, well, you
guys called me saying that like that there was some
possibly fraudulent activity that we needed to talk about. The
lady on the other end is like, no one here
called you. Like, I'm looking at your record. I can
(11:05):
tell when someone's getting a call. We don't have any
record of that. And I explained what happened, and she
like goes back, talks to a supervisor and is like,
so that was a scam. They've this is something we've
seen more and more lately. They've they're able now to
actually just spoof our banks and this is my bank
is a significant sized institution. They're able to spoof our
(11:26):
phone number now and so you can't tell through the
caller ID, And like it was this whole thing where
like obviously I know, don't give certain things over the phone,
even if they're pretending to be your bank. We never
got to that point where anything was actually compromised, but
it was just like, well, shit, like what are you Like?
That's this is now well beyond a thing where like
you're getting called and someone's offering you you know, to
(11:49):
make a bunch of money, you know, holding a Nigerian
prince's wealth or something. If you sit in your bank account,
this is your bank calls you and your phone tells
you that it's your bank, and a human being who
sounds just like the bank teller like it's it's gotten
so and I think kind of the broad obviously, each
of these individual vectors by which scamming has increased is
(12:12):
a worthy story and a separate story in a lot
of ways, but they also come together in this like well,
you know, it's not it's not like weird at this
point to note that everybody seems angrier, and everybody seems paranoid,
and you hear more stories about like people opening fire
on folks pulling into their driveway to turn around, and
(12:32):
there's you know that story. Obviously there's guns and stuff
that's also connected to that. But I wonder how much
of the paranoia and anger is at least exacerbated by
the fact that everyone is fighting off a million scammers
at all times.
Speaker 3 (12:48):
Yeah, I think that's a good observation, like that, just
we're seeing this erosion of public trust in reality. Yeah,
And and some of that is like deliberate and political.
Uh And a lot of it is just coming from
like the fact that technology has enabled spam in in
(13:12):
unprecedented new vectors and the fact that you can like
that you can automate bombarding people with.
Speaker 2 (13:23):
Noise. Uh.
Speaker 3 (13:25):
Is is just kind of it's eaten away at all
of us because it's like, how do I how do
I trust anything? I mean so like and this is
the thing is it's like, Okay, so I've been I've
been keyed into this and thus paranoid for like a
decade now. So if I get a message that's like,
you know, like that from my bank, if it comes
(13:46):
in text, then it's like I don't interact with the
original thing that it came from. I then go like
on the website. It's like all right, I like call
my bank to uh to to inquire about it. Like
never you know, it's like never communicate through the channel
that you're first contacted in.
Speaker 1 (14:03):
Yeah, if you're dealing with your bank. And it's like
and it's like, but is that level of paranoia healthy?
And it's like that takes that also takes effort.
Speaker 3 (14:11):
That means you have to have the foresight to be like,
do not panic see the thing process it consciously go
somewhere else and like, you know, activate a different channel.
You know, if they contact you through text, you know,
go through like email or or like live chat. If
they contact you through email, call them on the phone,
(14:33):
not with the phone number that was at the bottom
of the email, go to the website. And it's like
that's effort, that's effort. I don't even have that much
energy in me sometime, and a lot of other people
just like absolutely don't. And and that leads to like
just exhaustion, vulnerability, you know, all of the things that
feed into like paranoia, distrust, et cetera, et cetera, et cetera,
(14:57):
et cetera. And it's it's relentless. Like online advertising is
basically useless at this point. Oh yeah, because like if
you ran, if you ran a legitimate ad, you know,
unless you have the money to run like a real
proper you know, basically TV commercial. Yeah, like banner ads.
(15:21):
I haven't I've seen like I don't know one legitimate
banner ad for like a car company in the last year.
Everything else is like a hearing aid scam or you know,
liquefy your belly fat using the metaverse.
Speaker 1 (15:38):
Yeah, and it's it's this constant like number one, It's
led me to the situation where when I see an
ad on social media in particular, but with any sort
of like print in online ad, my assumption is it's
probably a con right. Yeah, even if it's like, oh wow,
that shirt looks nice, well, that company is probably not
going to ship me that shirt right like, or it
won't be right like, yeah, it'll be that photo is
(16:00):
absolutely not from the company that is running this ad.
Speaker 3 (16:03):
Yeah. You just you assume you distrust as a as
a first measure. You see you see a banner ad,
you see the aesthetics of advertising, and the assumption is
that it's like, ah, that's gonna you know, that's gonna
get me to sign up for some subscription that's going
to be buried in like recurring payments that yeah, I
will never be able to cancel.
Speaker 1 (16:25):
And it's it's interesting because I, I mean, I'm not
sure if this has been your experience, but like I
can acknowledge, I think I morally have to acknowledge, like
part of my success financially as a creator has been
as a result of that. Because one of the things
that we've seen in the ad market is that text ads,
(16:45):
ads for print and shit do not work, do not
function in any way, shape or form, and a lot
of like random innersitt ads don't work well, but creator
ads work well, and so there's money in it, right
because people listen to like people have a degree of like, Okay,
well this is like number one. It's just like the
process of consuming a YouTube video or a podcast is
(17:07):
different from an article. But like the ads work better
because it doesn't feel the same as like the scrum
of shit that like is getting pushed into every conversation
you have on Twitter.
Speaker 3 (17:17):
Yea a human that I can confirm existence has at
least taken.
Speaker 1 (17:21):
A look at this, like this is not at least
like not a complete con or whatever, right yeah, or.
Speaker 3 (17:29):
If it is, then like then the host has also
has also been.
Speaker 2 (17:33):
Kind of like gone by that and whatever, Like it.
Speaker 3 (17:36):
Ends up, yeah, we're in this together. Like it ends
up being at least like a little bit sort of
sort of distant distanced from that, you know, like the freakin'
you know, by by a square foot of land in
Scotland and become a lord and that whole thing is
like a scam being run out of China. Yeah.
Speaker 2 (17:57):
Yeah.
Speaker 3 (17:58):
I Mean one of the one of the kind of
like weird ironies for me is it's like, okay, so
line goes up, came out, well, the crypto ecosystem was
in its like biggest ad blitz ever. You know, they
had the Super Bowl ads coming up just like a
month later, actually weeks, like weeks afterwards, and so you know,
like the vast majority of the like mineral ads that
(18:21):
ran on that video were crypto dot Com, were ftx,
were binance, and and the ad rates that they were paying,
like the CPM that they were paying to run on
crypto relevant videos was insane. It was like it was
(18:43):
like twenty eleven all over again.
Speaker 1 (18:46):
Twenty eleven was the only good time to be in
digital content creation. There's like a lot that's unsettling about that.
I think one of the things that is like most
(19:07):
frustrated to me is the degree to which it's meant
that we've we've gone backwards. Like there was this people
who like study tech and kind of the way socialization
around big tech works talk about this thing called the
trow of disappointment, right, which is when you get a
new technology, everybody we're in like the hype phase for
like AI right now, right, yeah, And then at a
(19:30):
certain point it becomes clear which aspects of the hype
were right, you know, the degree to which the technology
is capable of doing things that kind of the evangelists
were claiming, and to which extent the hype was wrong
right in what areas is the tech always going to
fall short? And that's called like the trow of disappointment
when people start to reel and then you know, things
kind of are supposed to level after that's dot com
(19:52):
is not in fact magic, Yeah, exactly, exactly. You can't
just keep shoveling money into this shit forever and the
hopes of exponential returns. Or as a consumer, like at
a certain point, I can remember the time when phones
were exciting and I was, as especially as a journalist,
like really interested every new year at like what new
things they're capable of. And then after a couple of
(20:14):
years it was like, well, every phone is like there's
no difference. Now there's no excitement in getting a new phone.
It's just like, well this my old phone is broken,
so I need a new phone. But like I'm not
like wow, the new capabilities of this device, but I
feel like there's another I don't even I don't really
know what to call it. But there's also this kind
of thing where we the Internet helps to create, or
(20:37):
is the method through which is disseminated a new labor
saving device, and then the scams reach such a density
that the amount of labor you're able to save is minimal, right,
Like that's that's I feel like there's like a that's
at least one of the things that I've noticed, especially
with like digital communication, with just communication in general. Right,
(20:59):
my smartphone made it easier to stay in touch all
the time, and now my smartphone like it. Obviously I
still carry the damn thing everywhere, but like my text
messages are mostly scams, and my emails are mostly scams,
and most of the calls that I get are scams.
Speaker 2 (21:13):
Like yeah, yeah.
Speaker 3 (21:15):
I've actually been finding myself drifting back towards email as
a communication medium just because the spam filters are better,
you know, mature and sophisticated and for the most part
they work. Yeah, like they're yeah, it's like I can
actually people can actually reach me by email.
Speaker 2 (21:32):
Yeah, that's pretty cool.
Speaker 3 (21:36):
And you know, like there's there's a whole tech like
really kind of the big thing is there's this whole
technological element to it.
Speaker 2 (21:42):
And you know, when you when you.
Speaker 3 (21:44):
Sort of pitched the idea of this conversation with the
first two places, my brain went to where John Romulus Brinkley,
the goat testicle doctor yes, yes, yes, and pioneer of
new media radio and and of course uh and Marshall mcluan. Yeah,
you know, like those those were the those were the
(22:06):
two things that my brain immediately was like, this is
this is sort of like relevant to it because like
Brinkley was, he was a pioneer of radio. He he
absolutely advanced sort of the format of like what radio
could be and how you could use radio to not
just extract money from people, but get them onto your
(22:29):
side such that after they have given you their money,
like they're they're not just they're not just your your victims,
You're you're not just rolling into town and selling them
some some snake oil and then like skid daddling as
fast as possible, you have made them into your fans,
into your followers. And you know the way that he
(22:52):
did that by connecting his scam to like a sense
of identity. You know that he wasn't just this fake doctor.
He was also so effectively a pastor.
Speaker 1 (23:02):
Yeah yeah, people who had defend it after there was
no longer any chance of them, Like after it had
been sort of proven that the thing that he was
promising was not real, right, Yeah, like once there was
no more. It's almost like you know that play the
Music Man, Yeah, yeah, if you if you at home
or not, Like I'm not.
Speaker 2 (23:22):
A huge musical theater guy. This is a pretty famous play.
Speaker 1 (23:24):
But like the basic idea is this guy tells everyone come,
this con man comes to town, tells everyone he's gonna
make like a big band and raises money for it,
and his plan is to like take the money and run.
It's kind of what the Monoail Sketch and The Simpsons
is based on to a significant extent. And if I'm
remembering correctly, I shouldn't have brought this up, maybe because
I'm actually not that knowledgeable about musical theater. But my
(23:46):
recollection of the way it goes is that like he
falls in lover some shit and feels bad, and you know,
they wind up he winds up becoming not a con man.
But like I think the modern version of that is
he just he gets people to like adopt as a
religion and the idea that these fucking trombones and uniforms
and tubes and shit are on the way, and like,
(24:06):
you know, then they attack the local newspaper and string
a journalist up in the center of town for telling
them that there were ten years into this city, hasn't
started a bay anyway, whatever, Yeah, yeah.
Speaker 3 (24:17):
And so so the other mccluan, you know, his his
famous postulate, the medium is the message, which remains a
radical observation to this day, is just that it's this
assertion that the medium itself is more important than any
given message on it, or even the like the the
combined weight of the individual messages. Now, I think in
(24:42):
some regards mccluan kind of went like overboard with that
because he said that it's like content doesn't matter at all,
and it's like, ah, I think content matters. But the
point still stands that like the medium itself, the like
the invention of radio, the invention of television, the invention
of the internet, the invention of social media had a bigger,
(25:04):
like has had a bigger impact than any given thing
on it, because that's the thing that ultimately we warp
our lives around that, we restructure our homes around, we
restructure our physical environment around, we restructure how we spend
our days like that, our time usage gets warped around
(25:25):
the medium itself, and thus the medium becomes the portal
for information to travel.
Speaker 1 (25:31):
Through absolutely and it also I mean, I think there's
an extent to which that is true of kind of
the way parasocial dynamics impact things like political belief. I
think there are a lot of people, and I think
there are a lot of things that people a lot of,
especially when it comes to like radical politics, that people
adopt because somebody who they had come to already like
(25:55):
expresses those politics, right, and so something that maybe never
would have gotten any purchase with them suddenly is able
to get purchased with them because like a dude that
they are, a lady, that they had a parasocial relationship
express this kind of stuff. And it just it's not
that it like hacks their brain. It's not that like
people are you know, little robots. It's that, uh, this
(26:19):
is kind of the way influence works. It's the same
reason why like your you like, people often wind up
believing similar things to their parents or similar things to
their friends group.
Speaker 2 (26:29):
You know, if your friends are.
Speaker 1 (26:31):
All saying like you know, on the positive end of things,
if you're if you grow up like like I did.
Speaker 2 (26:37):
I don't know about your high school, but if you.
Speaker 3 (26:38):
Are like I like that. I was just thinking the
same thing. Like my vocabulary in high school.
Speaker 1 (26:43):
Was yeah, yeah, there is there's a slur that starts
with F. That was like every third word out of
not just my mouth, but everyone I knew. No. The
movie super Bad captures this to a significant degree of fidelity.
To be honest, that's just the way shit was. And
like the early aughts, and then you know, the people
I hung around with, suddenly there were more people who
(27:05):
were openly queer. And suddenly people weren't talking that way,
and I stopped talking that way. It's just how people are.
Speaker 3 (27:12):
Yeah, just from like somebody that I admired being like, hey.
Speaker 2 (27:15):
Yeah, I don't say that. I don't think that's cool.
Oh that is kind of fucked up.
Speaker 1 (27:20):
Yeah, And then you know, and it's it's not an
like yeah, I can I can go to my the
subreddit for my show and see people being like yeah,
I started getting interested in like anarchist politics and history
and stuff because of something Robert said. And I don't
think that's bad, because I think anarchist's history and politics
are useful even if you're not an anarchist, Right, it's
valuable to understand that history. It's often undertold. But this
(27:44):
is the same dynamic, This thing which has benefited me
and to some extent benefited some of the ideas that
I think are should be more widely known. This is
also why there's more Nazis, right, like it's it's it's
cuts every where which way.
Speaker 2 (27:57):
And so so mccluan.
Speaker 3 (27:59):
You get these new you get the Internet, you get
the subdivisions of the Internet, like you get social media,
you get email, you get you know, instant messaging and whatnot.
And because those technologies have this gravitational effect around them
that alters the trajectory of how we structure our lives,
(28:21):
they become because they are potent, because they are valuable
communication vectors. They become prime targets for grift. And the
thing is is that all of these technologies that have
accelerated communication, you know, people have long been pointing out
like the negative impacts of social media, and just like
(28:44):
the the effect on like self esteem, self perception of
just being exposed to other people's curated, idealized version of
themselves so constantly. You know, it's like this like that's
already you know, uh, impactful in potentially negative ways.
Speaker 1 (29:08):
Uh.
Speaker 2 (29:08):
And that's when you're dealing with like real people.
Speaker 1 (29:11):
Uh.
Speaker 3 (29:12):
And but then you add on to that that it's like, oh,
you go on Instagram and like you can be following
a bot and not even know it. You know, you're
you're getting you know, and the algorithm is going to
float this stuff. And so particularly if you're looking down
these like these addictive infinite scroll feeds, uh, you know,
(29:34):
you don't have the filter of pre interaction to to
gauge those things. So like so like I follow you
on Twitter, and I know that if I see, like, oh,
Robert Evans has retweeted this thing, that it's like okay,
so like he's taken a look at it, uh and
and it's been through like the filter, the filter of
his brain, and so I can probably just like, you know,
(29:56):
take my trust in that thing up like one notch, right.
But if I'm just like scrolling down the like the
algorithmically curated like this is what our computer has determined
is similar to things that you have already looked at.
It's it's just it's so much more fraught. But there's
(30:17):
it's really easy to be complacent and just be like, oh,
I trust this thing, I trust this platform. And that's
where we get into the trow of disappointment. Is yeah,
this like, I trust these algorithm. These algorithms do a
really good job of like, oh, I watched Dan and
so the YouTube algorithm introduced me to like a bunch of.
Speaker 2 (30:37):
Other really good creators. Cool.
Speaker 3 (30:41):
Uh oops, I watched one video on flat Earth and
now my my recommends are full of like COVID denihialism
and anti maskers and you know, the Trucker movement and
and all of these other like wedges to just sort
of slowly rot in my brain.
Speaker 1 (31:03):
Yeah, it's like it it's it's like kind of the
way our parents told us, or dare or whatever told
us drugs worked, you know when we were little kids,
where someone's like, oh, you want some pot. Here's some
straight up heroin, right, like you want some of it is too,
like you want some crack cocade? No, and I it
is you know you were talking about like yeah, you
(31:24):
see you follow someone and you see them share something,
and if they're a trusted source for you, you know,
it bumps it up a notch. And even that you
know that's the way it, like it works for me
as well. But there's a degree to which I find
it like problematic, especially because like we all fuck around
on the internet too. I had a thing go crazy
viral recently where like someone someone posted an obviously photoshopped
(31:47):
image of like a control, like a Logitech controller at
the bottom of the of the sea and was like, look,
the controllers survive, and I like, I shared it to
make a joke, right, and the joke was that, like, well,
the joke was that like, well the control or we're
gonna find out was one of the more functional things
about that terrible sub and that was And I even
posted underneath it this obviously this is not a real image, guys.
But like then I saw, like I wound up finding
(32:10):
it went. The post that I did of it went
so viral that like it wound up like screen captain
in some different Reddit communities for people to talk about,
and they it was only the first post, not the
one where I was like, obviously this is fake, and
like it was a joke. You know, it was it
was a it was a it was a ship post.
We were banton online. Yeah yeah yeah. But also I'm like,
I wonder how many people now think that literally there's
(32:33):
a Logitech controller that they found at the bottom of.
Speaker 2 (32:36):
The sea because of that?
Speaker 1 (32:38):
What is what are the ethics now of like making
a making a ji a jape as like somebody who's
got like a following, like where does that come into?
And like I don't know, and I don't I'm certainly
not like clear on it because I seem to be
incapable of not shit posting. I spent too much time
on something awful as well.
Speaker 2 (32:57):
But it's so.
Speaker 3 (32:58):
It's so hard to give up.
Speaker 2 (33:00):
I miss it.
Speaker 3 (33:01):
I miss the days when I could just make like
tasteless jokes on on on Twitter and you know, a
couple hundred people would see them and go like that's funny. Uh.
And now it's like, ah, if if I'm a little
too ironic, someone's going to be like, oh crap, are
are you like that happened? It's like, no, no, that
did not happen. But that did not happen. This is
(33:22):
this is fake.
Speaker 2 (33:22):
I am. I am telling you lies for it was
a bit, but it was a bit.
Speaker 1 (33:26):
I was doing a bit, but no, And that's like,
you know, something awful, which is kind of the the
the digit It's like the oh crap, now I've forgotten
a very basic science team, you know.
Speaker 2 (33:38):
The big the big puddle of boiling goop.
Speaker 1 (33:41):
That life came out of the prim It's the primordial,
primordial of digital culture. That's what Something Awful was. It
was a forum website that gave birth in various ways,
some direct and some indirect to four Chan, to Reddit,
to Twitter culture, you know, to all of these different
to anonymous, to all of these different things have a
(34:03):
you can trace a lineage back to Something Awful. And
the motto of that website, as written by the terrible
person who founded it, was the Internet makes you stupid.
And I at the time what that kind of meant was.
And if you're younger, or if you just weren't very
(34:23):
online in the late nineties early two thousands, you may
not remember this long period, but there was a fairly
long period where the default assumption in regular society was
whatever happens online doesn't matter, right, like, it can't matter.
Speaker 3 (34:38):
Probably fraudulent, it's almost certainly like made up. You can't
you can't trust anything online.
Speaker 1 (34:45):
And real people are not on the internet, right, Like
it's kids, it's nerds, but like, you know, guys who
run banks aren't online. You know, Like the idea that
the richest man in the world would spend all of
his time shit posting was absurd, Like, so.
Speaker 3 (35:02):
He really should be busier than he observably is he?
Speaker 1 (35:05):
Certainly it seems like it, although I guess so should
I if I if.
Speaker 2 (35:09):
I'm being fair, But yeah, it's.
Speaker 1 (35:15):
There's this, uh, this degree to which digital culture is
still very much a huge chunk of it. Like we
all want it to not matter. We all want a
place where we can just shit post and bullshit because
shit posting and bullshiting comes out of like the very
same impulses that like determine a lot of how we
(35:36):
interact with like our friends, right, you know, we all
need some times where you can just sit down, have
a couple of beers or whatever and like say shit
with your with your buds, you know, and it's not
it's not being recorded, it's not going up any everywhere forever.
You can just kind of like talk. This is the
it's a it's a field almost social experimentation is a
huge part of maturity of growing up, becoming a person. Uh,
(36:00):
and I think we all get kind of there's a
degree of like the accessibility of the Internet that makes
that impossible to entirely get over even though it is
demonstrably untrue. What happens on the Internet matters quite a lot,
and you can have a real significant you can influence
your own life in very negative ways by saying.
Speaker 2 (36:21):
The wrong thing on the Internet at the wrong time.
Speaker 3 (36:24):
Yeah, I mean lots of people have observed just this
fact that it's like on Reddit, you can you're not
on Reddit, I mean on Reddit too, but you know,
on Twitter, like once, you know, once a month, Twitter
(36:47):
elects some ten follower anime profile pick with who with
with a single tasteless joke and makes it the full
crim of reality.
Speaker 2 (37:01):
Yeah, and it's like that's a and.
Speaker 3 (37:04):
The thing, I don't think this is actually that far
off topic, just because like it's this warping of reality,
this warping of like what is real, what is trustworthy,
what are the like impacts of things, And the fact
that like you know, ten follower account can become can
become international news. Yeah, has to sit alongside the endless
(37:29):
bombardment of Dick pills and global leaders. Like I had
this joke that I was trying to formulate over the
weekend of like World War two with Twitter, where it's like,
you know, just a joke hinging on the idea that
some like follower bought would observe this like, ah, it's like,
you know, two posts in a row, Like it's like
(37:51):
the USSR has rolled into Berlin, Stalin has unfriended the President.
I hope this doesn't mean anything, you know that it's
like the you have, like you have international politics happening
in the same space as fake international politics, as the
same space as just like this endless bombardment of you know,
(38:14):
of curated reality, fictionalized reality, unreality and spam, and no
one knows what's real anymore, no one knows what to trust,
and the instinct and a lot of people is to
just give up trying to parse the difference. And that
makes us like increasingly vulnerable.
Speaker 2 (38:33):
Yeah, And I think.
Speaker 1 (38:37):
A big part of what's what's kind of at the
core of the problem here is what you've said here
makes us vulnerable. The degree to which this can be
weaponized is really significant, Like the you know, one of
the things that we saw that I think is kind
of low key a significant moment in sort of info conflict.
Shit is this last this weekend, last weekend from you
(39:01):
know where we're talking now, there was a mutiny by
the Wagner Mercenary forces in Ukraine and southern Russia against
the Russian government, or at least that's what it appears
to have been now right, this is Russia. A lot
of this is really weird. So I'm not going to
say we know, we don't. We certainly don't know entirely
like what happened there, like what's going on there? But
(39:24):
a couple of things happened very quickly. For one, folks
on the right, and there were also a lot of
kind of like shit had left people who adopted this
too decided that liberals were cheering on the head of
Wagner Yvjiny Pregoshin because like they believed he was a
reformer and that like they'd all thought this guy, who
was like objectively a piece of shit and a fascist,
(39:47):
is like they're cheering him on because they hate Putin
so much and they've convinced themselves that he's you know,
going to fix Russia, and he's like, no, no, I
didn't see that. Like, look, I love calling people out
when they have shitty takes, specifically on this specific war,
because I've been covering it since twenty fourteen, But like,
I didn't see that, and none of the people talking
about how liberals were doing this provided any evidence of it.
(40:08):
And it happens all the time, right Sometimes people will
like take a post that has like thirty likes and
be like, this is what the left is saying, But
like with this, there was even less. Like I didn't
see a single post where someone was like, Pregosion's gonna
like fix you know, corruption in Russia or whatever. No
one was saying that they just invented that this was
going on. And it part of it is that, like,
(40:30):
you know, the way Twitter works now made it a
lot easier for dis info to spread from this thing,
Like there was very famously a guy who is an
absolutely a con artist just started sharing up bunch of
videos from there with like bad commentary that was inaccurate,
and Elon.
Speaker 2 (40:45):
Musk was like, this is the guy I've come to
trust about. You can say Elon Musk.
Speaker 1 (40:49):
Yeah, we can say Elon Musk I don't know. It's
a problem, Dan, So.
Speaker 3 (40:54):
You beat me to Elon Musk because I was going
to say this like the con artist was. But then
it turns out that he was just retweeting that is,
like he got of course he got involved anyway.
Speaker 1 (41:02):
Yeah, yeah, And it's the I don't think this the
solution was not because we lived, you know, our parents
and grandparents lived to the day where most people would
be like, well, you know, folks who are in politics
maybe need to care about this. I might want to
get the broad strokes of it, but like random people,
you know, shouldn't be influencing what's going on with these
(41:25):
international relations. And that's how you get shited, like the
Dulles brothers carrying out coups all over the world on
behalf of the US government, where most Americans are like.
Speaker 2 (41:33):
What did we do in Guatemala?
Speaker 1 (41:34):
I didn't know we had guys in Guatemala and that
wasn't great. But also this new thing where if you
are a personality, if you are in media, then you
are obliged to be a part of every big thing
that happens everywhere, even if you are demonstrably incompetent at
that and everyone is demonstrably incompetent at that past a
(41:55):
certain point.
Speaker 2 (41:56):
You know.
Speaker 3 (41:57):
Yeah, and that's been Oh boy has that. That's been
a lot to deal with.
Speaker 1 (42:01):
And it going back to the original thing that started
this conversation. That's part of how so many of these
cons perpetuate is that, like people are only competent, including
famous people, including people with followings in limited areas, and
once you get out of your area of competence, it's
easy to get fooled. And if there's a bunch of
(42:21):
people who trust you because of the things you were
right about, then they can very easily get fooled when
you get fooled.
Speaker 3 (42:27):
One of the big hazards there is that, and this
is a long standing observation, is that huckster's con artists
are going to be more willing than anyone else to
pretend to be up to date on it. They have
no compunction about being It's like, oh, yeah, I know,
I'm an expert on submarines and Ukraine and Russia and belarush, yeah.
Speaker 2 (42:55):
You know, so there.
Speaker 3 (42:58):
The reason it's a con man is it's a confidence
man because they get your confidence because they act confidently
and give you reason to give you reason to trust them,
and they have no moral compunction about lying to you.
Speaker 1 (43:10):
Uh.
Speaker 3 (43:10):
And and they are always going to be faster with
the take, faster with the confidence statement, faster with the solution,
faster with the with with a call to action to
buy their book or dick pills.
Speaker 1 (43:26):
Yeah, and it's it's the And often I think part
one of the things that's made this all so much
harder to catch and so much more durable is that
it's it used to be as obvious. You used to
be able to see like, Okay, well, this guy's a
con man, but like, I'm not a person who can
be conned by someone selling diet pills. That's not my vulnerability,
so I immediately recognize this guy as a con man. Or
(43:47):
I am not a person who can be conned by
Christianity stuff because I'm not a Christian, So I'm not
vulnerable to this con man. And now so much the
cons are downstream of the fot and of the fame,
and so a lot of people are getting taken in
by conman. And maybe you know the fact that person's
putting in a link for their their's supplements. You know,
(44:09):
on every viral post you don't buy their supplements, but
they'll come up with something else for you once they
get you in, once you're in the funnel, or even
if they never convince you to buy anything, if you're
sharing their content, that's bringing more people into the funnel,
you know, and that really wasn't the case. That wasn't
the case with you know, you go back ten years
talking about like Young Living, right, or some other like
(44:30):
multi level marketing company where they're selling you know, essential
oils with fraudulent health claims. They weren't getting random people
to spread their business without paying for shit. And now
you can do that. If you're a con man and
you've already got followers because you bought a bunch and
you're you're on the Ukraine, shit, you just grab whatever
videos and say whatever about them, you know, frame them
(44:52):
in whatever way is likely to get people to share
them the most. Then suddenly you gain two thousand, three
hundred thousand followers in the space of a night or
to and your ability to scam people and get money
out of them has increased several times. You know, the
con is downstream of the of the platform, right, so
you know that's you get this guy and maybe he's
(45:14):
shilling thing X or thing. Why he's got a couple,
you know, whatever different con he has. But regular people
can be in the in the business of spreading his platform,
of increasing his profitability, even if they're not vulnerable to
the con. Maybe they're not the kind of person who's
ever going to buy weight loss pills or supplements or
whatever kind of thing. But if this guy starts, you know,
(45:37):
sharing all of these videos on the fighting in Ukraine,
you know, at a moment when it happens to be
the opportune moment to do that, and they go crazy viral,
well then that guy is able to triple his following
and you know, and have people who are not interested
in his con spread his shit, which gets him followers,
which brings more traffic to whatever the money generating part
of the con is. Yeah, it's it's all a sales funnel.
(46:02):
Are just anxiety lives we have.
Speaker 3 (46:06):
Built our society into just like a giant nested series
of sales funnels. Yeah, I don't know that's bound to
be a solid foundation. I don't see if you go wrong,
that seems like that will go well for us.
Speaker 1 (46:18):
How do we get any ideas on how to fix
it or should we just should we just state a
problem and then run away.
Speaker 3 (46:25):
I mean, the easy thing to do would be to
restore trust in our public institutions. You know, if we
could uh have sort of like I don't even want
to say, like a unifying cause, but just a sense
of common of like shared commonalty and and trust in
(46:47):
like our local our local society, you know, strong like
not necessarily strong families, but like strong family units constructed
or natural or however you want to like defin our constructors,
but like local with like good infrastructure around us, so
that our physical spaces are you know, appealing and comfortable
(47:09):
to live in and and provide us a sense of
like enrichment and fulfillment.
Speaker 2 (47:14):
You know, the easy stuff.
Speaker 3 (47:16):
Yeah, just just fix infrastructure, fix society, fix media, and
uh and then I think we're good.
Speaker 2 (47:23):
Yeah.
Speaker 1 (47:24):
Yeah, so that's that's good. So if we fix everything,
then we won't have any more problems. That's great. We're
on the same page now. I mean it is really
like and this is there's it's also there's also this
kind of like problematic element of when you're like, well,
we want to like a problem is that there's zero
trust in institutions objectively a problem because it means that
when say the CDC is like, hey, guys, there's this,
(47:47):
there's a plague, we should probably do this and this
and this, it immediately becomes a culture war thing. And
so you can't actually you can't actually confront serious problems
the way that you need to be able to confront them.
It's just not possible anymore. Likewise, Like, but the other
issue is that like, well, for a significant chunks of
the population, there's never been any good reason to trust
(48:08):
you know, the institutions because you know, they're marginalized groups
and whatever, you know, when the institutional trust was higher,
like the government was fucking them in this way and
that way, and yeah, you know that's also so I wonder,
like I think there's a significant extent to which we
need new concepts of like what an institution is and
(48:29):
should be, Like we need It's it's such a ground
floor problem because like I don't know, we're never getting
back to a point where Americans trust the CDC. Like
that's just not going to happen, you know, Like whatever
the way forward is on us having less the overcoming
the anti vax anti science shit around medicine it's not
getting everyone to love the CDC. You know, that's just
(48:51):
not ever going to happen again.
Speaker 2 (48:53):
Yeah.
Speaker 3 (48:53):
And part of the complexity here is that it's it's
really easy to sort of say that, you know, it's like, Okay,
well the solution is like strong central institutions, and it's
like that's not that's not correct at all either, because
like I mean, the my go to example for that
would be that it's like look at look at the
LDS Church, look at Mormons. They have a very very
(49:15):
strong central institution that provides this like social anchoring point
for a lot of their lives, and yet Mormon communities
are incredibly vulnerable to affinity fraud and and MLM's you know,
like Utah Salt Lake is like the locus of MLM culture,
(49:36):
and so like it's not the sort of like strong
man like, ah, this is why we need strong like
you know, strong leaders is not isn't the answer in
its own way, even if it's a very tempting sort
of like answer to gravitate towards.
Speaker 1 (49:53):
Yeah, And that's that's I don't know, I don't actually know.
Part of the problem is that like there are little solutions, right,
there are little things that you can do, stuff like
advocating for you know, a more functional idea of like
a more a more functional legal definition of like what
(50:13):
an auto dialer is and what counts as like illegally
sort of like flooding phone lines with with with cons
and stuff or restricting you know, the ability of people
like bill collectors and stuff to utilize you know, the
phone system and some of the ways that they do,
like the and that can make stuff better. Just like
you know, at a certain point, we will develop tools
(50:36):
that mitigate some of the harm AI is doing in
the con space. Some of its ability to automate and
push shit to people at scale will get reduced at
a certain point. That will happen, right, because it happens
with everything.
Speaker 3 (50:47):
You know.
Speaker 2 (50:47):
AI is not unique.
Speaker 1 (50:49):
This is the It's it's a you've heard the you've
heard the red Queen hypothesis, right, Yeah. It's kind of
a way of like for it's kind of like a
way of looking at evolutionary theory. There's this this point
in Alice in Wonderland where you know, the Red Queen
kind of like traps Alice in this situation where like
she's got to keep running as fast as she can,
but it's like a situation like a conveyor belt sort
(51:13):
of situation, So no matter how how hard she runs,
she never gets ahead. Right, And that's kind of the
way that like the evolutionary arms race works, right, Like
you know, one one animal develops a defense against a
predator and the predator develops a way around it, and
like the the like, that's kind of the best case
scenario for how we adapt to cons I think actually,
(51:35):
like technology just moves too fast now for us to
to be able to keep up, right, Like we're not
We're not just standing in place. We're consistently following behind.
And I don't know, I don't know what we do here, uh.
Speaker 3 (51:49):
I mean yeah, so like there will there will be
technol technological solutions to specific manifestations. I mean a big
one like in there, like to not to not bant.
Speaker 2 (51:59):
Is that you know, the uh.
Speaker 3 (52:02):
At A, the legal system, the governments, Like governments need
to do something about the robocalling and the text messages
because they're rendering a vital piece of like sneak infrastructure unusable.
Speaker 1 (52:15):
Yeah, people don't trust their phones anymore, and that's that's bad, Yeah,
because it means they stop using it. You know, it's like.
Speaker 3 (52:24):
There's yeah, there's very real like consequences. Uh, and we
need to be able to trust that we're talking to
people who aren't just trying to get our money.
Speaker 2 (52:38):
Yeah.
Speaker 1 (52:40):
Yep, Well, Dan, anything you want to plug at the
end of this here YouTube channel Folding Ideas everyone should
check out if you have not already.
Speaker 3 (52:51):
Yeah, the YouTube channel that's going to be the big one.
I'm still on. I'm on socials at Foldable Human, though
I'm trying to wean myself off of them because they're
broken and being broken purpose and they're.
Speaker 2 (53:02):
Bad for my soul.
Speaker 3 (53:04):
So I still I'm addicted, So I still keep coming back.
But I'm a lot less active than I used to be.
Speaker 1 (53:10):
Oh sorry, I didn't hear you. I was too busy
getting anxious because of a thing on Twitter. No. Yeah, Dan,
thank you so much for coming on today. I really
appreciate your thoughts on all of this. I'm looking forward
to your next video, your next investigation, whatever that happens
to be. Folks should check out. If you haven't, line
(53:33):
goes up your documentary on NFTs. You should check out
entrepreneurs Is. I think what you called your Michelson Twins documentary. Yeah,
check out everything Dan has done. Thank you Dan, And
that is the episode. You can all go home now
and deal with the fact that your bank information just
got stolen by somebody in Macedonia.
Speaker 2 (53:59):
It Could Happen Here is a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
coolzonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you listen to podcasts.
Speaker 1 (54:11):
You can find sources for It Could Happen Here, updated
monthly at coolzonemedia dot com slash sources.
Speaker 3 (54:16):
Thanks for listening.