Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of I Heart Radio. Hello, welcome back to the show.
(00:26):
My name is Matt, my name is not They call
me Ben. We're joined as always with our super producer
Paul Mission Control Decond. Most importantly, you are you. You
are here, and that makes this the stuff they don't
want you to know. It's the top of the week.
It's a strange week, so what better time for some
strange news. Alex Jones is back in hot water. We'll
(00:49):
see how that shakes out. Steve Benden might actually have
to answer for some things in court. We'll see how
that shakes out. But this is news you have probably
already heard about. And that's not the point of strange news.
We want to bring you the weird stuff from the
dark corners of the Internet. We want to bring you
the stuff that, if you looked on a map of
(01:11):
the news, would be from an area saying here be serpents. UH.
Today we're diving into some stories from the world of technologies,
stories that UH may frighten you stories that may inspire
you if you are a more vigilante minded person, and UM,
one story we really want to see what you think about.
(01:31):
And it may not be appropriate for everybody, but I
believe the three of us are are quite excited as
always to hear your opinions. So stay tuned. We'll give
you a heads up for that last one as it
may as it definitely is not appropriate for all listeners.
But let's let's start with the scary stuff. So a
(01:52):
lot of people, your faithful correspondence included, have been working
from home for the majority of the pandemic. Many many
people have UM. At first it was a safety measure
for a lot of folks, and then eventually for many
folks it evolved into a personal preference. I mean, before
(02:13):
we begin, I think that's that's a good place to
check in. UM. We don't talk about it too much
with each other because we hang out all the time.
But what has you guys experienced been working from home? Primarily?
I know we've all been popping in the office every
so often. Yeah, I do like the occasional office pop
just to change things up. And they've been restocking the
(02:34):
snacks again, which is a silent civilization is returning to
normal it is. I don't mind it. I think we've
gotten used to the flow of doing things this way. Um.
I do like doing it in person every now and again,
but I really quite enjoy I mean, I have a
separate space, so I think that makes a big difference,
because in my last house I did not, and that
was not fun at all, which is why I moved.
(02:55):
But in general I'm good with it. Yeah. I think
setting my own hours has become a new thing. It's
always kind of been a thing for us at this company,
but actually having a place that I can walk a
couple of steps to and get some work done rather
than drive, you know, in my case it was fifteen
(03:15):
to thirty minutes to a place then come home. It's
really changed my ability to to work on different kinds
of hours. Um, get well and seriously and and actually
it's weird and hopefully I'm not going to get in
trouble for saying this, but to be able to take
an hour during the day to actually get an errand done,
like go to a place that's not open when I'm
(03:36):
usually not working, and then work an extra hour later
in the night or something. Um, I think a lot
of people have been taking advantage of that. It's just
the question is how much of an advantage is an
individual taking with their workday hours. That's where employers are asking, Yeah,
there we go. But we're in a results driven space, right,
(03:59):
not sound to you know, producerly. I mean, I think
it's going to be clear to our employers if we're
not pulling our way. Uh So, at the end of
the day, no one's police saying exactly how we spend
every second of our time, which I think for some people,
continuing to work from home is a problem because there's
all this kind of nanny state technology that's being implemented
that's policing how many clicks people are logging on their
(04:22):
keyboards and like, you know, all that kind of stuff.
To me, that would be a nightmare. I'd rather just
go to the damn office if that was happening. But
we're very lucky that's not the case for us as
far as we know not to be too or welly
inter paranoid. Yeah. For me personally, um, I I prefer
it because I keep late hours. I don't sleep as
(04:43):
much as um the average person. So uh my my
schedules all over the place, and I quite enjoy it. Um.
We've known we're a special case. Probably we have known
for a long time that if anybody bothered to check
our search histories on our work machinery, uh, they would
(05:04):
they would find some dark, disturbing things because of the
nature of our vocation. But I love what I love
the way that we are segueing here to the big,
big fear. Uh. Someone recently told me something quite insightful.
They said, when you are working from home, you're not
(05:24):
really what what what's happening is that you're just now
having your job invade your home. And for millions of
people around the world, some of whom are listening today,
that's not just uh, you know, a thought experiment anymore.
And this leads us to the story that you dug up, Matt, which,
(05:45):
to your credit, my old dear friend, is something that
you have been talking about since like the week we've
been met, all those years ago. I guess it's true.
We did, yeah, and it was when we were originally
(06:05):
searching for the weird stuff that, as you said, shows
up in our search histories, and the why the websites
that we end up on uh for way too long
because we have to read the entire thing. Sometimes other
parts of that website are probably not safe for work,
but hey, too bad, that's what we do. So with
all with this in mind, the concept of work invading
(06:28):
our home, it's the question is is it just your
work that is now being done at your home or
is it somebody from your work or something from your
work that is physically looking at what you're doing, watching
like like Noel said, key strokes that you type in clicks,
maybe the number of clicks, maybe even having access to
(06:50):
your camera or your microphone. That would be a little
disturbing to imagine that somebody in some group at your
office is keeping track of that. Well, that's what the
Joint Research Councils and the European Commission and all kinds
of other things like the UK's All Party Parliamentary Group,
(07:13):
they want to know. They're looking into it. They're interested
because there's lots and lots of concern from the collective
of individuals who are aware of this kind of technology
existing um and they put out some reports and that's
what we're talking about today. You can go to the
Register dot com and you can read an article titled
(07:35):
Workplace surveillance booming during pandemic, destroying trust in employers. That's
trust in employers. Uh, the the trust of the employee
is what's being destroyed here. Um. Okay, So if you
if you go through this, you can read about the
Joint Research Councils who have put out a report titled
(07:57):
Electronic Monitoring and Surveillance in a Workplace. This is something
you can access. It's something you can download in PDF format.
You can find it, especially if you go to that
Register article. You can just click through and actually look
at the thing. You can see an abstract to download
the sucker and within this report. Guys, it's way more
(08:18):
prevalent than I thought. We've discussed before the monitoring of
the kind of the gig economy, just by the sheer
nature of what that means. Right, It's a task that
goes into a marketplace. Then someone from the gig economy
comes in and picks up that task. So the person
who sent the task in wants to know what's happening.
(08:41):
You know, how of these ten steps, which one are
we at right now? How long has it taken? How
efficient is this? Um? Now that one person from the
gig economy, we can now rate that person depending on
how well they did with that one task, and we
can predict how they will function on the next task
that they're assigned, right, and so on and so forth,
building an entire structure of just people trying to get
(09:07):
things done because everybody needs to make a little bit
of money or as much money as we can. Right,
got to get those coupons, Man, who gets those coupons
before you die? You know what I mean? Neck and neck?
That's right. So you can imagine something we've discussed before
on this podcast very recently, Ben, I believe we just
talked about Amazon's mechanical Turk system, which which is the
(09:31):
thing that they have in place, uh that monitors that
very thing, right, how well is an employee doing. The
only problem is this is human directed AI functional technology,
where it is a computer that is actually looking at
all of the data and then sometimes making a decision
about how well someone is performing. And as is the
(09:53):
case there with some uh some workforces, the AI is
actually making the decision to send an email to an
actual human employee that says you are no longer needed here. Uh.
That is not always the case with many of these
because we're also talking about Uber, we're talking about up work,
We're talking about lots and lots of different companies and systems. Right,
(10:15):
So that so that exists, right, that whole thing monitoring
how someone within the gig economy is functioning and like
really looking at what everything they're doing while they're on
the job. The other thing is employees like us Ben
Noel and myself and Paul and alexis looking at us
and seeing what we're doing on a day to day
basis by monitoring things like Microsoft Teams and Google Workspace,
(10:41):
hen Slack and checking to see how often we are responding,
how long it takes for us to respond to an email,
let's say, how many different meetings we have, what our
schedules look like, keeping track of when we are logged in,
if our computer is actually on, or if our screen
is locked. It's insane the amount of stuff that can
(11:03):
be tracked and looked at from somebody who essentially holds
the keys to one of one of those enterprise accounts. Sorry,
it's just really freaks me up. Yeah, it's super freaky.
I don't know about you guys, but I mean do
a decent amount of work on my personal phone. I
research a lot on my phone, my personal iPad. I
do emails on my phone, I do our inter company chat.
(11:26):
I don't really go into my office or my studio
unless I'm needing to do stuff that requires recording or
full screen situation or like I mean, I often will,
you know, kind of split my time between several different devices, Uh,
some of which belong to me. And you know, we
use Microsoft Teams, but not like a lot. It doesn't
I don't know, like I just do the companies have
(11:47):
to acknowledge that they're doing this or is it just
something that's been kind of discovered a little by little.
Often in your contract and it will be a simple,
like a simple paragraph that just states the company the
company reserves the right to monitor what you do with
the equipment that it gives you. Yeah, and this is
this is more important for some industries than others. Consider
(12:11):
industries wherein individual employees may generate a lot of patents
or new research issue, or an engineer who creates something
new and innovative and you did it on the company
machine with company equipment, then of course they're gonna want
you know, they're gonna want their big uh, and uh,
they rightly deserve it, But yeah, it's usually lumped in
(12:34):
in a contract. If you are going to be quote
unquote told, it's going to be lumped in with that
kind of statement, Which is like the reason why if
I if I, for instance, this is the opposite of
something productive like engineering. But if I, for instance, wrote
a novel on work machine and it became like the
(12:54):
next Song of Ice and Fire or something, then there's
a pretty valid legal argument that could be made if
our job were to be kind of evil, that they
own a piece of it because these weren't their ideas,
but these were their pieces of equipment. This is real,
the camera thing especially, I believe in the uk UM
(13:19):
first Microsoft will snitch at you at work like never before.
Like Matt said, you got no idea, but I know
that in the UK UM pretty pretty comprehensive study or
pretty thorough study found that camera technology in particular had exploded.
Six months ago, about five per cent of people were
(13:42):
being monitored if they were working from home by their
company via camera. Now that number is attent and uh
and now in the UK thirty two percent of people
are being monitored in some way if not cameras, and
it isn't the takeaway. Have some of this this probe
that like this is bad, this is actually anti productive? Yeah, yeah,
(14:04):
because all of this is a way for the employer
to see which employees are doing great, which ones need improvement,
which ones are going to get those sweet bonuses for
Christmas and Boxing Day, which ones are you know, maybe
on the outs or they need a performance enhancement plan
or whatever the hell companies call something whatever. Yeah, I mean,
(14:31):
it's just like that's the concept, right, But when each
individual human employee knows that this kind of thing is happening,
it may in fact make our productivity take a hit,
because it's almost like there's there Oh no, it's not.
Almost like there is a resentment towards that kind of
helicopter parenting that's being done through our via our computers,
(14:53):
by our computers. Um. And you know, we're in a
movement right now. Keep thinking back to several voicemails that
we've gotten. We're in a movement right now where people
are choosing not to go back to these specific office
jobs for various reasons. But you know, this is just
you can just add this one to the to the
(15:13):
mix tape of the reasons that people don't want to
do that crap anymore. Um. And I mean again, we're
really lucky us in particular in our type of industry
because it is results driven. We have specific metrics that
we have to meet. We have certain pieces of content
that were scheduled to produce every single week, and until
the moment we like fail to deliver, no one's really
(15:34):
gonna mess with us. At least that's my perhaps naive perception.
But to me, that's I kind of bank on that. Uh.
And it makes me feel like that's fair, I mean,
of course, but it also to me from the perspective
of like, you know, we oversee people guys were executive
producers and we'd work with teams, and I wouldn't want
(15:56):
anybody on my team to feel like I'm up there,
you know, up in their business and that way. Um.
In fact, Portugal as a country just passed a law
that makes it illegal for a boss to text or
call an employee after hours. Um. And I think that's
that's something that's a little more empowering for the employee
(16:18):
than this kind of crap. Well, that's really agree. That's
that's work life balanced stuff, right, Um. And that's that's
the kind of thing that makes a job enjoyable or not.
If you know, you can turn the thing off and
go on with your life. And I think, I mean, yes,
I think that's generally true for most people, but for
(16:38):
me personally, UM, I like, uh, I don't know. Well,
I'm probably the worst example, but I'm pretty into this stuff,
like I kind of I kind of live it, so
I want to know what happens when it happens. I
don't save stuff for this win back points, right, yeah,
(16:59):
and that's your Yeah. And so what Portugal is saying is, uh,
it quickly can become abusive, which is absolutely true, and
that should be a matter of personal choice. No, we
also have to point out for anybody who is listening
is not currently in the US or familiar with it. Uh,
the US is policy towards things like vacation is is
(17:22):
it's cartoonish, you know what I mean. My friends in
Europe are horrified when they hear the way vacations or
things like paternity or maternity or parental leave work. So
this is probably even more of a necessary law in
the US than it is in Portugal. But still, but still,
it's it's the future and the monitoring thing is you know,
(17:48):
it's funny that it not ha ha funny, but it's
it's odd that it becomes a contentious issue when we're
talking about workplace monitoring or quote unquo workplace monitoring, because
it's only another extension, a smaller extension of the constant
surveillance and monitoring that's occurring regardless of what laws are
(18:12):
on the books. Cointelpro never ended. Uh. Mass surveillance is
just too powerful to not do it. It's the hottest thing. Yep.
I yes, And I know we could. I could talk
about this for a long time. I know you could too.
I want to leave you with a couple of statements
from the actual report that we mentioned at the top
here that this registered article is referring to, just to
(18:35):
kind of leave us with this, have us think about it,
and we can hear hopefully from you just what you've experienced,
perhaps when it comes to surveillance in your own workplace.
UM here it is. Then you're gonna love one of
these statements because it's basically some it's your mantra like
when that it adds been every time we talk about
anything like this, it's in this report here we go
(18:58):
quote pervasive monitoring. Target setting technologies in particular are associated
with pronounced negative impacts on mental and physical well being
as workers experience the extreme pressure of constant, real time
micromanagement and automated assessment. AI is transforming work and working
(19:18):
lives across the country in ways that have plainly outpaced
or avoid the existing regimes for regulation. With increasing reliance
on technology to drive economic recovery at home and provide
a leadership role abroad, it is clear that the government
must bring forward robust proposals for AI regulation to meet
(19:40):
these challenges. So they're saying, what is it? Then technology
always outpaces legislation and uh, you know that's the UK
saying we got to do something about this. We know
we do, but right, let's let's form some committees to
(20:05):
figure out the meetings. You know what I mean? That
is um an accurate quote, And we'll see, We'll see again,
speaking of mantras, we'll see again. Before civilization gets to
a post work environment, there is a very high probability
that it will become what is called a post worker environment,
(20:26):
meaning that the robots and the AI can do amazing things,
or the machine consciousness can do amazing things for people
but no one can afford to engage with it because
no one has a job anymore. Imagine the story, John
Henry writ large for almost eight billion people at once.
(20:48):
That's cool, let's hang out in the metaverse. Some play
pinogle with our robot pals. Yeah, we're a lot of parties,
all right with that? Imagine the party where it is
the three of us and you, what are you gonna
say about this stuff? What are you gonna say? Send
it our way? Right now, we're gonna take a quick break.
(21:09):
Her word from our sponsor, and we'll be right back
with more strange news. And we're back with more strange news. Um,
I am here to talk about a story that is
definitely in line with some of the ones that I've
(21:29):
been bringing lately. I'm very fascinated with this new wild
West that is the cryptocurrency space, you know, and all
of the decentralized finance projects and n f t s
and all of that. I think it's interesting on a
couple of levels. One that it sort of signals this
weird new kind of gray area of the law when
(21:49):
it comes to finance that and has been mentioned, technology
will always outpace the law. Um, there was just talk
of passing a law in three that will allow the
government to tax and f t s those non fungible
tokens UM, which is a big deal because people were
looking at that as a kind of like a free
money tax loophole. Uh. Sorry, guys, not gonna be the case. Though.
(22:12):
This is probably gonna be decided more in the courts
than it is with like one instant law, because there's
gonna be new ways around it with technology. UM. Today
I'm not talking about that, though I just wanted to
mention it. Today, I'm talking about the rise of cryptocurrency
scams uh that are getting more and more elaborate and bizarre. UM.
(22:34):
I've got kind of two stories. The second one will
just be like a quick kind of follow up to
the first one. But the first story I want to
talk about is this particular scam that is like something
out of a horror movie. Honestly. UM. Vice reported on
this scam where individuals are hacking people's Instagram accounts UM
and using them to communicate with people in their close
(22:56):
friends circle. Uh in very you know conversation language, saying
hey buddy, long time UM, but been missing you. Hey,
I gotta tell you. I just found out about this
incredible opportunity where you can spend five dollars on ethereum
and you get ten thousand dollars back. Isn't that insane?
It's real and legit bro totes um. And then to that,
(23:21):
you're you're like, you know, I would probably do it.
I might look into it a a little bit or have
a little back and forth. But you know, you'd be
more likely to be taken in by someone reaching out
to you as a friend giving you an inside scoop.
Then you would be, you know, from some cold call
or some kind of like email phishing scam or whatever.
Like nobody reads their emails anymore and friends very rarely
(23:42):
email each other conversational It would be something you do
through Instagram or or WhatsApp or you know, any kind
of peer to peer chat. But here's the creepy part. Um.
What they're doing is the people are investing the money,
they're giving them instructions how to do it, and then
realizing it's a scam. Oh crap, that wasn't Johnny at all,
(24:03):
that was some kind of sock puppet or some kind
of bought And so then the this person, Yerry Henfield,
who is a victim of the scam reached out to
the person who had clearly taken control of this friend's
Instagram and said, hey, what gives totally messed up? Not cool, bro,
give you my money back, to which the hacker responds, Okay,
(24:24):
I'll give you your money back, but first you have
to make a video where you sound super straight faced
and sincere promoting my crypto scam. But I am then, yeah,
like a hostage style video of you saying I am
so and so and I am of sound mind. The
(24:45):
date is blah blah blah, and I think that this
crypto No, no, no, you seem too shell shocked. Dude,
Do it again, Do it again, do it again til
it's right, and then I will give you your money back.
Oh yeah, and when I give your money back, it'll
be conf firms to you via a text that will
come to your phone that you need to accept. Okay,
(25:06):
it sounds good. What do you think that text going
through the phone was. It was one of those third
factor authentication texts that allowed this person access to change
the individual's Instagram passwords and repeat no, rinse and repeat
cheezd Isn't that sinister? Yea? That's uh this guy Hen's Field. Um,
(25:34):
he got scammed, you told Motherboard, uh quote, these people
are the definition of sin. It makes me so sick
thinking who else they scammed. My Instagram page is also
a shrine to my girlfriend who passed away almost six
years ago. The page is now a disgrace and it's
you know, it's not something that has been shut down yet.
No one has uh has fully figured out who the
(25:56):
individual is. Let's see, unbeknownst to me, Um, you know
is this is this is the part um where they
realized what was happening with the text. Unbeknownst to me,
it was my Instagram request to gain access and change
my password. And then the scammer takes that video and
post it as a story to the compromised Instagram account,
(26:17):
to which you know, individuals and that person's friend group
see that and follow up potentially um. And the article
goes on to say the hacker um had gone from
not only taking money from Hinsfield, but using his own
Instagram account and connections to continue the scam on down
the line. Um. Even though the video was only were
(26:38):
twenty four hours, it did seem to have converted a
few people. He says he doesn't know why they took
it down. Maybe it wasn't so convincing. But stories only
last for twenty four hours, so I think I don't
know if he's confused. Um. But even the original like
patient zero account that they traced this back to Jane
and uh Jane everything. Uh, they realized even that was
(27:00):
a hijacked account. Um. And this is something that's generating
lots and lots of money. Um, and it's just gonna
get more and more elaborate and more you know, hard
to track. And these are these are not you know,
non tech savvy people that are getting you know, scammed
(27:20):
by this. These are like zoomers and and uh you know,
um gen x folks that are like pretty savvy about
this stuff. Um. Even if someone who isn't like into
crypto at all and they hear someone that they know saying, hey, dude,
I spend five hundred bucks, you get a thousand bucks back.
It's a trusted source who wouldn't give it a shot.
And then at the end of the day it's just
five bucks that you've lost. That's like not nothing. That's
(27:42):
certainly a lot of money too many folks. It's a
lot of money to me, UM probably not worth, you know,
going to the mattress is over. Well, it's also not
worth taking money out of the proverbial mattress. I don't
mean to sound like a jerk. I think the world
of UM most of my friends and family, but I
(28:04):
very short list of people that I would take financial
advice from, and would would prefer that to be completely transactional.
I want to want to pay a pro instead of
instead of here and here in the next hot drift UM.
But sometimes sometimes it does work out. Our very good
friend Casey Pegram, friend of the show, if you listen
(28:27):
in Ridiculous History, you know him, you love him, if
you've heard him on a couple of other things, very
friend of ours. He was right about bitcoin. There's no
walking away from that one. Casey nailed it. But that
is that is an exception, and you're right. The leveraging
of trust is so very effective. That's because it's it's
social engineering, That's what it is. It's just on a
(28:49):
new platform. I would strongly advise UM, and very rarely
do advice, but I would strongly advise assuming that any
link you get sent as a cold call link is
a scamp. Just think of it that way and then
contact him another way if you think it's legit, like
a completely different way. And asked them about the link
(29:13):
and asked them if they can resend it and if
it's literally they will. And my mom even you know,
my mom is is older, she's in her seventies, and
it's not good at computers. Um. She doesn't have an
Amazon Prime account. But she got a telephone call saying
that she owed thousands of dollars on Amazon Prime or
that someone was you know, using her Amazon Prime account.
And she called me and asked him that. I'm like, Mom,
(29:35):
you don't have an Amazon Prime account. You use mine?
Like she just doesn't have it. So it's not a thing. Um,
but that you know, it's like older generations are being
scanned with the phone because they still kind of trust
the phone. You know, people in our generation couldn't trust
anything less. I mean, I I am like suspicious when
I get a phone call. You know, I think most
(29:55):
of us feel that way. Everyone's like, you know, just
texting or or social media. Um. But here's my follow
up really quickly. It's gonna wrap it up with some
good news. The good news is that UM the blockchain,
which is the you know, open source record that tracks
all of these transactions. It it does just that it
is like a breadcrumb trail of all of these transactions
(30:18):
that if you know, in the right hands, through the
right lens, can lead to the perpetrator. And there are
folks or a community of folks on Twitter that are
like you know, becoming these modern day digital vigilantees who
are hunting down and exposing these crypto scammers. One digitalantes,
(30:38):
can we do digital? That was a missed opportunity. Thank
you Ben for for for saving that one. UM one
in particular, there's a really great article about and wired Uh.
The article is by gian m Volk pickeli Um talks
about a particular one of these digitalantees goes by the
(31:00):
handle I'm thrilled about Gabba google f which is a
reference to the h very popular um cured meat uh
that is made famous, you know, in Italian cuisine, but
also in the Sopranos. That's how they say it's a
gobba goule. It's actually I think a capa cola maybe
would be how a layman would say it, but it's
(31:21):
gobba goule in in the Sopranos parlance UM and E
is a is a reference to ethereum, which is the
type of cryptocurrency that's typically used in UM. These n
f T exchanges UM and a lot of decentralized finance
UM operations and just really quickly decentralized finances. These like
a peer to peer middleman gets cut out kind of
(31:41):
like funding projects that we could it could be around
a particular type of token, it could be around a
particular type of you know, charitable contribution or something like that.
There there's different tons of different ones, and many of
them exist in these UM smart chains that are like
kind of separate you know, sub block chains of different
(32:02):
current different cryptocurrencies. But the point is a lot of
scams are going on. I talked recently about a rug
poll operation where like you know, these influencers in the
UK where I think it was called um uh crypto eats.
It was like, you know, an answer to uber eats
that allows you to pay with crypto. The whole thing
was absolute sham all of the influencers. They were scamed too,
(32:22):
but it also showed that how a lot of influencers
will just take the money and and and shill for whatever,
you know, not even really thinking through if it's real
or not, or if they even care. And now you
made a good point that you know, it sounded cool
if someone was reached out to and and and all
everything seemed you know, legit. Um, maybe it wouldn't be
such a stretched to to take the money and do
(32:44):
a little positive video about it. But I think that's
not I think that's the exception and not the rule.
I think a lot of folkses take money and talk
about whatever, not even caring who it affects or you know,
who it hurts. Um. And that's a big problem in
in the decentralized finance where and there's one in particular
where a company some of these projects do these things
(33:04):
called token drops, where like if you've invested a certain
a certain amount of of of capital into these projects,
you will be rewarded occasionally with free tokens that have
a like a time limit on like what when you
can cash them out, um to keep them from like
messing with the value of the tokens. But it's just
a way of rewarding people that are kind of early adopters.
(33:26):
And there's this one situation where gaba Ghoul noticed that
a company called ribbon or a defied project called ribbon
um did a token drop, and then one particular user
was able to cash out all of those uh those
tokens um from duplicate accounts and then convert them to ethereum.
(33:47):
I think it was somewhere in the neighborhood of two
point three million dollars that they stole well quote unquote
stole um. They you know, they broke the rules. They
essentially had, you know, advanced knowledge of this token drop,
so they were able to time it right and gain
the kind of system essentially what the SEC would call
instead of trading, which is illegal. Two point three million,
(34:11):
which isn't like a ton of money, considering that defied
projects are now worth two hundred and fifty billion dollars total.
But um, they actually this person traced it back to
an individual who worked for the parent company quote unquote
of this uh, this DeFi project UM, and they essentially said, Okay,
we didn't really break the law, because there's a lawless
(34:34):
world out there. Um, but we did cross a line. Uh.
The employee has been reprimanded and we will now return
those funds. Um. But again, that all happened because of
this you know, blockchain sleuthing from Gobba gool and there's
other ones zach sysiphus um as a handful of these,
you know Twitter kind of you know, crypto detectives that
(34:58):
are out there doing God's work. And I think it's cool.
I agreed. So um, now there you have it. I'm
gonna keep an eye on still, but I'm glad. I'm
glad to know that these uh Internet superheroes are out there,
you know, trying to keep people honest, a balance for
all things. Indeed, let's take a break and then we'll
(35:18):
be back with one more bit of strange news. Welcome
back to the show, folks. This is the part we
told you about at the top. This is the juice.
This is the u umami, this is the creepy stuff. UH.
(35:40):
This is something uh, this is something that I was
on the fence about bringing and uh checked checked with
the team beforehand. But perhaps one of the most important
things at the top here is to say the following.
If you are listening along to this show with your children,
fellow conspiracy realist, this may not be appropriate it for them,
(36:00):
but also thank you for starting them young. It's very
important thoughts. UH. If you are somewhat skittish about graphic material. Uh.
If you are not the kind of person who thinks
gross things are funny, then this maybe where you pause
the podcast. This is why we wanted this story to
(36:20):
be the story at the very end today. And then, um,
even if you're not interested in it, fast forward a
little bit because there is a special message I have
at the end. But here's the here's the headline. Matt Noll.
You guys know about this. We have talked a lot
about deep learning, about neural networks, about applying AI in
(36:42):
a generative sense. Uh. The probably the version of this
that's most familiar to people is actually a really cool
meme slash joke, wherein someone says, hey, I've fed thousands
of hours of Batman or sign fell Old, or um,
it's always sunny in Philadelphia or a president's speeches into
(37:05):
this AI program and it generated its own thing, and
then waca, waca waka. Look how wacky this is. There
are great examples of it. There are many are quite enjoyable.
The vast majority of those, of course, are written by
clever comedic people rather than machines. At this point, to
get into this, UM, I'll say something that is UH.
(37:30):
I don't know if we talked about too much on
the show. So we know that war drives innovation, right.
War drives science, war drives medicine, war drives transport, war
drives economies. UM. Sex also, in a very real way,
drives technology. For anybody old enough to remember VHS tapes, uh,
(37:51):
the reason VHS became like a world standard is because
that's what pornographic studios went with. And so we didn't
say this out loud when we've been talking about Ai.
We're talking about oh AI. It's kind of like a
child soldier, which is disturbing and true. We didn't talk
about it when we talked about all the beautiful potential
(38:12):
it has for art or the weird way it explores ethics.
But it was only a matter of time, folks, before
someone gazed upon the new exciting realm of the synthetic
mind and said, can we use this to make porn?
And that's what happens. That is exactly what has happened. Ah.
(38:34):
We have a guy named Eric shardcore Drafts to thank
for this. Uh. He is the creator of something called
the machine Gaze, which uses machine learning tools uh to
to create to generate. Um, what this software thinks is pornography,
(38:56):
just like c g I creatures, like cd I figures.
I was show you, I'll show you, Okay, okay, alright,
So I am sending you guys in the chat here,
I am sending you a very short video. It's about
three minutes. We can just have it played on the
background of UM. What the machine learning algorithms seem to
(39:19):
think is intercourse. And I just want to be honest,
I don't know who the audience is. I don't think
it's humans. The music does not help. But uh but
it is hearing a heartbeat, yes, yeah, a drone. Uh huh.
It's a little for David nch right, Oh my god, playboy.
(39:45):
I would totally project this like behind like a psychedelic
rock band. This is like seeing like a gallery, you know,
like it's called four Hands to It even looks like
a digital art project. Yeah, I want to No, I
don't see anything sexy about this at all. It's just cool. Yeah,
(40:06):
we're we're not the target audience again, I think, but
it was only it's weird because it was only a
matter of time before something like this happened, right, I mean,
that's just how people work. So here's what Here's what
draft finds interesting about this. He says, by turning the
(40:27):
AI on the domain of poor and you make imagery
that's not pornography but is fleshy. You can see the
roots that it's come from. He says, it starts to
trigger the similar parts of your brain, but in slightly
the wrong way. So a bit of a uncanny Valley
dreamlike quality. Matt, your facial expression here is priceless, dude.
(40:52):
I am why I'm strangely finding myself becoming titillated and
aroused watching this video. I'm so yeah, okay, So here
for him. The underlying question, there's another quote, is what
new machines with no concept of sexuality or sexual behavior
see when they look at this content. And he fed
(41:14):
a lot of content will call it into these programs. Uh,
it's for me. What's strange about is it's it's a
raw shack of flesh, is really, I think a way
to describe it. And the the dreamlike quality is a
huge part of it is due to the music. I
(41:35):
wonder whether Drafts had experimented with generating audio as well,
and that was just too weird like these not quite
human kaleidoscopes and sonic fractals of moans and exclamations. That's
what would have occurred, I imagine, and this UM that
sounds amazing AI generated sounds from those that same two. Well,
(42:02):
I will tell you watching this, we don't have the
sound off, UM, but if it had like a sexy
saxophone on top of it, I think that might really
take it to the next level. Listen, we're we're just
gonna do this real time for a second, just to
get the listener an idea. Okay, we're just gonna sample.
You're ready to see if you can hear this from
my headphone? You ready, know? Man? Yeah, go for it.
(42:27):
Very David Lynch, that's very David Lynch. All the flesh,
all the flesh must be sacrificed. Is the new flesh.
They were talking about, what doest thou like to live deliciously?
A um? All the boundaries? Yeah, so so we predicted Yeah,
(42:48):
I think civilization predicted the emergence of deep fakes, and
there's still kind of emergent. But this is something different
because in a way, this is asked UM. This is
like asking someone who's never never seen something, someone who's
(43:10):
been who hasn't been cited since birth. This is like
asking them to describe an image in a dream, right,
there's not a frame of reference. And it immediately makes
me think of the deep dream software and the neural
network kind of uncanny Valley, psychedelic kind of you know,
hell Escape. That was those videos, which I love. I
(43:31):
think those are super cool. They's sort of gotten a
little played out and almost like a little cliche at
this point, but at the time I thought it was
really neat, And this really makes me think of that,
only it's all just flesh and intertwined kind of bodies. Yeah,
and this is um. I I do think that this
is important work because reproduction is biologically the primary goal
(43:55):
of the human species. Everything else is just sort of
extracurricular stuff built around that. Right, And uh, it makes
sense that this fundamental, this potentially fundamental shift in civilization
would address one of the fundamental bedrocks of civilization as
it is commonly understood. Uh, the question is does the
(44:20):
do the programs to the algorithms understand what they're doing.
That's that verges into philosophy, and we're already kind of
in the deep water of poetry and poetic thought. Here
are poetic science, as Ada Lovelace would say. The next um,
the next question is is this is this truly generative?
Is it only referential? Has drops been steering it? Like?
(44:44):
What kind of did did this human individual's predilections play
any role in it? Like? Uh, I don't think it's
offensive for me to point out, since we are in
the adult proportion of today's show, that it was based
entirely on Quentin Tarantino taste. We might just see a
bunch of like feet, right. That was that too low
(45:05):
a blow, not at all. I mean feat picks are
like internet currency, so weird. Well, you know, if you're
not hurting anybody, go forth, do your thing. Uh, live
your life. But this, this was fascinating to me because
of what it means for the future. This means that
there is a world in which I'm bringing I'm going
(45:28):
somewhere here that's not gross. This means there is a
world in which my dream of AI generated films and
scripts becomes even more possible, increasingly plausible. There could be
I like, think past the pornography and whatever silly taboos
humans have around that stuff, and think about the world
(45:49):
in which you could say, I want I want to
film computer. I want to film with Kianu Reeves and
uh Marilyn Monroe, and I wanted to be about the
night that Edgar Allan Poe died, and I wanted to
be um a musical kind of like you know, West
(46:13):
Side Story or Hamilton's And then there could be a
program that did that. This is a within our lifetimes
possibly thing if humanity keeps it together. Um yeah, well
do you think stuff like this can replace like human
imagination and like like like, do you think we'll have
a I that will be smart enough to mean? I
think it requires a shift in what we expect out
(46:36):
of a story. Like I think, on a long enough timeline,
maybe a computer could generate a story that the society
of that day would consider acceptable. But I don't think
they could ever be as robust and imaginative as the
stuff we I mean, even like content has gotten dumbed
down over time in our lifetime and like it's it's
it's much more remarkable when something truly original and thoughtful
(46:59):
comes up, whether in music or film or television or whatever,
because so much stuff is basically algorithmically generated, you know,
by the Netflix machine, Like what kind of content do
people want? It's what you just describe, Ben, perhaps, Um,
do you think this is just going to create base
kind of broad stuff that would just be like the
(47:21):
lowest common doom there? Do you think this has the
potential for like creative expression? I'm very glad that you
brought up the concept of imagination, because when humans think
of how to define success for the creation of a
machine consciousness, there's often a very easy, very self referential
(47:41):
error to make, and that error is thinking that the
aim should be for very similitude, you know, for something
that seems very much like the human mind, very much
capable of the human imagination. But I pause it, that
is not success. Success if success is to be defined
in this old is a completely original imagination, you know
(48:04):
what I mean. It's a I that is making stories
for AI that do not necessarily involve humans or the
human mind. And if civilization can make a truly distinct
creative imagination, then you know, maybe, uh, maybe all the
horrible stuff between the first time people discovered fire and
(48:27):
the day that happens will be in some way worth it,
which I'm trying to be optimistic. I'm not super great
at this totally And that was beautifully said Ben, It
really was. But now I'm thinking we need to do
Ben is, get images of motherboards and CPUs and feed
all of that into the system and then see what
(48:47):
kind of real machine gaze we can get us in
weird places? Yeah, what would what would like pleasurable onton?
What would the equivalent of machine learning pornography be? Would
it be math? Would it be like like sexy math?
(49:09):
Is that a thing? Would humans even get it? You
know what I mean? Would you be like, oh no,
I caught my personal assistant. I caught Alexa doing the
the dirty calculus again. I don't know. It's a weird
world to live in. But just like typing sixty nine
over and over. Yeah exactly again, like, well, what's the
frame of reference for what it is unless it thinks
(49:30):
about it in a human world? Um? From Sin to
Sign and co Sign, Yeah exactly. I heard a really
good point about sort of this. Um. I've been really
getting into this podcast called blank Check with Griffin and
David Um. It's like a film kind of critique podcast.
But they were talking about the movie, uh, Interstellar, and
(49:51):
if you'll remember, there's a robot in it called tars.
That's really neat because it just kind of looks like
a wall, but that it also like sort of like
pops out with these little appendages can like do different things.
And the thing that the point they made was, why
would if we were going to to build an android,
as we've seen in every science fiction film ever since
the beginning of you know, cinema, why would we make
(50:12):
it look like a person. We already have a person.
We have people that can do people things. If we
were going to build an effective android, like for like
war or something, why we we should be a form
that we are incapable of replicating with human body, like
a moving wall that can generate you know, arms, and
and become different shapes as needed. That kind of outside
(50:33):
of the box thinking, I think requires I don't know,
human mind in some ways. This is interesting. I think
it was. I thought it was a really good point.
Could also be a medium sized canine, yes, very much so. Uh,
one capable of carrying a serious firepower on its back. Uh.
This this is the world in which you listening today
(50:54):
and your descendants may very well live in soon. I
would love to hear your take on this. Is this
over all a good thing. Is this just the ego
level set that human civilization needs? Or is it the end?
Is it the end? I I am hoping that machine
imagination and human imagination will be different enough that both
(51:17):
can coexist and perhaps even in some way complements one another.
And I'm trying to be optimistic. Um. And while I'm
being optimistic, just to close this out on air, I
wanted to thank everybody, and Matt and Noel and Paul,
I wanted to thank you guys specifically. I was out
um previously, and you guys held down the fort with
(51:41):
a cavalcade of voicemails from a very special listener. Male segment. UM,
I am going to maybe towards the end of Fear
talk a little bit more about what went on backstage.
But everything is fine, as the dog says in the
Burning House. Uh. And we are all collectively grateful for
(52:05):
every single one of your conspiracy realists who have tuned in.
You are, again, the most important part of this show.
I never get tired of saying that we can't wait
to hear from you. What's your crypto scam? Don't send
us the link? What are you? Uh? What are you
thinking about when you think about workplace surveillance necessary evil
(52:25):
or just another step towards a big brother society, as
Matt said, a nanny state. And what is the future
I mean, once we get past the pornography, what is
the future of imagination in the mind of a machine.
We cannot wait to hear from you. We try to
make it easy to find us online. Oh, the internet
is rifeless stuff that I don't want you to know.
(52:47):
Communication options. You can find us on Facebook, Twitter, and
YouTube at the handle conspiracy stuff or conspiracy Stuff show
on Instagram. Yes, you can use your mouth to contact us,
and your phone call eight three three st d w
y t K. Let us know if we can use
your name and voice on the air. Give yourself a
(53:08):
cool nickname. And that's a part of the whole scenario too.
And you've got three minutes say whatever you'd like anything
in the whole wide world. Use it how you will. Well,
except for machine prawn, let's not do that. Let's not
do that anywhere. No drones, no fleshy naval created knuckles.
But it's also hard to do on an audio format.
(53:31):
But if you've got too much to say, they won't
fit into those three minutes. Please, please, please send us
a good old fashioned email. We are conspiracy at iHeart
radio dot com. Stuff they don't want you to know.
(54:01):
Is a production of I heart Radio. For more podcasts
from my heart Radio, visit the i heart radio app,
Apple Podcasts, or wherever you listen to your favorite shows.