All Episodes

June 7, 2021 51 mins

It appears the government of China is testing a camera system that uses AI and facial recognition to reveal unspoken emotional states in the oppressed Uyghur population. After a creepy personal experience with social media, privacy expert Robert G. Reeve busts the myths about our phones hearing us -- and reveals, instead, what he sees as the much more disturbing reality of automated online stalking. All this and more in this week's strange news.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of I Heart Radio. Hello, welcome back to the show.

(00:25):
My name is Matt Noel is on an adventure today
but will be returning soon. They call me Ben. We
are joined as always with our super producer Alexis code
named Doc Holiday Jackson. Most importantly, you are you. You
are here, and that makes this stuff they don't want
you to know. It is the top of the week,

(00:47):
right after quite a quite a wonderful long weekend for
us here in uh, not just in Atlanta, but in
Detroit and in New York, in the various places that
we find ourselves scattered. Uh. As usual, Matt, as we
like to do, Uh, you and I have scoured the internet.
We have uh we have dug deep into the weirdest

(01:11):
stories that often don't make it to the mainstream spotlight.
And one thing that I thought was interesting is you
and I both went with some we we both went
with some scary tech futurism, some scary tech stories. But
before we go, you know, I was thinking, because it's

(01:32):
just the four of us. Oh no, no, that's fine, Yeah,
go for it. I would feel like I was being
rude if I didn't back you up on that. We
can make it if we try. Uh, what's the other way?
It's just the two of us on Mike currently. Unless
Doc you know, gets a wild hair m Yeah, if
we say something that especially infuriates code named Doc Holiday,

(01:57):
she will not hesitate to put us back in line.
I said, oh, oh no, wait, we've got video and
she's gone. That was a close call. Well so and
that was that was doctor self laughing there. Uh so,
what I'm thinking we could do, Matt is let's explore
our stories together and then maybe if we have time,

(02:17):
if we have sort of a third act, we could
uh throw some headlines at each other, because I'm sure
there's a lot of stuff that you see that just
doesn't make the cut. To be your chosen when almost
a choice in one, Uh, you're you're you're chosen one
for the week. How do you feel about that? Sounds great?
Did you read the one about the sting rays in

(02:39):
the little touching area of a specific you know, the
place where you can actually touch the sting rays and
in aquaria. No, I didn't. Okay, well we'll talk about
it later. Well, now we have to do it. Did
you read the one about the rogue AI drone? No? Okay,
all right, well okay, we'll do the show first and
then we'll do the rest of this. Okay, So we

(03:01):
we said we picked two uh pieces of scary tech reporting,
and Matt, I've got to be honest, mine is going
to be familiar to quite a few of our fellow
conspiracy realist on the Here's where it gets crazy Facebook page. Uh.
This is a story about social media and big data collection,

(03:22):
and your story is I believe, also about technology that's
going to become increasingly dangerous and pervasive in the in
the next few years. And your story also deals specifically
with some problems that we have pointed out in uh
emergent AI tech. Right, I. I don't want to go

(03:44):
too much further without spoiling the surprise, my old friend,
Do you have a preference for what terrible thing we
learned about first? Let's can may I go first? And
it's only because it's a specific hardware software combo that's
kind of one piece of the larger story. That is
what you are going to talk about. I think let's

(04:05):
do it, okay, so we'll build up. Well, I'll read
you what I've got here. This is the title of
the article AI emotion detection software tested on weakers. This
is from BBC News, written by Jane Wakefield. I believe
it's it's May twenty of this year, and what I
want to talk about today isn't necessarily about the weaker

(04:28):
population and their treatment um by Chinese authorities, and you
know the government there that this is a very important
aspect of this story and it is worth going into
and diving into. What I really want to talk about
is the tech that is emerging here UM and and
the implications that it could have. So this is what

(04:50):
I will I'll read you directly from that BBC article.
This is what it says. A camera system that uses
AI and facial recognition intended to reveal states of emotion
and has been tested on weakers in jin Yang. The
BBC has been told. Now this is important. The BBC
was told this, and they're not revealing their sources. It

(05:10):
says a software engineer claimed to have installed such systems
and police stations in the province, and the software engineer
agreed to talk to the BBC's Panorama program under the
condition of anonymity, and this is because he fears for
his own safety and the company for which he worked
is not being revealed. It was not revealed in that
entire article, but he did show Panorama five photographs of

(05:34):
specifically weaker detainees he claimed had the emotion recognition system
tested upon them. So, first of all, before we even
jump into this, this specific article and story that's being
told to the BBC Panorama is being told by, you know,
a source that the BBC has chosen to trust with

(05:55):
that information. But as you know, a listener, as a viewer,
as a re or, we have to then trust the
BBC as an outlet to have made that decision that, Okay,
this information may be legitimate, but we you know, we
don't have a specific name that can be cited, a
source that can be cited, so we have to take it,

(06:15):
I don't know, as a little less than absolutely true.
Does that make sense? Sure, we have to at least
be cautious about being overly credulous, right, That's that's something
that's tricky with any kind of sensitive reporting, because you know,
if you're a reporter, whether you're Seymour Hurst or a

(06:35):
cub just starting out. The last thing you want to
do is compromise a source, because it means other people
won't trust you in the future, and in this case,
it means the source could be detained, uh, incarcerated, tortured, murdered.
These are all options that are on the table. There.
Those are all options on the table. And simultaneously, we

(06:56):
cannot rule out that maybe and again I to even
say this, but we cannot completely rule out that it's
some kind of propaganda story. You you just can't so
and again that doesn't I guess you could assign probabilities
to either of those scenarios, but in this case we
just have to be aware that they're all possibilities. But

(07:16):
let's jump into what the article says. What information was
given to BBC Panorama by this anonymous source, this person,
the source showed Panorama like it, like we said, five
photographs of these detainees that supposedly underwent UM experimentation in
some way or testing with this new software hardware combo.

(07:38):
And what was displayed, at least for the folks, there
were these pie charts that represented what the system was
seeing when it was using this new facial emotion recognition
software on each individual. And it's a pie chart, and
what it represents essentially is this person's state of mind,
at least that's what it purports to represent, and what

(08:02):
it's looking for our negative emotions, things like anxiety, stress, anger,
anything that could be considered negative. They are searching this
person's face, their temperature, all kinds of bio data on
this person to see whether whether or not they could

(08:23):
be trouble right, whether or not they're suspicious. And that's
specifically what's looking for, is this person suspicious? Right? And
this is going beyond your typical facial recognition match, which
uses still photos, because when you're using UM, when you're
capturing video like this, you greatly expand the number of

(08:45):
things that you can measures. So we're also talking vocal tone,
we're talking you know, you and I were talking about
micro expressions off air earlier today. It also will be
able to check micro expressions. They're a real thing that
most people have unless they've been trained to try to
suppress those expressions, those reactions, those little facial ticks. But

(09:06):
even if you have that kind of training, which again
is a real thing, uh, you probably wouldn't be able
to fool a system like this if it had enough
stuff to measure and if it had enough other faces
to reference. Yeah, you're right. Uh. Essentially, what you need
for this to function is time in front of the system.

(09:26):
You need to have a human like face and be
in front of the system for you know, longer than
a few seconds. And this anonymous source is saying that
they installed this system at prisons with an s and
that it was specifically being used on the weaker population.
And and again, as I said before, I don't want
to jump too deep into to that situation between you know,

(09:50):
the Chinese government and the weaker population, but we just
I would just say that it's pretty obvious that multiple
sources have claimed and seemed to have conferred armed the
mistreatment of the weaker population by the Chinese government. That's
at least the way it appears at this time. Yeah, Yeah,
you're right, Matt. Uh. The quick and dirty version of

(10:11):
it is that the weaker population is considered one of
China's fifty five officially recognized ethnic minorities. There are culturally
distinct in many ways. You know, they have lived in
the region for a long time, they're technically a Turkic
ethnic group. But when we say culturally distinct, we mean

(10:33):
like a majority Muslim. Uh. The cuisine is different. They're
they're not Han Chinese. And since about the Chinese government
has been if for being extremely diplomatic. Uh, they've been
engaging in a policy of total non consensual cultural assimilation.

(10:55):
So things like secretive internment camps, lack of legal process us.
You know, you and I have explored various aspects of
this story in the past, everything from allegations of Oregon harvesting,
which goes back to your is this how much can
we trust question right? Allegations of Oregon harvesting, the detainment camps. Uh,

(11:18):
those have improven. Those are real threats against journalists, forcing
people to forcing children to learn the language as well
as adults learned Mandarin at least and then um even
to the point of sending Chinese military members to live
in a weaker families home after one of their parents

(11:41):
has been detained, just to keep an eye on people.
So it's it's a very it is an unsustainable situation.
The majority of the weaker population feels that the overall
Chinese government is attempting to erase them from history, from
the president, and from the future. Not The Chinese government

(12:02):
of course does not agree with that with that description,
but they are outnumbered by the multiple agencies who alleged
that at least some of those practices are going on.
So you can understand how people were already concerned about
the weaker population would be uh, pretty terrified, pretty spooped
by the idea of adding this naissent, already problematic technology

(12:27):
to the mix. And that's that's something I think a
lot of people explore. You know, when you come to
the concept of what is race to an AI or
facial recognition algorithm, right, and we know there are major
issues with a lot of the existing technology for facial
recognition because of those very things. And there are specifically

(12:49):
major issues with this version of emotion facial recognition or
emotion recognition software um and it has to do with
how it was tested, at least according to this anonymous
source who spoke with the BBC. This person said that
test subjects were placed into restraint chairs, which sounds very

(13:10):
stress free already, just calling it a restraint right, which
are widely installed in police stations across China. According to
the BBC, uh quote your risks are locked in place
by metal restraints, and the same applies to your ankles. Again,
this is like a day at the spot. Sounds the
same to me. Uh wow. Then what they do is

(13:31):
they use the AI system, you know, pointed at your
face and detect everything from your facial micro expressions as
you said, Ben, to the way your poores adjust because
it can see that deeply into you um and you know,
like you said, temperature, all these other things. And it's
really tough to even imagine this being real because according

(13:52):
to the source, the whole reason for this is to
be able to provide quote pre judgment without any credible
evidence or without evidence, so to be able to uh
pre cog right pre cog crime and or threats to
a location or individuals. That's what this whole thing is about.
Or to determine it will probably start with determining evidence

(14:17):
right of a past crime. You know. The lie detector
idea is I think where they're going. They're definitely going
first right, show you a picture of something, an explosion maybe,
and then ask how you feel about it. It's also
insisting this line and then but that's certainly one way
that it's being used. I think the other issue is

(14:37):
that's not just police stations. It's also being used in
UH assisted living facilities, in some schools. It's being used
UH just at random police checkpoints you can set one
of these up or when entering a large corporate building.
These systems have been tested in several different places. Again,

(14:58):
it's not all the same systems, not all the same
software and or hardware, but there are various systems like
that that are meant to do the same thing and
just see how you're feeling. Yeah, you know. One of
the interesting things there to me was that there's a
list of emotions that they're looking for, right beyond just
the piagraphs stuff. There's a project. Managers speaks on record

(15:20):
in the in an article quota by The Guardian and
they talk about they have literally a list of emotions
and one of them really stood out to me man
was boredom. I don't think boredom should ever be listed
as a crime. I think it's okay to be bored.
I think there I don't think anybody should ever be

(15:41):
bored because there's always something interesting to do, right but
h but I think it's okay if that's like the
choice you make or that's what you're going through out
of like, none of this is happening in a vacuum.
There are multiple reports that weaker populations already have to
give d n A, have to get facial scans out

(16:02):
the wazoo, and then also have to download an app
on their phone and if they don't, Now, if they
don't carry a smartphone, they may be seen as suspicious
because they may be seen as trying to avoid that
app and that tracking. So this is like, if there's
an emotion that's not terror that these folks are experiencing,
I say, let them have it, you know, let them

(16:24):
be bored if you have that luxury. People don't realize
boredom is a tremendous luxury. Imagine the other implications if
it's not just bored, if it really is anger that
they're looking for, or just in general stress and you're
having something. Let's say you're a Tokyo and you've got
an Olympics planned and you want to have a way
to possibly check and see if anyone means other people

(16:49):
harm when they're entering a large you know, stadium or
facility or something like that, and you potentially could I
can imagine in a you know, an authority really liking
the possibilities that the software would provide, or this the
system would provide. But then you imagine it in train stations,
and then you imagine it in airports, and you know,

(17:09):
at your job and at your kids school, and you
just imagine the world looking at you and judging if
you're having a bad day, and possibly if you are
having a bad day, maybe you're a terrorist, maybe you're
you know, a potential threat to somebody else. And that's absolutely,
I mean, that's absolutely what's happening. And I just want

(17:30):
to bring this up with you because I know you've
probably thought of this too. Cultural expressions of things are
not universal across the human population, uh, nor within the
human population. Right, So what about what about misidentifying people? Right?
What if you got the wrong angry bird? And then

(17:50):
what what about people who are neuro atypical or for
some reason don't display emotions in you know, in what
the in a way the AI would expect. And I'll
say it just so I could be a bit a
bit lighthearted about something that is terrifying and dystopian and

(18:13):
very much on the way to you. By the way,
if you live in the US and it's this, what
if you have the good old RBF, you don't. I'm
talking about the resting face, but face sure, resting belligerent face.
We'll say so, like how you know, I'm sure this
stuff is supposed to have in theory safeguards that might

(18:35):
just say, oh, that person is not furious and angry.
They're just like this, Mrs Robert de Niro. You know,
I'm joking a little bit, but their mouth just goes down. No,
for sure. And again, if I'm if I'm in an
off mood and I'm walking by a random kiosk uh,
I don't want to get tackled by some security force anyway. Um,

(19:01):
it's a it's a weird it's a weird situation. Right.
The last thing I want to say here, Ben, is
that this tech, at least according to the Guardian, and
you can read this in their article Smile for the Camera,
The dark Side of China's Emotion recognition tech. You can
read about how the markets, the global markets view this industry,

(19:24):
and there's a quote down here that it's being forecast
globally to be worth nearly thirty six billion dollars by
this This whole facial recognition, emotion recognition industry, big, big
emotion detection, and you can actually you can read a

(19:46):
ton more about that as well if you head over
to Globe whereas it's in Toronto Globe news Wire. They've
got an article called Emotion Detection and Recognition e d
R market to reach thirty three point nine billion, and
you can read all about it in various pieces of
information about specific well it's all Allied market Research, a

(20:10):
lot of it is anyway, It's just that's a lot
of money because I think the world, in the markets
at large understand how excited security firms would be to
get ahold of this. And there are other uses to
not all of them damaging. You know, I'm I'm fastinated
to learn about this technology. You know, I think you

(20:32):
and I talked about it earlier. One of the big
problems is that we're we're building these things that are
increasingly robust and intelligent, but we're building them too often
for what critics would call specifically warlike or punitive purposes.
You know, I'd like to point out that in another

(20:53):
world and another reality, uh, this kind of technology could
be very useful for in ocuous reasons for an acting
class for example, right, because don't think about it that
I am reaching, I am free. Um. It's a shame
it won't be used for that, but you could make
you could help some people become pretty pretty talented actors

(21:18):
by training them to recognize and then later to mimic uh,
you know the proper human emotions, which a lot of
us do that any like, try in the mirror anyway.
So I think that would accelerate it, because it's kind
of a bio feedback, right, like the way you can
learn to sing um. But that's not what it's going

(21:39):
to be used for for a while, in my opinion. Yeah, no,
I agree right now, at least according to this one source,
it seems to be used to keep control over a
minority group inside mainland China. That's what it feels like,
that's what it seems to be used for. Anyway. Uh,

(21:59):
let's move on, let's talk about something else. I'm done
with this, you know, saying all of this while a
camera is just continually taking pictures of my face and
of your face and doc Holiday's blank screen. It's just
I don't like it. It's not done with you, Matt,
not by a long shot. All right, here's our sponsor

(22:26):
and we have returned. So who has not had this experience?
If you have social media, you have a smartphone, or
your friends of social media, your friends have a smartphone,
you have likely encountered some kind of ad that seemed
like it came out of the blue but knew too
much about you. We can make up the anecdotes. It's

(22:48):
a mad lip at this point. Everybody has a story
like this. Let's say you're over at your matt give
me an interesting name, please, Richard's Variety Store. Okay, that's perfect,
that's perfect. Uh So you're with your cousin Richard, and

(23:10):
you're at Richard's Variety Store, which is a neat little
store here in Atlanta, and uh Richard is let's say,
shopping for a surprise for their partner, right, and they say,
you know, things are getting pretty serious. Uh so I'm
gonna get I'm gonna get my sweetheart a stone sculpture.

(23:32):
You were killing it with the props city a mini
it's your stone sculpture. There's an ad for it on
the website I'm looking at and a whoopee cushion, so
she knows I'm also into the lighter side of life. Uh,
and let me say, Okay, that's awesome. So you buy
these little gifts with your pal Richard, and then you
log onto your Facebook or you're probably statistically more like

(23:54):
more and more likely your Instagram or TikTok, and then
you start seeing sponsored content for miniature stone castles or
whoopee two point oh the last word and whoopee cushions.
And then you also start seeing recommended things that tell
you and some it almost seems like urgency. You know,

(24:14):
you're on Facebook and it says Richard like shake shack.
There's a picture of a burger and it's snitching on
Richard's diet to you. Uh, and you don't care. Maybe
you guys just went to shake Shack. Who knows. Most
people have pretty reasonably assumed that this was happening because
social media companies and data aggregators were less than honest

(24:38):
about what they were collecting, how they were collecting it,
and where they were moving it or where they were
sellied it. And we're being candid. And so today's story,
with help from our conspiracy realist over it, here's where
it gets crazy, comes from a privacy tech advocate named
Robert g Reeve, who had one of these experience, and

(25:01):
he's one of those guys who uh living in that
text space. He hears stories like this all the time,
you know, reported the same way somebody might report a
story about seeing Bigfoot. And I get the feeling that
he was he was pretty often the voice of reason
in the room, saying something like, well, I don't know
if I don't know if your device can hear you

(25:24):
the way that you seem to think it can. So
he recently went on Twitter, Matt, and he unraveled the mystery.
And what I'm hoping we can do with our with
our time in this segment is to walk through some
of his story and then stop and and check in.
So we'll start going through some of the tweets. Awesome, Robert.

(25:47):
Here's in case you are listening, sir, thank you for
the fantastic work. Here's what you had to say. I'm
back from a week in my mom's house and now
I'm getting ads for her toothpaste brand, the brand I've
been putting in my mouth for a week. We never
talked about this brand or googled it or anything like that.
As a privacy tech worker, let me explain why this
is happening. And this for a long time, by the way,

(26:11):
I just assumed the apps were actively listening unless you
explicitly denied microphone access, and even then I thought, oh, well,
how much do I you know? How much do I
trust it? The answer is zero, obviously. Yeah. Well, we'll
just to be clear, Robert, and I know you're a
privacy tech worker. Now I'm just I'm not talking down.
Do you like that, Robert? We we know you know

(26:32):
more than us. But you know, Siri and Alexa and
those things are always listening. That doesn't mean they're always
collecting data, right, They're not always sending data away. But
we also do know that sometimes they are sending random
sentences that are not you know, officially, Hey Alex's or whatever. Okay,
Googles and all that. We've talked about it, but but

(26:54):
we get what you're saying. Sorry, Yeah, I can you
change the settings on those I'm someone of a somewhat
technophobic in that regard, just because the track. I wonder
if you changed the settings too. Instead of like saying
Alex so whatever, you could say something like I summon you,
I bet you can. I'm sure you can that's way

(27:15):
too much fun for that not to be a thing. Yeah.
My my favorite way to make changes with our Google
Home is to just turn it off and then unplug
our our WiFi. Yeah, same, same, That's what I do
when I'm in your house. Uh. So to you, welcome man.
I'm looking out for you because you're already asleep, you
know what I mean. So it's it's not me, It's

(27:37):
an issue of responsibility at that point. I appreciate you
letting me get so much rest. Thank you. Uh just
for the record, I have not broken into mass house. Uh.
I left the door open. All back to uh factor Robert,
and he says, first of all, he he does a
bit of myth busting here, and I'm grateful for it.
He says. First of all, your social media apps are

(27:59):
not listening to you. This is a conspiracy theory. It's
been debunked over and over again, which you know, as
you point out, Matt, it's not the same as um
the voice activated surveillance devices or you know, fund cylinders
or whatever you're supposed to call him. He continues with
his tweet, and he says, but frankly, they don't need
to meaning they don't need to listen to you, because

(28:21):
everything else you give them unthinkingly is way cheaper and
way more powerful. Your apps collect a ton of data
from your phone, your unique device i D your location,
your demographics. We know this. Data aggregators pay for to
pull in data from everywhere. When I use my discount
card at the grocery store, every purchase, that's a data

(28:43):
set for sale. They can match my Harris Teeter purchases
to my Twitter account because I gave both of those
companies my email address and phone number, and I agreed
to all that data sharing when I accepted those terms
of service and the privacy policy. Here's where it gets
really nuts, though, whoa whoa, whoa whoa. Here's where I
gets truly nuts. Though. I think Robert Roberts trying not

(29:06):
to like use our phrase nobody knows what he's doing.
I think that's very that would make our days. You
know what I mean that, according to that Chinese facial
recognition software, I have like three emotions a year, and
I would be it would be it would be awesome
to being pleasantly surprised that way. It was one of

(29:27):
those emotions. But I'm saving I like, you know me
I like to saving towards the end. Then is allowed
to be bored three times a year? Three is bored
and emotion I guess. So, I guess. I wonder if
you can live. I wonder how long someone can be bored.
It's such a foreign concept to me. I don't know,

(29:50):
because at some point doesn't just become like list listeness
or a symptom of something larger, like you can't You
probably aren't going to go to a psychologist therapists and say, hey,
I've been feeling bored for five years, I wonder what
it is, and have them come back with a diagnosis.
That's just like due to our super board. Yeah, and

(30:12):
you know, depression and hopelessness and all those things can
lead one down. I bet paths of boredom, And I
bet that's why boredom as actually I just thought about it.
Then that's why boredom is tested because in some of
the prison systems, not to jump back to hard it's great, great,
but yeah, they really do. And some of the prison

(30:34):
systems they're specifically looking to see if an inmate has
suicidal thoughts or self harm, thoughts of self harm or
harmy others, but specifically self harm, because the rates of
self harm in some of the prisons in China are
so high they're attempting to just make sure nobody is
planning to hurt themselves that day. Yes, I see, that

(30:55):
is an astute, an excellent point at And you know,
both of these stories are really about the erosion of
privacy and the obsessive tracking of people with the ultimate
goal of predicting their actions. You know, you're not far
off at all if you were thinking like Matt did

(31:15):
about pre crime. This is absolutely the the both of
these things absolutely lead to that path. But this one,
this story now from Robert, is primarily at this point
about messing with your head and priming you to buy stuff.
Not an Amazon Prime reference. But yeah, I guess yeah, sorry, Jeff,

(31:38):
I'm not paying you for that one. I just call
him Jeff. So Robert goes on. He says, here's where
it gets truly nuts. Though. If my phone is regularly
in the same GPS location as another phone, they take
note of that. They start reconstructing the web of people
I'm in regular contact with. The advertisers can cross reference

(32:00):
my interests and browsing history and purchase history to those
around me. It starts showing me different stuff based on
the people around me, so family, friends, co workers. Knowing this,
this is just a pause in his tweets. Knowing this, man,
just think about how sticky and strange those events can get.

(32:23):
Like let's say, oh, let's take Richard. Okay, let's say
richards mini tour stone sculpture and whoopee cushion were a success.
And uh, let's say Richard got a ton of boyfriend
points for it. He's living high on the hog and
he is scrolling through some social media app or something,

(32:46):
and he starts seeing ads for wedding rings and he thinks, well,
I'm against the idea of matrimony. I am never gonna
get married. Why the hell is this showing up for me?
Then your significant other really really wants to get married,
and that puts you, That puts you in a weird situation.
That's possible, and and Robert, that's where we continue Robert's

(33:08):
tweets here. It will serve me ads, he says, for
things I don't want. But it knows someone I'm in
regular contact with might want to subliminally get me to
start a conversation about I don't know, beat me on
this doctor in toothpaste. It never needed to listen to
me for this. It's just comparing aggregated metadata. And then

(33:32):
he also says, look, this is out in the open,
tons of people report on this. It's just no one cares.
We decided privacy isn't worth it. It's a losing battle.
We've already given way too much of ourselves. Uh. And
then he goes on to share some fantastic resources that
I highly recommend. Reply all excellent podcasts. They have a

(33:53):
great episode is Facebook Spying on You? We have a
pretty solid episode on it too, that could definitely use
an update because it's several years old, but most of
the information in there is still relevant and correct. And
then one that I thought would be great for everyone
to read is uh. New York Times op ed twelve
million phones, one data set, zero privacy. And so Robert

(34:17):
walks through this. He says, so they know my mom's
tooth baste, they know I was at my mom's, they
know my Twitter. Now I get Twitter ads for mom's toothpaste.
Your dad isn't just about you. It's about how can
be used against every person you know or people you don't,
to shape behavior. Unconsciously. Yeah, I know, and it's not

(34:37):
It's not science fiction anymore. It's not the realm of
speculative novels or screenplays. This is happening and it's going
to continue. And you know, you can read more of
Robert's writing and some of the responses to this, But
I I want to this is what I ke thinking about, Matt,

(35:00):
and I'd love to have your thoughts on this. How
close does this get us to a system of sesame credit?
By this, I mean like, um, when we were all
going into the office every day and we're you know,
we work on any number of shows who were recording
all the time, how much of our data was inter linked? Right?

(35:22):
Would you or Null get an ad for you know,
like tactical gear or something because we hang out and
I was looking at it, or would I get would
would uh you know like Haull Mission Control or Alexis
dot Holiday get and add for Magic the Gathering because

(35:43):
you were looking at new sets, right, And I would think, well,
that's weird. I wonder why Mark or whoever wants me
to play Magic the Gathering. That's strange. I never got
into it. You know, I'm a ghost of Sushima guy
or whatever. Well, and I would just be like, well,
you should really try strick shaven because you know, it's
got like this Harry Potter vibe with all these different
you know, uh clans within a school, they all study

(36:06):
different things. You know, I think you'd really be into it. Yeah, yeah,
like like you just recommended anyways, trying to you just
did so. Side note, I used to collect Magic the
Gathering cars when I was little, so it actually wouldn't
be strange at that par awesome, awesome, Wait do you
still have them? I was trying to find them and

(36:27):
I couldn't because I was thinking they might actually be
worth a lot of mone can find them, they well
be Well, according to Robert, it doesn't matter if you
say magic the Gathering out loud again seven more times
because of your social media isn't listening to you. But
you have been on a zoom call with me a lot,
and I've been searching magic a lot, so could be happening. Well. Also,

(36:48):
the algorithms could get it wrong. I mean, how hilarious
would it be if we were talking about magic the
Gathering But the algorithm just picked up the idea of
magic and we started getting ads for like to become
stage magicians or something like that. We have a conversation
like it's like, oh, how is your weekend? H Matt,

(37:09):
I got really into magic, like, no way, what cards
do you have? And then I just, uh pull out
like a deck of fifty two and I'm like, one
of these is yours. Yeah, I don't know if they're
that concerned. That terrible choke. But but the reason bringing
this up is because having your your data and your

(37:30):
life be bound to the company the software believes you
keep could in the future have a negative impact on you.
What if, for instance, as the surveillance net titans, UM,
you are seeing hang like you hang out with people
who are considered to have bad credit, right, Uh, will

(37:53):
you find your credit impacted in the future by that
just because you are associated with those people? You know,
It's it's possible. And I don't know what laws have
been written that might restrict that. They're certainly not applied
at a federal level as far as I can find.
But I'd love I'd love to learn more, maybe in

(38:13):
a full episode. Um, I think we can. We can
pause here for another word from our sponsor. Wouldn't it
be funny? It was stage, magicians and magic the gathering. Uh,
we can hope, We can hope, Matt. But while we're
on this brief break, before we come back and throw
headlines at each other, folks, we want to hear from you.

(38:34):
What are some of the strangest things that you have
run into? What social media ads have mystified you? And
did you ever find an explanation for them? Also, what
do you think about Robert g Reeve? He seems on
the up and up to us. He looks like he

(38:55):
is uh acting in good faith. He's relaying events the
way he under stands them. Do you believe him? Do
you believe him when he says Instagram isn't tracking you
through the microphone because they already have so much other stuff.
Um and if not, what's the alternative? I cannot wait
to hear these stories one three, three, std w y

(39:16):
t K conspiracy and I heeart media dot com. Uh,
stick with us, will be right back to try something
new and knowing us strange and we're back everyone. I
have to tell you this and then I do not

(39:37):
want to embarrass you with this. I need everyone to
know that Ben reverted into his UM, his extraterrestrial language
that he sometimes uses. It's something he's learned. It's not
like inside of him. But he he jumped into it
for just a moment right before he went to break
and just played it off like it was nothing, and

(39:57):
the emotion recognition software didn't pick anything up. So just
an update. Uh, yeah, that was my bad, you know,
that was my was my bad. My accents slipped for
just a second, you know. But luckily we've got each
other's backs, the three of us here and you listening
at home, or in the car, or in the spaceship

(40:19):
or wherever the wide world finds you today. Uh, and
today Matt speaking of terrible segues. Uh, you and I
have decided live on air while we were recording this
show that we were just gonna spend the last few
minutes throwing some headlines at one another. And I think
we both just in the real quick check in the

(40:42):
folks from the beginning of this episode, what Matt and
I were doing where throat When we each through a headline,
the other person is honestly just checking to see if
this could work. And if we had both heard of
those things that we talked about, then we might not
have done this. So it's quite fortuitous. We're working live
mat hit me with it. What's going on? Man? What'd
you see? Oh? Sure, let me make sure the Chicago

(41:05):
Tribune will load for me, Yes it will. Uh. This
was this just popped up on the subreddit news and
here's here's the title from Chicago Tribune. More than thirty
sting rays died at the Shed Aquarium over the winter,
and officials still don't know why. Now. The reason, the
reason how I was bringing this up is because anytime

(41:26):
there's a mass die off in any population, Uh, first
of all, it stinks. Uh. Second of all, it's puzzling
because this species of sting ray are specifically the kind
that if you've ever been to an aquarium, you will
sometimes see a small area where anybody can walk up
and touch some of the wildlife. So we actually put

(41:48):
your hands into the water and interact physically with the wildlife.
And that's what these these sting rays were. And to me,
I was just wondering, is this some kind of pathogen
that went from human too sting ray? Is it is
that even possible that that could happen, that you could
go across species like that, or you know, is it
something with the water quality. And this is the Shed

(42:11):
Aquarium in Chicago, Illinois. It reads as strange to me
because I they still didn't know what the heck killed
all these thirty sting rays. They noticed that they were
acting really strangely in January, and then all of a
sudden they just all went to put You know what's
really weird about this, man, is that when I guessed Florida,
I was wrong maybe about your zoo, but I was

(42:32):
right about another one. Just four days ago. Uh, all
all the sting rays in a tank at of a
zoo Tampa died. What. Oh, that's the that's the one, Ben,
That is the one. The Chicago article is from twenty nine.
But still coincidence. There's a stingy killer. Yeah, I wonder.

(42:56):
And it's different because this mystery maybe a little bit
easier to solve, give that the stink rays are located
in a very well defined, smaller environment, right than the ocean.
All right, Matt, here's here's what I wanted to throw
at you. This is partially about the terrors of building

(43:16):
thinking machines as weapons of war, and partially about how
tricky headlines can be. Business Insider South Africa reports a
rogue killer drone quote hunted down a human target without
being instructed to. This is according to a u N report.
Here's the problem. This thing they're talking about specifically, is

(43:37):
what's called a cargo to quad copter. It autonomously attacked
a human being during a battle between Libyan government forces
and a breakaway military faction. It's built in Turkey. It's
designed for asymmetric warfare anti terrorism operations, and it's designed

(43:58):
to it's designed not always need data connectivity between the
operator and the drone, so it works such that like
your imagine you're flying the drone. Uh, doc, we're gonna
pick on you on this one. So your Doc Holiday,
you're flying the drone and you're the best drone pilot

(44:19):
in insert military here, right you are. You are the
Doc Holiday of of this drone program. But there's a
spotty connection and all of a sudden, your screen freezes
and you cannot steer or direct your cargo to quad copture.
What the cargo two does instead is just continue on

(44:42):
its own with the instructions of the programs it believed
it was given. So I think it's a little unfair
to say this thing went rogue but I do think
it's fair to say this means there are autonomous, uh
flying killing machine beans out there and there with networking

(45:03):
with network get shoes. Yes, yeah, absolutely, Uh that's a
bad one for us to end on, Mett. No, it's okay.
The situation you described, Ben feels very similar to this
zoom call where Ben, you you've got a narrative that
you can that you're speaking to, and then you have
a co host who has a terrible connection that here's

(45:23):
you most of the time, then tries to say something
and then it comes the the information comes in a
way later and that was too late. It's just really bad. Well, well,
how twenty one is this, Matt, Uh, You and I
are in the same building right now, We're like two
creepy rooms away. People must think our office is the

(45:47):
creepiest place if they haven't seen our YouTube videos on
the same network somehow uh so true. I think that
that is illustrative of just how finnicky networks can be.
But I would argue this makes this could make drone
warfare even more dangerous as as these entities become increasingly autonomous.

(46:13):
You know, what kind of safeguards can we build in
should we try. Well, I I don't know how hot
of it take this is, but should we try not
to specifically design the earliest ancestors of artificial intelligence to
be war machines? Can't we just make them? We're really

(46:33):
happy to play magic the gathering. You know, I'm sure
there's an algorithm that would be amazing at it. We
tried that already. That AI's name is Sparky. You can
play against Sparky as much as you want. Sparky even
says like, hey, let's play again. Oh man, that was
a great play you. Oh dude, you're good at this. Ah,
let's try. Let's meet up again soon after you defeat Sparky.

(46:56):
So they're already working on mt g AI. Wow, that's
fun to say. Oh I like that. Okay, well let's
end it there today. Uh, folks, people will do this again.
We'll throw some random headlines at each other. But you
know what we'll do, Matt, I think, well, um, let's
each get like eight that we don't know about and
then just throw the rapid fire and maybe maybe we

(47:20):
can even maybe I'm doing like the end of a
Rick and Morty episode. I'm sorry, maybe we can even
make it like one of the headlines is fake, and
you have to guess which one is fake. I'm kind
of ripping off Jonathan Strickland. So we'll think of something different.
Maybe we could put some steaks on it. Maybe we
could gamble gamble steaks. I really thought. Okay, I'm gamble snakes.
Why not? So I'm gonna read. I'm gonna read you

(47:41):
to really fast, rapid fire. Jeff Bezos will step down
as Amazon CEO, so you're gonna have to name a
different guy next time you make a reference to the
CEO of a Amazon. Last one. All right, this is
an old one. But Amazon to buy MGM Studios for
eight point four or five billion? Oh okay, that none

(48:03):
of those matter. They're not conspiratorial. Amazon wants to get
into healthcare. That's conspiratorial. MGM. Dude, we're talking about Mayor
goldswin m Yeah, nailed it. Metro Golden Mayor. I believe
there it is Metro Golden Mayor. Uh. We We would

(48:24):
love to hear more random headlines from you, fellow conspiracy realists.
We've been a bit light on this when there is
some incredibly disturbing, heavy stuff going on in the world.
As we record once. It's been the case forever since
the creation of this show, and we typically are going
to return to those in the form of full episodes.

(48:48):
For now, we need your help. We want to hear
your stories. What's your experience with facial recognition with all
this near future tech that is bleeding over the edge
from fiction into fact. Do you think private company like
Amazon would be great for healthcare? What do you think
which do about rogue drones? And what are your weird

(49:11):
social media stalking stories when the algorithm does it not
not some weird Oh you met on the internet? That's great. Yes,
you can find us all over the place on Twitter
and Facebook. We are at conspiracy Stuff on Instagram, Conspiracy
Stuff show on YouTube. We are also conspiracy stuff. Check
out all of our videos. There are so many of them,

(49:32):
even videos of these conversations. This one might be a
little choppy considering the network situation in which we currently
find ourselves. Um, but hey, if you don't want to
use social media, because you know, we just talked about
how that's kind of a weird thing, you can always
give us a call. We have a phone number. Yes,
we do have a phone number. It is one eight

(49:53):
three three STD w y t K three minutes. Those
three minutes belong to you. You'll hear a bree message
from me, and then you are off to the races
by friends. Just tell us your name, give yourself a
nickname if you prefer. We always love those. Tell us
what's on your mind. If there's anything private that you
would rather not be stated on air, let us know

(50:16):
that in the message. I'm suggesting Matt that going forward
from now on, we just tell people. If you're calling
us and you do not want us to use your
name or voice, tell us that you don't. We're switching
from an opt in to opt out, which is a
pretty weird move. We'll see how it works out. And
if you like many of us hate being on the

(50:40):
phone for any non emergency reason, and you hate social
media because you heard our earlier episodes on it, never fear.
We have one more way for you to contact us
anywhere in the world, any time of day. That's our
good old fashioned email address where we are conspiracy at
i heart radio dot com. Stuff they don't want you

(51:17):
to know is a production of I heart Radio. For
more podcasts from my heart Radio, visit the i heart
Radio app Apple Podcasts or wherever you listen to your
favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows
Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.