All Episodes

July 12, 2019 59 mins

Ads are ubiquitous on the internet, and even if you use an ad blocker or two, you're bound to see a few things slip through. Luckily, those ads don't really give advertisers any new information about you unless you interact with them... right? Not so fast. Eye tracking technology can glean an enormous amount about your attention, as well as your reactions to a given image or piece of language, just by watching how you watch, gaze or glance at an ad. So how much can they learn, exactly? Does eye tracking allow companies to, in some sense, read your thoughts?

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of I Heart Gradios How Stuff Works. Welcome back

(00:24):
to the show. My name is Matt, my name is
no they call me Ben. We are joined with our
super producer MAYA nickname to be decided cold today, So
right in and give us suggestions for cool nicknames. Uh.
Mission control has already taken. Paul is on an adventure,
will return soon. Most importantly, you are you, You are here,

(00:47):
and that makes this stuff they don't want you to know.
Quick question, folks, as you are listening today, what are
you looking at? What do you often do when you're
listening to a podcast. A lot of our fellow listeners
have written in to say I listened to stuff they
don't want you to know when I'm exercising or when
I'm doing housework. Sure, yeah, when I'm in the middle

(01:10):
of another smash and grab on a crime spree. We're kidding,
don't don't do a lot of it is about people
doing hobbies, whatever their hobby is, at their home. They're
listening to a podcast Ben and I off of Mike
just had a discussion about us working or on a
computer or playing a video game, even on mute while

(01:32):
listening to podcasts. But what do your eyes do while
you're busy in your ears with us? Wow? Yeah, I
after doing some research on this, I actually started wearing
sunglasses in the studio at least for today's episode. That's
not just for like to look awesome and be like

(01:52):
slightly intimidating. I don't you know. I'm flattered, but I
think I think most people just find it annoying. But
it's honest, it's a bit freakish what we're exploring today,
and we would like you to be part of this conversation.
So as we always like to say at the top
of the show, if you have something you want to
let us know immediately post taste before you forget. And

(02:13):
you're like, I don't have time to write out an email,
I need to hop on the horn right now. Well, you,
my friend, are in luck because you can call us directly.
We are one eight three three S D D. Yes,
that's our number, calling right now. Leave a message, tell
us anything you want. You have three minutes, and remember
if you do not wish to be identified on air,

(02:35):
that's all jolly good. Just explicitly tell us that. Or
if you just want to tell us something and you
don't want it to go on here, it all tell
us that too. Or you use one of those voice
disguisers the serial killers use. That's entirely up to you
and on your serial killer whatever. That's profile, and that's fair.
That's fair. Here's a thing we never mentioned at the
top of the show that I would like today, If
you would be so kind, please leave us a review

(02:57):
on whatever podcast platform of choice allows you to do,
preferably Apple Podcasts, I think that's what they're calling it
these days. Say something nice and helps with the algorithm.
It helps raise the profile of the show. And honestly,
considering how many people listen to the show, the number
of reviews we have is a travesty. So help us
with that. Yes, vote your conscience please. So before you

(03:19):
heard our our weird voices coming coming through your ear holes,
you probably heard an add Maybe it was a promotion
for one of our many new podcast Maybe it was
some some offer to go to a website like Great
Courses or something like that. A long time ago, years

(03:41):
and years back, Matt, you and I had an unsuccessful experiment.
We've had many experiments. You remember this right years ago
we decided we're looking at big data and advertising in
general and these our modern days, to try and out
every single AD we saw in this space of one day,

(04:04):
and we failed because eventually you just give up. I
lost count after two hundreds. Yeah, yeah, And I remember
we had a lengthy discussion about what constitutes an AD.
Is a logo like just a singular logo without any
other words, or a picture that represents a company. Is
that an AD? I would argue, yes, I think so too.

(04:24):
And in that case, we, I mean we, we weren't
even counting correctly because there were so many of those
that we encounter every day. Although the nature of who
pays for the AD or how the AD is distributed
probably should be on the table, because if I have
an Adida's hoodie, I paid money for that Adida's hoodie
because I think it's cool, but it's also technically an
ad for Adidas, and they didn't buy it though I

(04:45):
paid them for the privilege of wearing their ad. Right,
you paid them for the privilege of advertising for them,
So congrats. That's why I usually that's why I usually
am adverse to many brands. But but you know, that's
the nature of the beast, right, people like those identifiers.
Ads are not necessarily a terrible thing. It would be

(05:07):
ridiculous for us to argue that on this show, right,
because there will be an AD here at some point. However,
ads are so ubiquitous advertisement in some form every day.
Most of us are surrounded by institutions, technologies, and various
groups that try ardently to collect as much information about

(05:27):
us as possible and then analyze it six ways from Sunday.
And you know, often they get things hilariously wrong. Like
have you You've probably probably had this thing where you
order something that's maybe a one time purchase, you get
like a toilet seat replacement or whatever, and then the
next thing, you know, for a week or two, you're
inundated with these ads for toilet seats, like you're some

(05:48):
kind of regular toilet seat enthusiasts, like you're some sort
of addict. And Facebook is thinking, well, if we just
keep showing them the add like, what are they gonna do?
Are they gonna say just one more I'll treat myself.
I guess this guy's on a customed toilet seat kick
right now, we really got to flood in while the
stir while the iron is hot. Or you buy a
car and they're like, you know what, this person and

(06:09):
just bought a corvette needs three more corvettes. Just wait
until you until you have a child. Ben, one day
you're going to have a child. I can I can
foresee it. I've had a prophecy. And when you have
a child, you realize just how many ads are thrown
at you about having a baby. Yeah, and it gets
pretty invasive. Right. There are not as many privacy safeguards

(06:32):
as one would expect when it comes to the world
of data aggregation. We always talk about the famous and
completely true, by the way folks story about target predicting
a pregnancy in such a way that surprised the parents
of the expecting mother. At times, advertising now, especially in

(06:52):
the digital space, can feel close to telepathy or an
attempt at it. But it's it's not quite air yet,
and that is what today's episode is about in large part,
so here the facts. It's no secret that advertisers are
infamous for their relentless pursuit to learn everything they can
about you. No matter who you are, in the hopes

(07:14):
of selling you something at some point. Yeah, but there's
this one one thing that's always alluded all of their
efforts and all of these companies and how much time
and money they've spent on this, and it is your
physical bodily reaction to everything that they're doing, to the
ads that they're putting in front of your face, to

(07:34):
to them how bright, how dark like, what color? All
of these things. They cannot tell precisely how you are
reacting in real time, which just to give you a
sense of the stakes here and how much information these
organizations would like to collect if you had, for some reason,
I'm not making fun of anybody, if you had, for

(07:56):
some reason physical reaction where you had like a little
anxiety for every time you saw something, they would want
to know that, yes, yes, yes, Just so that was one.
There's not really a limit for data collectors. There's not
a moment where they would say, well, this is too much,

(08:18):
this is we're getting a little weird right now. Whether
there is whether we're talking billboards or banners or you know,
ads on your phone or your freemium game or whatever
you're playing. Advertisers have always had to rely on other
indicators to see whether they could grab someone's attention, so

(08:38):
they know, for instance, um, in a YouTube ad or
something where you have five seconds before you can click right,
they know that you are doing something during those five seconds,
but you're paying attention to the ad or you you know,
like oh, I'm gonna, I don't know, drink soda. Well, yeah,
especially just that that concept if you're looking. If let's

(08:58):
say you're using your phone right and you've got YouTube
playing a YouTube aad comes up, how are they going
to know if you're actually looking at that ad or
just looking somewhere else while it's playing, or even putting
your phone down Right now, there are ways on on
most smartphones to tell if you've put it down, if
you've turned the screen off, if you've clicked. They definitely
know if you've done anything like that. But as of

(09:19):
right now, nobody is well, at least, oh god, we're
about to get into it, but nobody's using the camera
to actually look at you while you're looking at it officially, officially, officially,
there are there are hacks of plenty, and who reads
their terms of service? You know what I mean? Who does?

(09:40):
I had an old friend professor at a local university
here who was a graphic design guy, and he was
working on a project to make terms of service and
those agreements more readable. Oh, like a really cool thing
that automatically sorted this stuff you should be concerned about
and popped it up at the top. That's really great

(10:03):
and nice and cool. His funding was pulled. Yeah, there's
just not an incentive for it. Yeah. So here's a
little experiment that we found in Slate and want to
what you think about this, just for a moment. Don't
do this if you're driving. Uh, if you're in the
middle of like your m M A training or whatever,
just wait, wait till you have a safe place to
sit down. Oh wait, and I've got an idea. If

(10:24):
you can find a mirror, look into that mirror while
we read off this this list, it's a little different. Yeah,
it's we're mixing the senses here in the reaction, but
we just want to see if you notice anything. Okay,
so consider for a moment, mirrors at the ready the
following list. Republican abortion, Democrat, future, Afghanistan, healthcare, same sex marriage.

(10:55):
Now we've tried, i think all of us to just
say these as straight as we can, just like say
the word without without trying to add you much to
it um. But we want to know if you've noticed
that there was was there any reaction in your face
when you heard any of those words, maybe a dilation
of the pupils at one point right or narrowing, Maybe
your eyes flicked in one direction or another. Did you

(11:18):
notice a blink that ordinarily would have passed you by.
There is a tremendous, profound, enormous amount of information reflected
in the way you just read that list, if you
if you read it written down, or if you're watching
your own reactions. Honestly, we don't know exactly how it's
going to be if you just hear it precisely, but

(11:41):
for certain there was some kind of reaction. It sounds
a tremendously powerful thing. I learned a cool fact about sound.
This is a total tangent. Okay, So this came to
us from Will Pearson, a friend of the show and
a host of our podcast, Part Time Genius. Will kicked
a couple of studies to us that showed sound can

(12:01):
affect your experiences in subconscious ways. They don't that you
are not going to be aware of if you're eating,
for instance, stale pringles potato chips, and someone is playing
the sounds of some trunchy thing, then your brain will
believe that the chips themselves are fresh and not stale brilliant.
And this is it's so easy to affect people's sensory input.

(12:24):
But here's what happened when you If you read those
words that we just intuned, then you may notice that
your eyes paused for a fraction of a second on
certain words. And this this is definitely true for most
of us because we put in some phrases that could
be considered oppositional right, Democrat, republican, same sex marriage, et cetera.

(12:48):
These these things can be divisive for some people in
the US. And probably what was happening was our brains
as we read along, we're trying to connect these things,
even though they just exist independently, right as fragments of
a sentence. But did your pupils dialate a little bit
when you were reading the list? At which word you know?
Did you blink at a different rate? Did you backtrack

(13:10):
to reread any words? And if so, which ones, when
and how long? Eye tracking can show you all of this.
If you are an advertiser or even more disturbingly, a government.
This concept has also been called the Holy Grail of advertising.
It uses the image from one or more cameras to
capture changes in movements and structure of our eyes, and

(13:32):
it can measure all of these things with scary or
welly and accuracy. Yeah, that's where we're at right now.
That's that's right now. It hasn't it hasn't really mainstreamed yet. Well, yeah,
it's it's still one of those um technologies and approaches
that you basically have to do it in an environment

(13:55):
where you're testing right now, because can you imagine right now,
if say everyone right now that has an Apple phone
some kind of Apple phone, it just rolls out and
the Apple lets you know, hey, just so you know,
your front facing camera is going to be watching you
at all times. Whenever you're looking at your phone, it's
gonna be tracking exactly what your eyes are doing, and
it's going to be taking that data and sending it

(14:15):
back to Apple. Do you think they'll keep Siri as
the branding or do you think they'll make a new
ostensibly friendly entity for that. It's like, you know, every
time you pull up your phone if you don't have
time to type, don't worry. Danny can tell what you're
looking at, and Dannie will help you. I feel like
Siri has been looked at. It's kind of a flop anyway,

(14:35):
so maybe it's time to rebrand. Syria has been doing
some great work for memes though, is that right? Yeah,
just like the meme economy. I don't know. I don't
know any Siri memes. I don't use siris. So wait, okay,
that's I'm trying to think about your question with uh
Sirian what's gonna be? And it definitely, it definitely feels

(14:58):
like a Black Mirror episode. We were just talking with
our producers out there, um my in Seth by the way,
Shadow producer Seth Seth, it's literally his first day. It's
his first day, yeah, and we were talking with him
about how Black Mirror this, this topic in this episode
is um oh, I'm trying to get back to what
I was going to talk about. I got lost on

(15:19):
Seth there for saying okay, so, um, we were just
we were discussing how Siri as a thing feels like
it will be what the new phone is. It won't
be like a phone it will be a serie. Does
that make sense? Yes? Yes, UM, some kind of personified
UM thing that exists that you just interact with as

(15:43):
another being, something approaching AI as a personal assistant. That's
what it feels like. But it will just be that
the device you have on you at all time, and
eye tracking would be a fundamental piece of that. Again,
this stuff is not a theory. This is not necessarily alarmist.
Where we're aking. We're working to stay away from being

(16:03):
too too scary about this, but this is where we
are right now, and will pause for a word from
our sponsors, and we'll tell you what's happening pretty soon,
sooner than you think. Yeah, make sure you do not pause.
Just listen all the way through this and keep your
eyes as open and stares for as straight as you

(16:23):
can forward while listening to the sponsor. Take your take
your hands, take your pointer, fingers in your thumbs, and
whole ratchet open and let's see where everybody is. I
feel silly. No one can see us doing that. We
both did it. We're back. And before we vilify eye tracking,

(16:50):
which you know, honestly I am totally down to do,
we should mention that it's not an inherently evil or
bad thing. Right. There are some advantages yeah, so, um
for folks who are into this technology. Um, there is
an enormous benefit for people who have disabilities, or for example,
the creation of immersive games. But how far does it

(17:12):
go and what are the implications? Um? These are the
answers that may disturb you. Definitely, here's where it gets crazy.
You're absolutely right. Okay, First, the argument about disabilities is
entirely compelling and valid. If someone is, for example, paralyzed
below their neck, then they'll rely on mouth movements maybe

(17:36):
or uh, there are already forms of eye tracking. I
believe that exists in order for them to maybe type
with their eyes or something like that. This takes it
to another level and it could be of tremendous and
significant benefit to to these people. It also is, as
you said, no, it's it's probably great for VR games.

(17:58):
I haven't tried it, but it could be incredible instead
would make a lot of sense. Also, I just don't
I don't plan to. I wear sunglasses all the time,
but it's not because the lights are too bright. It's
because I already in real life don't want people to
know where I'm looking you know what I mean. I
can't handle taking that to a digital realm. It's all

(18:19):
good man, Well here we are right now. Eye tracking
seemed like a lovely, nice convenience. But the implications are
far reaching and too many, deeply disturbing and deeply concerning
at the very least. And we're always really careful to
separate our opinions from the facts, so I will say it.

(18:39):
We always try to be very explicit about this. It
is my opinion that this is a bad thing, and
I'm glad to I'm glad to defend why if, because
usually the three of us are on the same page.
But if there's a disagreement, I am glad. I am
glad you're I want to be persuaded that this is
overall a good thing. I don't buy it. Okay, Well,

(19:00):
I'm gonna go into it as though, just for funzies
that I believe that it's the greatest thing ever and
we have to institute it just to make sure that
we know everything about you that we need to know
that we need to know us well. First things first,
as we said, this isn't theoretical, it is happening. There

(19:20):
is virtually no government oversight. Zero government oversight sounds good,
sounds like something the private entities want, and that's a
that's I mean, that's a larger trend we're seeing, you know,
private private control or private subversion of things that once
existed as legal oversight in the governmental sphere. For example,

(19:43):
there used to be the do not call list. Do
you guys remember that I was on it? It's still
a thing, right, it is? But does it work? And
now I guess it doesn't doesn't. I mean I've had
people who you know, when you call and it's a human.
That's the problem though right now, more more often than not,
it's a robot. So you can't really yell at a
robot and say put me on you do not call list? Right, Yeah,
exactly right? And we have to ask ourselves, does the

(20:06):
do not call lists still exists? I mean, things are
These kinds of intrusions are becoming more and more frequent
and more and more invasive, and really private industry US
is leading the charge. Apple has filed a patent for
a three dimensional eye tracking user interface. There's a European

(20:27):
company called uh Sense. I get it because it's like
a it's like an eye. It worked on that it
wants to have eye tracking software baked into smartphones very soon.
And as this stuff becomes increasingly deployed in your laptops
and your tablets, in your phones and your TVs, in

(20:49):
your freezer, your smart refrigerator. Yeah, this will open a
new front in the debate over privacy. As it stands now,
privacy is very much an endangered species. Yeah, we talked
about it as a good thing with respect to virtual
reality game playing or whatever. However else you're gonna be

(21:09):
using VR or a R glasses like the old Google
glass or some of the new prototypes that are coming out. Um,
eye tracking will pretty much in the near future enable
all these companies to collect your most intimate in I mean,
for the most part, subconscious or unconscious responses to the
entire world around you, any and all stimuli. They will

(21:32):
have cameras facing at you so long and at all
times that literally everything you encounter will be recorded. I
thought we weren't going to go dystopian this type around. Well,
I mean, unfortunately, this is this is okay. I'm supposed
to be the part of the one who's positive for this, right,

(21:52):
I'm not gonna have any kind of existential breakdown right
now about this, It's fine, Um, I'll do it. We
can switch sides if you want, you can we switch side?
Okay here, Uh, let's see. Let me take a swing
at this on the positive side. All right, So I
tracking technology. My position now is that it's great, is superb,

(22:12):
it's a it's top notch because it helps advertising become
more effective. Okay, So I'm only going to see the
ads that really do pertain or interest to me. Right. So,
you're looking at your phone, or you're looking at your
tablet or your laptop, and advertisers received this feedback that say, oh,

(22:33):
Matt Frederick's eyes lingered for a few seconds on an advertisement,
but then he didn't click on it. So how does
he really feel about, you know, toilet seats or corvettes? Uh?
How did Matt's eyes move as he looked at an ad?
What did he look at first? Should we prioritize that
in the future, right to bring the corvette more in
the forefront. Are there certain words, phrases or topics that

(22:57):
are that are things Matt for Drenk loves or there
are things that are super like turnoffs for him, like
whenever he sees something that mentions velcro, He's like screw this,
burn the system down, I'm going outside, yeah, or pickled
okra like get out of here, get out of here.
Just don't even put that on my phone. So then

(23:17):
this means that successive ad campaigns will be more tailored
to you. The argument on paper is probably something like
they're more organic, or they're integrated, or that wait, I'm
going back to my real side, hang on, or they're
they're less invasive because it's something that you are predispositioned

(23:38):
to show an interest in. Okay, all right, well I'm
I'm coming back on your side now. So when when
companies are trying to figure out what kinds of stories
we want to hear, about what kinds of movies and
television should be produced, they can think about all of
these different things in a giant cloud of words and
phrases that they can mix together and make me the

(24:00):
stories that I want to see. That's great. And the
way they're able to do this is through the collection
and preservation, the hoarding and be fair to say, of
the data they collect from the again largely unconscious movement
of your eyes and largely uncontrollable. So we're talking about

(24:21):
millions of people's eyeball footprints for lack of a better word,
and then hundreds, probably thousands of companies in the in
the near to mid future who will collect this across
the digital ecosystem. What will they do with all that information?
Whatever they want? That is that is ultimately the answer.

(24:46):
That's probably. I mean, there there will be people who
want to split some hairs over it, but ultimately whatever
they want. It's like anything else on the internet, right,
I mean, it's all of the stuff that you pump
into Facebook knowingly knowing that that is what pays or
your privilege to use Facebook. It's just gonna get more
and more specific. Before you know it. We're gonna have
like biometric scanning that we're agreeing to allow be processed,

(25:09):
and we have no way of knowing what the end
results are the end user is going to do with
that information. Anybody that works at malship already gave this
stuff over Well, that's the thing too, Like for example,
this is a little bit off the subject, but it's
similar like when marijuana was recreationally legalized for the first time,
like in Los Angeles, and probably even when it was
you know, prescription you had to let them scan your

(25:32):
I D. You had to let them scan your I
D into some database, and you don't really have any
control over what they do with that. It gives you
the privilege of buying marijuana legally, but you're sort of
rolling the dice as to is this going to come
back and bite me in the ass if it's not
legal in my state. Like, there's all kinds of different things.
What if you get a job at a federal organization.
You know, you know they're doing the same thing with

(25:54):
nicotine sales in Georgia right now, or at least they're
proposing And I think it's not a faficially in place yat,
but it's being done in some places here in Georgia
where if you buy any nicotine product, they scan your
I D. And imagine the implications of that. Then it
goes back to your healthcare provider ing and all of that.
It's like Flannery O'Connor said, all all that rises must converge, right,

(26:16):
but now in a really uncool way. Sorry Flannery. Yes,
but also so this is this is true, and that's
an excellent point. The purpose is to aggregate this and
get as close to predictive potential as possible. In theory,
this stuff would be anonymous data they would they would

(26:40):
describe as non personal. But in practice it's going to
be very easy to penetrate the anonymity of that because
think about it, your eye movements will be largely unique.
They will be largely tied to a device that is
already identified with you, your smartphone, right when the GPS.
The GPS is trackable as well, so they know where

(27:00):
you are when you're looking at things. Uh, sort of
like Uber, I believe it was. Uber is using some
of that same technology to float the idea of a
system or a function in their software where they can
not pick up people who have been drinking, which to
me sort of defeats the purpose of a lot of
Uber rights. But they can see and they'll say, okay, well,

(27:24):
this person is calling, this person is calling a ride,
but their phone says they've been at this place for
you know, three hours and it's a bar or something,
and you're just like, I'm not picking up that drunk patent,
right or something like that. But again, that's what Uber
drivers and Uber customers have signed on for them, just
saying there's a lot that we can get from phones

(27:45):
and these other similar devices and that you might not suspect.
And now this is going to this is going to
kick it up to an entirely new level. Eye tracking
from tablets and smartphones is tied to a unique device
identifier associated with one specific device. So maybe they don't

(28:06):
know that your full name is Noel Brown or Matt
Frederick or Michael, but but but they they will know
that it is you know, uh an a T and
T iPhone six s or whatever, and they'll know where
it is and they'll know where it is. So because
this ties into location tracking information, it can also tie

(28:31):
to the locations that you show up at all the time. Right,
most of these studies are being on this I track
and stuff are being conducted in closed environments, but the
technology is already it's already been out there. What we're
telling you about is not just happening. It already happened.
All those rumors that you've been reading about regarding Facebook's

(28:54):
microphone being on or something like that, Uh, it's quite
possible that you already knowingly have an app or some
piece of software on some electronic device that is trying
to do something like this with your eyes. Yeah. And
you know, just as Ben said, when we come back

(29:14):
after this quick sponsor break, we're going to talk about
how towards the end there, how this is. This is
literally like one tiny cut right of the thousand cuts
that we're giving ourselves that will eventually bring about the
Gray Goo, Great Goose. I'm just kidding the don't worry.

(29:36):
I'm just I'm just saying that, um, the machines will
be taking of a really soon and eye tracking is
just one tiny part. Here's our sponsor, the Singularity Rights. Yeah, nanobots,
we're talking about premium vodka. Here. Jen's the Gray Goole,

(29:58):
the Great Goose, the Great Google, the Goose. You gotta
get that goose, get goosed. You're here first. If you've
never heard of vodka, Okay, let's potato juice, right, yeah, yeah, yeah,
it said sweet sweet, sweet sweet moscow potato juice. Let's
just start calling vodka that. Can we call it moscow
potato juice. Drink responsibly and only if you're over twenty
one or wherever the legal ages and whatever country you reside,

(30:21):
or if you're in if you're in Russia and whatever
age or don't drink alcohol that's also I think that
cranberry juice. You can just eat the potato. I feel
like we helped a lot of people with that. That's good.
Could be a shirt guy. I like that you can
just eat the potato. Hey, guys, where did where did
your eyes go? When when we just started talking about

(30:42):
potatoes and vodka and all that stuff. Where what happened
to your eyes? We need to know? Send us your
your metrics please wow? Oh please feel free to not
do that, okay, because we don't want to be held responsible.
It turns out that there are many applications for this
eye tracking technology, and this is incredibly dicey, ethically fraught territory.

(31:06):
We talked about cracking privacy. It's gonna be easier than
ever once this stuff is out in mainstreams and it's
it's fully acknowledged as the new normal. You're just gonna
have to treat everything you do as though it is
a matter of public record, at least online. But there's
another more troubling thing I would I would argue, which
is the concept of pre crime. Remember Minority Report. Yes,

(31:29):
this could that, this could happen. As a matter of fact,
it is happening That's the thing that I keep trying
to emphasize here to the gray goo. You're talking about
the the goo that the precogs are floating around in.
That's it, is it? No? No, I was talking about
the nano essentially, the just the nano nanobots functioning as

(31:50):
like the Philosopher's stone. Right, it's transformed, you can it
can create whatever it wants, but it could also just
cover the earth of it continues to replicate without any controller.
Where's the Google. It's like a gray substance, like a
gooey substance of nanobots. They just it's a thought. Yeah,
it's not real yet yet, but we do know that.

(32:14):
Oh wait, we have to do my When did Minority
Report come out? Statute? Okay, early two thousand, so we're solid.
But anyway, spoiler alert here be spoilers after the countdown
three to one spoilers. So so a Minority Report and
we've all seen this film in case you haven't seen it, Uh,

(32:38):
Tom Cruise attempts to act as as a member of
law enforcement in the near future in a pre crime division.
And the way that the pre crime works is the
precognitive triplets that you you mentioned earlier. Right now, they
have these dreams while they float in this pool, and

(32:59):
they're dreams are inscribed on these little wooden balls, and
then Tom Cruise reads them and uh, and then attempts
to stop a crime before it occurs. Right now, that's
still kind of sci fi, but not as sci fi
as you might think, all the while manipulating things on
kind of a holographic touch screen, which is largely the

(33:22):
way we use tablets and stuff. Now that's sort of
pre precogged, you know, our use of touch screens. There
we go. Yeah, so where where are we at in
real life with pre crime? Yes, it is the thing.
We have to recognize that although companies and private organizations
will take the brunt of criticism for this, law enforcement

(33:45):
and security will have big roles to play. And this
becomes very dangerous when we talk about more harsh nation states.
Researchers in the US and the UK have mapped the
correlation between blink rates, pupil dilation, and deceptive statements already,
right this is like a polygraph of polygraph actually worked.

(34:08):
So the Department of Homeland Security has already been developing
a pre crime program aimed at identifying criminals before they act. Yikes, Seth, Yes, Seth,
we do it before you even got in the room. Buddy.
We're totally joking. Of course Seth is wonderful, although we
have only met him for the first time today. That's true,

(34:30):
but I feel a connection. I'm having like a cloud
outless moment. Okay, yeah, looking okay, maya solid the whole time,
I can tell. So, so it's true. Though. This DHS
program is called an a burst of creativity Future Attribute
Screening Technology or FAST. Uh that is for you. Hey,

(34:51):
thanks man, you know I'm a I'm a real acronym buff.
So what is FAST? What does it do? It's when
things move at an alarming rate of speed. Yes, essentially,
what's gonna be ha any here very soon. But yeah,
this program is designed to take images analyze them um
specifically at airport security checkpoints, which is already being instituted.

(35:14):
Those facial scans. Have you guys seen them? If you
if you travel fairly often as as we do here,
and then you'll notice they've become sort of the way
that sesame credit was originally instituted, it was opped in
and now it's mandatory. Yeah, and it's just to literally
scan your face, in your eyes, and what they're doing
as you're going through the airport and through the security

(35:37):
process and all the other things, just to see if
are they doing any kind of weird movement, Is there
any kind of like what's your what is your eye
position and your gaze at any given time? What are
you staring at? Are you looking at the security guards?
Are you looking at this certain bag? Are you giving
off any kind of facial expressions that would seem suspicious?
Are you just are you why? Why breathing so heavy?

(36:00):
Were you running? That's it? Is your heart rate elevated
right now? Why is your heart rate up so high?
It's just the dogs that are walking past right now
checking everybody's bags. It's fine. So this is a very
controversial application of this because yes, we could say that
this has the potential to prevent crime and save lives. Right.

(36:22):
For instance, America has a tremendous, ongoing and tragic problem
with mass shootings which occur at a at a cartoonish
and fundamentally offensive rate, one being too many. Well so,
in theory, like if there were cameras set up in
the way, they could do eye tracking in a public
space and someone was being what one might describe as shifty,

(36:46):
as in darting their eyes around at a at a
weird you know, kind of clip that might be a flag. Right,
So that's the question. Could this prevent those sorts of tragedies?
It's it's weird because it's tough to argue this. Let's say,
for instance, Uh, let's say, for instance that someone snaps
and they go to a public place with guns or

(37:09):
or maybe an explosive or something, but they're clocked by
a camera and then they're arrested. Well, unless they wrote
down or publicly stated what their intent was, they would
just have a legal possession of a firearm, you know
what I mean. And so how can we prove what
would have happened would have happened? That's why pre crime
doesn't work, not yet. It basically amounts to depriving someone

(37:33):
of due process. That's a good point. Yeah, yeah, And
the the technology always outpaces the legislation. It's something that
we see time and time and time again. But the
effects go beyond the computer screen as well. Researchers are
testing a new product called sideways. Sideways can track what

(37:54):
products catch your attention on the shelves in a local
brick and mortar store. I have seen this really at
a grocery store or something. I don't know it's this
particular like sideways TM version, but where there will be
a camera set up in a specific aisle where you
know it looks as though it could just be for

(38:16):
security purposes to check and see. Maybe if you're in
a grocery store and there's a security camera right next to, um,
some of the medications or something, but maybe a hot
ticket item that's expensive and small, right, But I swear
to you it's got not only is a camera set
set up there, it's got the display right there and
it says it's recording while it's doing it. And again

(38:38):
it could be just CCTV, But why would you not
take something like that and then be looking at how
people like are their eyes and how they're looking at
things and how they're reacting, Because as a security officer,
you could probably ascertain whether something wrong is going on
by using the same things DHS is doing. Um, why

(39:00):
I would Kroger, sorry I said it. Why would Kroger
or Publics are one of these other big companies not
be working with some advertising company to have data on
that stuff? Absolutely Yeah, the overhead investment is just pinto
beans in comparison to you know, the the possible profit potential.
And is that the kind of thing you would have

(39:21):
to opt out of or could you even opt out
of it if you didn't even know what's going on?
Right right? Where's the informed consent? You're in the store?
In the store. But that could be one of those
things where it's like, Okay, let's say I go to
a music festival and their signs saying, by the very
fact that you are putting yourself in this environment, you
are de facto consenting to be filmed, right right right,

(39:42):
which maybe doesn't hold up in court. I don't know
that's it does in public? Like if if we're walking
down the street, is it public though it's a gated
thing I made money to go into, you know what
I mean? Yeah, I guess it's up to them then, right.
I just I just wonder how this would work. Obviously
we're not there yet, but I think this begs the
question of, like, how responsible do these companies have to

(40:04):
be to let people know that this is going on,
This kind of data collection is going on. Cough cough,
not very cough. Yeah, they don't. I mean there's not again,
there's not too much oversight at least, like let's go
back to this sideways or stuff like sideways. So you
walk up to the the item. Let's see your you're
deciding between the store brand we have cereal at home

(40:26):
type of cereal, or maybe you're gonna splurge and get
the yeah, get the caution. There you go like I
earned it. I earned this fiber or whatever. That's your right.
That's how your party and that person and the grocery
store has this system that will look at not just
what you ended up getting, but will tell us a

(40:49):
little bit about the decision process you went through, the
what's called the decision tree that lead you to CAUSEY right,
and from that you will probably have some sort of
linked contact avenue. Right. You know, we'll get more even
if you even if you don't sign up for the
whatever Kroger app or something you will, you will be

(41:10):
able to get targeted ads through Instagram for instance, and
eye tracking is going to be huge for Instagram. What
kind of stuff can people learn about you from your
eye movements? This is where this is where things become
very dangerous, possibly fatal, and we'll explain why. So it
shows not just what you're focusing on, but the order

(41:31):
of operations that you use when thinking through a visual
a visual stimulus, both where you tend to look in general,
like does this person tend to do read the captions
first or something nobody does, or what you tend to
look at and look for specifically? Are you just looking

(41:52):
at the price? Is that how you're shopping right? Are
you looking at the name brand? Is the appearance of
a certain thing a keyword more likely to sway you
toward buying something? Is a bright color standing out to
the majority of people who walk up to this section
of the store. Does someone just have to slap Tom
Cruise's face on anything and you will buy it? Yes?

(42:16):
I thought when you said that, I was just picturing
somebody slapping Tom Cruise's face when he jumps up on
the couch on O breat's like, get off my couch,
Tom Cruise. It's not where you raised in the barn.
This it's not where your feet belong, sir. We raised
it a land where all the floors or couches. That
sounds like a lovely place. I actually really twisted ankle

(42:36):
a lot. You did have stepping between the cushions and
then you'd go down. At least Soften went down. It's
got like an Ottoman vibe. Oh that is it? Oh
we got there, alright, alright, but wait, just there's one
last thing here that we haven't even talked about yet.
And this is this blows my mind. Just by looking

(43:00):
at your eyes something like sideways Ben, who I'm speaking
to Ben? And he's covering his eyes with his very
dark sunglasses. But just by looking at your pupils, these
companies will be able to tell if you're having an
emotional reaction to anything on that shelf. Yeah, kind of
reminds you of the Blade Runners test, right, remember that,
like there's a turtle on its back, it can't get up.

(43:20):
I have a question. Can you guys tell by looking
at my eyes right now that I'm tripping balls on acid?
Right now? Wait? I cannot good, hold on, I'm seeing
a little melting now, that's that's that's just how I look.
Oh is that my acid? Yeah? Okay, sorry, I have
a new pitch. If you're encountering too much Tom Cruise

(43:42):
in your life, you should have something to leverage all this.
Let's leverage all this crazy invasive technology for good if
you if you are seeing too much of Tom Cruise
and something. I propose we had the option of Tom
Cruise Control, which will just like lower his guys all right,
there at least be a plug in for like Google
Chrome or something, you know, just to strip Tom Cruise

(44:05):
from the Internet. That's it's probably the nicest guy. And
he does his own stunts, and he's like three ft tall,
but he's a little aputian, you guys, he's completely harmed.
He's bantam, and he has a lot of energy, but
he's ot like what seventeen At this point, he's basically
a god. Yes, he can materialize, dematerialize whatever he needs

(44:28):
to do. His Thetan's got kicked out, you know years ago.
Be Carefully's probably in the room where he can stop
your heart with one smoldering glance. Yes, I don't know why.
It was a million percent with you there, but I
felt that passionately. No, I was like, yeah, I don't
that's around with him anyway. Let's think about that. It
can it can It can measure your emotional reaction to anything, right,

(44:50):
so it could. We're joking about this, uh Tom Cruise
control thing, but he could do something like that, where
wherein it says, okay, we're and to remove the following
things from a normal ad because this will make so
and so more likely to buy kashi. Right, there's this
is the really dangerous part. Yeah, not just the emotional reaction,

(45:13):
but multiple studies confirm that someone's sexual orientation can be
very easily discerned from observing their eye movements when exposed
to certain images, so or at least their overall attraction
right right, right, exactly, And this seems to be proven

(45:34):
across virtually every every kind of orientation one might imagine.
It seems that the rules for our eyes apply across
the board. This is dangerous because in some countries it
is a crime to have certain sexual orientations, right, and
in the ideologies of certain groups and the ideologies of

(45:57):
certain groups, and you think about power and how it
can change hands and all these things. It's a pretty
terrifying technology, right. Especially we can consider that, not too
long ago, in various countries across the world, of for instance,
being attracted to people the same sex was an automatic
prison sentence, if not effectively a death sentence, right, And

(46:20):
that still remains in places across the world. Today. So
imagine that you're flying in to this country, you know,
going on vacations, name a country, I don't want to
pick on any specific one. And then you are stopped
at customs, which is normal. They look at your passport,
they may have a connection with five Eyes or in
her pool to make sure you're not an international taffy

(46:41):
thief or whatever, and then they deny you entry because
of some kind of test or facial scan or some
data they had from you literally just walking in, yeah,
walking in, or looking at people on Instagram or something,
or on Facebook or whatever, what have you, or maybe

(47:02):
on the displays on the plane. Insidious, I like it. Yeah.
Also also maybe just data collected from the TV you're
watching that is also watching you. Which is the best
place to put these things, TVs and phones. So this
means that someone without their informed consent could suffer legal

(47:23):
consequences for something that is not a crime in their
home country, right, or something that really isn't other people's business.
It would, being honest, mm hmm. So this leads us
to our conclusion for today, which is really opening the
door to more questions. At what price convenience? You know

(47:46):
what I mean? What are the advantages. Are you for this?
Are you against it? Or, as we say in the South,
are defer it or again it? If so, why and
if you were one of those people who accepts the
argument that, uh, if you're not doing anything wrong, you
don't have anything to hide. If you're one of those

(48:08):
people is on that side, we would love to hear
from you. We love to hear your reasoning why. Maybe
this is a necessary evil to some of us. Maybe
the benefits ultimately outweigh the disadvantages. Right either way, no
matter how how you feel about it, if you're a
hundred percent on board, a hundred percent opposed, it's happening.

(48:29):
It doesn't matter. It's kind of like not liking the weather.
The weather doesn't care what you think about it. And
the companies pursuing this I tracking um I tracking technology,
I don't really care what you think about it. Either.
The money got too good, and the potential is there.
It's Pandora's jar. The lid has been unscrewed, and that
lid only that it doesn't go back up. Well, and

(48:53):
that's um Well, well, here's what you may be thinking,
because there is certainly a thought I had. I'm in
here with a cell phone that has the front, the
front facing camera covered by just a terrible looking piece
of duct tape. But that's just how I roll because
I'm you know, like you listening out there, I'm aware
of some of these things. Um, you may think, well,

(49:15):
what I'll do is I'll just cover all the cameras
that I'm aware of that I can, and that way
nobody will be able to tell what my eyes are doing,
at least in my own home, right wherever I reside. Um,
because the only other solution is to wear sunglasses all
the time. And people don't, you know, people think something's up,

(49:38):
but it does. It does look cool, and you been
I wish it. Okay, we all look really cool in sunglasses.
I like Nol's got some sunglasses with white frames, those
who keep them on. My my little neck my neck
piece here. When I say neck piece, I mean my
my diamonds, my diamond and crested icy I see it piece.
I like it because it's kind of like a tie
analog the way it catches the light too, you know,

(50:00):
and it really catches the eye and the light. And
in this instance, I can see myself in your chest,
which is yeah, it's important. It's sort of a power
move really, and that you look like a cop with
with your glasses. You have like these aviators, right, No,
they're just the old school sunglasses. Who is that guy?

(50:20):
Then I'm not sure. Okay, I think I'm being followed
by a police officer and it looks like, yes, it's working.
And finally the dream. But yeah, this stuff, this stuff
is happening, and it is tempting to think that there
are um more low tech solutions like hey, guys, stop
freaking out. Just put some tape over your camera, which
obviously we do, but that is not going to work

(50:43):
in the future. No, right. Apple also had a patent
application as well as other technology companies, to have display
screens that include thousands of tiny image sensors built into
the screen itself. Yeah, so if you are looking at
the screen, it would be looking at you. I was
about to say, you're looking at the abyss. The ABYSS
looks back at you, my friends. Here's the here's the

(51:06):
crazy part, this filing, this patent that Apple filed that for.
Do you know what year it was from NOL Can
you want to take a guess? It was a display
two thousand four, But seriously, though, think about when the
iPhone came out. Think about like how often we've been
using phones since then and the technology behind it. Apple

(51:29):
filed a patent for that in two thousand four. Well okay,
I never thought I would say this on this show.
But in Apple's defense, a lot of times technology companies
just file a patent as kind of a way of
calling shotgun. No totally, I get that, I totally. Or
they filed patents to suppress technology, or the government does

(51:51):
that all which has actually happened. We're just we keep
saying the government and that's my spec the US heavy
g uh. It is true that there is a there
is a law on the books wherein if you file
for a patent that is considered to be a threat
to national security, you will lose your patent, it will

(52:12):
become property of the US government, it will be removed,
you automatically get a gag order preventing you from ever
speaking about what happened. It can also happen if the
government already has a secret patent for one of those things.
Oh yeah, secret patents. That's so that's cheating. I get it.
I would do the same thing in their position, But anyhow, Uh, anyhow,
here we are. This stuff is not being suppressed. This

(52:33):
is proliferating, spreading like wildfire to a cell phone near you.
So the question is what what next? What to do
do we fight against this? Uh, this new way of
aggregating data. I mean, have you ever been reading something
online and thought, Wow, it's crazy that I can have

(52:57):
my own thought about this and it's still be private.
Last sacracinct thing in the human experience now is just
the mind. That's it, you know. I I think what's
going to happen, and it's already starting to occur a
little bit, but I believe companies will begin to thrive

(53:18):
that specifically are their their goal in creating products is
to subvert this kind of tracking, in this kind of surveillance.
And again you can there are some companies out there
that are trying to create a wearable tech things that
you can put on your body, that you wear, your face, um,
face paint, different things like that that will not allow

(53:41):
facial tracking to a like the camouflage that's used on
some warships exactly. And I think those kinds of companies
are going to proliferate and flourish, and you're gonna have
ones that there are specific types of glasses that you
wear that aren't actually you know, they're not to protect
you from UV or as it is just to protect
whatever light sensing camera to see to your eyes. Um,

(54:06):
I really think that stuff is going to proliferate, maybe
disruptive contact lenses that being a little easier. I don't know,
what do you think? I mean? Is it gonna make
my snapchat filters better? Probably? Yeah, that I'm all for it. Okay, Yeah,
that's the way we're gonna end it. I think, well,
that's that's how it's going. I think that's how the
conversation is going to continue. You know what I mean

(54:29):
because right now here where this show is based in
the United States, there doesn't seem to be an immediate
threat posed by this. Right it will be an erosion
of privacy rather than a complete sudden removal. The frogs
in the water slowly boil. We're already in the water, dudes,
we are. We're already in the water. We're practically dead.

(54:49):
We're too far from shore now. It's true. There's no
coming back. You can't you can't roll back the tide.
Despite that being an Alabama football's teams, whatever you call it.
I still don't understan know what that means. I have
no idea what is this crimson tide? I know they're
rolling tides, but like why when the roll tide that
means like trample the other side? Let us know, Let

(55:12):
us know, give us your football hot takes, and also
give us your advice we have. This is clearly not
a sustainable situation right over the long term. We don't
know what's going to happen. What advice do you have
for your fellow listeners? What is your take? Is this
something people should be concerned about, especially in countries where
their involuntary reactions can lead to their imprisonment, uh their

(55:35):
physical abuse, where their death? Is this just alarmist? Is
this thing that will be a flash in the pan
and then disappear in the news cycle. What happens when
that black mirror episode really does come true? Where if
you're not looking at the ad uh, it pauses and
you can't foresee. That's what Seth was talking about earlier.

(55:56):
What's your favorite Snapchat filter, kiddies, You like the puppy
one where you stick your tongue out and then the
puppy tongue comes out. How do they do that? I
don't know. Let us know right to us, is the
facial recognition worth it? A lot of people say, yes,
you know, let us know. You can call us directly.
We are one eight three three. You can also contact

(56:23):
Now that we've said all these terrible things about social
media using eye tracking to learn your innermost secrets, Hey
find us on Facebook along with your fellow listeners at
Here's where it gets crazy. Yeah, we only use a
segment of Facebook. We don't do use the wide or
Facebook where we only use the Facebook group because that's better,
that's right. And Twitter conspiracy stuff and Instagram conspiracy stuff show.

(56:44):
You can check me out on Instagram at how now
Noel Brown. If you want to see me dabbing with
my kid, and I don't mean dabbing like the weed thing,
I mean like the things the dance or whatever. Thank god,
you can find me getting kicked into and out of
various countries at Ben Bowling on Instagram. I'm at Ben
bull in hs W on Twitter. If you've got a
hot take you want to just send, send our way. Yeah,

(57:06):
and find me on Instagram. It's Maya Seth big fan
twenty four and that's the Instagram handle. Uh, you'll find it.
It's all yeah, yeah, Well it's weird is that you
made that before Seth's first day. I know, I had
no idea and it's just kids. Meant that we're having
this recording session to kids. Mit, man, there we go.
So wait, what if someone says, guys, I hate the phone.

(57:28):
It's real friends, text, I hate social media. It's twenty nineteen.
We all know there's something screwy going on there. But
I have something I need to tell you are, more importantly,
my fellow listeners. Where where O? Where can I find
a way to talk to you? We have good news
for your friend. You can still send us a good
old fashioned email. We are conspiracy at iHeart radio dot com.

(57:52):
Oh and one last thing to mention here. On Monday,
I'm gonna be a guest of sorts I Guess on
the finale episode of the season of Ephemeral. It's a
newer podcast that's put out by I Heart Radio and
it's a show about ephemeral media. And in the episode,
the creator and host Alex Williams, and I talked about
this tape that I found of my grandfather trying to

(58:15):
record the first words of my mother. It sounds a
little weird in in this context, but I assure you,
within the context of that show, it's gonna make a
lot of sense. I hope you'll listen. Find Ephemeral right
now on Apple Podcasts, the I heart Radio app, or
wherever you listen to podcasts. Stuff they Don't Want You

(58:50):
to Know is a production of iHeart Radio's How Stuff Works.
For more podcasts from my heart Radio, visit the I
heart Radio app, Apple Podcasts, or wherever you listen to
your favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows
Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.