All Episodes

November 12, 2024 57 mins

Ads are ubiquitous on the internet, and even if you use an ad blocker or two, you're bound to see a few things slip through. Luckily, those ads don't really give advertisers any new information about you unless you interact with them... right? Not so fast. Eye tracking technology can glean an enormous amount about your attention, as well as your reactions to a given image or piece of language, just by watching how you watch, gaze or glance at an ad. So how much can they learn, exactly? Does eye tracking allow companies to, in some sense, read your thoughts? Strap in for the answers to these questions and more in tonight's Classic episode.

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Fellow conspiracy realist, Welcome to tonight's classic episode What are
you looking at? Statistically? You're looking at your phone and
you're hearing this while you're doing something else. What did
you look at? Did you look at the top right
corner of your phone? Did you look at the bottom left?

Speaker 2 (00:13):
Did you buy that thing that the Instagram ad told
you to? I did? It was not a very good product.

Speaker 3 (00:19):
When the pop up ad occurred. How long did it
take your eyes to move down to that ad and
then move away?

Speaker 1 (00:26):
When your Instagram or your TikTok algorithm pops up and
you get the whole, you know, cornycopia of little bits?
Which one do you stare at? It doesn't even matter
if you click.

Speaker 3 (00:39):
How long did you look at that lady bouncing around?
I bet it was too long? Trampoline stuff acause.

Speaker 1 (00:50):
Yeah, guy buys so many trampolines. Yeah, you gotta have
the context, folks. This is a favorite of ours. Ads
are ubiquitous on the in They're the achilles heel of
many Internet businesses, including podcasts, and we had a lovely
and disturbing conversation back in twenty nineteen about ad blockers

(01:17):
about the old question regarding how much does my social
media platform or my phone know about me? And this
is if I recall correctly, guys, this is the one
where we talk specifically about eye tracking.

Speaker 3 (01:32):
Yeah, in twenty nineteen, that stuff was pretty good, that
tech that could just watch what you're doing. So you know,
before you even start this episode, go ahead and cover
up that camera and then let's get and then let's
be no.

Speaker 1 (01:47):
Pull pull up a picture of us and look at it. Yeah,
we should get a we should get a little clip
of all of us just jumping on a trampoline.

Speaker 3 (01:57):
Oh man, I do want a trampoline, so many of
course we.

Speaker 1 (02:02):
Can be cool.

Speaker 3 (02:05):
And we'll jump right in on the trampoline. And then
also to this.

Speaker 4 (02:09):
Episode, from UFOs to psychic powers and government conspiracies. History
is riddled with unexplained events. You can turn back now
or learn this stuff they don't want you to know.
A production of iHeart Radios How Stuff Works.

Speaker 3 (02:34):
Welcome back to the show. My name is Matt, my
name is Nola.

Speaker 1 (02:37):
They call me Ben. We are joined with our super
producer MAYA nickname to be decided cold today, So right
in and give us suggestions for cool nicknames. Mission Control
has already taken. Paul is on an adventure. We'll return soon.
Most importantly, you are you, You are here, and that

(02:58):
makes this stuff they don't want you to know. Quick question, folks,
as you are listening today, what are you looking at?
What do you often do when you're listening to a podcast.
A lot of our fellow listeners have written in to say,
I listen to stuff they don't want you to know
when I'm exercising or when I'm doing housework. Sure. Yeah,

(03:20):
when I'm in the middle of another smash and grab
on a crime spree. We're kidding, don't do crime.

Speaker 3 (03:28):
A lot of it is about people doing hobbies, whatever
their hobby is, at their home. They're listening to a podcast.
Ben and I off of Mike, just had a discussion
about us working or on a computer or playing a
video game, even on mute while listening to podcasts.

Speaker 2 (03:44):
But what do your eyes do while you're busying your
ears with us?

Speaker 1 (03:49):
Wow? Yeah, I after doing some research on this, I
actually started wearing sunglasses in the.

Speaker 3 (03:57):
Studio, at least for today's episode, not just for like
to look awesome and be like slightly intimidating.

Speaker 1 (04:04):
I don't you know, I'm flattered, but I think I
think most people just find it annoying. But it's honestly,
it's a bit freakish what we're exploring today, and we
would like you to be part of this conversation. So
as we always like to say at the top of
the show, if you have something you want to let
us know immediately post taste before you forget. And you're like,

(04:24):
I don't have time to write out an email, I
need to hop on the horn right now. Well, you,
my friend, are in luck because you can call us directly.

Speaker 2 (04:33):
We are one eight three three sdd WYDK.

Speaker 3 (04:38):
Yes, that's our number, calling right now. Leave a message,
tell us anything you want.

Speaker 1 (04:41):
You have three minutes. And remember, if you do not
wish to be identified on air, that's all jolly good.
Just explicitly tell us that.

Speaker 3 (04:50):
Or if you just want to tell us something and
you don't want it to go on air at all,
tell us that too.

Speaker 2 (04:54):
Or you use one of those voice disguisers the serial
killers use. That's entirely up to you and on your dime,
serial killers.

Speaker 1 (05:00):
That's profiling.

Speaker 2 (05:01):
That's fair, that's fair. Here's a thing we never mentioned
a the top of the show that I would like
to today, If you would be so kind, please leave
us a review on whatever podcast platform of choice allows
you to do so, preferably Apple Podcasts. I think that's
what they're calling it these days. Say something nice. It
helps with the algorithm, it helps raise the profile of
the show, and honestly, considering how many people listen to
the show, the number of reviews we have is a travesty.

(05:24):
So help us with that.

Speaker 1 (05:25):
Yes, vote your conscience please. So before you heard our
our weird voices coming through your ear holes, you probably
heard an add Maybe it was a promotion for one
of our many new podcasts. Maybe it was some some

(05:45):
offer to go to a website like Great Courses or
something like that. A long time ago, years and years back, Matt,
you and I had an unsuccessful experiment. We've had many experiences,
yes this right, Yes, I do. Years ago we decided
we're looking at big data and advertising in general in

(06:06):
these our modern days, to try and count every single
ad we saw in the space of one day. And
we failed.

Speaker 2 (06:16):
Yeah, because eventually you just give up.

Speaker 1 (06:18):
I lost count after two hundreds.

Speaker 2 (06:20):
Yeah.

Speaker 3 (06:21):
Yeah, And I remember we had lengthy discussion about what
constitutes an ad, right, is a logo like just a
singular logo without any other words or a picture that
represents a company.

Speaker 2 (06:32):
Is that an AD?

Speaker 3 (06:33):
I would argue, Yes, I think so too, And in
that case, we I mean, we weren't even counting correctly
because there were so many of those that we encounter
every day.

Speaker 2 (06:42):
Although the nature of who pays for the AD or
how the AD is distributed probably should be on the table,
because if I have an Adida's hoodie, I paid money
for that Adida's hoodie because I think it's cool, but
it's also technically an ad for Adidas, and they didn't
buy it though I paid them for the privilege of
wearing their at.

Speaker 1 (06:58):
Right, you paid them for the privilege of ads advertising for.

Speaker 2 (07:00):
Them, That's right, So congrats.

Speaker 1 (07:03):
That's why I usually that's why I usually am adverse
to many brands. But but you know, that's the nature
of the beast, right, people like those identifiers. Ads are
not necessarily a terrible thing. It would be ridiculous for
us to argue that on this show, right, because there
will be an AD here at some point. However, ads

(07:26):
are so ubiquitous advertisement in some form every day most
of us are surrounded by institutions, technologies, and various groups
that try ardently to collect as much information about us
as possible and then analyze it six ways from Sunday.
And you know, often they get things hilariously wrong. Like
have you you've probably you've probably had this thing where

(07:48):
you order something that's maybe a one time purchase. You
get like a toilet seat replacement or whatever, and then
the next thing, you know, for a week or two,
you're inundated with these ads for toilet seats, like there's
some kind of regular toilet seat enthusiasts, Like here's some
sort of attic. And Facebook is thinking, well, if we
just keep showing them the ad, look, what are they
gonna do? Are they gonna say just one more? I'll

(08:10):
treat myself. I guess this.

Speaker 3 (08:11):
Guy's on a custom toilet seat kick right now. We
really got to flood in while the strove while the
iron's hot.

Speaker 1 (08:17):
Or you buy a car and they're like, you know what,
this person who just bought a Corvette needs three more Corvettes.

Speaker 3 (08:23):
Yeah, just wait till you have a child. Ben, one
day you're going to have a child. I can I
can foresee it. I had a prophecy and when you
have a child, you realize just how many ads are
thrown at you about having a baby.

Speaker 1 (08:38):
Yeah, and it gets pretty invasive, right. There are not
as many privacy safeguards as one would expect when it
comes to the world of data aggregation. We always talk
about the famous and completely true, by the way, folks
story about target predicting a pregnancy in such a way
that surprised the parents of the expecting mother. At times,

(09:01):
advertising now, especially in the digital space, can feel close
telepathy or an attempt at it, but it's not quite
there yet, and that is what today's episode is about
in large part. So here are the facts. It's no
secret that advertisers are infamous for their relentless pursuit to
learn everything they can about you, no matter who you are,

(09:24):
in the hopes of selling you something at some point. Yeah.

Speaker 3 (09:28):
But there's this one thing that's always alluded all of
their efforts and all of these companies and how much
time and money they've spent on this, And it is
your physical bodily reaction to everything that they're doing, to
the ads that they're putting in front of your face,
to the how bright, how dark like, what color? All

(09:49):
of these things they cannot tell precisely how you are
reacting in real time.

Speaker 1 (09:55):
Which just to give you a sense of the stakes
here and how much information these organizations would like to collect.
If you had, for some reason, I'm not making fun
of anybody, if you had, for some reason a physical
reaction where you had like a little anxiety fart every
time you saw something, they would want to know.

Speaker 3 (10:13):
That yes, yes, yes, just so well that was one.

Speaker 1 (10:19):
That was one. There's not really a limit for data collectors.
There's not a moment where they would say, well, this
is too much. We're getting a little weird right now.
Whether there whether we're talking billboards or banners or ads
on your phone or your freemium game or whatever you're playing.

(10:40):
Advertisers have always had to rely on other indicators to
see whether they could grab someone's attention. So they know,
for instance, in a YouTube ad or something where you
have five seconds before you can click right, they know
that you are doing something during those five seconds, But
are you paying attention the ad or are you you know, like, oh,

(11:02):
I'm gonna I don't know, drink soda.

Speaker 3 (11:05):
Well, yeah, especially just that concept if you're looking, if
let's say you're using your phone right and you've got
YouTube playing, a YouTube ad comes up, how are they
going to know if you're actually looking at that ad
or just looking somewhere else while it's playing, or even
putting your phone down. Right now, there are ways on
most smartphones to tell if you've put it down, if
you've turned the screen off, if you've clicked They definitely

(11:27):
know if you've done anything like that. But as of
right now, nobody is, well, at least, oh god, we're
about to get into it, but nobody's using the camera
to actually look at you while you're looking at.

Speaker 1 (11:41):
It officially, officially, officially, there are hacks of plenty, And
who reads their terms of service? You know what I mean?
Who does? I had an old friend, professor at a
local university here who was a graphic design guy, and
he was working on a project to make terms of

(12:02):
service in those agreements more readable. Oh, like a really
cool thing that automatically sorted the stuff you should be
concerned about and popped it up at the top.

Speaker 2 (12:12):
That's really great, nice and cool.

Speaker 1 (12:14):
His funding was pulled. Oh yeah, there's just not an
incentive for it. Yeah. So here's a little experiment that
we found in Slate and I want to want you to
think about this just for a moment. Don't do this
if you're driving, if you're in the middle of like
your MMA training or whatever. Just wait, wait till you
have a safe place to sit down.

Speaker 3 (12:33):
Oh wait, and I've got an idea. If you can
find a mirror, look into that mirror while we read
off this this list.

Speaker 1 (12:41):
Oh interest, it's a little different.

Speaker 3 (12:43):
Yeah, it's we're mixing the senses here in the reaction.
But we just want to see if.

Speaker 2 (12:48):
You notice anything.

Speaker 1 (12:48):
Okay, so consider for a moment, mirrors at the ready
the following list.

Speaker 2 (12:54):
Republican, abortion, Democrat, future, Afghanistan, healthcare.

Speaker 3 (13:03):
Same sex marriage. Now we've tried, I think all of
us to just say these as straight as we can,
just like say the word without trying to add too
much to it. But we want to know if you've
noticed that there was was there any reaction in your
face when you heard any of those words.

Speaker 1 (13:20):
Maybe a dilation of the pupils at one point, right
or narrowing, Maybe your eyes flicked in one direction or another.
Did you notice a blink that ordinarily would have passed
you by. There is a tremendous, profound enormous amount of
information reflected in the way you just read that list,

(13:40):
if you read it written down, or if you're watching
your own reactions.

Speaker 3 (13:45):
Honestly, we don't know exactly how it's going to be
if you just hear it precisely, but for certain there
was some kind of reaction.

Speaker 1 (13:53):
Sounds a tremendously powerful thing. I learned a cool fact
about sound. This is a total tangent. Okay, okay, So
this came to us from Will Pearson, a friend of
the show and a host of our podcast, Part Time
Genius Will Kick a couple studies to us that showed
sound can affect your experiences in subconscious ways. They don't

(14:15):
that you are not going to be aware of. If
you are eating, for instance, stale pringles potato chips and
someone is playing the sounds of some crunchy thing, yes,
then your brain will believe that the chips themselves are
fresh and not stale brilliant. And this is it's so
easy to affect people's sensory input. But here's what happened.

(14:36):
When if you read those words that we just intoned,
then you may notice that your eyes paused for a
fraction of a second on certain words. And this is
definitely true for most of us because we put in
some phrases that could be considered oppositional right, Democrat, republican,

(14:56):
same sex marriage, et cetera. These things can be divisive
for some people in the US. And probably what was
happening was our brains as we read along. We're trying
to connect these things, even though they just exist independently,
right as fragments of a sentence. But did your pupils
dilate a little bit while you're reading the list at

(15:17):
which word you know? Did you blink at a different rate?
Did you backtrack to reread any words? And if so,
which ones, when and how long? Eye tracking can show
you all of this if you are an advertiser or
even more disturbingly, a government, this concept has also been
called the Holy Grail of advertising. It uses the image
from one or more cameras to capture changes in movements

(15:40):
and structure of our eyes, and it can measure all
of these things with scary or wellyan accuracy. Yeah, that's
where we're at, right now. That's that's right now. It
hasn't it hasn't really mainstreamed yet.

Speaker 3 (15:56):
Well, yeah, it's it's still one of those technologies and
approaches that you basically have to do it in an
environment where you're testing right now, because can you imagine
right now, if let's say everyone right now has an
Apple phone, some kind of Apple phone, it just rolls
out and the Apple lets you know, hey, just so

(16:16):
you know, your front facing camera is going to be
watching you at all times. Whenever you're looking at your phone,
it's going to be tracking exactly what your eyes are doing,
and it's going to be taking that data and sending
it back to Apple.

Speaker 1 (16:27):
Do you think they'll keep Siri as the branding or
do you think they'll make a new, ostensibly friendly entity
for that. It's like, you know, every time you pull
up your phone, if you don't have time to type,
don't worry. Danny can tell what you're looking at, and
Dany'll help you.

Speaker 2 (16:43):
I feel like Sirih's been looked at as kind of
a flop anyway, so maybe it's time to rebrand.

Speaker 1 (16:47):
Siria has been doing some great work for memes though,
is that right? Yeah? Just like the meme economy, all right.
I don't know how.

Speaker 2 (16:54):
I don't know any SIRIU memes. I don't use Siri
memes so.

Speaker 3 (16:58):
Wait, okay, that's I'm trying to think about your question.
Sure with uh Siri and what it's going to be,
And it definitely, it definitely feels like a Black Mirror episode.
We were just talking with our producers out there maya
in Seth by the way, Shadow producer Seth it's literally

(17:18):
his first day.

Speaker 2 (17:19):
It's his first day, yeah.

Speaker 3 (17:21):
And we were talking with him about how Black Mirror
this this topic in this episode is Oh, I'm trying
to get back to what I was going to talk about.
I got lost on Seth there for a second. Okay,
So we were just we were discussing how Siri as
a thing feels like it will be what the new
phone is. It won't be like a phone. It will

(17:43):
be a Siri. Yes, yes, some kind of personified thing
that exists that you just interact with as another.

Speaker 1 (17:54):
Being, something approaching AI as a personal assistant.

Speaker 2 (17:57):
That's what it feels like. But it will just be
that the device you have on you at all.

Speaker 1 (18:00):
Time, and eye tracking would be a fundamental piece of that. Again,
this stuff is not a theory. This is not necessarily alarmist.
We're working, we're working to stay away from being too
too scary about this, But this is where we are
right now, and we'll pause for a word from our sponsors. Yeah,

(18:23):
and we'll tell you what's happening pretty soon, sooner than
you think.

Speaker 3 (18:27):
Yeah, make sure you do not pause. Just listen all
the way through this and keep your eyes as open
and stare as for as straight as you can forward
while listening to the sponsor.

Speaker 1 (18:36):
Take your take your hands, take your pointer fingers and
your thumbs, and hold ratchet open.

Speaker 2 (18:42):
And let's see where everybody is.

Speaker 1 (18:44):
I feel silly. No one can see us doing this.

Speaker 2 (18:47):
We both did it though.

Speaker 1 (18:56):
We're back. And before we vilify eye tracking, which you know, honestly,
I am totally down to do, we should mention that
it's not an inherently evil or bad thing. Right. There
are some advantages.

Speaker 2 (19:10):
Yeah, so for folks who are into this technology. There
is an enormous benefit for people who have disabilities or
for example, the creation of immersive games. But how far
does it go and what are the implications? These are
the answers that may disturb them.

Speaker 1 (19:28):
Definitely, here's where it gets crazy. You're absolutely right. Okay, First,
the argument about disabilities is entirely compelling and valid. If
someone is for example, paralyzed below their neck, then they'll
rely on mouth movements maybe, or there are already forms

(19:48):
of eye tracking I believe that exist in order for
them to maybe type with their eyes or something like that.
This takes it to another level and it could be
a of tremendous and significant benefit to these people. It
also is, as you said, Noel, it's probably great for
VR games. Oh yeah, I haven't tried it.

Speaker 3 (20:09):
But it could be incredible instead of having to make right, yeah,
it would make a lot of sense.

Speaker 1 (20:15):
Also, I just I don't plan to. I wear sunglasses
all the time, but it's not because the lights are
too bright. It's because I already in real life don't
want people to know where I'm looking, you know what
I mean. I can't handle taking that to a digital realm.

Speaker 2 (20:29):
It's all good man, Well.

Speaker 1 (20:31):
Here we are right now. Eye tracking seem like a lovely,
nice convenience. But the implications are far reaching and too many,
deeply disturbing and deeply concerning at the very least, And
we're always really careful to separate our opinions from the facts,
so I will say it. We always try to be
very explicit about this. It is my opinion that this

(20:54):
is a bad thing, and I'm glad to. I'm glad
to defend why if because usually the three of us
are on the same page. But if there's a disagreement,
I am I am glad to. I want to be
persuaded that this is overall a good thing. I don't
buy it.

Speaker 3 (21:10):
Okay, Well, I'm going to go into it as though,
just for fundzies that I believe that it's the greatest
thing ever and we have to institute it just to
make sure that we know everything about you that we
need to know, that we need to know, yes, we
all of us.

Speaker 1 (21:26):
Right. Well, first things first, as we said, this isn't theoretical,
it is happening. There is virtually no government oversight. Zero
government oversight.

Speaker 3 (21:36):
Sounds good, sounds like something the private entities.

Speaker 1 (21:39):
Want, and that's that's I mean, that's a larger trend
we're seeing, you know, private private control or private subversion
of things that once existed as legal oversight in the
governmental sphere. For example, there used to be the do
Not Call list. Do you guys remember that?

Speaker 2 (21:56):
Oh? Yeah, I was on it. It's still a thing,
right it is, But does it work? And I guess
it doesn't, doesn't. I mean, I've had people who you
know when you call and it's a human. That's the
problem though right now, more more often than not, it's
a robot. So you can't really yell at a robot
and say, put me on your do not call list?

Speaker 1 (22:11):
Right, yeah, exactly right, And we have to ask ourselves,
does the do not call list still exist? I mean,
things are these kinds of intrusions are becoming more and
more frequent, and more and more invasive, and really private
industry use is leading in the charge. Apple has filed

(22:32):
a patent for a three dimensional eye tracking user interface.
There's a European company called uh Sense I get it, yep,
because it's like a it's like an eye. It worked
on that. It wants to have eye tracking software baked
into smartphones very soon. And as this stuff becomes increasingly

(22:53):
deployed in your laptops and your tablets, in your phones
and your TVs. The freezer, yeah, smart refrigerator, Yeah, this
will open a new front in the debate over privacy.
As it stands now, privacy is very much an endangered species.

Speaker 2 (23:12):
Yeah.

Speaker 3 (23:13):
We talked about it as a good thing. With respect
to virtual reality game playing or whatever. However else you're
gonna be using VR or AR glasses like the old
Google glass or some of the new prototypes that are
coming out. Eye tracking will pretty much in the near
future enable all these companies to collect your most intimate

(23:33):
in I mean, for the most part, subconscious or unconscious
responses to the entire world around you, any and all stimuli.
They will have cameras facing at you so long and
at all times that literally everything you encounter will be recorded.

Speaker 1 (23:51):
I thought we weren't gonna go dystopian this time around.

Speaker 2 (23:54):
Well, I mean, unfortunately, this is this is.

Speaker 3 (23:59):
Okay, supposed to be the part of the one who's
positive for this, right, I'm not gonna have any kind
of existential breakdown right now about this.

Speaker 2 (24:06):
It's fine, I'll do it.

Speaker 1 (24:08):
We can switch sides if you want.

Speaker 2 (24:09):
Can we switch that? Okay?

Speaker 1 (24:11):
Okay, here, let's see. Let me take a swing at
this on the positive side. All right, So I tracking technology.
My position now is that it's great's superb, it's top
notch because it helps advertising become more effective.

Speaker 3 (24:29):
Okay, So I'm only going to see the ads that
really do pertain or interest to me.

Speaker 1 (24:35):
Right, So you're looking at your phone or you're looking
at your tablet or your laptop, and advertisers receive this
feedback that say, oh, Matt Frederick's eyes lingered for a
few seconds on an advertisement, but then he didn't click
on it. So how does he really feel about toilet
seats or corvettes? How did Matt's eyes move as he

(24:57):
looked at an ad? What did he look at first?
We prioritize that in the future right to bring the
corvette more in the forefront. Are there certain words, phrases
or topics that are that are things Matt Frederick loves
or they're things that are super like turn offs for him,
Like whenever he sees something that mentions velcro, he's like,
screw this, burn the system down, I'm going outside.

Speaker 3 (25:22):
Yeah, or pickled okra, like get out of here, get
out of here with you.

Speaker 2 (25:26):
Just don't even put that on my phone.

Speaker 1 (25:28):
So then this means that successive ad campaigns will be
more tailored to you. The argument on paper is probably
something like they're more organic, or they're integrated, or that wait,
I'm going back to my real side, hang on, yeah,
or they're less invasive because it's something that you are
predispositioned to show an interest in.

Speaker 3 (25:50):
Okay, all right, well I'm coming back on your side now.
So when companies are trying to figure out what kinds
of stories we want to hear, about what kinds of
movies and television should be produced, they can think about
all of these different things in a giant cloud of
words and phrases that they can mix together and make

(26:11):
me the stories that I want to see.

Speaker 2 (26:13):
That's great.

Speaker 1 (26:14):
And the way they're able to do this is through
the collection and preservation, the hoarding and be fair to say,
of the data they collect from the again largely unconscious
movement of your eyes and largely uncontrollable. So we're talking
about millions of people's eyeball footprints for lack of a

(26:36):
better word, and then hundreds, probably thousands of companies in
the near to mid future who will collect this across
the digital ecosystem. What will they do with all that information?
Whatever they want? Yay, that is ultimately the answer. That's probably.

(26:57):
I mean there will be people who want to split
some hairs over it, but ultimately, whatever they want.

Speaker 2 (27:03):
It's like anything else on the internet, right, I mean,
it's all of the stuff that you pump into Facebook
knowingly knowing that is what pays for your privilege to
use Facebook. It's just going to get more and more specific.
Before you know it, We're gonna have like biometric scanning
that we're agreeing to allow be processed, and we have
no way of knowing what the end results are or
the end user is going to do with that information.

(27:24):
Anybody that works at mailship already gave this stuff over.
It's right, Well, that's the thing too, Like for example,
this is a little bit off the subject, but it's similar.
Like when marijuana was recreationally legalized for the first time,
like in Los Angeles, and probably even when it was
you know, prescription, you had to let them scan your ID.
You had to let them scan your ID into some database,

(27:46):
and you don't really have any control over what they
do with that. It gives you the privilege of buying
marijuana legally, but you're sort of rolling the dice as
to is this going to come back and bite me
in the ass if it's not legal in my state. Like,
there's all kinds of different things.

Speaker 1 (27:59):
What if you get a job at a federal organization,
you know.

Speaker 3 (28:03):
And you know they're doing the same thing with nicotine
sales in Georgia right now, or at least they're proposing,
and I think it's not officially implace yet, but it's
being done in some places here in Georgia where if
you buy any nicotine product, they scan your ID and
imagine the implications of that. Then it goes back to
your healthcare provider and all of that.

Speaker 1 (28:22):
It's like Flannery O'Connor said, all all that rises must converge, right,
but now in a really uncool way. Sorry Flannery, yes.

Speaker 2 (28:30):
But also thank it's true for saying that.

Speaker 1 (28:34):
So this is this is true, and that's an excellent point.
The purpose is to aggregate this and get as close
to predictive potential as possible. In theory, this stuff would
be anonymous data they would they would describe as non personal.
But in practice it's going to be very easy to

(28:54):
penetrate the anonymity of that because think about it, your
eye movements will be large unique. They will be largely
tied to a device that is already identified with you,
your smartphone right when the the GPS is trackable as well,
so they know where you are when you're looking at things,
sort of like uber I believe it was. Uber is

(29:17):
using some of that same technology to float the idea
of a system or a function in their software where
they can not pick up people who have been drinking,
which to me sort of defeats the purpose of a
lot of Uber rides. But they can see and they'll say, okay, well,
this person is calling, this person is calling a ride,

(29:37):
but their phone says they've been at this place for
you know, three hours and it's a bar or something.

Speaker 2 (29:44):
And you're just like, I'm not picking up that drunk patron.

Speaker 1 (29:47):
Right or something like that. But again, that's what Uber
drivers and Uber customers have signed on for them, just
saying there's a lot that we can get from phones
and these other similar devices and that you might not suspect.
And now this is going to this is going to
kick it up to an entirely new level. I tracking

(30:09):
from tablets and smartphones is tied to a unique device
identifier associated with one specific device. So maybe they don't
know that your full name is Noel Brown or Matt
Frederick or Mia Cole, but they will know that it is,
you know, an AT and T iPhone six s or whatever,

(30:32):
and they'll know where it is and they'll know where
it is. So because this ties into location tracking information,
it can also tie to the locations that you show
up at all the time. Right, most of these studies
are being on this eye tracking stuff for being conducted
in closed environments. But the technology is already it's already

(30:54):
been out there. What we're telling you about is not
just happening. It already happened. Although rumors that you've been
reading about regarding Facebook's microphone being on or something like that,
it's quite possible that you already unknowingly have an app
or some piece of software on some electronic device that

(31:16):
is trying to do something like this with your eyes.

Speaker 2 (31:19):
Yeah.

Speaker 3 (31:20):
And you know, just as Ben said, when we come
back after this quick sponsor break, we're gonna talk about
how towards the end there, how this is. This is
literally like one tiny cut, right of the thousand cuts
that we're giving ourselves that will eventually bring about.

Speaker 2 (31:41):
The gray goo grey goose.

Speaker 3 (31:44):
I'm just kidding the don't worry. I'm just I'm just
saying that the machines will be taking over really soon,
and eye tracking is just one tiny part.

Speaker 1 (31:53):
Here's our sponsor, the Singularity, right, That's it.

Speaker 2 (32:03):
Yeah, nanobots, we're talking about premium vodka here, gents, the
gray goose, the gray goose, the gray goo, the goose.
You gotta get that goose, get goosed.

Speaker 1 (32:12):
You're here first. If you've never heard of vodka, okay,
let that potato juice, right, yeah? Yeah, yeah, it said
sweet sweet, sweet sweet moscow potato juice. Let's just start
calling vodka that. Can we call it moscow potato juice?

Speaker 3 (32:26):
Drink responsibly and only if you're over twenty one or
wherever the legal ages and whatever country you reside.

Speaker 2 (32:32):
If you if you're in Russia and whatever age. Yeah,
or don't drink alcohol. That's also a thing. Cranberry juice.

Speaker 1 (32:39):
You can just eat the potato. I feel like we
helped a lot of people with that. I think that's good.

Speaker 2 (32:45):
It should be a shirt.

Speaker 1 (32:46):
Guy.

Speaker 2 (32:46):
I like that you can just eat the potato.

Speaker 3 (32:49):
Hey, guys, where did where did your eyes go when
when we just started talking about potatoes and vodka and
all that stuff.

Speaker 2 (32:55):
What happened to your eyes? We need to know? Send
us your your metrics please?

Speaker 1 (33:00):
Oh wow? Oh please feel free to not do that, okay,
because we don't want to be held responsible. It turns
out that there are many applications for this eye tracking technology,
and this is incredibly dicey ethically. Fraud territory. We talked
about cracking privacy. It's gonna be easier than ever once
this stuff is out and mainstreamed and it's it's fully

(33:23):
acknowledged as the new normal. You're just gonna have to
treat everything you do as though it is a matter
of public record, at least online. But there's another more
troubling thing would I would argue, which is the concept
of pre crime. Remember Minority Report. Yes, yes, this could
that this could happen. As a matter of fact, it

(33:43):
is happening. That's the thing that I keep trying to
emphasize here.

Speaker 2 (33:49):
The gray goo. You're talking about the goo that the
pre cogs are floating around in. That's it, is it?

Speaker 3 (33:54):
No No, I was talking about the nano essentially, just the.

Speaker 1 (33:58):
Nano nanobots functioning as like the philosopher's stone.

Speaker 2 (34:01):
Right.

Speaker 1 (34:02):
It's transformative.

Speaker 3 (34:03):
It can it can create whatever it wants, but it
can also just cover the earth of it continues to
replicate without any controller.

Speaker 2 (34:10):
Where's the goop.

Speaker 3 (34:12):
It's like a gray substance, like a gooey substance of nanobots.

Speaker 1 (34:16):
They just it's a thought project. Yeah, it's not real
yet yet, but we do know that. Oh wait, we
have to do my When did Minority Report come out?
Are we passed a.

Speaker 2 (34:29):
Statute ninety nine? Okay, early two thousand, so we're solid.

Speaker 1 (34:33):
But anyway, spoiler alert here be spoilers after the countdown
three two one.

Speaker 2 (34:38):
Two thousand and two spoilers.

Speaker 1 (34:41):
So so a Minority Report and we've all seen this film.
In case you haven't seen it, Tom Cruise attempts to
act as as a member of law enforcement in the
near future in a pre crime division. And the way
that the pre crime works is the precognitive triplets that

(35:04):
you mentioned earlier. Right now, they have these dreams while
they float in this pool, and then their dreams are
inscribed on these little wooden balls, and then Tom Cruise
reads them and then attempts to stop a crime before
it occurs. Right now, that's still kind of sci fi,
but not as sci fi as you might think, all.

Speaker 2 (35:27):
The while manipulating things on kind of a holographic touch screen, yeah,
which is largely the way we use tablets and stuff. Now,
that's sort of pre pre cogged. You know our use
of touch screens.

Speaker 1 (35:40):
There we go. Yeah, so where are we at in
real life with pre crime? Yes, it is a thing.
We have to recognize that although companies and private organizations
will take the brunt of criticism for this, law enforcement
and security will have big roles to play, and this
becomes very dangerous when we talk about more harsh nation states.

(36:04):
Researchers in the US and the UK have mapped the
correlation between blink rates, pupil dilation, and deceptive statements already, right, yeah,
this is like a polygraph of polygraph actually worked. So
the Department of Homeland Security has already been developing a
pre crime program aimed at identifying criminals before they act.

Speaker 2 (36:27):
Yikes, Seth, Yes, Seth, we knew it before you even
got in the room, buddy.

Speaker 3 (36:35):
We're totally joking. Of course, Seth is wonderful, although we
have only met him for the first time today.

Speaker 1 (36:40):
That's true, But I feel a connection. I'm having like
a cloud outless moment.

Speaker 3 (36:43):
Okay, yeah, I see him out okay, yeah, maya solar
the whole time.

Speaker 2 (36:47):
I can tell. So it's true.

Speaker 1 (36:50):
Though, this DHS program is called in a burst of creativity,
future attribute screening technology or FAST. Huh that is for you?

Speaker 2 (37:01):
Hey, thanks man, You know I'm a real acronym buff.
So what is fast?

Speaker 1 (37:06):
What does it do?

Speaker 2 (37:06):
It's when things move at an alarming rate of speed.

Speaker 3 (37:10):
Yeah, yeses essentially what's going to be happening here very soon.
But yeah, this program is designed to take images analyze them,
specifically at airport security.

Speaker 1 (37:22):
Checkpoints, which is already being instituted. Those facial scans. Have
you guys seen them? If you travel fairly often as
as we do here, then you'll notice they've become sort
of the way that sessame credit was originally instituted. It
was opt in and now it's mandatory.

Speaker 3 (37:39):
Yeah, And it's just to literally scan your face, in
your eyes and what they're doing as you're going through
the airport and through the security process and all the
other things, just to see if are they doing any
kind of weird movement, is there any kind of like
what's your what is your eye position and your gaze
at any given time? What are you staring at? Are
you looking at the security guard cards? Are you looking

(38:00):
at this certain bag? Are you giving off any kind
of facial expressions that would seem suspicious?

Speaker 2 (38:07):
Are you just are you.

Speaker 1 (38:10):
Why are you breathing so heavy? Were you running?

Speaker 2 (38:12):
That's it? Is your heart rate elevated right now? Why
is your heart rate up so high?

Speaker 3 (38:15):
It's just the dogs that are walking past right now
checking everybody's bags.

Speaker 2 (38:20):
It's fine.

Speaker 1 (38:21):
So this is a very controversial application of this because yes,
we could say that this has the potential to prevent
crime and save lives.

Speaker 2 (38:32):
Right.

Speaker 1 (38:32):
For instance, America has a tremendous, ongoing and tragic problem
with mass shootings, which occur at a cartoonish and fundamentally
offensive rate, one being too many.

Speaker 2 (38:46):
Yeah, well so, in theory, like if there were cameras
set up in the way that could do eye tracking
in a public space and someone was being what one
might describe as shifty as the darting their eyes around
at a weird you know, kind of clear up, that
might be a flag.

Speaker 1 (39:02):
Right, So that's the question, could this prevent those sorts
of tragedies. It's weird because it's tough to argue this.
Let's say, for instance, Let's say, for instance, that someone
snaps and they go to a public place with guns
or maybe an explosive or something, but they're clocked by

(39:22):
a camera and then they're arrested. Well, unless they wrote
down or publicly stated what their intent was, they would
just have a legal possession of a firearm, you know
what I mean. And so how can we prove what
would have happened would have happened.

Speaker 2 (39:38):
That's why pre crime doesn't work right, not yet. Yeah,
it basically amounts to depriving someone of due process.

Speaker 1 (39:45):
That's a good point. Yeah, yeah, And the technology always
outpaces the legislation. It's something that we see time and
time and time again, but the effects go beyond the
computer screen as well. Researchers are testing a new product
called Sideways. Sideways can track what products catch your attention

(40:07):
on the shelves in a local brick and mortar store.
I have seen this really at a grocery store or something.

Speaker 3 (40:14):
I don't know if it's this particular like sideways TM version,
but where there will be a camera set up in
a specific aisle where you know, it looks as though
it could just be for security purposes to check and see.
Maybe if you're in a grocery store and there's a
security camera right next to some of the medications or something,

(40:36):
maybe a hot ticket item that's expensive and.

Speaker 2 (40:38):
Small, right, okay, but I swear to you.

Speaker 3 (40:41):
It's got not only is a camera set set up there,
it's got the display right there, and it says it's
recording while it's doing it. And again, it could be
just CCTV, But why would you not take something like
that and then be looking at how people are their
eyes and how they're looking at things, how they're reacting,
Because as a security officer, you could probably ascertain whether

(41:05):
something wrong is going on by using the same things
DHS is doing. Why would Kroger Oh, sorry I said it.
Why would Kroger or Publics are one of these other
big companies not be working with some advertising company to
have data on that stuff?

Speaker 1 (41:20):
Absolutely? Yeah, the overhead investment is just pento beans in
comparison to you know, the possible profit potential.

Speaker 2 (41:30):
And is that the kind of thing you would have
to opt out of or could you even opt out
of it if you didn't even know what's going on? Right?

Speaker 1 (41:36):
Right? Where's the informed consent?

Speaker 2 (41:38):
Yeah, you're in the store or in the store. But
that could be one of those things where it's like, Okay,
let's say I go to a music festival and they're
signs saying, by the very fact that you are putting
yourself in this environment, you are de facto consenting to
be filmed, right right, right, which maybe doesn't hold up
in court.

Speaker 1 (41:54):
I don't know that it does in public, Like if
we're walking down the street.

Speaker 2 (41:59):
Is it public though? If it's a gated things, I
don't need money to go into you know what I mean?

Speaker 1 (42:03):
Yeah, I guess it's up to them then, right.

Speaker 2 (42:06):
I just I just wonder how this would work. Obviously
we're not there yet, but I think this begs the
question of, like, how responsible do these companies have to
be to let people know that this is going on,
This kind of data collection is going on.

Speaker 1 (42:19):
Cough cough, not very cough. Yeah they don't. I mean
there's not. Again, there's not too much oversight at least,
like let's go back to this sideways or stuff like sideways.
So you walk up to the item. Let's see, you're
deciding between the store brand we have cereal at home
type of cereal yeah, or maybe you're gonna splurge and

(42:40):
get the Kashi. Yeah, get the Kashi. There you go
like I earned it, I earned this fiber or whatever.
That's your right. That's how you party. You're that person,
and the grocery store has this system that will look
at not just what you ended up getting, but will
tell us a little bit about the decision process you

(43:01):
went through, what's called the decision tree that led you
to Kaushi right, and from that you will probably have
some sort of linked contact avenue.

Speaker 3 (43:13):
Right.

Speaker 1 (43:13):
You'll get more even if you even if you don't
sign up for the whatever Kroger app or something, you
will be able to get targeted ads through Instagram, for instance,
And eye tracking is going to be huge for Instagram.
What kind of stuff can people learn about you from
your eye movements? This is where this is where things

(43:34):
become very dangerous, possibly fatal, and we'll explain why. So
it shows not just what you're focusing on, but the
order of operations that you use when thinking through as
a visual stimulus, both where you tend to look in general,
like does this person tend to do read the captions
first or something nobody does? Or what you tend to

(43:59):
look at and look for specifically?

Speaker 2 (44:02):
Are you just looking at the price? Is that how
you're shopping?

Speaker 4 (44:05):
Right?

Speaker 2 (44:05):
Are you looking at the name brands?

Speaker 1 (44:07):
Is the appearance of a certain thing or keyword more
likely to sway you toward buying something is a.

Speaker 3 (44:15):
Bright color standing out to the majority of people who
walk up to this section of the store.

Speaker 1 (44:20):
Does someone just have to slap Tom Cruise's face on
anything and you will buy it? Yes?

Speaker 2 (44:26):
I thought when you said that, I was just picturing
somebody slapping Tom Cruise's face when he jumps up on
the couch on alpre It's like, get off my couch,
Tom Cruise. That's not Were you raised in a barn,
is not where.

Speaker 1 (44:37):
Your feet belongs, sir, real home, where you raised in
a land where all the floors are couches.

Speaker 2 (44:43):
That sounds like a lovely place. I actually really twist
an ankle a lot. You did have stepping between the
cushions and then you'd go down, at least at least
because it's when you went down, it's got like.

Speaker 1 (44:53):
An Ottoman vibe, right atom, Yeah, Oh that is it.
Oh we got there, all right? Well, that that's our show.
Oh clearly, Wait all.

Speaker 3 (45:03):
Right, but wait, I just there's one last thing here
that we haven't even talked about yet, and this is
this blows my mind just by looking at your eyes,
something like sideways, Ben, who I'm speaking to?

Speaker 2 (45:14):
Ben? And he's covering his eyes with his very dark sunglasses.

Speaker 3 (45:16):
Yes, so just by looking at your pupils, these companies
will be able to tell if you're having an emotional
reaction to anything on that shelf.

Speaker 1 (45:24):
Yeah. Kind of reminds you of the Blade Runner test, right,
remember that, Like there's a turtle, it's on its back,
it can't get up.

Speaker 2 (45:31):
I have a question, can you guys tell by looking
at my eyes right now that I'm tripping balls on acid?
Right now? Wait?

Speaker 1 (45:37):
I cannot good, hold on, I'm seeing a little melting.

Speaker 2 (45:42):
Now, that's that's that's just how I look. Oh is
that my acid? Yeah? Okay, sorry, I have.

Speaker 1 (45:48):
A new pitch. If you're encountering too much Tom Cruise
in your life, who should have something to leverage all
this Let's leverage all this crazy evasive technology for good
if you if you are seeing too much of Tom
Cruise and something, I propose we had the option of
Tom Cruise Control, which will.

Speaker 2 (46:07):
Just like lower his guys all right, at the very least,
be a plug in for like Google Chrome or something,
you know, just to strip Tom Cruise from the Internet.
I support that. Actually, that's not nice.

Speaker 1 (46:19):
You know, he's probably the nicest guy.

Speaker 2 (46:21):
And he does his own stunts and he's like three
feet tall, but he's a little impusion, you guys, he's
completely harm.

Speaker 1 (46:27):
He's bantam and he has a lot of energy that but.

Speaker 3 (46:30):
He's ot like what seventeen At this point, he's basically
a god.

Speaker 2 (46:35):
Yes, he can.

Speaker 3 (46:36):
Materialize, dematerialized whatever he needs to do.

Speaker 1 (46:39):
His Thetan's got kicked out, you know years ago.

Speaker 2 (46:42):
He carefully he's probably in the room where he can
stop your heart with one smoldering glance.

Speaker 1 (46:46):
Yes, I don't know why I was a million percent
with you there, but I felt that passionately. No, I
was like, yeah, I don't mess around with him anyway.

Speaker 2 (46:56):
But think about that.

Speaker 3 (46:56):
It can it can It can measure your emotional reaction
to anything.

Speaker 1 (47:00):
Right, So it could We're joking about this tom cruise
control thing, but it could do something like that where
wherein it says, okay, we're going to remove the following
things from a normal ad because this will make so
and so more likely to buy kashi. Right, there's this

(47:20):
is the really dangerous part.

Speaker 2 (47:22):
Yeah, here it comes.

Speaker 1 (47:22):
Yeah, not just the emotional reaction, but multiple studies confirm
that someone's sexual orientation can be very easily discerned from
observing their eye movements when exposed to certain images, so
or at least their overall attraction right right, right, exactly,
And this seems to be proven across virtually every every

(47:49):
kind of orientation. One might imagine. It seems that the
rules for our eyes apply across the board. Yes, this
is dangerous because in some country trees, it is a
crime to have certain sexual orientations, right, and.

Speaker 3 (48:04):
In the ideologies of certain groups, ideologies of certain groups,
and you think about power and how it can change
hands and all these things. It's a pretty terrifying technology.

Speaker 1 (48:14):
Right, especially we consider that not too long ago, in
various countries across the world, for instance, being attracted to
people the same sex was an automatic prison sentence, if
not effectively a death sentence.

Speaker 3 (48:30):
Yeah right, And that still remains in places across the
world today.

Speaker 1 (48:34):
So imagine that you're flying in to this country, you know,
going on vacation, name a country, I don't want to
pick on any specific one. And then you are stopped
at customs, which is normal. They look at your passport.
They may have a connection with five eyes or inner
pull to make sure you're not an international taffy fie
for whatever, and then they deny you entrgue because of

(48:59):
some kind of test or facial scan or some data
they had from you literally just walking in, yeah, walking in,
or looking at people on Instagram or something, or on
Facebook or whatever, what have you.

Speaker 3 (49:12):
Or maybe on the displays on the plane that you
wrote in.

Speaker 1 (49:16):
On insidious I like it. Yeah. Also also maybe just
data collected from the TV you're watching that is also
watching you. Yeah, which is the best place to put
these things, TVs and phones. So this means that someone
without their informed consent could suffer legal consequences for something

(49:37):
that is not a crime in their home country, right,
or something that really isn't other people's business will being honest.
So this leads us to our conclusion for today, which
is really opening the door to more questions. At what
price convenience? You know what I mean? Yeah? What are

(50:00):
the advantages? Are you for this? Are you against it?
Or as we say in the South, are you for
it or again it? If so, why and if you
are one of those people who accepts the argument that
if you're not doing anything wrong, you don't have anything
to hide. If you're one of those peoples on that side,
we would love to hear from you. We love to

(50:21):
hear your reasoning why. Maybe this is a necessary evil
to some of us. Maybe the benefits ultimately outweigh the disadvantages.
Right either way, no matter how you feel about it,
if you're one hundred percent on board one hundred percent opposed,
it's happening. It doesn't matter. It's kind of like not

(50:42):
liking the weather. The weather doesn't care what you think
about it, and the companies pursuing this I tracking I
tracking technology don't really care what you think about it.

Speaker 2 (50:53):
Either.

Speaker 1 (50:54):
The money got too good and the potential is there.
It's Pandora's jar. The lid has been unscrewed, and that
lid only that it doesn't go back up.

Speaker 3 (51:03):
Well, that's your correct's ben. Well, well, here's what you
may be thinking, because this is certainly a thought I had.
I'm sitting here with a cell phone that has the front,
the front facing camera covered by just a terrible looking
piece of duct tape. But that's just how I roll,
because I'm, you know, like you listening out there, I'm
aware of some of these things. You may think, Well,

(51:25):
what I'll do is I'll just cover all the cameras
that I'm aware of that I can, and that way
nobody will be able to tell what my eyes are doing,
at least in my own home, right wherever I reside.

Speaker 1 (51:39):
Because the only other solution is to wear sunglasses all
the time. And people don't, you know, people think something's up,
but it does.

Speaker 2 (51:49):
It does look cool on you, Ben, I wish.

Speaker 1 (51:51):
Okay, we all look really cool in sunglasses. I like
Nole's got some sunglasses with white frames.

Speaker 2 (51:56):
Those are keep them on my little neck. My neck
piece here nice. When I say neck piece, I mean
my my diamonds, my diamond encrusted icy I see it piece.

Speaker 1 (52:05):
I like it because it's kind of like a tie analog.

Speaker 2 (52:08):
But the way it catches the light too, you know,
and it really catches the eye and the light well.

Speaker 3 (52:13):
And in this instance, I can see myself in your chest,
which is an odd feeling.

Speaker 2 (52:17):
Yeah, it's important. It's sort of a power move. Really. Yeah.

Speaker 1 (52:20):
And that you look like a cop with your glasses.
Oh really, you have like these aviators right.

Speaker 3 (52:25):
No, they're just the old school sunglasses.

Speaker 1 (52:30):
Who is that guy? Then? I'm not sure? Okay, I
think I'm being followed by a police officer. It looks like, yes,
it's working, finally the dream. But yeah, this stuff, this
stuff is happening, and it is tempting to think that
there are more low tech solutions, like hey, guys, stop
freaking out. Just put some tape over your camera, no problem,
which obviously we do, but that is not going to

(52:54):
work in the future.

Speaker 2 (52:55):
No right.

Speaker 1 (52:56):
Apple also had a patent application, as well as other
technology companies, to have display screens that include thousands of
tiny image sensors built into the screen itself.

Speaker 3 (53:08):
Yeah, so if you are looking at the screen, it
would be looking at you.

Speaker 2 (53:12):
I was about to say, you look in at the abyss,
the abyss looks back at you. My friends.

Speaker 3 (53:15):
Here's the here's the crazy part, this filing, this patent
that Apple filed that for.

Speaker 2 (53:22):
Do you know what year it was from?

Speaker 4 (53:23):
Noel?

Speaker 2 (53:23):
Can you want to take a guess?

Speaker 3 (53:24):
No, it was a display nineteen ninety three, two thousand
and four. But seriously, though, think about when the iPhone
came out. Think about like how often we've been using
phones since then and the technology behind it. Apple filed
a pattern for that in two thousand and four.

Speaker 1 (53:43):
Well, okay, I never thought I would say this on
this show. But in Apple's defense, a lot of times
technology companies just file a patent. Yeah, as kind of
a way of calling shotgun.

Speaker 2 (53:56):
No totally, I get that, I totally.

Speaker 1 (53:58):
Or they file patents to press technology or the government
does that all which has actually happened. God, we're just
we keep saying the government, and that's my specifically the US.

Speaker 2 (54:08):
Yeah, heavy g.

Speaker 1 (54:11):
It is true that there is a there is a
law on the books wherein if you file for a
patent that is considered to be a threat to national security,
you will lose your patent, it will become property of
the US government, it will be removed. You automatically get
a gag order preventing you from ever speaking about what happened.

Speaker 3 (54:30):
Yeah, it can also happen if the government already has
a secret patent for one of those things.

Speaker 1 (54:34):
Oh yeah, secret patents. That's so that's cheating. I get it.

Speaker 2 (54:38):
Yeah, I would do.

Speaker 1 (54:38):
The same thing in their position. But anyhow, anyhow, here
we are. This stuff is not being suppressed. This is proliferating,
spreading like wildfire to a cell phone near you. So
the question is what what next? What to do do
we fight against this? Uh, this new way of aggregating data,

(55:00):
I mean, have you ever been reading something online and thought, Wow,
it's crazy that I can have my own thought about
this and it's still be private. The last sacricynct thing
in the human experience now is just the mind.

Speaker 2 (55:16):
That's it, you know.

Speaker 3 (55:18):
I I think what's going to happen, and it's already
starting to occur a little bit, but I believe companies
will begin to thrive that specifically are their Their goal
in creating products is to subvert this kind of tracking
and this kind of surveillance. And again you can there

(55:40):
are some companies out there that are trying to create
a wearable tech things that you can put on your body,
that your face, face paint, different things like that that
will not allow facial tracking.

Speaker 1 (55:53):
To occur, like the camouflage that's used on some warships exactly.

Speaker 3 (55:57):
And I think those kinds of companies are going to
proliferate and flourish, and you're gonna have ones that there
are specific types of glasses that you wear that aren't
actually you know, they're not to protect you from UV rays.
It's just to protect whatever light sensing camera to see
to your eyes. I really think that stuff's gonna proliferate.

Speaker 1 (56:18):
Maybe disruptive contact lenses. Oh yeah, that being a little easier. Oh,
I don't know, what do you think you'll.

Speaker 2 (56:24):
I mean, is it gonna make my snapchat filters better?

Speaker 1 (56:27):
Probably? Eh?

Speaker 2 (56:28):
Well that I'm all for it, all right, And that's.

Speaker 1 (56:33):
Our classic episode for this evening. We can't wait to
hear your thoughts. We try to be easy to find online.

Speaker 2 (56:38):
Find this at the handle Conspiracy Stuff, where we exist
on Facebook X and YouTube on Instagram and TikTok where
Conspiracy Stuff show.

Speaker 3 (56:45):
Call our number. It's one eight three three st d WYTK.

Speaker 1 (56:50):
Leave a voicemail, and if you have more to say,
we can't wait to hear from you at our good
old fashioned email address where we are conspiracy at iHeartRadio
dot com.

Speaker 3 (57:18):
Stuff they Don't Want You to Know is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

Good Game with Sarah Spain

Good Game with Sarah Spain

Good Game is your one-stop shop for the biggest stories in women’s sports. Every day, host Sarah Spain gives you the stories, stakes, stars and stats to keep up with your favorite women’s teams, leagues and athletes. Through thoughtful insight, witty banter, and an all around good time, Sarah and friends break down the latest news, talk about the games you can’t miss, and debate the issues of the day. Don’t miss interviews with the people of the moment, whether they be athletes, coaches, reporters, or celebrity fans.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Crime Junkie

Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.