Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of I Heart Radios How Stuff Works. Welcome back
(00:24):
to the show. My name is Matt, my name they
call me Ben. We are joined as always with our
super producer Paul, mission control deck and in spirit because
today we have our returning super producer Seth Johnson. Everybody
give him a hand. Most importantly, you are you. You
are here, and that makes this stuff they don't want
(00:45):
you to know. I was thinking about this. How many people,
on average do you think are listening today's show on
a cell phone or maybe on a laptop with another
electronic device on in the background. I would say many,
if not most. I see. I'm thinking a lot of
people probably have their phone in their pocket or on
one of those cool little arm bands. Yeah yeah, Or
(01:10):
people who want to look like they have ran or
will run somewhere in the Yeah, there's so I should
chase those people. Is that what you're saying? Yeah, give them,
give them some steaks, you know what I mean. I
think that's what's missing for joggers. I feel like, you know,
I feel like jogging is just practice for you know,
running away from something. You know. I feel like these
people are very paranoid. One of my this might be
(01:30):
a hot take, but one of my very old and
dear friends years back, you guys know him, he told me.
He told me that one of the first signs of
gentrification in the neighborhood was people running when no one's
chasing them. Brilliant. He's a weird guy, but I thought
it was a good point. So so. But the reason
we're even mentioning this like multiple devices going on in
(01:51):
the same audio spaces, because that's one of the main
things we're gonna be talking about today, our devices and
potential monitoring themselves. Right. Are you addicted to your phone,
your television, or your tablet? Are you one of the people,
like many of us, who can't really hold a conversation
without at least checking in on this thing. I'm guilty
(02:14):
of that, he's right now. Yeah, well, I'm kind of
waving it around. Do you find yourself tuning out, you know,
when when the conversation is taking its course, and then
finding yourself tuning into the closest device, whatever that is.
Like right exactly. So just for the lay of the land, here,
we have three laptops out open right now, we all
(02:34):
have our phone somewhere near our person, and we have
you know, a bevy of a V equipment that is
probably the most innocent of the contraptions here right in
this room. And if you are one of those people
who constantly finds yourself clicking in or dropping out of
conversation tuning into something else, you are not alone. Let's
(02:57):
reiterate that, because at first it sounds kind and nice, right,
it sounds kind of like ah, warm, fuzzy, huggy time. No, no, no,
think about it. You are not alone in multiple senses
of the phrase. Here are the facts. Smartphones, smart device
is smart everything. Really, it's everywhere, and uh, you're probably
(03:19):
working with one right now just because you need the
technology to listen to this today. Let's let's look at
the Pew Research organization there. They've done you know, they
do statistics like nobody else. It is estimated they say
that more than five billion people have mobile devices, and
over half of these connections are smart phones. So they're
doing more than just making a connection via satellite to
(03:42):
another phone somewhere or from you know, it's it's not
a landline anymore. It is. It's a phone, it's an encyclopedia.
It's absolutely everything you could possibly do it. It's like
Ziggy from Quantum Leap. Yes, no, literally have that in
our hands now. Doesn't quite predict the future, but it
comes damn close. Yeah. And this is that's a global
number throughout the world number. So more than half of
(04:04):
the people alive today have one of these things. It's
I'm holding up the phone again like a prop. It's
one of technology's biggest breakthrough success stories of recent decades
because just a few decades ago, no one had him. Exactly,
Now five billion people have these. So get this. In
the U s specifically, we've got nine and ten or
(04:25):
more Americans aged thirty four and under who have had
a smartphone since twenty fifteen UM, while the ownership rate
among the fifty and older age group has risen sixty
seven percent over the same period. One of my dearest
oldest friends, who is in his um early fifties, very
purposefully still has a dumb phone. Oh, Harry, Harry, Yeah,
(04:47):
very much on purpose. He just like he is old
school in that way where he rejects a lot of
this like over connectedness, and consequently he's a much more
thoughtful person than a lot of people that I know. Well,
after this episode, most of us are gonna want to
switch back. I know, I know Verizon where I am
offers a flip phone option, so no kids never break two. Well,
(05:12):
there you go. It seems more and more like a
good option. We'll get into it though, because it doesn't
just stop with smart phones. We also are talking about
smart TVs. And yes, Matt, you're absolutely right. This may
well ruin some some people's day, but this is important
to know. So we checked out Statistic when we wanted
(05:33):
to find some more stats and specs on smart TVs.
The global TV set unit sales are projected to increase
from two nine million units in six to two nine
million by twenty. That's pretty nuts, because you know, television's
shouldn't be this sort of disposable resource. They never were, right,
(05:57):
but now more and more people are buying TVs more
more frequently. I mean, I have a smart TV that
is made by Amazon. I'm sure it's made by some
third party manufacturer from overseas. To do you know why though,
you guys, Because it was dirt cheap, and it's it
works great, and I like the interconnectedness of it. But
we'll get into some of the features that this uh,
(06:19):
well you call it a feature is really more like
a bug literally feature, but just not for you. Feature.
Bug is in the eye of the behold. That's actually right,
and in this case, the eye of the beholder is
big data. Yeah. As of of the television is being
sold across the planet are smart TVs, and a smart
(06:42):
TV at the most basic explanatory level, is a television
that combines a lot of features one would associate with
a computer. Right, So if you like no own a
smart TV, you can watch your favorite shows, but you
don't have to just watch them when they're on. You
can also you know, dial it up on demand end
for instance, you can connect it with your phone. Not
(07:03):
to mention you can it's very customizable. You can combine
all of these different services into one kind of widget box,
let's call it. Where you have your Netflix, you got
your Hulu, you got your Amazon which obviously, my Amazon
TV leans pretty heavily on the global search on my
Amazon TV searches like all of Amazon, and it gives
you products. It gives you TV shows, it gives you
(07:26):
other stuff that's you know, in your set of apps
or subscriptions, but very much leaning towards the Amazon side
of things. Yeah, the most important thing about a smart TV,
when we call it that, is that it's able to
communicate with your network or the network that it's attached to,
and could possibly see all of the other devices that
are attached to that network, could possibly sure this is
(07:50):
This is true though, so Android is probably the most
widely used operating system among smart TVs, but that by
no means should be taken to into that other oss
aren't in Their iPhones are in there as well. Apple
has a hand in this. And while smart device addiction
is real, especially when we talk about mobile devices, it's
(08:11):
I think we should brack at that as the subject
of its own episode in the future, assuming we don't
get black backed or disappeared. There's more to the story
behind the purposefully addictive technology here. You see, while we
stare into that electronic abyss, even though we might not
know it, sometimes things in that abyss, in this sort
(08:31):
of black mirror, are staring back at you. And we'll
get into that right after a quick word from our sponsor.
Here's where it gets crazy. So we talked about smart devices.
They're popular, everybody loves them, the hottest things since uh
(08:51):
fresh baked slice bread whatever. Yeah, they do all the
stuff we need, they do all the stuff we want
to write. But with smart devices comes the concept of surveillance.
And we've talked about this a little bit before it
our previous episodes on big data, big data whenever your
preference maybe oh yeah, and if you're worried about you know,
your smart devices tracking you or anything, and you still
(09:13):
have one of these Amazon echoes or maybe a Google
personal assistant like a home or something plugged in and
turned on, you can stop worrying. They've already got you.
It's over. It's just too late. Yeah, we're having a
little bit funny there. That's that's not fully true, you
think this man. But but there is some sand to
this idea, and we're gonna get into it a little
(09:34):
bit later. We're back when p d a s were
a thing personal digital assistance, And now we have robot
overlords that are like doing our bidding? But are they
really are we not really just doing their bidding? Oh god? Right,
it's like the old oh Man there's so many weird
ways to go with this, but we should talk about
the nuts and bolts too, right, because we know our
(09:55):
smart devices have to keep track of the user. You
have GPS, you have the ways app or something you're driving,
you have lift, you have Uber, what have you? You know,
all things that make the stuff function and make it
convenient for you. And then there are also a lot
of apps that say, hey, we want permission to access
your microphone or your location, and you're like, wow, Candy Crush,
(10:16):
this is getting serious. You know I'm picking that as
an example that's not one. Okay, great, Well, at least
we have candy Crush to remain as a sacrosanct example
of good programming. But when our smart devices are keeping
track of us, the kind of surveillance that they have is,
as we can tell, uh, squarely aimed at tracking our preferences.
(10:38):
Let me figure out what you like, says your mobile device,
such that I can give you better offers make it
easier for you to say yes to things in the future.
And that's why if you were on our Facebook group,
here's where it gets crazy. You'll see you'll see all
the strange, insidious examples ranging from hilarious to disquieting about
(11:01):
how just how these algorithms can hone in. I think, no,
you posted a meme recently that was Facebook related. Yeah,
I did. It was one of these great Simpsons memes
where it was just an image of like Bart Simpson
in bed and Homer leaning in really creepily like eyeball
to eyeball, and it was I think Homer was labeled
(11:21):
as Facebook ads and Bart was labeled as things I
said out loud but never actually Google searched or whatever,
or some people even say I was just thinking about this.
It definitely started a conversation of people giving examples and
these things. And I've experienced it too. We have a
lot of advertisers that like vet stuff through us, and
sometimes I feel like I just say it or like
(11:43):
I'm talking to you guys about it, and I've never
even like read copy or seen an email or gone
to the site. Next thing, you know, Facebook serving me
up you know, tushy or whatever. Bad bad example. But
like you know, you know I'm talking about you've seen
to have them. Sure there's a reason for that. So
you know, we get into why, what are the motives
behind all of the surveillance. We kind of talked about
a little bit tracking our preferences and everything, but honestly, like,
(12:07):
what are you gonna do with all of that? If
you if you start to really think about and understand
what the economic model is behind all this stuff, you
realize that it's because we, you, me, each of us,
We are the batteries of the economic system products. It's
literally the matrix. We are living in the matrix. Everybody
we are inside our pods are pods consist of your
(12:28):
smartphone and your smart TV and all the things you
interact with your laptop. That that is us. We are
the byproducts of a lifestyle obsession, a fight clip, no,
and and and here's the thing too, we mentioned I
mentioned this off air. Um I got that Amazon TV
because it had a lot of features, It had really
high resolution, and it was dirt cheap, and TV prices
way down, and as we saw at the beginning of
(12:49):
the show, TV sales way up. And I think that
you can't ignore that there's an exchange going on there
with like we're giving up this part of ourselves in
exchange for cheaper and better, more efficient technology. Well, one
must ask at a certain point, one must ask where
the income for the company is actually arriving from. Right,
(13:11):
We've mentioned before one of my favorite examples, and I
won't go into it now because long time listeners have
already heard this for a time. Target the corporation, the
retail store, was not making most of its money off
of selling you know, people baby toys and trousers to
people still say trousers, knickerbockers, whatever. They were making the
(13:32):
bulk of their income, a huge proportion of it from
selling their security system infrastructure to other companies, kind of
like the way McDonald's makes most of its money through
real estate. So it's okay to sell a television at
a phenomenal loss, right when you know you're going to
recoup that money and then some on something else. And
(13:54):
I think that's what's happening that the televisions. Would you agree?
I mean, it sure seems like that to me. So
you know, we know that no matter who you are,
no matter where you are in this wide world or
just or begin around it, you have something that really
wants to be your friend, wants to be your best friend,
your your teacher, your mother, your secret lover to quote
(14:17):
Homer Simpson, and this thing that can't wait to be
your best friend is called the advertising industry. It's had
its eye on you for a while. It already knows
a lot about you right now. Oh yeah, it knows
a great deal about you. But it wants to go deeper.
It wants to go a therapist style on you. He
wants to know what you love, what do you hate,
who do you trust? And also how much liquid cash
(14:39):
can you get your hands on in the short term. Okay, again,
we're joking a little bit, but you get the point. Uh,
for real, They want to know how much you can
spend and what you would want to spend it on
if you absolutely could right now, If somebody just popped
something in front of your face right now, what is
the number one thing you would buy? Because we'll find
it and we will show it to you and you
will buy it. Yeah, and companies just bobble up all
(15:00):
of this data because, um, the the level of technology
that we're at right now isn't quite as sophisticated as
they would like. We're getting there, they're certainly pushing it
every day, but right now it's just kind of like
a throwing everything at the wall and seeing what sticks approach,
you know, artificial intelligence, It turns out in this regard
at least is not that intelligent and it needs a
(15:22):
ton of help. So it's efficient as all hell, but
but it is not. It cannot make the connections a
lot of times unless it is helped out by a
human user. It's still a black box. Though great example works.
I often get served ads for things that I've already bought.
That's yeah, I was gonna say too. That's like it's
we're at the stage where in the Terminator franchise, the
(15:44):
original cyborgs were easily discernible from organic humans, right, because
it's a bit ridiculous. There's not any human advertiser who
would say, well, this person just bought a toilet, so
you know what, they need five more toilets. You know.
That's like, like, what what's gonna happen? Are you gonna
are you gonna just bought a toilet? You'll see an
(16:06):
ad for it and then you'll go though, I don't know,
maybe I'll just treat myself. It's it's bizarre. Or maybe
you're a contractor. Maybe it thinks we're all contractors and
we're all building out fair points. Yeah, and we know
that we know that companies have a lot they want
to do with this data, even if they can't entirely
(16:26):
get the rubber to meet the road in practice. And
we'll we'll expand that picture in a frightening way a
little bit later, but for now, let's think of it
this way. Companies give all the data they can get
their hands on, even the stuff you would think is unimportant.
They give all of that equal waiting when it comes
to picking it up. There's nothing that gets ignored if
(16:49):
it's able to be monitored and captured. And that's because
AI programs, as we said, just aren't that intelligent yet.
They are efficient, map, but they're not that intelligent. Like
think about home surveillance systems. You have a you know,
uh no, you and I probably have Amazon echos. Yeah,
and you have a Google Home or something like that. Okay,
So these home surveillance systems, which is the correct term
(17:13):
for them, these home surveillance systems have these uh these assistance,
these programs that will guess at what they think you said,
but they still frequently missed the mark, you know what
I mean. But they also aren't listening. And I think
in theory until unless you say that wake word, right,
whatever that might be. They are always listening. Oh, hey,
(17:36):
the because they have to hear the wake word. I
love doing this. Let's prank someone who's listening to one
of those in their house, right, Alexa play, Alexa play,
don't please? Don't the remix are the origin. We're kidding,
We're kidding, Hope. We're all still friends with our with
our various devices. But but we say this to point
(17:57):
out that these things are far from perfect, and there
are a lot of people employed by these companies human listeners, right,
just like many of us listening today, who are tasked
with going through these things and seeing listening to the
recording that someone said, and then seeing what the assistant
thought they said, and then reconciling the too to build
(18:19):
a better mouse trap for your personal information. Yes, and
that is where we get into, uh, the Amazon echo
story out of Bloomberg that we were going to talk about,
and that is the fact that there are thousands of
Amazon employees and contractors who, like Ben said, are tasked
with listen, literally listening to what the microphone recorded in
(18:43):
your living room. So if so, you're asking Noel about
is is it always listening? Yes? The microphone has always
turned on. As long as you've got your Alexa plugged
in or your Google Home plugged in, that mike is
on and it is listening. It doesn't record anything until
you say alexa or compute or echo or whatever. The
key right is the wake word. But which is a
(19:04):
creepy phrase in and of itself if you ask me.
But that is literally an open mic sitting in your
house and uh that that is where it gets really creepy.
But it also gets really creepy. Like, Okay, on the surface,
it makes complete sense. It's what Ben was saying. It's
quality assurance, right, it's trying to make that AI better.
It's an educational thing for the system. But below the surface,
(19:26):
like if you really break it down and you take
away some of the words that are in there that
make it feel like a fun and exciting new thing,
there is this mike in your room. It's recording information
things that you're saying in your private home, and it's
sending it to some person that you've never met, and
then this stranger is going to transcribe exactly what you
(19:46):
said in your living room. Then it's going to feed
it back in that system. So that when that mike
here's you talk again. It knows exactly what you said.
But again it's only after you've said the wake word.
It's it's it's not They're not transcribing your conversations in
your living room. It's all in theory stuff that you
were attempting to communicate to the device. Otherwise it wouldn't
(20:08):
be any use to them, It wouldn't help them improve
the algorithm at all. But my point here is that
that is how it functions, according to the way the
creators wanted to function, right, made this device and the
f a Q. Yes, that is the forward facing thing.
And I'm not saying Amazon is doing anything illegal or
you know, scary like that, but there is there is
(20:29):
an easy route there to exploit that microphone that's in
your living room. That's all I'm saying. So she's saying,
maybe if someone a bad actor, let's say, got ahold
of this, or do you think Amazon could potentially be
the bad actor Amazon's partners, Let's let's say, let's foreshadow
it that way. Amazon's Amazon's buddies, the folks in bed
(20:49):
with it. But that what's interesting to me about this
is that you are talking in terms of and above
the surface level. Yes, my spider sense tells me you've
got to the surface. Take. No, well, the surface take
is just in my mind, the reality of the situation
of We've talked about it before on here. We kind
(21:10):
of hit it a couple of times in this episode already,
just that we are literally bugging ourselves. And you know,
and you think when you think about a world in
which perhaps the powers that be end up ruling, Let's
just say this United States that we live in. Let's
say that some group comes along and takes over, and
now it is illegal in this land to do X.
(21:32):
And let's say your family or your living situation is X.
Now there is a microphone in your living room or
your kitchen or wherever it is, and if you're just
having a regular conversation about what your life is and
what you are doing, but it is illegal in this
land and there's a mic in there, there's potential that
there you could be abused in some way or persecuted.
(21:55):
That's all remind me of the telescreens in nine four,
which we're like, on this on one level seen as
like a luxury and it's like a really cool technological
gadget where you could watch all of these whatever entertainment
you so wished, But it was a two way thing.
It was monitoring you. But there's a certain acceptance of it,
you know, like it's not secret monitoring. Everyone knows they're
(22:16):
being monitored. They just know to stay in line and
not fall outside of the party, you know, doctrine or whatever.
Then we've kind of found ourselves in a very similar
situation where like complicit in our own surveillance or well,
was nothing if not prescient in that regard. There's a
there's here's a real life example or something that could
play out plausibly. And this is heavy stuff. So imagine
(22:39):
that you live in a country that is of least
economic developed country economically developed country, and that country has
an authoritarian government and they have strict religious laws of
one sort or another. Let's say that for a time
there was a different regime and you were maybe in
(22:59):
the sane sex relationship, and you and your partner lived
lived your normal everyday life. Right, you just happen to
have your device on because you like to hear music
when you cook. Who doesn't like that? But then the
regime changes and now uh, now again, uh, same sex
relationships are forbidden or haram or whatever. And now that's
(23:20):
stuff that you said that got hoovered up into the cloud.
Now it makes you complicit in what that government sees
as a crime. And that means that, according to that government, uh,
the stuff that you did, which was perfectly fine, your
relationship was perfectly fine until someone retroactively decided it wasn't.
And now because you wanted to hear uh the remix
(23:44):
to ignition, now just because of the you know, conversation
that occurred around that time, now you are in hot
water and there's not a recourse to help you. That's
a terrifying possibility. Matt. You were telling us off air
that at least some companies like Amazon attempt to well
(24:04):
those fears by publicly stating there are hard constraints on
how long an echo can record someth Oh. Yeah, absolutely,
And and again to am Amazon's defense, I am being
completely conspiratorial in this, um and I it's just one
of those weird foresight things that both men and I
were talking about there. Um. But Amazon has stated that
(24:24):
only a fraction of one percent of interactions with their
devices actually gets transcribed in this way by a human
by a human where it gets sent off, they transcribe
the center back in. Since the beginning of twenty and
this is by the way, from August of twenty nineteen
from the ambiance, so eight months of transcribing had had
(24:45):
gone through and only point two percent of all requests
to Alexa had actually been transcribed. So that's very a
very very small number of conversations that actually get listened
to and transcribed. Interesting how they put it in percentage though,
because putting it in percentage can make something something seems
smaller than it is in reality. Right, It was a
massive sample size, right, and so that small percentage um
(25:10):
is actually a massive number, right, Right, It just sounds
a little more reasonable. Right. Well, here's the great thing.
When you say, hey, whatever, spread it out enough. But
when you do that, generally you ask a very short
request or a very short question or something to that effect,
and then it does the thing and then like that
(25:31):
window of monitoring is over a kind of right generally, Yeah,
it lasted for about two seconds, that's the average. So
when somebody is transcribing, that's literally all all it says.
Can they make good money doing that? Or is it
like they do? It's Amazon, they're not getting paid very
well at all. Is it sort of like we as
a company, we um do a lot of transcribing interviews
and stuff. You think it's very similar to that. You
(25:52):
think that even outsources they couldn't it isnt some of
its contractors, some of his employees. And this practice has
no currently has no real legal constraints because, as we know,
technology always outpaces legislation tale as old as time. However,
I get the feeling that a lot of us were
(26:13):
sort of aware that something's off with these home assistance
or that there is some kind of transaction at play.
If it's not terrible, if it's something we're okay with,
we knew there was still something. And you can hear,
you know, when things go wrong and nine one one
calls and all these other spooky stories about things going
south with Google or Amazon. But what about the other devices.
(26:36):
We have some news for you about smart TVs. Look
around your room or wherever you happen to find yourself,
is there a TV in there? Things are about to
get very interesting for you. After a word from our sponsors,
all right, we're back. Let's jump in to something we
(26:56):
learned about thanks to new York Times in a two
thousand eight teen article about ailing New York Times that
they are not failing. I don't believe, I mean not
after this bump. I mentioned this article in the Game
Stalking episode. So you're welcome, New York. Yes you did. Um,
Then you Ben did bring this up, and we decided
we're going to look into it, and we did, and
(27:18):
now we can't look away forever. So it's a thing
called Samba TV. That sounds fun. Samba TV, it sounds
really funny. Any what else sounds fun? I'm gonna skip
down just just a tad um. How does this sound
to you? Guys? Hey? How about you want to interact
with your favorite shows, get recommendations based on the content
you love, connect your devices for exclusive content and special offers.
(27:42):
How about Samba Interactive TV. Let's you engage with your
TV in a whole new way. That sounds great, sounds
damn good? Into that? So, so what is Samba TV? Oh? Hey, man,
what's Samba TV? Okay, I'll tell you. Um. It is
a software, piece of software that is president in a
lot of television models, some models from nearly a dozen
(28:04):
smart TV brands. And again this is as of late eighteen,
that has changed. There are more included now, but it
is Sony Sharp Phillips a lot of that all the all.
In this software, in particular, it identifies what is being
watched on the monitor the television by literally analyzing the
(28:25):
pixels displayed and then comparing that data to a set
of known media that exists out there. It's a similar
way like audio things or even YouTube things are flagged
for copyright violations. Yes, but in this case it is
the end user, that is the the actual piece of
hardware that is being monitored. Right, And this was always
coming Nielsen ratings, like the Nielsen Institution wanted this. Yes,
(28:50):
there there needed to be a way to find out
who was watching what when. Um. And in particular, when
you're talking about who, it means everybody who is is
a round, not just that this household is watching something. Um.
But but here's the here's the idea. Your viewing history
is then, in part used to suggest, as Noel was saying,
(29:10):
as the flowery language there that's actually present in the
PR from Samba TV. Oh yeah they wrote that copy.
It's not only the PR, it's the opt in message. Yes. Yes,
it's used to suggest the next content that Samba TV
believes you will yourself enjoy. Um. But that is not
all that Samba TV does. It also identifies all the
(29:31):
other devices that are connected to the same network through
which it is accessing the internet. So your friend comes over,
they have a phone with WiFi. Now they're in the
there in the loop as well. Yes, if you are
Netflix and chilling or whatever the kids call it these
days at somebody else's house or apartment, uh, and Samba
TV is there. It knows that you're there because it
(29:52):
can identify your device and the MAC address and all
those things. And here's the thing that this company claims
that it is hearing very closely to privacy guidelines set
forth by the Federal Trade Commission, that it does not
directly sell any of this data. Instead, advertisers can pay
the company to kind of guide the hand of the
ads in the placement, which, which makes sense, doesn't sound
(30:14):
too insidious, right, Well, it's directing adds to the other
devices that are present who they believe are watching the
television program right right right, And so your opt in
stuff happens at the television right at the It doesn't
happen at your smartphone necessarily if you walk into someone
else's house. So how how do they get around the
(30:37):
legality of other people? Like? Are they automatically part of
your opt in when they jo? That's what I'm saying,
there's not informed consent. Really interesting, And also think about this,
This technology is amazing. If our species was less of
a garbage fire, we could use this to do wonderful
things for you know, say someone's mental health right, and
(30:58):
someone's like, Okay, I see you've watched Faces of Death
four nine times in a row. I'd like to recommend
the Great British Baking Show. Do you know what I mean?
To do a better job with your content consumption? No,
it's totally true. UM. And when you let's say you
plug in your new uh smart TV UM and it
has Samba on it, it will present you with that
(31:20):
very flowery language. That's the opt in message. There is
a giant terms of service and privacy policy you know
page that you can peruse if you wish. UM. I
believe it's six words for the terms of service and
four thousand words for the privacy policy. But why would
you even bother doing that when you can interact with
your favorite shows? Get recommendations based on the content that
(31:42):
you love. Term of service is a real page turner.
Oh yeah, but why would you want to do that
when this seems so innocuous and you just want to
start playing your Fallout four. Yeah, well to see that
that's not the real insidious thing here is that Let's
let's put yourself in the vision of you've spent let's
say the last couple of months saving up money because
(32:03):
you really, you know, you need to get this new TV.
You're really excited about it. Uh, you finally get it right,
and you're installing it your you know, your hands are
all sweaty because you know Fallout four, the next play
through is about to happen. You're like, oh my god,
this is I'm so excited. You plug this thing in,
you start going through the initialization process, the Samba thing
pops up, and uh, you know you can you literally
(32:26):
have to decide if you're going to spend an hour
parsing through all of that legal ease, or if you're
gonna get to whatever it is you're you wanted to
get to. And most people just click enable and move forward.
And it has nothing to do with the flowery language
or anything like that. It says enable, Okay, yes, this
gets me to the next thing. Just click enable, And
(32:46):
that is exactly what most people do. Right on the
order of an estimated the vast majority of people do
click enable. And once this stuff is up and running,
it's a k de bar the doors they used to
say in days of your Samba sees everything that is
displayed on the monitor, regardless of what you're watching or
(33:09):
playing or how you're displaying it. It doesn't matter if
you're watching TV. Uh, it doesn't matter if you're watching
a film. It doesn't matter if you're broadcasting a home video, right, yeah,
it could be. It could be literally anything. You know,
if you are broadcasting a home video of something that
you wouldn't want anyone else to see, Samba TV is
analyzing it. It isn't necessarily matching up with any known media,
(33:32):
but if you broadcast the same kind of home videos,
let's say, if your kids, maybe a romantic video you
made with your partner. Um, I mean, honestly, who knows.
That's again taking a little bit further than the known
technology or the known reasons for using it. But it
could be used in the future by someone to figure
(33:53):
out very personal, intimate things about you. But it's sort
of like when we read about the n s A
and the way the n s A was monitoring people's
phone calls. They weren't recording the actual audio. They were
just capturing the metadata so they knew how long a
call lasted or like you know who this web of
like can of interconnectedness or whatever, it's similar with this.
It's not like they're recording actually what you're streaming. They're
(34:15):
just capturing the data of what it is, how long
you watched it for, etcetera, of the pixels of the pixels.
So so this, okay, this, this is true. Even if
even if we want to be as skeptical or I
should say, as credulous as we can, and if we
take those pieces of stated pr copy at their word,
(34:38):
this still has a ton of hilarious, cartoonish vulnerabilities. You
can learn too much about people, and there's no way
for the end user to stop it other than try
to opt out. But opting out it doesn't delete all
the stuff that has already learned about you. Are we
being paranoid? Perhaps, or perhaps we should introduce you to Alfonso.
(35:04):
Oh God, I love that Alfonso Okay, so just to
break this down thus far, we've got our personal assistance
that always are on listening, no matter, no matter if
we're saying the keywords or not. They have their microphone
on and they're listening. They aren't necessarily recording all the time,
but they are. Now, you have your smartphone over there
(35:24):
that is literally watching what you're watching, too, and it
is making informed decisions about what you watch and sending
ads to all the devices in your house. Now, let's
say you're on one of those devices. Let's say it's
an Android device. Let's say you went to the Google
Play whatever it is app store thing and you've downloaded
(35:45):
some apps and some games. Well a lot of these
apps and games, not all of them, but a lot
of them have partnered with this this thing called Alfonso.
So this is uh, really interesting little little piece of
software that's attached to these apps, and what it will
(36:08):
do is prompt you to enable the use of your microphone. Right,
and these would be things that do not ostensibly need
that kind of access. Pool, three D, Beer Pong, trick Shot,
Real Bowling, Strike ten pen. You know these kind of
word sality names, little fun waste of time apps. Well,
and not just those some anti spying software. There's there's
(36:31):
a ton of apps out there right because it's it's
this Alfonso is app agnostic. So here's what happens. Here's
why they want that microphone access, because when you're using
this app, or when you grant this app microphone access,
Alfonso can figure out what you happen to watch by
(36:52):
identifying audio signals and television ads and shows and even
matching that information with the places people visit and the
movies they see really quickly. Here is how it works.
So you're we're all hanging out, we're watching some television
show that we're into. Let's go with Lost, something with commercials.
So when the show switches to commercial, there is a
(37:15):
pitch an audio signal that goes out to the room.
You cannot hear it, your pets cannot hear it, your
kids cannot hear it, no one can hear it, and
no one is supposed to hear it. It's only for
your phone. That's and that's what they do. They communicate
with your phone, and then the phone will also let
(37:36):
people know via Alfonso. The phone will let the the
users of the app, the real app, the users of
Alfonso understand who is in that room, where they came from,
maybe where they're going, and what they would like to buy.
That's not what you sign up for when you walk.
(37:58):
You know, you go to a pot luck at your
friend's house to watch some kind of film. Right, and
how many apps are we talking here? Okay, So according
to the New York Times, there were over two d
fifty apps on the Google Play Store with this feature, right, um,
And if you want, if you head over to the
Google Play Store and you type in quotations Alfonso Automated.
(38:19):
That's A L P H O N S O A
U T O M A T E D, and you
will find all of the various apps that have this
thing installed. But then if you if you look at
an interview with some Alfonso UM people, they said that
there are thousands of apps that they've partnered with and
they didn't want to disclose all of them because they
have competitors who are trying to basically get in on
(38:43):
their territory, ye, poach their territory. Remember when spyware was
a big concern. This is like some next level spyware
that's like different, Right, It's like it literally is opt
in right, it's spying on you. It's crazy, and now
we know it's cyclical. So it's all of the devices
functioning together in this web of trying to figure out
(39:03):
what you want the most and how to display that
thing to you the most effectively. I guess Shelton effect. Yeah.
This this also, this problem becomes complicated even further when
we realize that private entity institutions are not the only
actors in this sphere. Indeed, they may be some of
(39:23):
the more innocuous. I love that you mentioned spyware NOL,
because the best spyware right now is being built not
by private industry but by state actors. We mentioned Amazon's partners, right,
Amazon's partners using the data. Amazon's partners are alphabet soup
intelligence agencies or strongly thought to be so especial, especially
(39:49):
thanks to that early cash injection. Right, exactly so. According
to a Washington Post article from seventeen, the United States
government has already turned theoretical exploits and vulnerable abilities and
this kind of stuff into functioning attack tools. One of
these goes by the objectively badass name Weeping Angel. Weeping
(40:10):
Angel is specifically meant to target Samsung TVs. This is
just a small, micro cosmic example, and this is at
least what it was doing two years ago. According to
Wiki leaks after infestation, Weeping Angel places a target TV
in a fake off mode so that the owner believes
the TVs off when it's still on. And then in
(40:32):
this fake off mode, the TV operates as a bug,
recording conversations in the room and then sending them over
the cloud to a covert c I A server. This
sounds bonkers, This sounds bananas. I can't believe it's real.
Why would anybody ever be paranoid? Why would they? And
(40:53):
I I hope whomever is listening to this around a
smart television has unplugged their headphones and is listening on speaker.
You know, Okay, look everything, I just have to say this,
everything we've been discussing today. If you are of a
certain mind, perhaps like myself quite a lot, quite frequently, UM,
it could lead you down dark pathway where it feels
(41:16):
as though there's surveillance everywhere and you're being targeted in
some way. We can assure you this is not just
about you, no matter no matter what you may think,
or no matter what you may believe. Uh, it is
it's mass it's it's it's everybody. And again, it is
not necessarily nefarious. Um, but it's real. That's a matter
(41:41):
of perspective. There is a certain self importance or self
aggrandizing that occurs when people are suffering from paranoid delusions. Right,
But being paranoid about this sort of stuff does not
make you delusional. It means that you have unfortunately turned
turned over the rock and you've seen the thing squirming
(42:03):
in the darkness beneath. This is very real stuff. I
love that, and I also am terrified by it. But
you know this, it's not all bad. I mean, there
are ways of kind of at least stemming some of
this stuff a little bit. Right. Um So, how to
Geek actually has an easy to follow a guide on
(42:23):
how to stop Google Home from recording you all the time.
Google Home has a thing where it actually saves your
voice memos. You can check that out. Um, you have
to opt in for constant recording allegedly, while you can
if you're an existing user opt out. Yeah, that's the
that's the whole thing. They've updated their terms of service
basically Google Home as right, and actually, uh, Amazon has
(42:45):
done something similar there where you have more choices now. Um,
but if you're I think, if you're a legacy user,
you actually can't get out of some of the agreements
you already signed into. Yeah, somebody somebody fact checked me
on that, but I recall reading that this more. Um,
here's the good thing. Remember Samba TV we were talking
about that felt so creepy. Literally, all you have to
(43:06):
do is say disable when you get to that screen
and you're installing your TV. That's all you have to
do and you're done. Do you really not think it
has something like why do you think people are so prone?
It's such a massive amount, Like are so prone to
click enable because you've got a new toy and want
you want to take full advantage because it's presented, like
I said, as this lovely way of like making this
(43:29):
a better experience for you the user. Why wouldn't I
want that? Well, And it's a menu that you have
to click through, right, So think about it this way.
If it's on enable, so your your cursor is on enable,
when the screen pops up, you'd have to go down
to terms of service, down to privacy policy, down to
learn more, down one more to disable the clicks the
as stupid as that sounds, and you know, benign as
(43:54):
five clicks or four clicks, people will take the easier
route and just say, Okay, fine, enable, I'm in a hurry.
I can't do I got m F places to be,
you know what I mean. It's so, it's it is true,
and it's uh. It's an exploit not just of technology,
but an exploit of our own hardwired physiology. We're our
(44:15):
brains are built to function this way, right, and this
leads us to uh some conclusions and what is very
much an ongoing events. Right. The first conclusion is that
there's some issues remaining. There's a lack of accountability. One
of the primary issues in this conversation is the utter
(44:36):
lack of accountability on the part of private institutions as
well as government agencies. It is not difficult to imagine
these companies cooperating with intelligence agencies, further exacerbating the legal
pitfalls involved. And again it's important to point out just
as a cheap skate, it's important to point out that
(44:57):
the people getting their data gathered are not paid for
that information. Quite the opposite. It used to be. You know,
what's that old adage we always said, if you're not
paying for it. You're not the customer, you're the product. Right.
But now the pendulum swings a little bit further in
the in the wrong direction, in my opinion, because we
are paying for these services, we are paying Amazon, we're
(45:21):
paying Google to spy on us to whatever end, and
we are not we are not accounting not only for this,
we're not accounting for the larger problem, which is that
insurance companies aggregate this information, your financial institutions aggregate this information,
and there is nothing that stops them from cooperating together
(45:43):
to build a footprint of you close enough. The The
idea is that this, this footprint, this uh, this digital
impression of you will one day have the fidelities such
that it can predict future actions you will take. So
you're saying that it could, in theory be used against you, Yes,
very much. So we're saying they're gonna make Android versions
(46:05):
of you knowl I'm cool with that, that's what help. Yeah,
but it's not going to be about you anymore. No,
it's gonna be Amazon Android cloud clown of me. And
and it's it's fascinting when you think about it, because
I know you're not joking because it's it's it's like
a shape of me that is my data, you know,
like it's out there in the matrix. It's a knoll
(46:26):
shaped data cluster or a mat shaped data cluster, Seth
shaped cluster. Seth shaped cluster sounds like a band. It
does like a tasty treat. It does. I feel like
it's maybe a it's maybe like a hostess thing. So
we are brilliant ideas for desserts to treat the side.
(46:46):
We are all in this together. We are looking at
the end of privacy as we recognize it. And that's
sort of tricky. That sounds more dramatic than it really is,
because the concept of privacy is we know and enjoy
it today is relatively recent, yes, right, And everything we've
learned indicates that type of privacy we idealize may end
up becoming a short lived fat to future historians. We're entering.
(47:11):
You know that you and I talked about this a
long time ago, an inequality of privacy. Right, Privacy is
a new currency. Some of the world's most influential, powerful,
successful people still have this kind of privacy. Right. A
weird example is just think about how much it costs
to get a good tint on your windows. I'm not kidding.
(47:35):
I'm not kidding. If you if you see someone drive
by with perfectly tinted like the darkest windows you've ever seen,
that is expensive. M m well, and that's literally privacy
just in your car anyway. I'm sorry, I feel like
something going on is your car, Okay, No, no, I'm
just saying that that amount of privacy just to be
(47:57):
on the road driving money, I see. And if you
think about really good shutters on a home or something
like that, in those little examples, it takes quite a
bit of means to protect yourself just from someone viewing
with their eyeballs where you are at any time. And
(48:17):
then if you apply that to the digital space, uh,
it gets more and more expensive. It's it's creating something
very similar in nuts and bolts in the mechanics of
it to the infamous sesame credit that's occurring in the
Chinese mainland. And I'm not being alarmist about this, and
they don't, you know, I don't want people to be
any more frightened. That is absolutely appropriate. You should be
(48:41):
a little the we're at the Pandora problem right once
the Pandora's jars unscrewed. Once the lid is off, there
is no going back. There were some rumblings in Congress
about investigating what is essentially a smart TV spy ring,
but the advantages of keeping the technology and play for
now seemed to out way the problems of consent. Then,
(49:03):
the fact that consent is not occurring, and the fact
that yes, this could partner up with your with your
information from other places, such that it might affect your
ability to get a car loan, It might affect where
you can live. This can get very dirty, very quickly.
It's a slippery slope, and especially once I mean, what
(49:24):
what if, like we can't opt out anymore? You know,
if if that goes away, Like are we really owed
that right to opt out? Like it's sort of like
almost a pr move to allow us to opt out,
Like you you could very easily say, is the manufacturer
of a product say well, if you don't want us
to have your data, don't buy the products. Like it's
sort of almost a courtesy if you think about it,
to allow people to opt out of this. Yeah, but
(49:46):
what it could just are you owed a fancy television?
But but are you owed and are you owed are
you required to participate in some of these systems? Insurance
one is required to do so. Absolutely, that's different, I think.
But I guess what I'm saying is like with the
the television, all these are gadgets that, like you could
not necessarily call necessities. Well, think about think about most
(50:09):
people with a steady job, think about email communication nowadays,
or an app that's used wherever you work or you
know some there. You you have to have some kind
of connection like that, you really do, and with most
of these devices, you're going to run into these issues
or add to that the compounding complicating factor that for
(50:33):
the for a huge proportion of people who have mobile phones,
it's their only way to access uh, not just the Internet,
but it's their primary tool for any financial deals like
people's lives hinge on this thing. So I I see
both sides of that. But here's what I think we
(50:53):
can end with. We can say it's not just the
United States. Way back in two seventeen when Wiki Leaks
released this, UH, they showed that digital spine is going
to continue to grow. It's not going to go away.
We're talking about the Weeping Angel thing right right, weeping
Angel in specific, which again is just for Samsung TVs
and it's relatively low tech. You have to put a
USB uh stick in there. But now you don't. Now
(51:15):
it's now it's a whole different thing. Uh. You know,
anybody got uh sort of irritated when you bought a
new cell phone and it had stuff pre installed that
you can't move. Think about this times a thousand. That's
what's happening. That's what's going to happen. And it's not
just happening in the US. Other advanced nations China, Russia, Britain,
Israel and so on are creating newer, more robust, powerful
(51:40):
tools to do this, and any any nation that can
gain access to this kind of spine technology is going
to do so, and they can do it through a
web of private industry. There there are no laws. This
is wild West and it's very bad. It's very bad
for the people who are not at the top of
the food chain. And we that we all threw away
(52:01):
our devices. We got out our acoustic guitars or uku leles,
starting kumba. I started writing my version of Ralph water
Emerson's on Nature right, that's correct. I got our my
Jim bay out and we just started playing symbols for me. Yeah,
and we just you know, commune with nature for the
(52:22):
rest of our lives and watch the sunset dissipate over
the horizon. Had an ostrom in Quebec. Yeah, and then
we all woke up and went back to work. That's true.
And hey, listen, I'm just being Devil's advocating about like
are you oh to smart TV? And I agree it's
a great question with the with the phone though, You're right,
the phone is for many an affordable entry point to
(52:43):
the Internet because it doubles as a you know, a
crucial communication device and a way to access the Internet,
which we can all agree is a necessity for things
like banking everything, you know, everything starts on the Internet. Well,
of course, but I would argue that for the things
like you know, an Alexa, do we do we need
(53:04):
an Alexa? Maybe maybe for accessibility, Maybe maybe that's the thing.
Maybe maybe for like people with disabilities, an Alexa could
be a very important addition to a home. For others,
I think it's more of a luxury and sort of
like a neat little gadget. You know, you're absolutely right,
and you all nobody needs a smart television monitor right,
but very soon the only available televisions will be smart televisions,
(53:28):
at least ones that are easily purchased. I think what
I was getting at when Ben talked about the sesame
credit and the slippery slope of all this is what
if there does come a time where you can't opt
out anymore, and just by buying the thing and installing
it in your home, you're opting in. The Only way
to opt out is to not buy it, or to
buy something else there you go. Or And while we're
(53:51):
talking about needs and once, there is one thing that
we definitely want, and that is to hear from you.
Do you think this is alarmist? Are you are you
do find yourself more on the side of, well, if
you not do anything wrong, what do you worry about?
You know it's a big def right. Or are you
more on the side of where you think this is
an even more um potentially dangerous situation than what we've
(54:15):
outlined today. Let us know we would love to hear
from you. You can find us any number of ways.
We're all over the internet, not as not as like
widespread as the n s A. But you know, we're
out there. You can find us. We're on Facebook at
here's where it gets crazy, where you can talk to
our favorite part of the show. Your fellow listeners were
on Twitter. We're on Instagram, the whole the whole Kitten Caboodle,
(54:37):
the whole Bag of Badgers, the whole nine. And that's
not all. If you want to give us a phone call,
you can leave a message. Our number is one eight
three three st d w y t K. If you
do leave a message, let us know if you'd like
us to use your name on the air, if you
want to remain anonymous, if it's just for us, it's
for all the listeners. Just give us all that info upfront. Uh.
(54:58):
And if you don't want to do that, you can
send it doubt Oh yeah, you can't opt out. You
can opt out of this whole thing. Just turn off
your podcast player right now. But if you don't want
to do that and you want to send us an email,
you can. We are conspiracy at iHeartRadio dot com. Stuff
(55:30):
they Don't Want You to Know is a production of
iHeart Radio's How Stuff Works. For more podcasts from my
heart Radio, visit the i heart Radio app, Apple Podcasts
or wherever you listen to your favorite shows,