All Episodes

February 14, 2025 57 mins

If you're like more than 90% of Americans, you own a cellphone of some kind -- and it's no wonder. Like tablets or smart speakers, these handy devices are incredibly convenient, providing a universe's worth of knowledge at your fingertips... at a price. What exactly do your smart devices know about you, and where is all that information going? Tune in to learn more with Ben and Matt in tonight's Classic episode.

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Okay, all right, let's set the stage for this evening's
classic episode. Fellow conspiracy realists, cast your memory back to
twenty nineteen. Nobody was really sure what would happen. Alexander
Dugan had some ideas, he had some wishes on his
wish list, and we got we got very We finally

(00:25):
did an episode on something we had been talking about
for gosh, more than a decade at this point, the
idea of smart devices and surveillance. And I think this
might have been the one that radicalized us a little bit.

Speaker 2 (00:39):
Maybe a bit tiny bit, because this is definitely a
moment in time when, in real time, we realized just
how much mass data collection was happening within our phones,
not by necessarily the government, but by individual third party
companies that we all say yes to when we down

(01:00):
load their applications or we use their services.

Speaker 1 (01:03):
Who reads the terms and conditions right? Who reads down
to paragraph seventeen of the agreement, especially when you're scrolling
on a phone and you're in a hurry because you
got to get the app dude.

Speaker 2 (01:17):
Well, then, but then when you combine that with companies
potentially giving that information to governmental entities, and you combine
that with actual governmental entities that have stated before or
it's been leaked before stellar wind that communications on individual
phones are being picked up, and even communications through the

(01:38):
massive pipelines that go underneath the oceans.

Speaker 3 (01:41):
Mm hmm.

Speaker 1 (01:42):
Yeah, it's a prism of conspiracy and it probably takes
five eyes to see oo in full right. This is
also this is also shot up for anybody who ever thought, oh,
my phone is listening to me, the speakers take it
up something, it's so much deeper.

Speaker 2 (02:02):
Oh my gosh, Ben, I'm gonna say this so I
say it on record. That is happening so much to
me recently where it's that echo on the telephone. It's
that thing where you're getting feedback when you're talking to
one other person when you don't have the speakerphone on,
or when you're using a Bluetooth headset that's got a

(02:22):
microphone that's close to it, when you're talking on PlayStation.

Speaker 1 (02:26):
Oh buddy, oh man, Well, Matt, I'm with you there.
If this is the year we get hauled off to
the crazy house, that I think we'll be in the
same uber because we're not paying for the ambulance.

Speaker 2 (02:45):
Is the joke anyway?

Speaker 1 (02:46):
Anyway, this is our classic episode twenty nineteen smart devices
and surveillance.

Speaker 4 (02:52):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. Turn back now or learn this
stuff they don't want you to know. A production of
iHeart Radios How Stuff Works.

Speaker 2 (03:16):
Welcome back to the show. My name is Matt.

Speaker 3 (03:18):
My name is no.

Speaker 1 (03:19):
They call me Ben. We are joined as always with
our super producer Paul, mission control deck and in spirit
because today we have our returning super producer Seth Johnson.
Everybody give him a hand. Most importantly, you are you.
You are here, and that makes this stuff they don't
want you to know. I was thinking about this. How

(03:41):
many people, on average do you think are listening today's
show on a cell phone or maybe on a laptop
with another electronic device on in the background.

Speaker 3 (03:51):
M I would say many, if not most.

Speaker 2 (03:53):
I see. I'm thinking a lot of people probably have
their phone in their pocket or on one of those
cool little armbands when they're just yeah, oh yeah, dude.

Speaker 1 (04:01):
Runners or people who want to look like they have
ran or will run somewhere in the future.

Speaker 3 (04:06):
Yeah, there's so I should chase those people. Is that
what you're saying? OK?

Speaker 1 (04:10):
Yeah, yeah, give them, give them some steaks, you know
what I mean. I think that's what's missing for joggers.

Speaker 3 (04:14):
I feel like that, you know. I feel like jogging
is just practice for you know, running away from something,
you know. I feel like these people are very paranoid.

Speaker 1 (04:22):
One of my this might be a hot take, but
one of my very old and dear friends years back,
you guys know him, he told me. He told me
that one of the first signs of gentrification in a
neighborhood was people running when no one's chasing them.

Speaker 3 (04:35):
Brilliant.

Speaker 1 (04:36):
He's a weird guy, but I thought it was a
good point.

Speaker 3 (04:38):
I think so too so so.

Speaker 2 (04:40):
But the reason that we're even mentioning this, like multiple
devices going on in the same audio space, is because
that's one of the main things we're going to be
talking about today, our devices and potentially monitoring them.

Speaker 1 (04:54):
So, right, are you addicted to your phone, your television,
or your tablet? Are you one of the people, like
many of us, who can't really hold a conversation without
at least checking in on this thing. I'm guilty of that.

Speaker 3 (05:07):
He's literally doing it right now.

Speaker 1 (05:08):
Yeah, well, I'm kind of waving it around, do you
find yourself tuning out, you know, when when the conversation
is taking its course, and then finding yourself tuning into
the closest device whatever that is, what like right exactly, So,
just for the lay of the land, here, we have
three laptops out open right now. We all have our
phone somewhere near our person, and we have you know,

(05:31):
a bevy of av equipment that is probably the most
innocent of the contraptions here right in this room. And
if you are one of those people who constantly finds
yourself clicking in or dropping out of conversation, tuning into
something else, you are not alone. Let's reiterate that, because
at first it sounds kind of nice, right, it sounds

(05:54):
kind of like ah, warm, fuzzy huggy time. No, no, no,
think about it. You are not alone in multiple senses
of the phrase. Here are the facts.

Speaker 2 (06:06):
Smartphones, smart devices, smart everything. Really it's everywhere, and you're
probably working with one right now just because you need
the technology to listen to this today. Let's look at
the Pew Research organization. There, they've done you know, they
do statistics like nobody else. It is estimated they say
that more than five billion people have mobile devices, and

(06:28):
over half of these connections are smartphones, so they're doing
more than just making a connection via satellite to another
phone somewhere or from you know, it's not a landline anymore.
It is it's a phone. It's an encyclopedia. It's absolutely
everything you could possibly do.

Speaker 3 (06:44):
It's like Ziggy from Quantum Leap. Yes, no, literally have
that in our hands now. It doesn't quite predict the future,
but it comes damn close.

Speaker 1 (06:52):
Yeah. And this is that's a global number, yes, throughout
the world numbers. So more than half of the people
alive today have one of these things. It's I'm holding
up the phone again like a prop. It's one of
technology's biggest breakthrough success stories of recent decades because just
a few decades ago, no one had him.

Speaker 2 (07:11):
Exactly.

Speaker 1 (07:12):
Now five billion people have these, so get this.

Speaker 3 (07:14):
In the US specifically, we've got nine to ten or
more Americans aged thirty four and under who have had
a smartphone since twenty fifteen, while the ownership rate among
the fifty and older age group has raisen from fifty
three percent to sixty seven percent over the same period.
One of my dearest oldest friends, who is in his
early fifties very purposefully still has a dumb phone. Oh, Harry, Harry, Yeah, yeah,

(07:40):
very much on purpose. He just like he is old
school in that way where he rejects a lot of
this like over connectedness, and consequently he's a much more.

Speaker 2 (07:50):
Thoughtful person than a lot of people that I know
you well. After this episode, most of us are going
to want to switch back. I know, I know Verizon
where I am offers a flip phone.

Speaker 1 (08:00):
Option, so Nokia's never break too Well, there you go.

Speaker 2 (08:05):
It seems more and more like a good option. We'll
get into it.

Speaker 1 (08:09):
Though, because it doesn't just stop with smartphones. We also
are talking about smart TVs, and yes, Matt, you're absolutely right.
This may well ruin some some people's day, but this
is important to know. So we checked out STATISTICA when
we wanted to find some more stats and specs on
smart TVs. The global TV set unit sales are projected

(08:34):
to increase from two hundred and twenty nine million units
in twenty sixteen to two hundred and fifty nine million
by twenty twenty. That's pretty nuts, because you know, televisions
shouldn't be this sort of disposable resource. They never were, right,
but now more and more people are buying TVs more
and more frequently.

Speaker 3 (08:53):
I mean, I have a smart TV that is made
by Amazon. I'm sure it's made by some third party
manufacturer overseas.

Speaker 2 (09:01):
It is Amazon.

Speaker 3 (09:02):
You know why though, you guys, because it was dirt
cheap and it's it works great, and I likely interconnectedness
of it. But we'll get into some of the features
that this well you call it a feature, it's really
more like a bug. No, literally a listening surveillance feature,
but just not for you.

Speaker 1 (09:20):
Yeah, feature bug is in the eye of the beholder.

Speaker 3 (09:22):
That's absolutely right, And in this case, the eye of
the beholder is big data.

Speaker 1 (09:27):
Yeah. As of twenty eighteen, seventy percent of the television's
being sold across the planet are smart TVs. And a
smart TV at the most basic explanatory level, is a
television that combines a lot of features one would associate
with a computer. Right, So, if you like no own
a smart TV, you can watch your favorite shows, but

(09:48):
you don't have to just watch them when they're on.
You can also you know, dial it up on demand
for instance. You can connect it with your phone.

Speaker 3 (09:55):
Not to mention, you can it's very customizable. You can
combine all of the different services into one kind of
widget box, let's call it. Where you have your Netflix,
you got your Hulu, you got your Amazon, which obviously,
my Amazon TV leans pretty heavily on the global search
on My Amazon TV searches like all of Amazon, and
it gives you products, it gives you TV shows, it

(10:18):
gives you other stuff that's in your set of apps
or subscriptions, but very much leaning toward the Amazon side
of things.

Speaker 2 (10:26):
Yeah, the most important thing about a smart TV, when
we call it that, is that it's able to communicate
with your network or the network that it's attached to,
and could possibly see all of the other devices that
are attached to that network.

Speaker 1 (10:39):
Could possibly sure this is This is true though, so
Android is probably the most widely used operating system among
smart TVs, but that by no means should be taken
to indicate that other oss aren't in there. iPhones are
in there as well. Apple has a hand in this.
And while smart device addiction is real, especially when we

(11:01):
talk about mobile devices, it's I think we should bracket
that as the subject of its own episode in the future,
assuming we don't get black backed or disappeared there's more
to the story behind the purposefully addictive technology. Here. You see,
while we stare into that electronic abyss, even though we
might not know it, sometimes things in that abyss, in

(11:23):
this sort of black mirror, are staring back at you.

Speaker 2 (11:27):
And we'll get into that right after a quick word
from our sponsor.

Speaker 1 (11:35):
Here's where it gets crazy. So we talked about smart devices.
They're popular. Everybody loves them. They're the hottest thing since
fresh baked sliced bread. Whatever.

Speaker 2 (11:45):
Yeah, they do all the stuff we need, they do.

Speaker 1 (11:47):
All the stuff we want too writ But with smart
devices comes the concept of surveillance. And we've talked about
this a little bit before at our previous episodes on
big data, big data, whatever your preference may be.

Speaker 2 (12:01):
Oh yeah, and if you're worried about you know your
smart device is tracking you or anything, and you still
have one of these Amazon echoes or maybe a Google
personal assistant like a home or something plugged in and
turned on, you can stop worrying. They've already got you.
It's over. It's just too late. Yeah, we're having a
little bit of fun. There's that's not fully true. You
think that's fun, Matt, But there is some sand to

(12:25):
this idea, and we're gonna get into it a little
bit later.

Speaker 3 (12:27):
Remember back when PDA's were a thing. Sure, personal digital assistance. Yeah,
now we have robot overlords that are like doing our bidding?
But are they really? Are we not really just doing
their bidding?

Speaker 2 (12:39):
Oh? God?

Speaker 1 (12:40):
Right, it's like the old oh Man. There's so many
weird ways to go with this. But we should talk
about the nuts and bolts too, right, because we know
our smart devices have to keep track of the user.
You have GPS, you have the ways app or something
you're driving, you have Lyft, you have Uber, what have you?

Speaker 3 (12:56):
You know, all things that make the stuff function and
make it convenient for you.

Speaker 1 (13:00):
And then there are also a lot of apps that say, hey,
we want permission to access your microphone or your location,
and you're like, wow, Candy Crush, this is getting serious.
You know, I'm picking that as an example that's not one. Okay, great, Well,
at least we have candy Crush to remain as a
sacrosanct example of good programming. But when our smart devices

(13:21):
are keeping track of us, the kind of surveillance that
they have is, as we can tell, squarely aimed at
tracking our preferences. Let me figure out what you like
says your mobile device such that I can give you
better offers make it easier for you to say yes
to things in the future. And that's why if you

(13:42):
were on our Facebook group, here's where it gets crazy.
You'll see You'll see all the strange, insidious examples ranging
from hilarious to disquieting about how just how these algorithms
can hone in. I think, Noel, you posted a meme
recently that was Facebook related.

Speaker 2 (14:02):
Yeah, I did.

Speaker 3 (14:02):
It was one of these great Simpsons memes where it
was just an image of like Bart Simpson in bed
and Homer leaning in really creepily like eyeball to eyeball,
and it was I think Homer was labeled as Facebook
ads and Bart was labeled as things I said out
loud but never actually Google searched or whatever.

Speaker 1 (14:22):
Or some people even say I was just thinking about this.

Speaker 3 (14:26):
It definitely started a conversation of people giving examples of
these things. And I've experienced it too. We have a
lot of advertisers that like vet stuff through us, and
sometimes I feel like I just say it or like
I'm talking to you guys about it, and I've never
even like read copy or seen an email or gone
to the site. Next thing, you know, Facebook's serving me
up you know, tushy or whatever. That bad example. But

(14:46):
like you know, you know I'm talking about You've seen it,
have them?

Speaker 2 (14:48):
Sure? No, there's a reason for that. So, you know,
we get into why what are the motives behind all
of this surveillance. We kind of talked about a little
bit tracking our preferences and everything, but honestly, like what
are you going to do with all of that? If
you if you start to really think about and understand
what the economic model is behind all this stuff, you
realize that it's because we, you, me, each of us,

(15:11):
We are the batteries of the economic system, its products.
It's literally the matrix. We are living in the matrix everybody.
We are inside our pods. Our pods consist of your
smartphone and your smart TV and all the things you
interact with your laptop. That is us. We are the
byproducts of a lifestyle obsession, a fight club.

Speaker 3 (15:29):
No, and here's the thing too, we mentioned I mentioned
this off air. I got that Amazon TV because it
had a lot of features, It had really high resolution,
and it was dirt cheap and TV prices way down,
and as we saw at the beginning of the show,
TV sales way up. And I think you can't ignore
that there's an exchange going on there with like we're
giving up this part of ourselves in exchange for cheaper

(15:51):
and better, more efficient technology.

Speaker 1 (15:54):
Well, one must ask at a certain point, one must
ask where the income for where the company is actually
arriving from. We've mentioned before one of my favorite examples,
and I won't go into it now because longtime listeners
have already heard this for a time. Target the corporation,
the retail store was not making most of its money

(16:15):
off of selling people baby toys and trousers. Do people
still say trus trousers? Okay, sure, knickerbockers whatever. They were
making the bulk of their income, a huge proportion of
it from selling their security system infrastructure to other companies,
kind of like the way McDonald's makes most of its

(16:36):
money through real estate. So it's okay to sell a
television at a phenomenal loss, right when you know you're
going to recoup that money and then some on something else.
And I think that's what's happening at the televisions. Would
you agree?

Speaker 3 (16:49):
I mean, It sure seems like that to me.

Speaker 1 (16:51):
So we know, we know that no matter who you are,
no matter where you are in this wide world or
just orbiting yourround it, you have something that really wants
to be your friend, wants to be your best friend,
your teacher, your mother, your secret lover, to quote Homer Simpson.
And this thing that can't wait to be your best

(17:13):
friend is called the advertising industry. It's had its eye
on you for a while. It already knows a lot
about you, right Matt.

Speaker 2 (17:21):
Oh, yeah, it knows a great deal about you. But
it wants to go deeper. It wants to go a
therapist style on you. He wants to know what you love,
what do you hate, who do you trust? And also
how much liquid cash can you get your hands on
in the short term. Now, okay, again we're joking a
little bit, but you get the point. For real. They
want to know how much you can spend and what
you would want to spend it on if you absolutely could,

(17:42):
right now, if somebody just popped something in front of
your face, right now, what is the number one thing
you would buy? Because we'll find it and we will
show it to you and you will buy it.

Speaker 3 (17:51):
Yeah, and companies just gobble up all of this data
because the level of technology that we're at right now
isn't quite as sophisticated as they would like. We're getting there,
they're certainly pushing it every day, but right now it's
just kind of like a throwing everything at the wall
and seeing what sticks approach.

Speaker 2 (18:09):
You know.

Speaker 1 (18:09):
Artificial intelligence, it turns out, in this regard at least,
is not that intelligent and it needs a ton of help.

Speaker 2 (18:16):
So it's efficient as all hell, but it is not.
It cannot make the connections a lot of times unless
it is helped out by a human user.

Speaker 1 (18:25):
Still a black box, though great example works.

Speaker 3 (18:28):
I often get served ads for things that I've already bought.

Speaker 1 (18:31):
That's yeah, I was gonna say too. That's like it's
we're at the stage where in the Terminator franchise, the
original cyborgs were easily discernible from organic humans, right because
it's a bit ridiculous. There's not any human advertiser who
would say, well, this person just bought a toilet, so
you know what, they need five more toilets.

Speaker 2 (18:53):
You know.

Speaker 1 (18:54):
That's like, like, what's gonna happen? Are you gonna are
you gonna just bought a toilet. You'll see an ad
for it, and then you'll go, no, I don't know,
Maybe I'll just treat myself. It's it's bizarre.

Speaker 2 (19:05):
Well, maybe you're a contractor. Maybe it thinks we're all
contractors and we're all building out bathroom fair points.

Speaker 1 (19:10):
Yeah, and we know that, we know that companies have
a lot they want to do with this data, even
if they can't entirely get the rubber to meet the
road and practice. And we'll we'll expand that picture in
a frightening way a little bit later, but for now,
let's think of it this way. Companies give all the

(19:31):
data they can get their hands on, even the stuff
you would think is unimportant. They give all of that
equal weighting. When it comes to picking it up. There's
nothing that gets ignored if it's able to be monitored
and captured. And that's because AI programs, as we said,
just aren't that intelligent yet. They are efficient, mat but
they're not that intelligent. Like think about home surveillance systems.

(19:53):
You have a you know, Noel, You and I probably
have Amazon Echoes, yes, yeah, and you have a Google
Home or something like that.

Speaker 2 (20:00):
Correct.

Speaker 1 (20:01):
Okay, so these home surveillance systems, which is the correct
term for them. These home surveillance systems have these uh,
these assistants, these programs that will guess at what they
think you said, but they still frequently missed the mark.
You know what I mean, right, But they.

Speaker 3 (20:19):
Also aren't listening and I think in theory and unless
you say that wake word right, whatever that might be.

Speaker 2 (20:26):
No, they are always listening. Oh hey, the because they
have to hear the wake word.

Speaker 1 (20:31):
I love doing this. Let's prank someone who's listening to
one of those in their house right now. Alexa no play,
Alexa play, don't please? Don't the remix are the origin
We're kidding, We're kidding. I hope we're all still friends
with our with our various devices. But but we say
this to point out that these things are far from perfect.

(20:52):
And there are a lot of people employed by these companies,
human listeners, right, just like many of us listening today,
who are tasked with going through these things and seeing
listening to the recording that someone said and then seeing
what the assistant thought they said, and then reconciling the
two to build a better mouse trap for your personal information.

Speaker 2 (21:15):
Yes, and that is where we get into the Amazon
Echo story out of Bloomberg that we were going to
talk about, and that is the fact that there are
thousands of Amazon employees and contractors who, like Ben said,
are tasked with literally listening to what the microphone recorded

(21:35):
in your living room. So if so, you're asking Noel
about is is it always listening? Yes, the microphone is
always turned on. As long as you've got your Alexa
plugged in or your Google Home plugged in, that mic
is on and it is listening. It doesn't record anything
until you say Alexa or computer or Echo or whatever.

Speaker 3 (21:53):
The key, right is the wavecord? Yeah, sure, But which
is a creepy phrase in and of itself if you
ask me.

Speaker 2 (21:59):
But that is literally an open mic sitting in your house.
It's a hot mic, and that is where it gets
really creepy. But it also gets really creepy. Like, Okay,
on the surface, it makes complete sense. It's what Ben
was saying. It's quality assurance, right, it's trying to make
that AI better. It's an educational thing for the system.
But below the surface, like if you really break it

(22:20):
down and you take away some of the words that
are in there that make it feel like a fun
and exciting new thing. There is this mic in your room.
It's recording information things that you're saying in your private home,
and it's sending it to some person that you've never met.
And then this stranger is going to transcribe exactly what
you said in your living room. Then it's going to

(22:41):
feed it back in that system, so that when that
mic here's you talk again, it knows exactly what you said.

Speaker 3 (22:47):
But again, it's only after you've said the wakeword. It's
not They're not transcribing all your conversations in your living room.
It's all in theory stuff that you are attempting to
communicate to the device of Otherwise it wouldn't be any
use to them, it wouldn't help them improve the algorithm
at all.

Speaker 2 (23:03):
But my point here is that that is how it functions,
according to the way the creators want it to function
right now. Made this device on. Yes, that is the
forward facing thing. And I'm not saying Amazon is doing
anything illegal or you know, scary like that, but there
is there is an easy route there to exploit that

(23:24):
microphone that's in your living room.

Speaker 3 (23:26):
That's sure. So you're saying, maybe if someone a bad
actor let's say, got a hold of this, or do
you think Amazon could potentially be the bad.

Speaker 1 (23:33):
Actor Amazon's partners. Let's let's say, let's foreshadow it that way,
Amazon's Amazon's buddies, the folks in bed with it. But Matt,
what's interesting to me about this is that you are
talking in terms of an above the surface level. Yes,
my spider sense tells me you've got to below the surface.

Speaker 2 (23:53):
Take, No, well below the surface. Take is just in
my mind, the reality of the situation of We've talked
about it before on here. We kind of hit it
a couple times in this episode already, just that we
are literally bugging ourselves. And you know, and you think,
when you think about a world in which perhaps the
powers that be end up ruling, Let's just say this

(24:17):
United States that we live in. Let's say that some
group comes along and takes over. And now it is
illegal in this land to do X. And let's say
your family or your living situation is X. Now there
is a microphone in your living room or your kitchen
or wherever it is, and if you're just having a
regular conversation about what your life is and what you

(24:37):
are doing, but it is illegal in this land, and
there's a mic in there, right, there's potential that there
you could be abused in some way or persecuted.

Speaker 3 (24:48):
That's all reminds me of the telescreens in nineteen eighty four,
which were like on this on one level, seen as
like a luxury and as like a really cool technological
gadget where you could watch all of these whatever entertainment
so wished. But it was a two way thing. It
was monitoring you. But there's a certain acceptance of it,
you know, like it's not secret monitoring. Everyone knows they're

(25:08):
being monitored. They just know to stay in line and
not fall outside of the party, you know, doctrine or whatever.
And we've kind of found ourselves in a very similar
situation where like complicit in our own surveillance.

Speaker 1 (25:20):
Or well was nothing if not prescient in that regard.
There's a there's here's a real life example or something
that could play out plausibly. And this is heavy stuff.
So imagine that you live in a country that is
least economic developed country economically developed country, and that country

(25:41):
has an authoritarian government and they have strict religious laws
of one sort or another. Let's say that for a
time there was a different regime, and you were maybe
in a same sex relationship, and you and your partner
lived lived your normal, everyday life. Right. You just happen
to have your device because you like to hear music
when you cook. Who doesn't like that? But then the

(26:04):
regime changes, and now again same sex relationships are forbidden
or haram or whatever. And now that stuff that you
said that got hoovered up into the cloud, now it
makes you complicit in what that government sees as a crime.
And that means that, according to that government, the stuff

(26:26):
that you did, which was perfectly fine, your relationship was
perfectly fine until someone retroactively decided it wasn't. And now
because you wanted to hear the remakes to ignition, now
just because of the conversation that occurred around that time,
now you are in hot water and there's not a

(26:46):
recourse to help you. That's a terrifying possibility. Matt. You
were telling us off air that at least some companies
like Amazon attempt to quell those fears by publicly stated
there are hard constraints on how long an echo can
record something.

Speaker 2 (27:04):
Oh yeah, absolutely, And again to Amazon's defense, I am
being completely conspiratorial in this and it's just one of
those weird foresight things that both men and I were
talking about there. But Amazon has stated that only a
fraction of one percent of interactions with their devices actually
gets transcribed in this way by a human, by a

(27:25):
human where it gets sent off. They transcribe at the
center back in since the beginning of twenty nineteen, and
this is by the way, from August of twenty nineteen
from the ambient, so eight months of transcribing had had
gone through and only point two percent of all requests
to Alexa had actually been transcribed. So that's very a
very very small number of conversations that actually get listened

(27:48):
to and transcribed.

Speaker 1 (27:49):
Interesting how they put it in percentage though, because putting
it in percentage can make something something seems smaller than
it is in reality.

Speaker 3 (27:57):
Right, it's a massive sample size, right, That small percentage
is actually a massive.

Speaker 1 (28:03):
Number, right, Right, It just sounds a little more reasonable.

Speaker 3 (28:07):
Right.

Speaker 2 (28:07):
Well, here's the great thing. When you say, hey, whatever,
Alexa gonna do it, You're gonna do it. Spread it
out enough. But when you do that, generally you ask
a very short request or a very short question or
something to that effect, and then.

Speaker 3 (28:22):
It does the thing and then like that window of
monitoring is over kind of.

Speaker 2 (28:26):
Right, generally, yeah, it lasts for about two seconds, that's
the average. So when somebody is transcribing, that's literally all
all it says.

Speaker 3 (28:34):
You can they make good money doing that?

Speaker 1 (28:35):
Or is it like they know it's Amazon, They're not
getting paid very well at all?

Speaker 3 (28:39):
Is it sort of like we as a company, we
do a lot of transcribing interviews and stuff. You think
it's very similar to that. You think even outsource it.
They couldn't.

Speaker 2 (28:46):
It is contractors.

Speaker 3 (28:47):
It is contractors.

Speaker 2 (28:48):
Some of its contractors, some of its employees got it.

Speaker 1 (28:50):
And this practice has no currently has no real legal
constraints because, as we know, technology always outpaces legislation tale
as old as time. However, I get the feeling that
a lot of us were sort of aware that something's
off with these home assistants, or that there is some
kind of transaction at play. If it's not terrible, if

(29:13):
it's something we're okay with, we knew there was still something.
And you can hear, you know, when things go wrong
in nine one one calls and all these other spooky
stories about things going south with Google or Amazon. But
what about the other devices. We have some news for
you about smart TVs. Look around your room or wherever

(29:34):
you happen to find yourself, is there a TV in there?
Things are about to get very interesting for you. After
a word from our sponsors.

Speaker 2 (29:45):
All right, we're back. Let's jump in to something we
learned about thanks to New York Times in a twenty
eighteen article about filing. New York Times, they are not failing.

Speaker 1 (29:57):
I don't believe, I mean not after this bump. I
mentioned this article in the gang Stalking episode. So you're welcome,
New York.

Speaker 2 (30:05):
Yes you did, Ben. Ben did bring this up, and
we decided we were going to look into it, and
we did, and now we can't look away forever. So
it's a thing called Samba TV. That sounds fun Samba TV, it.

Speaker 3 (30:18):
Sounds really funny. Know what else sounds finn I'm gonna
skip down just just a tad. How does this sound
to you?

Speaker 1 (30:22):
Guys? Hey?

Speaker 3 (30:24):
How about you want to interact with your favorite shows,
get recommendations based on the content you love, connect your
devices for exclusive content and special offers. How about Samba
Interactive TV lets you engage with your TV in a
whole new way. That sounds great sounds.

Speaker 2 (30:42):
Damn good.

Speaker 3 (30:42):
I'm into that.

Speaker 2 (30:43):
So what is Samba TV?

Speaker 3 (30:45):
Oh?

Speaker 1 (30:45):
Hey man, what's Samba TV?

Speaker 2 (30:47):
Okay, I'll tell you. It is a software, a piece
of software that is present in a lot of television models,
some models from nearly a dozen smart TV brands. And
again this is as of late twenty eighteen. That has changed.
There are more included now, but it is Sony Sharp Phillips,
a lot of that, all the hits, all the head

(31:09):
ones in this software. In particular, it identifies what is
being watched on the monitor the television by literally analyzing
the pixels displayed and then comparing that data to a
set of known media that exists out there.

Speaker 3 (31:23):
It's a similar way like audio things or even YouTube
things are flagged for copyright violations.

Speaker 2 (31:28):
Yes, but in this case it is the end user
that is the actual piece of hardware that is being monitored.

Speaker 1 (31:35):
Right, And this was always coming Nielsen ratings, like the
Nielsen Institution wanted this.

Speaker 2 (31:41):
Yes, there needed to be a way to find out
who was watching what when, and in particular, when you're
saying about who, it means everybody who is around, not
just that this household is watching something. But here's the
idea you're viewing history is then in part used to suggest,

(32:02):
as Noel was saying, as the flowery language there that's
actually present in the PR from Samba TV.

Speaker 1 (32:07):
Oh yeah, they wrote that copy.

Speaker 3 (32:09):
Of Well it's not only the PR, it's the opt
in message.

Speaker 2 (32:11):
Yes, yes, it's used to suggest the next content that
Samba TV believes you will yourself enjoy. But that is
not all that Samba TV does. It also identifies all
the other devices that are connected to the same network
through which it is accessing the internet.

Speaker 1 (32:29):
So your friend comes over, they have a phone with
Wi Fi. Now they're in the loop as well.

Speaker 2 (32:35):
Yes, if you are Netflix and chilling or whatever the
kids call it these days at somebody else's house or
apartment and Samba TV is there, it knows that you're
there because it can identify your device and the Mac
address and all those things.

Speaker 3 (32:48):
And here's the thing. This company claims that it is
adhering very closely to privacy guidelines set forth by the
Federal Trade Commission, that it does not directly sell any
of this data. Instead, at tizers can pay the company
to kind of guide the hand of the ads in
the placement, which which makes sense. Doesn't sound too insidious, right.

Speaker 2 (33:09):
Well, it's directing ads to the other devices that are
present who they believe are watching the television program right.

Speaker 1 (33:17):
Right, right, And so your opt in stuff happens at
the television right at the TV, but it doesn't happen
at your smartphone necessarily if you walk into someone else's house.

Speaker 3 (33:28):
And so, how do they get around the legality of
other people? Like, are they automatically part of your opt
in when they's.

Speaker 1 (33:37):
That's what I'm saying, there's not informed consents. Interesting and
also think about this, this technology is amazing. If our
species was less of a garbage fire, we could use
this to do wonderful things for you know, say someone's
mental health, right, and someone's like, okay, I see you've
watched Faces of Death four nine times in a row.
I'd like to recommend the Great British Bacon Show. Do

(33:58):
you know what I mean?

Speaker 3 (34:00):
You do a better job with your content consumption? No,
it's totally true. And when you let's say you plug
in your new smart TV and it has Samba on it,
it will present you with that very flowery language that's
the opt in message. There is a giant terms of
service and privacy policy. You know page that you can

(34:22):
peruse if you wish. I believe it's sixty five hundred
words for the terms of service and four thousand words
for the privacy policy. But why would you even bother
doing that when you can interact with your favorite shows,
get recommendations based on the content that you love.

Speaker 1 (34:36):
That terms of service is a real page turner.

Speaker 3 (34:38):
Oh yeah, yeah, But why would you want to do
that when this seems so innocuous and you just want
to start playing your Fallout four?

Speaker 2 (34:46):
Yeah, well, to see, that's not the real insidious thing
here is that Let's put yourself in the position of
you've spent let's say the last couple of months saving
up money because you really, you know, you need to
get this new TV. You're really excited about it. You
finally get it right, and you're installing it your you know,
your hands are all sweaty because you know Fallout four,

(35:07):
the next playthrough is about to happen. You're like, oh
my god, this is I'm so excited. You plug this
thing in, you start going through the initialization process, the
Samba thing pops up, and you know you can you
literally have to decide if you're going to spend an
hour parsing through all of that legal ease or if
you're gonna get to whatever it is you're you wanted
to get to, and most people just click enable and

(35:31):
move forward. It has nothing to do with the flowery
language or anything like that. It says enable, Okay, yes,
this gets me to the next thing. Just click and enable,
And that is exactly what most people do.

Speaker 1 (35:41):
Right on the order of an estimated ninety percent, A
vast majority of people do click enable. And once this
stuff is up and running, it's a katie bar the
doors they used to say in Days of your Samba
sees everything that is displayed on the monitor, regardless of
what you're watching or playing or how you're displaying it.

(36:04):
It doesn't matter if you're watching TV. It doesn't matter
if you're watching a film. It doesn't matter if you're
broadcasting a home video.

Speaker 2 (36:12):
Right yeah, it could be literally anything, you know, if
you are broadcasting a home video of something that you
wouldn't want anyone else to see, Samba TV is analyzing it.
It isn't necessarily matching up with any known media. Sure,
but if you broadcast the same kind of home videos,
Let's say I your kids, maybe a romantic video you
made with your partner. I mean, honestly, who knows. That's

(36:36):
again taking a little bit further than the known technology
or the known reasons for using it. But it could
be used in the future by someone to figure out
very personal, intimate things about you. But it's sort of like.

Speaker 3 (36:49):
When we read about the NSA and the way the
NSA was monitoring people's phone calls. They weren't recording the
actual audio. They were just capturing the metadata so they
knew how long a call lasted, or like you know
who this web of interconnectedness or whatever, it's similar with this.
It's not like they're recording actually what you're streaming. They're
just capturing the data of what it is, how long

(37:10):
you watched it for, et cetera, of the pixels of
the pixels.

Speaker 1 (37:13):
Right, So this, okay, this is true. Even if we
want to be as skeptical or I should say, as
credulous as we can, and if we take those pieces
of State of pr copy at their word, this still
has a ton of hilarious, cartoonish vulnerabilities. You can learn

(37:37):
too much about people and there's no way for the
end user to stop it. Other than try to opt out.
But opting out it doesn't delete all the stuff that
has already learned about you. Are we being paranoid perhaps,
or perhaps we should introduce you to Alfonso. Oh God,

(37:57):
I love that Alfonso.

Speaker 2 (37:59):
Okay, So to break this down thus far, we've got
our personal assistance that always are on Slicious no matter
if we're saying the keywords or not. They have their
microphone on and they are listening. They aren't necessarily recording
all the time, but they are. Now you have your
smartphone over there that is literally watching what you're watching too,

(38:20):
and it is making informed decisions about what you watch
and sending ads to all the devices in your house. Now,
let's say you're on one of those devices. Let's say
it's an Android device. Let's say you went to the
Google Play whatever it is app store thing, and you've
downloaded some apps and some games. Well a lot of

(38:40):
these apps and games, not all of them, but a
lot of them have partnered with this thing called Alphonso.
So this is a really interesting little piece of software
that's attached to these apps, and what it will do

(39:01):
is prompt you to enable the use of your microphone.

Speaker 1 (39:06):
Right and these would be things that do not ostensibly
need that kind of access. Pull three D beer pong,
trick shot, real bowling, strike ten pen, you know these
kind of word salady names, little fun waste of time apps.

Speaker 2 (39:20):
Well, and not just those some anti spying software. There's
a ton of apps out.

Speaker 1 (39:25):
There right because it's it's this Alphonso is app agnostic. Yes,
So here's what happens. Here's why they want that microphone access.
Because when you're using this app, or when you grant
this app microphone access, Alfonso can figure out what you
happen to watch by identifying audio signals and television ads

(39:48):
and shows and even matching that information with the places
people visit and the movies they see really quickly. Here
is how it works. So we're all hanging out, we're
watching some television show that we're into. Let's go with lost,
something with commercials. So when the show switches to commercial,
there is a pitch an audio signal that goes out

(40:12):
to the room. You cannot hear it, your pets cannot
hear it, your kids cannot hear it. No one can
hear it, and no one is supposed to hear it.
It's only for your phone's and that's what they do.
They communicate with your phone, and then the phone will
also let people know via Alfonso. The phone will let

(40:36):
the users of the app, the real app. The users
of Alfonso understand who is in that room, where they
came from, maybe where they're going, and what they would
like to buy. That's not what you sign up for
when you walk. You know, you go to a potluck
at your friend's house to watch some kind of film. Right,
and how many apps are we talking here?

Speaker 2 (40:58):
Well, okay, so according to the New York Times, there
were over two hundred and fifty apps on the Google
Play Store with this feature. Right, And if you want,
if you head over to the Google Play Store and
you type in quotations Alfonso Automated. That's A L P
H O N S O A U T O M
A T E D, and you will find all of

(41:18):
the various apps that have this thing installed. But then
if you if you look at an interview with some
Alfonso people, they said that there are thousands of apps
that they've partnered with and they didn't want to disclose
all of them because they have competitors who are trying
to basically get in on their territory. Yeah, approach their territory.

Speaker 3 (41:37):
Remember when spyware was a big concern. This is like
some next level spyware.

Speaker 1 (41:42):
This is spyware.

Speaker 3 (41:43):
Yeah, but it's like different, right, It's like it literally
is opt in, right, it's spying on you.

Speaker 2 (41:48):
Yeah, it's crazy, and now we know it's cyclical. So
it's all of the devices functioning together in this web
of trying to figure out what you want the most
and how to display that thing to you most effectively.

Speaker 1 (42:01):
I guess Staltz effect. Yeah. This also, this problem becomes
complicated even further when we realize that private entity institutions
are not the only actors in this sphere. Indeed, they
may be some of the more innocuous. I love that
you mentioned spywaar or NOL, because the best spywear right

(42:22):
now is being built not by private industry but by
state actors. We mentioned Amazon's partners, right, Amazon's partners using
the data. Amazon's partners are alphabet soup intelligence agencies or
strongly thought to be so, especially.

Speaker 2 (42:40):
Legedly, especially thanks to that early cash injection.

Speaker 1 (42:44):
Right, exactly so. According to a Washington Post article from
twenty seventeen, the United States government has already turned theoretical
exploits and vulnerabilities in this kind of stuff into functioning
attack tools. One of these goes by the objectively badass
name Weeping Angel. Weeping Angel is specifically meant to target

(43:05):
Samsung TVs. This is just a small, microcosmic example, and
this is at least what it was doing two years ago,
according to WikiLeaks after infestation, Weeping Angel places a target
TV in a fake off mode so that the owner
believes the TV's off when it's still on. And then
in this fake off mode, the TV operates as a bug,

(43:28):
recording conversations in the room and then sending them over
the cloud to a covert CIA server. This sounds bonkers.
This sounds bananas. I can't believe it's real.

Speaker 2 (43:37):
Why would anybody ever be paranoid? Right?

Speaker 1 (43:42):
Why would they?

Speaker 2 (43:43):
That's crazy?

Speaker 1 (43:45):
And I hope whomever is listening to this around a
smart television has unplugged their headphones and is listening on speaker.

Speaker 2 (43:53):
You know, Okay, look everything, I just have to say this,
everything we've been discussing today, If if you are of
a certain mind, perhaps like myself quite a lot, quite frequently,
it could lead you down a dark pathway where it
feels as though there's surveillance everywhere and you're being targeted

(44:13):
in some way. We can assure you this is not
just about you, no matter what you may think or
no matter what you may believe. Sure it is, it's
mass it's everybody, and again it is not necessarily nefarious.
But it's real.

Speaker 1 (44:33):
I that's a matter of perspective. There is a certain
self importance or self aggrandizing that occurs when people are
suffering from paranoid delusions. Right, But being paranoid about this
sort of stuff does not make you delusional. It means
that you have unfortunately turned over the rock and you've

(44:54):
seen the thing squirming in the darkness beneath. This is
very real stuff.

Speaker 2 (44:59):
I love that, and I also am terrified by it.

Speaker 3 (45:03):
But you know, it's not all bad. I mean, there
are ways of kind of at least stemming some of
this stuff a little bit. Right, Yeah, So, how to
Geek actually has an easy to follow a guide on
how to stop Google Home from recording you all the time.
Google Home has a thing where it actually saves your
voice memos. You can check that out. You have to

(45:24):
opt in for constant recording allegedly, while you can if
you're an existing user opt out.

Speaker 2 (45:29):
Yeah, that's the that's the whole thing. They've updated their
terms of service basically Google Home has right and actually
Amazon has done something similar there where you have more
choices now. But if you're an A I think, if
you're a legacy user, you actually can't get out of
some of the agreements you already signed into. Yeah, somebody

(45:50):
fact checked me on that, but I recall reading that
this morning. Here's the good thing. Remember Samba TV we
were talking about it felt so creepy. Literally, all you
have to do is say disable when you get to
that screen and you're installing your TV. That's all you
have to do and you're done.

Speaker 3 (46:05):
Do you really not think it has something like why
do you think people are so prone? Ninety percent? Is
such a massive like amount? Like, are so prone to
click enable.

Speaker 1 (46:12):
Because you've got a new toy and you want to
take full advantage because it's presented, like I said, as
this lovely way of like making this a better experience
for you the user.

Speaker 3 (46:24):
Why wouldn't I want that?

Speaker 1 (46:25):
Well?

Speaker 2 (46:25):
And it's a menu that you have to click through, right,
So think about it this way. If it's on enable,
so you your cursor is on enable. When the screen
pops up, you'd have to go down to terms of service,
down to privacy policy, down to learn more, down one
more to disable the clicks. The as stupid as that sounds,
and you know, benign as five clicks or four clicks,

(46:49):
people will take the easier route and just say, okay, fine.

Speaker 1 (46:52):
Enable, I'm in a hurry. I can't do I got
MF places to be, you know what I mean.

Speaker 2 (46:57):
It's true.

Speaker 1 (46:58):
So it's it is true, and it's an exploit not
just of technology, but an exploit of our own hardwired physiology.
Our brains are built to function this way, right, and
this leads us to some conclusions in what is very
much an ongoing events right. The first conclusion is that

(47:20):
there are some issues remaining. There's a lack of accountability.
One of the primary issues in this conversation is the
utter lack of accountability on the part of private institutions
as well as government agencies. It is not difficult to
imagine these companies cooperating with intelligence agencies, further exacerbating the

(47:42):
legal pitfalls involved. And again it's important to point out
just as a cheap skate. It's important to point out
that the people getting their data gathered are not paid
for that information, quite the opposite it used to be.
You know, what's that old adage. We always said, if
you're not paying for it, you are not the customer,
you're the product. Yes, right, But now the pendulum swings

(48:06):
a little bit further in the wrong direction in my opinion,
because we are paying for these services. We are paying Amazon,
we're paying Google to spy on us to whatever end,
and we are we are not accounting not only for this,
we're not accounting for the larger problem, which is that
insurance companies aggregate this information, your financial institutions aggregate this information,

(48:31):
and there is nothing that stops them from cooperating together
to build a footprint of you close enough. The idea
is that this footprint, this digital impression of you, will
one day have the fidelity such that it can predict
future actions you will take.

Speaker 3 (48:50):
So you're saying that it could in theory be used
against you, Yes, very much.

Speaker 2 (48:55):
So we're saying they're going to make Android versions of
you knowl' I'm cool with that.

Speaker 3 (48:59):
That's what some help.

Speaker 2 (49:00):
Yeah, but it's not gonna be about you anymore. No, No,
it's gonna be Amazon.

Speaker 1 (49:04):
And and and it's it's fascinating when you think about it,
because now I know.

Speaker 3 (49:11):
You're not joking. It's it's it's like a shape of
me that is my data, you know, like it's out
there in the matrix. It's a null shaped data cluster.

Speaker 1 (49:20):
Or a Matt shaped data cluster, a Seth shaped cluster,
Seth shaped cluster.

Speaker 2 (49:26):
That sounds like a band.

Speaker 1 (49:27):
Yes, it does like a tasty treat, it does. I
feel like it's maybe a it's maybe like a hostess thing.
So we are brilliant ideas for desserts to treat the side.
We are all in this together. We are looking at
the end of privacy as we recognize it. And that's
sort of tricky. That sounds more dramatic than it really is,

(49:48):
because the concept of privacy as we know and enjoy
it today is relatively recent. Yes, right, and everything we've
learned indicates that type of privacy we idealize may end
up becoming a short lived fat to future historians. We're entering.
You know, Matt, you and I talked about this a
long time ago. An inequality of privacy, Right, Privacy is

(50:10):
a new currency. Some of the world's most influential, powerful,
successful people still have this kind of privacy, right yeah.

Speaker 2 (50:19):
A weird example is just think about how much it
costs to get a good tint on your windows. I'm
not kidding. I'm not kidding. If you see someone drive
by with perfectly tinted like the darkest windows you've ever seen,
that is expensive. Well, and that's literally privacy just in

(50:41):
your car anyway.

Speaker 1 (50:42):
I'm sorry, I feel like something going on is your car, Okay.

Speaker 2 (50:47):
No, no, I'm just saying that that amount of privacy
just to be on the road driving costs money, right yeah.
And if you think about really good shutters on a
home or something like that, in those little examples, it
takes quite a bit of means to protect yourself just
from someone viewing with their eyeballs where you are at

(51:09):
any time. And then if you apply that to the
digital space, it gets more and more expensive.

Speaker 1 (51:15):
It's creating something very similar in nuts and bolts and
the mechanics of it to the infamous sesame credit that's
occurring in the Chinese mainlanes precisely. And I'm not being
alarmist about this, and I don't you know, I don't
want people to be any more frightened. That is absolutely appropriate.
You should be a little We're at the Pandora problem right.

(51:37):
Once the Pandora's jar is unscrewed, once the lid is off,
there is no going back. There were some rumblings in
Congress about investigating what is essentially a smart TV spir ring,
but the advantages of keeping the technology in play for
now seem to outweigh the problems of consent and the
fact that consent is not occurring, and the fact that yes,

(51:59):
this could and up with your with your information from
other places, such that it might affect your ability to
get a car loan, It might affect where you can live.
This can get very dirty, very quickly.

Speaker 3 (52:12):
It's a slippery slope, and especially once I mean, what if,
like we can't opt out anymore, you know, if if
that goes away, Like are we really owed that right
to opt out? Like it's sort of like almost a
pr move to allow us to opt out, Like you
could very easily say, as the manufacture of a product,
say well, if you don't want us to have your data,

(52:33):
don't buy the products. Like it's sort of almost a courtesy.
If you think about it to allow people to opt
out of this.

Speaker 1 (52:38):
Yeah, but what it could just.

Speaker 3 (52:39):
Are you owed a fancy television?

Speaker 1 (52:41):
But it's no, but are you owed? And are you owed?
Are you required to participate in some of these systems
insurance one is required to.

Speaker 3 (52:50):
Do so absolutely, that's different, I think. But I guess
what I'm saying is, like with the television, all these
are gadgets that, like you could not necessarily call necessities.

Speaker 2 (52:59):
Well, think about think about most people with a steady job,
think about email communication nowadays, or an app that's used
wherever you work or you know there. You have to
have some kind of connection like that. You really do,
and with most of these devices, you're going to run

(53:20):
into these issues.

Speaker 1 (53:22):
Or add to that the compounding complicating factor that for
the for a huge proportion of people who have mobile phones,
it's their only way to access not just the Internet,
but it's their primary tool for any financial dealings like
people's lives hinge on this thing. Yes, so I see

(53:42):
both sides of that, But here's what I think we
can end with. We can say it's not just the
United States. Way back in twenty seventeen, when WikiLeaks released this,
they showed that digital spine is going to continue to grow.
It's not going to go away.

Speaker 2 (53:57):
We're talking about the Weeping Angel thing, right right.

Speaker 1 (53:59):
Weeping Angel and specific which again is just for Samsung
TVs and it's relatively low tech. You have to put
a USB stick in there, but now you don't. Now
it's now it's a whole different thing. You know, anybody
got sort of irritated when you bought a new cell
phone and it had stuff pre installed that you can't move.
Think about this times a thousand. That's what's happening. That's

(54:20):
what's going to happen. And it's not just happening in
the US. Other advanced nations China, Russia, Britain, Israel and
so on are creating newer, more robust, powerful tools to
do this, and any any nation that can gain access
to this kind of spying technology is going to do so,

(54:41):
and they can do it through a web of private industry.
There are no laws. This is wild West, and it's
very bad. It's very bad for the people who are
not at the top of the food chain.

Speaker 2 (54:52):
And with that, we all threw away our devices. We
got out our acoustic guitars or ukuleles and started Kumbaya.

Speaker 1 (55:00):
I started writing my version of roth Water Emerson's on Nature.

Speaker 2 (55:04):
Right, that's correct. I got our my Jimbei out and
we just started playing.

Speaker 3 (55:09):
Finger symbols from the Yeah.

Speaker 2 (55:13):
And we just you know, commune with nature for the
rest of our lives and watch the sunset dissipate over
the horizon.

Speaker 3 (55:20):
Had an ashram in Quebec.

Speaker 2 (55:22):
Yeah, and then we all woke up and went back
to work.

Speaker 3 (55:25):
Yeah, that's true. And hey, listen, I'm just being Devil's
aavogaated about like are you owed a smart TV? And
I agree it's a great question. Well with the phone though,
You're right, the phone is for many an affordable entry
point to the Internet because it doubles as a you know,
a crucial communication device and a way to access the Internet,
which we can all agree is a necessity for things

(55:45):
like banking everything, you know, everything starts on the Internet,
the Web, of course. But I would argue that for
the things like, you know, an Alexa, do we do
we need an Alexa? Maybe maybe for accessibility? Maybe maybe
that's a thingy for like people with disabilities, and Alexa
could be a very important addition to a home for others.

(56:08):
I think it's more of a luxury and sort of
like a neat little gadget.

Speaker 2 (56:10):
You know, you're absolutely right, and you all nobody needs
a smart television monitor right, But very soon the only
available televisions will be smart televisions, at least ones that
are easily purchased.

Speaker 3 (56:23):
I think what I was getting at when Ben talked
about the sesame credit and the slippery slope of all
this is what if there does come a time where
you can't opt out anymore. Just by buying the thing
and installing it in your home, you're opting in. The
Only way to opt out is to not buy it,
or to buy something else.

Speaker 2 (56:39):
There you go, or live in the woods.

Speaker 1 (56:42):
And that's our classic episode for this evening. We can't
wait to hear your thoughts.

Speaker 3 (56:48):
It's right let us know what you think. You can
reach to the handle Conspiracy Stuff where we exist on
Facebook x and YouTube on Instagram and TikTok or Conspiracy
Stuff Show.

Speaker 2 (56:57):
If you want to call us dial one eight three
three the STDWYTK that's our voicemail system. You've got three minutes.
Give yourself a cool nickname and let us know if
we can use your name and message on the air.
If you got more to say than can fit in
that voicemail, why not instead send us a good old
fashioned email.

Speaker 1 (57:14):
We are the entities to read every single piece of
correspondence we receive. Be aware, yet not afraid. Sometimes the
void writes back conspiracy at iHeartRadio dot com.

Speaker 2 (57:45):
Stuff they don't want you to know. Is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows
Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.