Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
I'm Manny, I'm Noah, and this is no such thing.
The show where we said all our dumb arguments and
yours by actually doing the research on today's episode, he's
my phone listening to me right now. There's no no
such thing, such thing, such thing.
Speaker 2 (00:29):
First question, do you guys use voice assistants like devices
very infrequently?
Speaker 1 (00:36):
I'll I'll tell Siri to put something on my calendar. Okay,
I've for most of my life have not used the
calendar until very recently at all. Now I'm needing to
use a calendar busy.
Speaker 3 (00:48):
Yeah, yeah, man.
Speaker 1 (00:50):
You've never known the calendar. It's not that I wasn't
busy enough to use a calendar. It's that I just
would remember everything. But now I'm starting to like, I
can't remember everything.
Speaker 3 (01:00):
On the calendar. It doesn't exist to me.
Speaker 2 (01:02):
Yeah, it's it's very important to have it on the calendar.
Speaker 3 (01:04):
So that's the only way you use hear, Yeah, put
somebody account.
Speaker 1 (01:08):
Only it's only to put pay Siri put it ad.
You know, add this to my calendar.
Speaker 2 (01:13):
And that's like on your phone on my phone. Yeah,
not like you don't have like a echo or something.
Speaker 4 (01:18):
I'm an Alexa head, so a lot of ship in
my apartment is set up to go through Alexa, and
I really notice it when I go on vacation. You forget,
like how ingrained this stuff is into your life until
you go somewhere where you can't do the thing, because.
Speaker 3 (01:33):
I'll be like, Alexa, what was the weather? And I'm
in a hotel room talking like a crazy person.
Speaker 4 (01:38):
It's like or they're like, I'm having actually some issues
this week with my Alexa. It's like I got to
restart it whatever, it's not working, and I just have
been putting it off. And then I'm like, oh, like,
how am I going to turn off the lights? Because
the lights are tied to the Alexa music, Like every
everything in my house is it is Alexa. So when
(02:01):
Alexa doesn't work, I'm like, oh, I'm very reliant on this.
Speaker 1 (02:05):
Yeah.
Speaker 2 (02:05):
I don't really do voice The only voice activation stuff
I do is like on my Rocoop remote where you
press the mic and say a movie or whatever.
Speaker 1 (02:12):
Oh yeah, because it.
Speaker 2 (02:13):
Especially certain apps. It's the way they set up keyboards.
It's it's insane, like why would you put the alphabet
in one straight line? It makes no sense you. I
would rather randomize it. In a cube and it would
be type in stuff then do it that way. Yeah,
so that's the only voice thing I can think of
that I use. I never used Surry or I've never
(02:34):
had like one of those like an echo or anything
like that. But anyway, this is generally a topic that
we've actually discussed for almost a decade now. Yeah, taking
us back to our day's co working in the office, yep,
we would, you know, have our discussions that you know
would eventually become podcast episodes. Yeah, and if we caught someone,
you know, changing their story or backtracking, one of us
(02:57):
would yell out something like.
Speaker 1 (02:58):
Hey, Alexa, play that back. Yeah, you know, as if
Amazon's Alexa had a feature that was recording and ready
to replay whatever we're referencing whenever in an argument. Now,
to be clear that this feature doesn't exist, yeah, didn't,
then doesn't now never really has.
Speaker 3 (03:15):
Do you guys remember any of these?
Speaker 5 (03:16):
Oh?
Speaker 4 (03:16):
I remember one specifically we got in a very heated
argument about Katy Perry was hosting.
Speaker 3 (03:25):
You remember this?
Speaker 1 (03:26):
Not really? I mean I might I need to hear more.
Speaker 4 (03:28):
It might be a most heated argument we've ever had
as friends. We might not be ready to do this,
but she was hosting a VMA's and I was saying,
this is a really bad look for Katy Perry. She's
too famous to be doing this and this is going
to be not great for her career, Like this is
a sign that she's on the downslope of her career. Okay,
obviously this week Katy Perry going to Space and a
(03:48):
lot of backlash for it, people not liking the music
over the last ten years.
Speaker 3 (03:52):
I'm not saying because she.
Speaker 2 (03:55):
Did that was the beginning of the end.
Speaker 3 (03:57):
It was the beginning and the end.
Speaker 1 (03:58):
Your argument has been valid is yes.
Speaker 4 (04:00):
But there was a lot of back and forth when
we were there was a lot of because Mila Sarus
had done it recently, I think right before yes, so
we had asked and you know when you get into
those sort of conversations, every line you start to nitpick.
Speaker 1 (04:14):
Yeah.
Speaker 4 (04:14):
So during that conversation popular yes, wow, exactly, Yeah, And
there was a back and forth about Miley and whether
or not that was good for her career. And my
argument was myle Sarus was not on the level of
Katy Perry. And that's when we had to bring in
Alexa to playback the specifics of that conversation. So that
was one argument where we yes Alexa to play it.
Speaker 1 (04:36):
Back, I remember, I don't remember that. That's hilarious. I'm
sure I was chiming in, And that just proves like
back then I was didn't even fucking care what I
was arguing about.
Speaker 3 (04:46):
No No, that was the thing that was with the
fun aspect of it.
Speaker 2 (04:49):
It's like, who really cares about?
Speaker 4 (04:51):
Yeah, yeah, I think this may have been too early,
But the first time Manny and Noah Matt.
Speaker 3 (05:00):
Oh, this would be a great one.
Speaker 1 (05:01):
Yeah, it's not that.
Speaker 3 (05:03):
It's no one else, Okay, who cares?
Speaker 4 (05:10):
But uh, Manny introduced himself to Noah, and I would
put money on it that Manny said, tie, this is Manny, Yeah,
to introduce himself to Noah, and we're like, why would
you introduce yourself by saying this is your name?
Speaker 3 (05:29):
Which is funny because now I do that.
Speaker 2 (05:32):
Honestly, It's like, I don't even know now what I
even think happened, Like.
Speaker 1 (05:36):
You know, you two thought I said this is many.
I mean, that's that's kind of the stance I've held.
I don't even know if I even believe it would
have it would have had to be, it would have
been a complete like accident. But I feel strongly that
I didn't say, this is man So what did you say?
I would have said, Hi, I'm Manny, my name is Manny.
Kind of sounds like this is many to find out
(06:00):
this is many, Hey, this is Manny's shaking your name.
Speaker 3 (06:06):
All right.
Speaker 2 (06:06):
So lex has been on our minds for a long time,
and you know, most people have had some sort of
suspicions about their phones and other devices listening all the time.
You know, you're talking about something that you haven't searched for,
not something you've bought before anything, and then you're on
your computer or whatever later and you're getting served ads
for something related to that.
Speaker 1 (06:25):
We've been talking about whether this is happening for a
long time. I'm now fully convinced that it's happening. For
a while, I've been like, okay, it makes sense when
you type something that you your browsers saving the cookies
or whatever, it knows what you want to search for.
But recently Mia has been getting into gaming, and so
she's been playing the Harry Potter video game. Pretty good game.
(06:48):
I'll say nothing about the controversy surrounding it. I don't
know what you're talking about. But I've been joking to
friends that she's been keyword hooked on this game because
she played it a lot. And so I've been telling people, Aha,
me has been hooked on this.
Speaker 3 (07:05):
And it's hooked specific hooked.
Speaker 1 (07:08):
Hooked is coming up now because immediately after I've been
saying that, I've been getting Instagram ads for gaming addiction.
So I'm getting ads that are like, hey, is your
kid addicted to gaming?
Speaker 3 (07:25):
Child?
Speaker 1 (07:27):
Big brother doesn't know that. It's my partner, not my child.
But I want to show you, guys some of the
ads I've been getting. So here's one addiction lawsuit. These
are for lawsuits. Join the lawsuit about these gaming companies
making your child addicted? How many hours a day?
Speaker 3 (07:43):
Three plus hours a day? Like normal?
Speaker 1 (07:46):
That's crazy, I'm doing that easily. Here's another one. Oh wow, multiple,
Oh I got, I still get. These are different lawsuits.
The previous page is from Right Choice Legal. Yeah, these
are different. This one's from gaming addiction.
Speaker 4 (08:01):
Claims, okay, and this one says gaming addiction. Yeah, gaming
giants profit six point six nine million dollars an hour.
Speaker 1 (08:11):
While your child struggles to quit. And so to me,
what other information could my phone be going off of
other than me telling people.
Speaker 3 (08:23):
That you've never gotten gaming additions.
Speaker 1 (08:26):
I didn't even knows a thing until now.
Speaker 3 (08:28):
What have you?
Speaker 4 (08:30):
Have you yourself started playing different video games or anything.
Speaker 3 (08:33):
Has your gaming habits changed?
Speaker 1 (08:35):
Yes, less, I'm a huge gamer, but if anything, in
the past couple of years, my gaming rate has been slowing.
And so this one comes up all right, very soon
after I start joking to people that meta is hooked
on video games.
Speaker 2 (08:52):
I have read articles that do debunk this. It's just
one of those things where it's like impossible to kind
of convince yourself, yeah, because you read it and it says, okay, like, well,
they just get demographic info on you. But at the
same time, it's always when it happens to you, you're
just like, oh, like in this moment, they can't know
me that well right now for the timing to happen.
Speaker 4 (09:10):
Yeah, But my theory would be the advertisers know that
you and me are a couple and she's looking at
something on her phone and now they are serving it
to you.
Speaker 5 (09:20):
Yeah.
Speaker 1 (09:20):
Do you think that's a lot of it too? Is
how close you are to other people who might be
looking at stuff. Yeah, I remember you saying this about
how it's not necessarily that they're listening, they're doing something worse,
which is like seeing who's around you, what getting a
gaming addiction AD. I would be like, what the hell,
that's weird. This one trips me up because it's about
my kid.
Speaker 3 (09:38):
Yes, yeah, that's tright.
Speaker 1 (09:39):
And when I'm saying like, hey, this person's addicted and.
Speaker 4 (09:43):
It is very yeah, it is very that's a serious one.
Speaker 1 (09:53):
So moving forward, today's episode came from some news from
our friends at Amazon. This is from an article in
Arts Technica. In March, Amazon would no longer let users
opt out of sending their Alexa voice recordings to Amazon's cloud.
This is meant to support their new AI feature, which
is called Alexa Plus. The article points out that people
might have privacy concerns after prior mismanagement of Alexa voice
(10:15):
recordings in the past.
Speaker 6 (10:17):
And Amazon accused of violating the Children's Privacy law.
Speaker 2 (10:20):
Amazon has agreed to pay twenty five million dollars to
resolve this case.
Speaker 6 (10:24):
Amazon confirms thousands of employees around the world listen to
voice recordings capture by that speaker and homes and offices
every day. Questions, family discussions, financial matters, even intimate.
Speaker 2 (10:36):
Moments, and then they've also been used in criminal trials
and things like that. Right, I think a lot of
people shrug get a lot of this stuff, and they go, well,
I don't have nothing to hide, so who cares? Like
if this thing makes my life easier in whatever way,
then I don't really care if some Amazon person is
listening to me. Yeah, beyond the tech thing. I remember,
like politically, a lot of people would say that too,
(10:58):
when like Edward Snowden was leaking all his stuff and
you know, we know the NSAS can just you know,
kind of wiretap anyone and listen to whatever, and people
would say the same thing. Well, like I'm not a terrorist,
So like if Obama wants to listen to me, like.
Speaker 1 (11:10):
Yeah, go for it, you know, yeah, And you know,
whether it's like, well, i'd rather the country be safe
if this is the cost, or you know, I don't
personally value my own privacy or whatever.
Speaker 2 (11:20):
Yeah, so I kind of want to explore the erosion
of privacy, not just in big tech, but also beyond.
Of course, in more recent months, we've seen the Trump
administrations kind of using this stuff as more of a
scare tactic. Where even you know, you can write some
op ed or whatever and then they're deporting you these
sorts of things. Yeah, I guess what are your thoughts
on that generally? Like, how concerned do you guys consider
(11:41):
yourselves with your personal tech and privacy?
Speaker 5 (11:43):
Yeah?
Speaker 1 (11:43):
I definitely. I've been on both sides of it. Like
I used to be when I was younger. I used
to be the person who was like, you know, I'm
not trying to kill anyone, so I don't care if
they're tapping into my thing, Like I'm not that worried
about it. Now as an adult, I'm like, oh, yeah,
wait a second, I don't want it doesn't matter if
you're not gonna you know, if you're your privacy is
(12:03):
your privacy. You don't want, just like any corporation or
politician to be able to tap into it. And I
and I ran into this recently because years ago for
a Business Insider video, I did one of those twenty
three and me ancestry tests and recently they just announced
that they're just like I think they're bankrupt and there
(12:26):
and anyone can buy them and just have that information.
And back then I didn't care, but now I care,
and I'm like, it just came back up, and I'm like,
wait a second, I'm contacting twenty three and me now
to get my stuff deleted. And there's a process you
can do it with, but like it's so long. It
hasn't been deleted yet. Is my deadline until a company
(12:48):
buys twenty three? Yeah, and then they can just say
no and that's now. Yeah.
Speaker 3 (12:53):
Yeah, who knows.
Speaker 1 (12:54):
It's a situation where it's like, all right, if I
don't know what it looks like if my ancestry gets
into the wrong and so to speak, but it's still weird.
It's your information.
Speaker 4 (13:05):
Yeah, yeah, Well they've shown that they've used like the
ancestry stuff to like say you have a cousin that
commits a crime or they think committed a crime. They
could get the NA from a scene and say, oh,
you know from many that this is this guy actually yeah.
Speaker 1 (13:19):
So yeah, these days, i'm a little older, I think, like,
you know, your information is so unique and it's like
one of the only things about you that's private anymore,
and I think it's important to like, you know, find
ways to keep it private.
Speaker 4 (13:34):
I lie to myself about this though, because I try
to pretend like oh, I'm all about my privacy right,
Like there are certain things I won't do.
Speaker 3 (13:39):
Right.
Speaker 4 (13:40):
It's like Pole Foods has the like you can use
your palm to pay for things, and I'm like, absolutely
not Amazon, You're not giving my palm. But then it's
like I do all the fingerprint stuff at Apple. Apple
has my face I.
Speaker 1 (13:53):
D oh, yeah.
Speaker 2 (13:53):
I avoided face ID for so long, and then once
I maybe.
Speaker 1 (13:56):
Had to do it, I was like, oh, yeah, this
isn't make it easy. Like yeah, it'll be things like
that where it'll be like I used to not log in,
like when you're logging into a site and there's like, okay,
you can just log in with Google.
Speaker 3 (14:07):
Yeah.
Speaker 2 (14:07):
I used to always just make my own password and
keep everything separate, and then now I'm.
Speaker 1 (14:12):
Just like, yeah, I'll just log Now. It's like if
anyone gets in my Google account, yeah they go Thatt's over.
Speaker 3 (14:16):
Yeah, yeah, they get my whole life. It's over.
Speaker 1 (14:18):
We're clearly hypocrites, Like I pay for clear, which like
this is like thirty seconds after I do a speel
about keeping your information private.
Speaker 3 (14:28):
It's the thought that counts it is.
Speaker 4 (14:30):
I guess the thing, right is like this convenience, right,
it's like, yeah, that's really it. I care about my
privacy until it's too inconvenient to care about my privacy,
and then I don't care.
Speaker 1 (14:41):
Right.
Speaker 3 (14:41):
It's like the Alexa is so convenient.
Speaker 4 (14:44):
It's so convenient to be able to like turn on
my music with my voice and like my lights, and
to like I have like a ring camera, which is horrible, right,
but it's like I have a dog and I need
to know when the dog walker comes and at what time.
And you know, it's like all this stuff that it's
horrible and it's surveillance, and I'm like voluntarily giving up
my privacy, but I'm justifying it by thinking, yeah, this
(15:07):
is making my life better an x amount of ways,
and what is the likelihood that this is going to
come back?
Speaker 3 (15:12):
Yeah, it's that me.
Speaker 4 (15:13):
But that is always what people think any situations. They're
never like I'm giving up my privacy. Oh no, and
then like you know, it's like they're always like it's
no big deal, and then obviously it comes back to
bite them.
Speaker 5 (15:25):
Yeah.
Speaker 1 (15:25):
So to spin out some of these questions, I'm going
to talk to someone who can actually explain how tech
companies like Amazon might use our recordings and why reachuch
care or not?
Speaker 2 (15:34):
That's after the break.
Speaker 3 (15:44):
All right, we're back. I'm Manny, I'm Noah Devin.
Speaker 1 (15:48):
So we're looking at how tech and voice activated devices
are impeding on our privacy. So I reached out to
Andrew Koutz, he's senior editor of Security and Investigations at Wired,
to help de bunk some rumors and put our fears
to rest. Okay, not quite, but he was super informative.
Speaker 2 (16:04):
So I wanted to start with the basic scenario we
talked about, where you're chatting about something and then you
get served ads for something related to that that you
never searched for before.
Speaker 1 (16:11):
Here's what he had to say.
Speaker 5 (16:12):
There have been many investigations to try to figure this out,
and as far as I know, none of them have
found evidence that the phone is actually listening to you.
Though now even as I'm saying it, I'm doubting that
there's probably some instance at some point where somebody's found something.
But in most cases, to the best of my knowledge,
there's not widespread surveillance of your conversations from your smartphone
(16:37):
happening all the time. I think typically what ends up
happening is that you do search for something or someone
close to you searches for something, somebody you're associated with
searches for something, and then because the proximity of your
device your locations are aligned, then like you end up
getting served ads for things that other people had searched for,
(16:57):
or you just happen to input some information somewhere that
made that ad pop up and that and you just
didn't realize it because we're not all able to keep
in our heads every single thing we do constantly every day.
The way that these ad systems work is basically, like
whatever data your devices are generating through apps, through web browsing,
(17:18):
et cetera, you end up getting put into these like buckets.
They call them segments. And so it's like white guy
in his thirties who lives in Brooklyn, and that's going
to be potentially a segment. And they can get really
granular and they can be also pretty broad, and so
we're not old as unique as we think we are
in a lot of ways. Like that explains the way
a lot of the more creepy elements to online advertising.
Speaker 2 (17:41):
So not a particularly exciting truth, but I think it
makes sense.
Speaker 3 (17:45):
Thoughts.
Speaker 1 (17:46):
Wow, So it's possible that Mia was googling is my
boyfriend addicted to gaming? That's not where I went. But yeah,
and then it's because of her proximity to my phone
for it.
Speaker 4 (18:02):
God, I'm just gonna ask who signed in when she's playing.
Speaker 3 (18:08):
Harry Potter account.
Speaker 1 (18:11):
It's my PlayStation, but she has her own account.
Speaker 3 (18:14):
See, I think this is it.
Speaker 2 (18:16):
That actually kind of makes it.
Speaker 4 (18:17):
I think this is it. It's a new person who
was not playing video games before.
Speaker 2 (18:23):
Audibly and like she's playing kid games no offense, but
like you know.
Speaker 1 (18:29):
Harry Potter, Harry Potter could be It's like it's probably teens. Yeah, yeah,
teen a new teen, get her playing some get her,
get her on what happens, Like.
Speaker 4 (18:42):
We do so many things that we didn't even think
about day to day, like he's saying, and yeah, we
might not google those exact words, but we may be
with someone who is looking that thing up. Or we
may go to a certain place that you know is acid.
Speaker 2 (18:54):
And you're you're less unique than you think. Just like
the general thing of like no one thinks advertising works
on them. It's the same thing of like, well there
are things that like based on who you are, like
generally they can be like okay, you're you might like
this stuff.
Speaker 4 (19:08):
I feel like I come across this like once a
week where someone will make a TikTok and people will
be like, oh my god, I thought like I was
so I was the only one who had this experience,
and it's like there's a hundred thousand people who like
this particular TikTok. So like, just because we don't talk
about things doesn't mean like we don't have shared interests
or shared experiences, And I think people often forget that, like, yeah,
(19:31):
we're all growing up and relatively the same.
Speaker 2 (19:33):
Yeah yeah, it's a good lesson for podcasting topics.
Speaker 3 (19:36):
Yeah yeah.
Speaker 2 (19:37):
So then next me and Andrew got into the Alexa
news we mentioned earlier. Amazon changed terms of service so
you can't prevent your requests to Alexa from going to
Amazon's cloud servers, removing a privacy feature that used to offer.
Andrew made a great social video on this and the
concerns that you just may have that I wanted to
expand on. He said that the response of well, who cares,
I'm nothing to hide when it comes to personal privacy
(19:58):
is actually a logical fallacy.
Speaker 5 (20:04):
You don't actually know what you have to hide, and
you don't actually know what you have to fear, and
I think we are living through a moment in time
that's making that abundantly clear, where people are being essentially
kidnapped off the streets for and off ed that they
wrote for their college newspaper or any other types of
speech that is well, certainly isn't criminal, but is not
(20:25):
even that provocative. And so if you are a person
who has a vulnerability at all, which we all do,
you have to understand that that can be potentially exploited
at some point. And so right now we're seeing what
it means to be a target of a government scrutiny
is changing by the day. It seems, you know, I
(20:47):
would argue that you can think you don't have anything
to hide, and you can think you don't have anything
to fear, but you aren't necessarily going to know that.
When we're seeing things retroactively being used against people that
were previously fine and seemingly a not and are now
deemed problematic or even criminal, that's when we're into really
scary territory where you can't actually make that assessment.
Speaker 4 (21:10):
What that brought some mind to me is that when
we're all been thinking about privacy.
Speaker 3 (21:14):
In the moment in which we're doing the thing, right, we're.
Speaker 4 (21:16):
Like, yeah, oh I currently am looking at x X
is not that bad currently if someone sees this who
cares not thinking that, you know, like the woman wrote
an op ed for her school newspaper. I don't think
she thought in that moment, oh my god, I'm gonna
get like deported because it.
Speaker 2 (21:35):
Is like maybe I'll get some mean emails.
Speaker 4 (21:36):
Yes, but like you know, if she would have today,
she probably would not have written that op ed. So
we don't think about like these things that we are
searching or looking up, Like our privacy doesn't just exist
in a vacuum. There's you know, there's going to be
time that comes after whatever we're doing that those things
may no longer be klosher.
Speaker 2 (21:56):
So that's that's his main thesis, and then he kind
of breaks it into three major points for why people
should think about their privacy. So, number one, he says,
you don't know what's being recorded. So when you have
one of these devices in your home, you may not
be aware of when you're accidentally activating it or exactly
when it's recording. Andrew says it's not only recording when
you want it to, and there's no ability to edit
(22:16):
that in real time. So losing that control is part
of the price of having these devices around, no matter
how convenient they might be.
Speaker 4 (22:23):
This happens to me all the time, where like sometimes
I'll be playing music and then the music lowers, which
means ALEXA is being activated, but like I haven't said anything,
so like who knows if music isn't playing? How many
times throughout the day, think so I'm saying something and
is recording me not just ask what the weather is
and other things, and it's like, yeah, there was a
lot of I do a lot of talking in my
(22:45):
you know, even just like work meetings and stuff. Right now,
every time you talk, you're talking to your smart speaker
or smart device to Yeah, keeping in mind that you
don't know what it's actually recording.
Speaker 2 (22:56):
Point number two is that you don't know how the
company will use your data. Once they have it.
Speaker 5 (23:00):
They can take that data and kind of use it
for business purposes that I'm sure laid out in terms
of service that nobody read, or a privacy polic saying
that nobody read that are probably more expansive than you think.
We do know that they use it for training AI.
We know that they are, you know, collecting this data
and having certain employees listen to the recordings. So there
(23:22):
is a human element, or at least historically there has been.
And then the other category is what other people are
going to do with that. That can include Amazon employees
who get fired and then are mad and take all
the data and post me on the internet, or it
can include hackers. Amazon is in a security but it
can get hacked to like anything that can be designed
(23:42):
can be hacked, and so you have to just assume
that it's going to get out there at some point
and kind of what that would mean for your life.
Speaker 3 (23:48):
That did so.
Speaker 2 (23:49):
That one reminded me of the recent story about twenty
three and me that you brought up earlier. I asked
if there is risks if something similar happening to say Amazon.
Andrew said that, Amazon is of course like an extremely
huge and wealthy company, so it's unl they go bankrupt
the same way, but we're still at the whims of
their future business dealings. And companies also just don't last forever,
so they could always kind of spin off their AI
segment and sell that somewhere whenever. Fifty years ago, Sears
(24:12):
or something going out of business would have been crazy,
and then they did. So it's like, you know, these
things only last so long. Once that data is out
of our hands. It's it's out of our hands, you know.
Speaker 1 (24:21):
Yeah, you never know, like we might launch like no
such thing warehouses that take Amazon out of business.
Speaker 2 (24:27):
We're planning on it. Yeah soon stay cue yeah twenty
twenty six.
Speaker 3 (24:32):
Well I think about it too.
Speaker 4 (24:33):
When Amazon acquires a lot of companies, right when people become.
Speaker 3 (24:36):
Like medical companies and stuff.
Speaker 4 (24:37):
Yeah, it's like, okay, I got you know, when I
signed up for this thing, I didn't realize that Jeff
Bezos would one day be like controlling this data.
Speaker 3 (24:45):
So yeah, you never even if you trust the.
Speaker 4 (24:47):
Company currently that has your data, you don't know where
it's going to end up down the line.
Speaker 2 (24:51):
His last point, number three is that privacy just as
a concept is important.
Speaker 5 (25:00):
Uncovering privacy for probably about the past fifteen years or so.
And you know, one things that's striking to me is
just how many of the doomsday scenarios that people in
the early oughts laid out of what could happen actually
has come to fruition, not not just in this in
(25:20):
secretive ways, but like entire business models being built around
privacy violations. So we have very very little privacy now.
I mean just you have your own thoughts and then
there's like two apps or something that actually have good
privacy protections. The point that you have to care about
privacy as a concept is that you can't you can't
(25:41):
take it back right like once the once the data
is out there, it's out there and there's no putting
it back.
Speaker 1 (25:46):
In part of me thinks about the like the future,
and like there's almost a zero percent chance that one
hundred years from now will have like the same or
like more or privacy than we do today. And so
that kind of makes me feel like, all right, we're
kind of privileged right now. We should hold onto a
(26:08):
privacy I have, Yeah, while we have it, because our
kids certainly won't have it. I mean, that's a very
bleak worldview.
Speaker 2 (26:16):
So beyond just concerns of you know, big corporations profiting
off of our private data, what if a government wants it?
Can they just access anything with a subpoena? Andrew said, yeah,
it basically happens all the time. The FBI used geofence
warrants from Google to get location data for devices on
January sixth, for one example.
Speaker 1 (26:35):
From some patriots.
Speaker 3 (26:39):
But I think orange, so it's a crime to be
near the capital now.
Speaker 2 (26:45):
But more interestingly, he said that most of the time
they don't even have to try that hard and do
the whole subpoena thing. You can just buy that from
a data broker and it's not even very expensive.
Speaker 5 (26:56):
We did a story that we've partnered with some German
publications on where our German partners had reached out to
this data broker and just asked if they could have
some of the data, and they gave them like a
month's worth of data on every device in Germany for free,
just as a sample of what they have, because what
they actually sell is the real time and access to
(27:17):
that data. And in that instance, we were able to
track members of the US military because there's a bunch
of military bases in Germany and see like everywhere they
went over the course of a month, like when they
went to brothels, when they went to the bars, when
they went home and back to work, what their route
was to and from work, and you know, these we're
talking about people on facilities where our nuclear weapons are
(27:39):
scored and so like, that's just from advertising data that
they literally gave away for free.
Speaker 1 (27:43):
And Andrew says that law enforcement agencies like the FBI
and ICE have in the past and can't today buy
the data and use it for whatever purposes they'd like.
Speaker 4 (27:53):
It's crazy too, because I think of like kind of
for those Cereal heads Cereal season one, there's so much
about out tracking and trying to figure out where people are.
And it was before our phones obviously with tracking us
to the degree that they are now. Yeah, the unreliable
cell phone towers, And I'm like, oh, if you were,
like you know, if you think a non did it,
(28:14):
if you were to commit that crime today, it would
be very easy to be like, you're.
Speaker 3 (28:17):
Here, here, here, here, We're here and here right.
Speaker 4 (28:19):
Like this is what happened with the the Iowa suspect
his phone and saw that you know, he was.
Speaker 3 (28:27):
At the house where the people were killed.
Speaker 4 (28:30):
So it is crazy to think just like yeah, and
that's not even something we think about in regards to privacy.
It's like you carry your phone everywhere and you're not
like saying yes, tell them like.
Speaker 1 (28:41):
Something yeah and say, you know, if if a detective
wants to see that data. It seems to be just
like a request to the company, do you have do
you need like a subpoena or anything to be able
to access that information or that's the company's information.
Speaker 2 (28:57):
Yeah, because then there's like the case like the San
Bernardino shooting a few years ago. Yeah, Apple refused to
open it for the FBI. Yeah, Andrew said that Apple
didn't do it because they didn't you know, it looks
good to them to say like we care about people's privacy,
even though obviously right, you know, but then like the
FBI was able to.
Speaker 3 (29:17):
Just get it.
Speaker 2 (29:17):
You know, they had have their own hackers and they're
able to get in the back end anyway. But that's
another thing where like depending on kind of the circumstances,
and you can imagine a world where you know, tech
companies are CosIng up to different administrations in different ways,
they might be more lenient or not just based on
the circumstance and their personal stances maybe, and again you
don't really know that's their own assessment of like what's
(29:39):
more important to like protect our customer in this sense,
or catch some terrorists or you know, perceived terrorists or whatever.
Speaker 1 (29:46):
The dynamic of a tech company cozying up with an
administration is happening right now with that company Palanteer, where
like Ice is now paying they just it's a thirty
million dollar contracts to build like US the immigration surveillance program.
I mean, those things just change on a whim depending
on who's in the administration.
Speaker 2 (30:06):
So then, anyway, I asked Andrew if there's any way
that things can get better or is it just all
too far gone.
Speaker 5 (30:13):
You can't do much about what's already out there beyond
the basically impossible instance where just companies just start deleting
data or there's like some law that gets past that
they say like, oh yeah, all that data collection that's
happened for the past twenty years, that's illegal now and
you have to go back and delete all of it.
That's not going to.
Speaker 1 (30:32):
Happen, But he says we could see a digital privacy
law passed on state or federal levels, which could make
for more transparency on what's actually collected and maybe put
some terms on it being deleted. Andrew says some countries
in Europe right now and even in China have better
data protections than we do, So there are ways to
improve on where we are, even if we can't quite
ever reverse the destruction to our collective privacy so far,
(30:55):
it's good to have a little bit of optimism. Yeah,
care you can move to China.
Speaker 3 (30:59):
Yeah, I will say too.
Speaker 4 (31:02):
There is a lot more of a focus now, at
least with kids on their privacy and data companies.
Speaker 3 (31:08):
I could see that at some point, once we do.
Speaker 4 (31:12):
A little bit better job of that, of that, you know,
that being a framework for expanding to us adults in
our privacy as well, because I think the assumption is that, oh,
when you're an adult, you're unaware of what you're giving up.
Speaker 1 (31:25):
Yeah.
Speaker 3 (31:25):
Yeah, but I don't think that's the case.
Speaker 4 (31:27):
I think people would be shocked by the things that
they are agreeing to on a daily basis.
Speaker 1 (31:32):
Yeah.
Speaker 2 (31:33):
So I was curious if Andrew himself used any voice
activated devices. He doesn't really, but he did make a
good point about why people might and how we can
live rationally with these privacy intrusions.
Speaker 5 (31:44):
I will say, like, a lot of the things that
are really bad for privacy are great for accessibility. So
if people have disabilities, there's just a lot of things
that make it people's lives a lot easier. And that's
why I'm I'm hesn't called any of these things like
bad on their face because they are useful for a
lot of people. Even you know, my eighty five year
old mother in law who doesn't get around as quickly
(32:06):
as she used to. She uses an Alexa device, I believe,
and I would never take that away from her, Like
it makes her life a lot easier. It reduces the
risk of falling and you know, having a serious injury
or going to the hospital, and that's really good. And
I think the way you have to think about these
things is to have kind of a personal threat model
and understand what your risk thresholds are. For my mother
(32:28):
in law, it's tripping and falling is a much higher
risk than hacker stealing her whatever she says to her Alexa.
Speaker 2 (32:34):
You have to think about how this can work in
your life and like, obviously these things really help you,
then that's great. Just be mindful of those sorts of things.
He also is just just like especially for certain things,
like read the terms and conditions and check like, okay,
what can they actually do with this? See if there
is kind of what you would see as an overreach,
that sort of thing.
Speaker 1 (32:53):
Yeah, it's a good advice. I don't think I even
once ever have looked at terms and conditions.
Speaker 2 (32:59):
Oh yeah, I'm going when when like sometimes on a
video game something it makes you.
Speaker 1 (33:03):
Like really go down, It's like, actually, I don't care
about myself, just just automatic.
Speaker 2 (33:08):
I mean, does hearing all this change how you think
you guys will go about your lives? Or is it
just kind of more maybe a thought.
Speaker 1 (33:18):
So I certainly learned more about like the extent of
what quote unquote they have us, Like the story about
the German like in Germany Blue kind of blew my mind.
I already knew that all this stuff is compromised, so
to speak. My concern is like, what's the potential of
(33:38):
it being used in a harmful way?
Speaker 3 (33:41):
To me?
Speaker 1 (33:42):
Sure, your interview with him gave me a lot to
think about.
Speaker 2 (33:45):
You're welcome.
Speaker 3 (33:47):
Doing God's work.
Speaker 4 (33:48):
No, I think the thing that I didn't quite think
about was, yeah, how can this stuff be weaponized in
the not so just in future?
Speaker 1 (33:57):
Right?
Speaker 3 (33:57):
And I'm always thinking about things.
Speaker 4 (33:58):
Within a context of now and not like, oh, ten
fifteen years down the line. If yeah, I don't know,
someone knows that I listened to Kanye. It cannot be
used against me in some cord of law or something,
you know, if it's no longer cool to do that.
And I always thought of privacy as in what am
(34:18):
I doing in this moment? If I were caught now
doing this thing by someone? Yeah, yeah, how bad would
that look? And I'm like, I amout doing anything that bad. Yeah,
but it's like, oh yeah, norms changed, and.
Speaker 2 (34:29):
You never know if someone has some case they're trying
to build against you, yep, how they can spend anything.
Speaker 1 (34:34):
Yeah, look at all right, thank you for listening. We
could cut that out to thank you. No Such Thing
as produced by Noah Friedman, Manny Fidela, and Devin Joseph.
Theme music is by Manny, with additional music by Xeno Piccarelli.
(34:57):
Big thanks to our guest today Andrew Kouts from Wired.
For related links in research, please subscribe to our newsletter
at No Such Thing dot show, and if you have
a question you'd like us to tackle or want to
say anything, shoot us an email at many Nooadevin at
gmail dot com. Lastly, if you like the show, please
follow us wherever you listen and give us a five
star rating and write a nice review while you're at it.
(35:17):
It really really helps a lot. Thank you to the
loyal NST heads out there, and we'll talk to you
again soon