Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Annie and Samantha.
Speaker 2 (00:06):
I'm welcome to stuff I never told you, prolected by
Heart Radio, and today we are thrilled once again to
be joined by the charming, the clever Bridget Todd. Welcome Bridget.
Speaker 3 (00:25):
Thank you for having me every time I come.
Speaker 4 (00:27):
I'm always so stoked with whatever superlative you give me
in the intro.
Speaker 3 (00:32):
They get better every time.
Speaker 2 (00:34):
Thank you. I'm trying not to in my mind cheat
and use at thesaurus, but who knows, I might have
to break it out one day.
Speaker 3 (00:43):
One day. We're not there yet, not yet, but.
Speaker 2 (00:47):
Yes, Bridget, We're always so happy to have you.
Speaker 1 (00:50):
How have you been?
Speaker 3 (00:52):
Sweaty as hell?
Speaker 5 (00:54):
Uh?
Speaker 3 (00:54):
Big? I don't know if it hit you all, but
big heat wave where I'm at.
Speaker 4 (00:58):
Yeah, it's horrified, I were listeners can't see me, thank god.
Speaker 3 (01:04):
But the glistening is not dewey skincare.
Speaker 4 (01:08):
It is literal sweat because it is hot as balls
on the show. Yeah sure, yeah, did you all get
that heat wave?
Speaker 1 (01:19):
Yes, yes, yes we did.
Speaker 2 (01:21):
I remember when we moved into kind of like home
studio Pandemic recording, Atlanta gets very hot and I realized, like,
you can't have my aces really loud, and so I
was trying to come up with a solution, and I
came up with all kinds of wonky things, like putting
ice packs all around me, putting my underwear on the fridge.
(01:45):
But we definitely did get it, and it was brutal,
it was real. It was real tough.
Speaker 4 (01:55):
Yeah, it makes my It impacts me as a podcaster
because I have like a little sad proof booth, but
you can't have a fan or ac in there because
right ruin the soundproofing. So when I'm in there, I'm
just like, I mean sometimes I like, it's just it's
it's like being in a in a sauna.
Speaker 3 (02:11):
Yeah.
Speaker 2 (02:12):
I have what I call my sweatshirt, which is just
a shirt I wear when I come in here nice
because I don't want to do like laundry every like
so often. So I'm like, this is the shirt I
will sweat through while I'm podcasting.
Speaker 3 (02:27):
I love that. Oh I don't like doing laundry and
have to pay for it.
Speaker 2 (02:34):
Yes, a lot of my quarters, no way.
Speaker 1 (02:39):
So many things in those conversations.
Speaker 2 (02:43):
Yes, well, speaking of so many things, I feel a
little bit guilty about this one, Bridget, because it was
a big topic that I was like, you know what,
I would really love to discuss this, and it's just
a huge, spiraling thing, complicated, but I really appreciate you
(03:05):
bringing it, and I think it's really important and I'm
excited to talk about it and to learn more. So
what are we discussing today?
Speaker 3 (03:13):
Today?
Speaker 4 (03:14):
We are diving into the wide world of privacy, specifically
whether or not messaging apps like WhatsApp actually are private.
And I know I can already hear somebody in their
car or washing their dishes right now thinking snooze. Privacy
has nothing to do with me. I don't break the law,
I have nothing to hide. Whatever they have on me,
(03:36):
they already have it. Who cares? That is actually a
very common attitude called privacy nihilism, where you just are like,
the government has all that they're going to have on me,
it doesn't matter. And honestly, I get it. Most of
us probably don't really think twice before you hit send
on a message, right, but what if hitting sen could
put you at risk? And the truth is, in twenty
(03:58):
twenty five, it literally can. You're texting about abortion or
a protest or whatever, your messages could be watched, flagged,
or even used against you. Privacy isn't just for people
with something to hide, It is for people with something
to lose. And right now, in twenty twenty five, given
everything that is going on, that is all of us.
Speaker 2 (04:19):
Yes, and we're going to get into this more. But
it seemed that companies at least have tapped into this worry,
this concern, and they're competing with each other in ad campaigns.
Speaker 4 (04:34):
Well, Annie, you sent me this ad from WhatsApp that,
in my opinion, heavily implied that WhatsApp is meant to
be like a private, secure messaging platform and that one
could even trust it for a private conversation about healthcare,
like maybe if one was trying to pursue an abortion,
this would be a platform that you could do that securely.
Speaker 3 (04:52):
You were also telling me about another ad from Apple.
Speaker 4 (04:54):
Where it was like camera birds being launched at people
or some thing.
Speaker 2 (05:01):
Yes, yes, that ad is very frightening everyone. It reminds
me of the Black Mirror episode with like the robot dogs.
Speaker 3 (05:08):
Oh yeah, army robot dog.
Speaker 4 (05:11):
That's like like it's like the post apocalyptic army robot
killer dog.
Speaker 3 (05:16):
Oh my god.
Speaker 2 (05:17):
Yes, but there's just these like cameras that have bird
wings and they have this screeching sound they make, and
they fly after people on the street who are looking
at their phone, implying that like they're watching you if
you're on your phone or on your laptop, and they're
like crash into windows and this is a really piercing sound.
It's really disturbing. And then it's an ad for Safari
(05:41):
that's saying, like, other browsers watch you, Safari doesn't.
Speaker 4 (05:47):
Definitely, they got their finger on the pulse, and the
pulse of the moment is concern about our vast surveillance state.
Speaker 3 (05:53):
That much is clear.
Speaker 4 (05:55):
I think people are worried about it, and these I
think these companies are like, oh, let's tap been do
people's understandable concern about the surveillance state right now.
Speaker 2 (06:05):
It's so odd though, because like when I sent you
that the WhatsApp campaign, it's it's strange to me in
a way that we're living in such a dystopian world
where there are ads that are like, you don't want
anyone to know about your medical you don't want It
just has this vibe of overall, it's probably not safe
(06:28):
for people to know about these things. I don't know
which is true. It's absolutely true, but it's strange that
it's in an ad campaign.
Speaker 4 (06:37):
It's weird to get reminded of it while you're watching
Bob's Burgers. Like, I totally understand the like it does
feel rather dystopian.
Speaker 3 (06:46):
I totally get it. Yes.
Speaker 4 (06:49):
Oh, and I should say right off the top that
I'm not like an online privacy expert.
Speaker 3 (06:55):
I don't work in it. I would more call myself
a privacy enthusiast.
Speaker 4 (07:00):
One of the projects that I work on, in addition
to my own podcast or own Our Girls on the Internet,
is a podcast that I make with the Mozilla Foundation,
the makers of the browser Firefox, called IRL, and it's
a podcast all about technology, AI, ethics, and of course privacy.
We actually just launched our brand new season a few
weeks ago, so folks should check it out. But what's
interesting about the show, I think, is that we're really
(07:22):
trying to make these conversations that might feel wonky or
inaccessible accessible for everybody. And I think that it's important
for folks like me, who aren't necessarily privacy experts or
don't work in it, to have a sense of the
privacy of the technology and the platforms that we do
use every day, like we shouldn't treat those as conversations
that are just for experts, And especially in twenty twenty
(07:43):
five when as we were just discussing the vibes it's
giving nineteen eighty four. At times, it's giving dystopian, it's giving,
you know, something out of a scary novel, only it's real.
That's I mean, I don't know if that's how you
all are feeling, but that's how I'm feeling. And so, yeah,
privacy and the privacy around the apps that we use
(08:03):
every single day becomes even more important.
Speaker 2 (08:07):
Yes, and I know we're going to talk about it,
but we've seen instances of these apps turning over data
that people thought was private, weren't thinking that it was
going to be shared in this way. So, and you know,
I have to say, WhatsApp is an app that I
(08:27):
only have experienced generally with my international friends, but I
did always get this vibe that it was more secure
other apps. So when I saw this ad campaign, what
set me off was not WhatsApp. It's when I saw
meta at the bottom of it. Yeah, and I was like, okay, wait,
(08:48):
now I have a lot of concerns. Now I have
a lot of concerns. So yeah, that's what's going on
with WhatsApp here.
Speaker 4 (08:56):
So let's get into it. For folks who maybe don't
know about WhatsApp. It is a free global messaging platform
that was launched in two thousand and nine, and it
allows folks to send texts, voice messages, video messages, make
voice and video calls, share images, documents, user locations, and
other content.
Speaker 3 (09:13):
It has over two billion billion with a B users.
Speaker 4 (09:17):
There are like eight billion people on planet Earth, so like,
that is a lot of users. And in twenty sixteen
it became the number one messaging platform in the world.
And Annie, you were just talking about how you associate
it with like international folks, and that is absolutely true
because it is particularly important for global communication. If you
have aunties or cousins in another country, odds are you've
(09:38):
probably used WhatsApp to stay in touch. And by twenty sixteen,
it was the primary means of internet communication and regions
including the Americas, the Indian subcontinent, and large parts.
Speaker 3 (09:49):
Of Europe and Africa. So yeah, it is a very
very popular messaging platform.
Speaker 4 (09:56):
In twenty fourteen, it was bought by Meta, the company
that runs space Book. But back then it was still
just called Facebook. Listeners have rightly called me out for
sometimes mixing up those two names. Facebook was Facebook in
twenty fourteen, they changed their name to Meta to sort
of avoid some scandals.
Speaker 3 (10:10):
But Facebook and.
Speaker 4 (10:11):
Meta for the most part, or like you could think
of them as interchangeable for the most part. And so
they bought WhatsApp in twenty fourteen. And that means that
like the same way that for me, getting online when
I was a kid meant America Online, like America Online
functionally was the Internet for me, Facebook is essentially the
Internet for a whole a whole big swatch of people
(10:35):
all over the world, right, And this is all controlled
by one person and one company, Mark Zuckerberg. And so
really think about that, this one private company basically has
total control of the main method of how people get
information and communicate with each other. So that gives him, specifically,
like an incredible amount of power, power that I would
(10:56):
argue has not always been used ethically or responsibly and
not always done in a way that keeps what's best
for people at the forefront. So if you're asking the
big question of like whether or not you would trust
WhatsApp run by Meta at their word that they are private.
I mean, it really comes down to the fact, like
do you trust Facebook, do you trust Mark Zuckerberg, Because
(11:19):
that's really what it comes down to. How Facebook and
Zuckerberg define what privacy is, like what is private and
what is not private, and like what that means. And
so this is where it gets a little bit confusing.
WhatsApp's messages are and to end encrypted by default, so
you don't have to turn anything on to message in
(11:39):
that way, both for messages and for calls. And bottom
line is like that is very good for privacy. So
I want to be clear that I'm not saying that
WhatsApp specifically is bad for privacy. However, the way that
Meta operates as a company is what, in my opinion, makes.
Speaker 3 (11:56):
Things a little bit murky.
Speaker 4 (11:58):
Mozilla's privacy not included blows put it like this, WhatsApp
is owned by Facebook, which means Facebook can access some
data WhatsApp collects on you for specified purposes, which may be.
Speaker 3 (12:08):
Bad for privacy.
Speaker 4 (12:10):
So this means WhatsApp might be collecting data on you
when you use it, and we actually sort of know
this is the case. As of September twenty twenty one.
It is known that WhatsApp makes extensive use of outside
contractors and AI systems to examine certain messages, images and
videos that have been flagged by users as possibly abusive,
and it turns over to law enforcement metadata, including critical
(12:31):
account and location information. So again it really comes down
to whether or not you consider that to be private.
Speaker 2 (12:40):
Yeah, and also, given like a lot of other news
that's happened with Meta and Mark Zuckerberg, that just reverses
everything that they said they were going to do. So
it's also I feel an extension of that is do
I believe they're actually going to stick with this thing?
(13:01):
Are they just going to change it?
Speaker 4 (13:03):
I mean, I think it's reasonable to look at how
companies historically have moved when they say things. When they
say one thing and then do the opposite, I think
all of that plays into the question of, like whether
or not you're comfortable with them having information about you
that could potentially wind you in jail. This is not
a hypothetical. We'll talk more about this in a moment,
(13:23):
but like that has happened. Conversations people had on Facebook
platforms have led to them being in prison.
Speaker 3 (13:30):
So like that's in my book, that's like you're.
Speaker 4 (13:33):
Giving quite a bit of trust to these platforms. And
you know, as I said, Meta owns Facebook, WhatsApp, Instagram,
and also works with a sprawling network of third parties
and contractors, sometimes in like very opaque ways. And so
my sense is that Meta is like, oh, well, WhatsApp
itself is encrypted, so that means it's private, and then
(13:54):
conveniently leave out that users who are concerned about their
privacy might also be concerned about how and what data
is then being shared with third parties. Like I think
that Facebook is like, well, we're encrypted, so that's private.
And how Facebook and other companies access data, it's still private.
Speaker 3 (14:14):
I think that's there. That's their their their play there.
Speaker 2 (14:18):
Yeah, and it does feel very like, I don't know,
kind of shuffling it, Like even going back to that AD,
the Meta symbols very small and that AD, I have
to say, I think they were like, we know people
know that we've gone through these scandals, but also kind
of shuffling it off to the user of you should
(14:38):
figure out this was on you. If you were worried
about privacy, then I don't know why you didn't take
whatever steps.
Speaker 5 (14:47):
Yeah, I remember, and WhatsApp was bought out, and everybody
was really upset because I think at that point a
lot of protests and like a lot of gatherings had
happened through different messengers like WhatsApp because it were supposed
(15:10):
to being encrypted, and then Facebook bar amount and everybody
lost that. I remember people talking about what to do next,
and that's when Signal kind of got its big boost.
Speaker 1 (15:18):
If I remember correctly.
Speaker 5 (15:19):
I mean they've always been pretty big, but it got
the bigger boost because WhatsApp it eventually became known as
now it's going to be Facebook, and then they were
real quiet about it.
Speaker 1 (15:27):
They made sure not to change the brand or.
Speaker 5 (15:29):
The logo, and then they were just I think, really
hoping people would forget who owned WhatsApp. And I feel
like the same way like you were talking about, when
people start clicking on doing the messengers and you get
the little you have to read this, this is our
terms of service, and no one reads that. Let's look
it really honest, it's hard to even decipher. I actually
had a similar incident with my loans and when everything was.
Speaker 1 (15:52):
Popping off about doge or doge uh.
Speaker 5 (15:54):
Doing those weird things and looking at the loans and
everybody's coming up.
Speaker 1 (15:58):
With a that's illegal.
Speaker 6 (15:59):
Y'all told us if it gets preached by like organizations
like this, then our loans has to be forgiven. My
specific loan side came back with a whole agreement that
it tells me by the way, a third party can't
get access if given permission by blah blah blah.
Speaker 5 (16:14):
If you agree to this, you can log in. If
you don't, you can't log in period.
Speaker 4 (16:19):
But what like, let's like be so for real what
I have like read with a five. First of all,
you're not an attorney, you're not a lawyer, you're not
a contract expert.
Speaker 3 (16:29):
You don't have.
Speaker 4 (16:29):
All day to read sometimes what it's like thirty pages
of fine print, Like literally there are artists and activists
who have printed out the terms of service that you
have to like, you know, agree to and it'll be
pages and pages and pages long. And that's not a reasonable.
It's not reasonable to expect you to be able to
read and understand all of that just to get information
(16:52):
about your loan.
Speaker 3 (16:53):
So like, let's be, let's like be for real.
Speaker 6 (16:55):
They're literally holding your stuff, huss hostage. I feel like
that's also I feel like this was a thing with Facebook,
especially when they started changing terms and like, yeah, if
you want to log in, you can, but you have
to give us access to your computer, your information. We
have ownership of all your pictures that you're posting. Like,
I feel like they were.
Speaker 1 (17:13):
Kind of the beginning of that as well.
Speaker 4 (17:15):
So it's so funny that you say this because in
twenty twenty one and this all sort of came to
a head when Apple and Facebook kind of had a
dispute about this very thing, right, So Apple made this
update to their app store which gave users privacy labels
that showed all the different data that that app.
Speaker 3 (17:33):
Will link to you.
Speaker 4 (17:34):
So when What's App, owned by Meta was in the
app store, Apple published this information that's like, oh, here's
the data that this app will link to you, and
people were like, uh, wait a minute. I was told
this was private, Like, why are you linking so much
data to me? So then after that, WhatsApp updated their
privacy policy, which inadvertently highlighted they're years old policy of
(17:57):
sharing certain user data like phone numbers Facebook. So people
who were concerned about their privacy obviously freaked out. And
then Ireland hit What'sapp with a record two hundred and
sixty six million dollar fine for an alleged lack of
transparency over how it shares data with Facebook.
Speaker 3 (18:15):
So when Apple actually gave users.
Speaker 4 (18:18):
The tool to be like, oh, I can make an
informed decision transparently about how my data is shared on
this app, people were like, nah, I don't actually like that.
Like it really revealed that perhaps WhatsApp is not as
private as some of these ads would have us believe.
Speaker 2 (18:36):
Yes, And so going back to the ad where I
was like, bridget, I would love if we could talk
about this.
Speaker 3 (18:44):
It's a what's up ad?
Speaker 2 (18:46):
And I thought it was about abortion, but I've seen
it since and now I'm like, Okay, it's just some
medical situation. It's never clearly stated it's an abortion or
anything like that. But the fact that my mind imediately
went to, oh, this is they're talking about abortion or
like kind of implying abortion, and I was I recoiled
(19:10):
so viscerally because I was just like, no, that's that's terrible,
Like we can't be telling people this. And they have
like a nice little they show people typing, and the
all the letters get all mixed up with numbers and
stuff like that, so you can't see what they're typing.
But I think that goes to show the concerns a
(19:34):
lot of us have of this technology that's telling you.
They're like, oh no, it's private, you can do this
thing that has unfortunately become dangerous are risky for a
lot of people. And my mind immediately was like, abortion, No,
that's bad.
Speaker 4 (19:52):
Yeah, I saw the ad to I also. I mean,
I think they were trying to like you're not seeing
things like they don't come out and say it. But
I had the same interpretation of that ad as you did.
And I think it's just I think people should just
know what's up about what's app like, they should understand
like what is actually happening. Because if I were to
use any analogy to explain on it, I would say,
(20:14):
there's private in that you told your good girlfriend something
and she took it to her grave that she never
told a soul. And then there's private like that you
told your good girlfriend something and she shared it with
her husband.
Speaker 3 (20:26):
And it's one thing.
Speaker 4 (20:27):
When we're talking about like a juicy secret, it's another
and we're talking about something that unfortunately like could get.
Speaker 3 (20:32):
You locked up.
Speaker 4 (20:33):
And you know, when it comes to abortion, we know
that Facebook does have a bit of a track record
with this. So Facebook turned over the chads of a
mother and daughter to Nebraska police after they were served
with a warrant as part of an investigation into an
illegal abortion, so they notably, these two women were not
(20:55):
using WhatsApp, they were using Facebook Messenger. But I do
think it shows some in into how this all works because,
as I said, it's all the same umbrella company Meta.
So in June, before Rowe was overturned, Facebook gave the
police department in Norfolk, Nebraska, access to their private messages
that Jessica Burgess and her then seventeen year old daughter
shared about how to obtain abortion bills. Burgess ended up
(21:18):
getting sentenced to two years in prison and.
Speaker 3 (21:20):
Her daughter got ninety days.
Speaker 4 (21:22):
The platform that they were using to communicate, Facebook Messenger
does offer and to add encryption, just like WhatsApp does,
meaning that the chats between the two women were only
visible to them on their phones and not readable by
Facebook or any government entity that makes a legal request
to the company. But that option is really only available
to folks who were using Facebook Messenger on the app
(21:45):
on a mobile device, not like on desktop or on
a laptop or something.
Speaker 3 (21:49):
And the messages are only.
Speaker 4 (21:50):
Encrypted after users selects the option to mark those chats
as secret. So you can see how that's less secure
than WhatsApp that has that as a default setting to
go in and click secret on mobile to get that
end to end encryption. And what's worse is that Facebook
just immediately complied by giving these chats over to the police,
(22:11):
which they really don't have to do. Legal experts told
The Guardian that Facebook could have fought the warrant in court.
In there other instances where tech companies have refused to
comply with government demands. Apple refused to comply with federal
law enforcement's request to break into an iPhone involving the
Sin Bernardino shooting back in twenty sixteen, and Facebook itself
successfully refused to comply with a wiretapping request for messenger
(22:34):
calls back in twenty eighteen. So they don't have to
give this information over to police, but they do, and
they're able to have it.
Speaker 2 (22:43):
And that's that's the frightening thing, because we use, all
of us use this stuff in some way or another,
maybe not Facebook Messenger or maybe not What'sapp. But it's
just a concern with all of these platforms of you
know what, I thought this was a something that was private. Yeah,
(23:07):
I'm going to jail, or it's been turned over.
Speaker 6 (23:10):
We can talk about the fact that they have been
hacked so many what my WhatsApp has been hacked so
many times. My messenger has been hacked so much.
Speaker 1 (23:17):
I was like, I've f this.
Speaker 5 (23:19):
I'm not doing this anymore, and I completely refuse to
use it except anytime anyone sends me a message, whether
it's my partner or someone.
Speaker 6 (23:25):
You have to download it if you're not on the laptop.
But and even on the laptop, you can't get it
unless you do you download the app and get another code,
which is like, what you're making me do this?
Speaker 4 (23:37):
So I actually included all the different instances of WhatsApp
being hacked.
Speaker 3 (23:42):
There are many, there's a there's many, many many.
Speaker 4 (23:46):
At the last minute, I was like, well, hacking is
different than and again it's not different than privacy.
Speaker 3 (23:52):
I made a game day decision to be.
Speaker 4 (23:54):
Like, let's let's exclude hacking from the conversation.
Speaker 3 (23:57):
But I don't like you what's that beginning hacked? Like
it's a real thing.
Speaker 6 (24:02):
I mean, if they want to say be you know,
like it's secure and private and all these things, but
like one goes in hand in hand. How'm I gonna
trust you if every other day I get a message
from this person they're saying, Hey, did you send me
these things? Just because I opened an app and it
turns out that someone has hacked into my account? Like, obviously,
if you can't even get like the security to keep
my stuff, okay, how do you have the ability to
(24:24):
keep it private?
Speaker 3 (24:25):
Exactly? Just and just to be super clear, even.
Speaker 4 (24:29):
If WhatsApp their encryption means that they cannot hand the
content of your messages over to the police, if the
police were trying to build a case against you that
relied on location data, like say, did you travel to
xyz location to obtain an abortion and you live in
a place where you were not legally permitted to travel
across state lines to do so, or like were you
(24:50):
at x y z protest? WhatsApp has metadata, So even
if they can't give you get the actual messages, they're
fine giving metadata, which includes things like your location or
contacts who you were messaging. They're fine to give that
information over and so and they have that information. And
so it's really about understanding privacy in a more holistic
way that it's not just about the content of your messages.
(25:12):
It's about how secure is its platform, Like is it
easily hacked?
Speaker 3 (25:16):
Is it hackable? Does it have a history of being hacked?
You know, what kind of metadata are they collecting? How
do they use that metadata?
Speaker 4 (25:22):
It's really about asking all of these background questions in
addition to just like how do they keep my content safe?
Speaker 2 (25:29):
Right? And on top of that, it's also there are
some things some laws that we have that sort of
allow for them to be like, yeah, we can track you, right, Yeah.
Speaker 4 (25:45):
So I learned this in doing the research for this
episode that thanks to the Electronic Communications Privacy Act, they're
basically able to track users without probable cause. In January
twenty twenty two, an unsealed surveillance application revealed that WhatsApp
started tracking even users from China and November twenty twenty one,
based on a request from USDA investigators. The app collected
(26:06):
data on who the user's contacted and how often and
when and how they were.
Speaker 3 (26:10):
Using the app.
Speaker 4 (26:11):
And this is reportedly not an isolated thing, as federal
agencies can use the Electronic Communications Privacy App to covertly
track users without submitting any probable cause or linking a
user's number to their identity.
Speaker 3 (26:24):
And I should just.
Speaker 4 (26:25):
Add a sort of global note about this whole thing
is that WhatsApp has two different platforms, regular WhatsApp and
then WhatsApp Business that's meant for small businesses to communicate.
Speaker 3 (26:36):
That was rolled out in twenty eighteen. Everything that I've
said so.
Speaker 4 (26:38):
Far is really about regular WhatsApp, and that's because WhatsApp's
business platform is actually less private. So yeah, don't use
that to talk about anything.
Speaker 2 (26:51):
To go.
Speaker 4 (26:53):
Well, I mean it kind of makes sense because like
if you're if you're a business entity, you're communicating with
so many different people in a different way.
Speaker 3 (27:00):
But like I kind of get where they're coming from there.
Speaker 4 (27:03):
But I just wanted to say that in case someone's like, oh,
I have what'sapp business, like, let me use it the
way that Bridgette told me I should be using it.
Like that's a whole different beast.
Speaker 1 (27:13):
Yeah. I thought that was gonna be an upgrade.
Speaker 3 (27:15):
Not downgrade.
Speaker 1 (27:18):
It's not business class.
Speaker 3 (27:19):
No, it's on a plane. It's like better, but if
it's worse, not this much.
Speaker 2 (27:26):
Okay, Okay, Well we have been you know, focusing on WhatsApp,
and surely all of the all of these applications we
(27:52):
could talk about, but you know, what's app in particular,
how how's it doing with privacy when it comes to privacy.
Speaker 4 (28:03):
So, as I said, ultimately, WhatsApp, I would say, is
like pretty private, like good for privacy. Here's how Mozilla's
privacy experts put it. From a technical perspective. No, WhatsApp
itself is not really bad for privacy. WhatsApp use this
strong end to end encryption for all texts, chats, and
video calls.
Speaker 3 (28:21):
This is great.
Speaker 4 (28:22):
WhatsApp cannot read your messages or see your call. The
flip side of this is Facebook, a company infamous for
its vast and questionably ethical collection of so much data,
owns WhatsApp. This means that lots of metadata, things like
purchase history, location, device idea and more can be captured
and shared with businesses advertising on WhatsApp.
Speaker 3 (28:42):
So people looking for a true.
Speaker 4 (28:44):
Privacy centered messaging app can find much better options.
Speaker 2 (28:48):
Yes, and speaking of.
Speaker 3 (28:53):
Option, peeing me.
Speaker 4 (28:54):
Up beautifully to talk about one of my favorite apps,
and that is Signal. I am such a Signal fangirl.
I had been a fangirl of Signal for a very
long time. I remember the very first time I encountered Signal.
Speaker 3 (29:07):
I was like trying to date somebody, and.
Speaker 4 (29:09):
This person eventually became a very good friend of mine,
but initially I was like, oh, what's this platform? And
I was telling a friend like, oh, I met this
this person and they didn't give me their number. They
were like, here's my signal. And my friend I was
telling this too, was like, are they an eco terrorist?
Because back then the only person who who you would
have signal on their phone was an eco terrorist. Like
(29:31):
that is how long I've been with Signal. But Signal
is amazing.
Speaker 3 (29:35):
It is free. Everyone listening should download Signal right now.
Speaker 4 (29:38):
And it is run by an incredible nonprofit that is
helmed by this woman, Meredith Whittaker, who is this bad
woman in technology who does a lot of interesting writing
on surveillance.
Speaker 3 (29:48):
I genuinely, like deeply, deeply admire her.
Speaker 4 (29:51):
If you ever want to be really challenged in a
good way, like watch up videos of her lectures.
Speaker 3 (29:56):
She's just a fascinating.
Speaker 4 (29:58):
Human being and so a Signal generally considered to be
more secure in privates than WhatsApp due to its just
general commitment to user privacy as a nonprofit and its
open source nature.
Speaker 3 (30:10):
While WhatsApp uses end to end.
Speaker 4 (30:12):
Encryption, Signal goes further by not collecting any user metadata
and being backed by this nonprofit organization, which generally does
minimize the potential for that kind of data exploitation, right.
And so, whereas they're both using end to end encryption,
WhatsApp does collect and share metadata, Signal does not.
Speaker 3 (30:31):
They don't collect any of that.
Speaker 4 (30:34):
So that is definitely the better platform if you're talking
about anything at all sensitive and honestly, it's just a
platform that I use, like as like, most of my
conversations are happening on Signal.
Speaker 6 (30:47):
I'm going to start this joke here. Unless you're dom
enough to send messages to the people you're not supposed
to be sending, like, you know, toxic information to journalists.
Speaker 4 (30:57):
If you're listening, Pete Heggsaft, we're talking about you. You
know what makes me mad about that is that that
was the probably for a lot of Americans, that was
probably the first time they had ever heard about Signal,
and Signal is so awesome, it's so great, they have
such a great like mission, and it made me sad
that the first encounter that a lot of people were
(31:19):
having was with Pete Hegsath like completely misusing it, and
even in some of the reporting about that, they were
sort of trying to make it signals like like oh like,
oh like what is this shadowy app that they were
using and how could this happen? And it's like, no,
it signal didn't do anything wrong.
Speaker 3 (31:38):
It was user error, right.
Speaker 5 (31:41):
Honestly, I was really shocked that they use such a
public like platform that is free.
Speaker 1 (31:45):
I'm like, you're the government.
Speaker 6 (31:47):
Don't you have your own kind of like chat application
that you used within your own people, Like I don't this.
Speaker 1 (31:55):
There's a lot I have questions about in this moment.
Speaker 3 (31:58):
I can answer that question for you.
Speaker 4 (31:59):
The answer is absolutely, the United States government has spends
lots and lots and lots of money to develop and
use super secure messaging apparatus. Yeah, single is not the
use case for this if you're if you're talking about
like secret military plans. I do love like I read
the different conversations that were happening.
Speaker 3 (32:19):
I can't remember who it was. It might have been
jd Vance. Don't quote me on that.
Speaker 4 (32:23):
Who is like only responding in thumbs up emojis?
Speaker 3 (32:28):
It's one of.
Speaker 4 (32:28):
Them that it's like just stick a thumbs up there
and they'll they'll get the gist, Like I don't want
to be in print on this one, which looking back
was probably a smart idea.
Speaker 5 (32:39):
Yeah, the little level of like comedy here is just
like what is going on, but it feels right.
Speaker 1 (32:45):
It feels about right, the disaster.
Speaker 5 (32:47):
And doom and like a comedic form like yeah, okay,
that's where we are.
Speaker 3 (32:51):
Sometimes you gotta laugh to keep from crying. Am I right?
Speaker 1 (32:55):
It's true.
Speaker 2 (32:57):
I have to admit to you all. I wrote in
a fan for Lately I had a plot point about
this whole thing, because yes, because I was trying to
figure out how to make something that was incredibly stupid
happen in the government, and I was like, well, it
already happened, so I can point to real life evidence
(33:17):
and be.
Speaker 1 (33:18):
Like, you know what.
Speaker 2 (33:19):
I would have thought it was too too stupid, but
it did happen. So I just wrote it in and
it kind of saved me a whole headache of figuring
out next.
Speaker 6 (33:29):
I mean, it's true, like all of the things that
they have predicted, this level of stupidity, it's no longer
funny because it's like right on, yeah, like this should.
Speaker 1 (33:37):
Not be this real.
Speaker 4 (33:38):
You know, stuff is bad when like if you were
writing a fictional work about it and you were be like, oh,
that's too on the nose, nobody's going to buy that
it happened from.
Speaker 1 (33:49):
The headlines, you're good.
Speaker 3 (33:53):
Well.
Speaker 2 (33:55):
You know one thing that has One of the reasons
I really wanted to talk about this bridget is that
I was so when I saw that commercial.
Speaker 3 (34:09):
I was so.
Speaker 2 (34:11):
Worried about people believing it, Like you know, like I
was really concerned that people would be like, oh, this
is a this is a secure interface that I can
use and getting in trouble or having this really terrible experience.
(34:32):
And I do know what you were talking about earlier
that a lot of people do have that sort of
burnout of like, you know what, they already know, they
already have it, what's even worth trying? But it really
it really worried me when I saw it, and I
(34:52):
think given the environment we live in now, that it
is something that we need to talk about and be
aware of and be educated on.
Speaker 4 (35:08):
Absolutely, I'm glad that you're bringing it to the Semintee
listeners because it is important and it can always be
really hard to tell marketing from reality, especially with this
because it is a little bit murky, But I guess
I would just offer, like, the reality is that we're
in a world where our rights are being rolled back,
where protest is criminalized, healthcare is criminalized. In a world
(35:32):
like that, privacy and security is not just a tech issue.
Speaker 3 (35:36):
It is a survival issue, and we.
Speaker 4 (35:38):
Have to protect our data and our conversations the way
that we also protect each other. I know that someone
listening out there is like, Bridget you sound paranoid. It
is not about being paranoid. It's about being prepared and
being cognizant of like the reality that we are in today.
And so if using signal over WhatsApp and getting into
(35:58):
the habit of doing that, if that could potentially save
somebody listening a whole lot of stress, I want to
offer that, like, there are small, meaningful changes that we
can make to keep things more secure and private that
don't cost us anything that we should be doing. And
I honestly think, like if you are someone who is
(36:19):
I don't even know how to put this, Like I
am like prepper minded a little bit right, Like I'm like, oh,
the vibes are bad, the vibes are weird.
Speaker 3 (36:27):
Like, better get prepared.
Speaker 4 (36:29):
I think right now the big thing we should be
preparing for is cyber security, and like like like.
Speaker 3 (36:36):
I think that it's probably more likely that we are going.
Speaker 4 (36:38):
To see digital attacks and cyber attacks, probably more so
than other kinds of things you could be prepping for.
And so preparing and getting yourself situated for those kinds
of attacks is easy and free.
Speaker 3 (36:49):
It doesn't take a lot. And so I would say.
Speaker 4 (36:51):
If you're feeling unnerved and uncertain in these times, start there,
like said, is something that you can manage. It does
not take a lot to get yourself in a good
protect place.
Speaker 2 (37:01):
Yes, and you over on your podcast There Are No
Girls on the Internet and other episodes on spinty, You've
done a lot of a lot of episodes about that,
about how that can look and what you can do.
So thank you as always.
Speaker 4 (37:16):
Oh my gosh, thanks for having me, my pleasure. And
if you want more conversations about tech and privacy, check
out the newest season of Mozilla's IRL podcast all about
those very topics.
Speaker 2 (37:26):
Yes, yes, and thank you, We always love having you on.
Thank you for taking this topic. Suggestion but yes, where
can the good listeners find you?
Speaker 4 (37:38):
You can find me at the IRL podcast, on my
podcast or No Girls on the Internet and on Instagram
at Bridget Marie DC.
Speaker 2 (37:46):
Yes, well, looking forward to next time Bridget. In the meantime,
if you would like to contact those listeners, you can.
You can email us at hello at Stuffmomenever Told You?
Speaker 3 (37:54):
Dot com.
Speaker 2 (37:55):
You can find us on blue Sky, It moms to
podcast or on Instagram and TikTok at stuff I Never
Told You for us on YouTube band. We have a
book you can get wherever you get your books. Thanks
as always to our super bluster Christina, executive du My
and a contributor Joey. Thank you and thanks to you
for listening Stuff Never Told You propection by heart Radio.
For more podcasts from my heart Radio, you can check
out the heart Radio app, Apple Podcasts, or where you
listen to your favorite shows.