Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
All righty then.
Ladies and gentlemen, welcomeback to another episode of
Privacy.
Please, cameron Ivey, alongsideGabe Gumbs.
We got to catch up here alittle bit, gabe, so last week
things have just been busy.
I know people didn't get tohear your voice, but I talked
about three different privacyfronts.
Well, actually there were threedifferent privacy threats
(00:22):
across multiple differentapplications.
There's a couple of differentthings.
Just to recap, I talked alittle bit about 23andMe and
them going bankrupt, then talkeda little bit about the Honda
Fine by the CCPA related tocomplicated opt-out processes,
and then also talked about thesignaling app.
Let's dive into the signal part, because I wanted to talk to
(00:45):
you about the security side ofthings.
First of all, when you heardabout these the signal thing
because I know we've talkedabout signal quite a few times
on the on the show.
It's an encrypted messaging app.
It's, you know, use it at yourown will, I guess.
But what were your firstthoughts when you heard about
this?
Speaker 2 (01:04):
I mean there were a
couple of layers to this right.
So the first kerfuffle, usethat word today.
Speaker 1 (01:09):
It's a good one.
Speaker 2 (01:10):
Yeah, it's a good one
.
There were some senior folks inthe administration discussing
some sensitive topics aroundsome national security issues
and some attacks and things ofthat nature conventional warfare
attacks and someoneaccidentally added a reporter
into that threat.
Then after that, there was anannouncement that there were
(01:36):
attackers, specifically Russianlinked attackers, that were
targeting signal, targetingsignal for exploitation.
Now, that exploit was actuallyoriginally covered by Google
Threat Intelligence Group backin February of this year.
You know that actually wasn'tnew news at all, that threat
actors were targeting signal,and it's here's the thing.
(01:58):
Let's break down what signal isa little bit more and where the
threat Wait before you diveinto that.
Speaker 1 (02:04):
I don't know why this
is funny to me, but it just
made me think of Russian spiesand how it's 2025 and we're
still dealing with Russian spies.
Speaker 2 (02:16):
Spies everywhere.
It's just spy versus spies justeverywhere, I guess so.
Speaker 1 (02:20):
Well, let's dig into
it, Okay.
Speaker 2 (02:21):
Spies aren't going
anywhere.
Signal, as you mentioned, it'sa secure, encrypted messaging
platform, communication platform.
You can text, you can talk, youcan group chat, you can do all
that good stuff.
It does have a couple of flaws,though.
The first flaw is that it's notprotected from human error as
nothing is as nothing is.
(02:44):
And so accidentally addingsomeone to a secure group chat.
Well, you know, that's justaccidentally adding someone to a
secure group chat.
The real question is why werethese folks not using, you know,
a sensitive compartmentedinformation facility, aka a SCIF
, for this type of communication?
Why were they using Signal?
Presumably because Signal isquite trusted as a secure
(03:06):
communication application heardwith it.
But one would assume that forthat kind of top secret
information, that Signal wouldnot have been an approved SCIF.
So there's that problem.
The second problem is the attackitself.
The attack vector isn't againstSignal directly, it's against
how you use signal.
So, like you can link yoursignal to different devices so
(03:28):
that you can chat on your laptopand then you can also chat from
your phone.
And so the attack targetsindividuals, gets them to use a
QR code to link their signal toan attacker controlled instance.
So you see they didn't hacksignal itself to an
attacker-controlled instance.
So you see they didn't hackSignal itself.
(03:50):
They basically still justexploited the human weakness in
all of these technologies, whichis why skiffs tend to be
completely isolated, like evenfrom the regular internet, such
that you know you can't crossthe streams.
So there's like two layers ofproblems happening here, and
they all come down to humanerror.
At the end of the day, they allcome down to the weakest link
in any technology and in anyattack vector is the human
problem.
And so you know, knowing thatsenior officials were leveraging
(04:15):
a platform that is beingtargeted and like look, we have
to assume that that platform hasalways been targeted because it
is a secure platform, likeevery spy, I'm certain, has been
targeting it for exploitation.
Speaker 1 (04:28):
It's like a girl or
guy where they're like it's the
challenge of trying to get intosomething that is secure.
That's like candy for thesepeople that do this for a living
, For a spy.
Speaker 2 (04:39):
I can't get into it,
yeah right, that's where I want
to go.
I want to get right to thatinformation For sure.
Speaker 1 (04:43):
For sure.
So yeah right, that's where Iwant to go, I want to get right
to that information?
Speaker 2 (04:45):
Yeah, for sure.
So you know, I mean, from myperspective, is Signal still
secure, I think, for anythingbelow nation state
communications?
Sure, absolutely.
I'm still not necessarilyworried about it.
Scanning QR codes, like look,we've talked about that in the
past too.
I'm not really convinced thatQR codes are useful enough to
(05:09):
solve.
I just don't like them as asolution to the you know, get a
quicker way to a link problem.
I get it and it is quicker,don't get me wrong, but you
can't get me to scan one, reallyyou can't get me to scan one,
really.
Speaker 1 (05:22):
So wait, wait.
Are you meaning like, how doesthat?
I know that there was somethingthat with the QR codes, with
Signal, but how does that relateto scanning a QR code?
Are you saying, in general, ifyou have Signal and you use your
camera on your phone to scanany kind of QR code, that makes
you vulnerable to these type of?
Speaker 2 (05:39):
hackers.
I'm saying when you look at aQR code, you actually have no
idea where that thing is takingyou.
Speaker 1 (05:45):
True.
Speaker 2 (05:46):
I can write below the
QR code wwwgooglecom.
That doesn't mean the QR codeis taking you to wwwgooglecom,
right.
Speaker 1 (05:56):
And it's used a lot.
It's used if you want toconnect your phone to your
Netflix account on your TV,things like that you pull up.
Speaker 2 (06:06):
So COVID introduced
QR codes at a lot of restaurants
because we weren't usingphysical menus for a while.
So it's still a bit persistentin the wild, so to speak.
If I walked around town andjust slapped a QR code that
takes you to a site that Icontrol that delivers a payload
to your devices, there's noshortage of people that are just
going to scan it, just becauseNow, if I get more intentional
(06:26):
with that and maybe I go to arestaurant and I put that QR
code over the QR code that therestaurant had there, or
airports have them now too, tonsof airports have them you pull
up to the restaurant inside theairport and there's a QR code in
front of you and you scan thatto order and pay.
So I create a site that looksexactly like theirs.
I create a QR code that takesyou to my site that looks like
(06:48):
theirs.
Speaker 1 (06:51):
I go to the airport
and I just put those over every
place, Matt.
Speaker 2 (06:53):
That's like the new
gas station credit card thing.
That's it.
It's yeah Right.
Speaker 1 (06:57):
That's way.
That's a way broader, likescope too for people to.
That's dangerous.
That's genius, though, thoughjust clicking on random it's
smart, right?
Speaker 2 (07:09):
you just shouldn't
just go around scanning random
things.
Speaker 1 (07:12):
No offense to people,
but there's a lot of dumb
people out there yeah, well,there's a lot of trust, it's
easy.
Yeah, yeah, yeah, and I don'tknow the convenience of it and
we've been.
Speaker 2 (07:21):
You know we've
collectively and socially asked
people to trust it also.
Speaker 1 (07:26):
Like you know again.
Speaker 2 (07:26):
Covid really drove a
lot of trust in QR codes.
People like, oh, another QRcode, I'll scan that thing and
like the same thing.
So if I did that at every, atevery place that there there's a
QR code, I'm just going to usethat airport reference again and
you use Signal and you scanthat QR code and I get it to
link your Signal account to mycontrolled Signal instance.
Boom, I'm in.
But you know, the thing is bestwe can tell.
(07:50):
In the case of that sensitiveinformation getting out from
those government officials,someone literally just fat
fingered adding another contactto the group that's what it
sounds like that.
Speaker 1 (08:02):
Was it Human error,
which I mean?
I don't know which is worse,admitting that that was the
accident, or if they were hacked.
Speaker 2 (08:09):
Man, I almost prefer
to know that it was just human
error, right?
And when I say prefer to know,because you're never going to
train humans out of this, but itcertainly points at least to
the necessity to then leverage atechnology stack that tries to
eliminate as much human error aspossible.
Hence the reason skiffs existedin the first place, right so
that even if someone messes up,there's a technology backstop
(08:31):
that just doesn't allow that tohappen.
On the other side of that coin,the you know, and it's tough for
me to call it a vulnerability,you know, yeah, we can argue it
is a vulnerability, but you know, again, tricking someone into
into clicking a link, it's justanother form of phishing.
Really, it's just another formof phishing and phishing is
wildly successful.
(08:52):
It is still the primary attackfactor for most ransomware, like
link clicking, just clicking onlinks the old thing you don't
hack in you log in.
Speaker 1 (09:04):
Yeah, Get them to log
in for you.
That's why it'll never go away,because it's just adaptable.
Speaker 2 (09:11):
Yeah.
Speaker 1 (09:11):
Yeah, so the guy.
Just for reference, if nobodyis too familiar with this
situation, the encrypted signalapp is what Defense Secretary
Pete Hegseth and other leadingnational security officials
within the administration usedto discuss bombing Huthu sites.
This past month.
The Atlantic editor's in-chief,Jeffrey Goldberg, was
(09:33):
inadvertently added, as Gabementioned, to the group and was
privy to the highly sensitivediscussions.
They're calling it a spillage,Gabe, Are you familiar with that
term?
It can be a career ender for amilitary officer.
Spillage, yeah.
Speaker 2 (09:49):
I mean I'm familiar
with that term in different
contexts, I presume that in thiscontext it means yeah,
information spilled yeah yeah,yeah.
Yeah.
Was it really spilled, though,or did someone just like?
Speaker 1 (10:08):
they added an extra
straw to the glass that wasn't
supposed to be in there.
Well, hopefully the truth, thefull truth, will reveal itself
at some point, but it's aninteresting I mean.
So what do you look at this interms of?
So human error is the culprithere, but what is there any
advice that you have for anyonethat uses Signal for sensitive
information or anything to leavethe listeners with this entire
(10:30):
situation?
What, I guess?
A conclusion of what you feelabout it?
Speaker 2 (10:33):
Depending again on
what you're using Signal for.
I personally wouldn't linkSignal to multiple devices in
the first place.
I individually do not practicethat.
So you know, if something elseshowed up as a link device, that
should be a flag.
But back to those QR codes, Iwould I'd be super hesitant to
scan any QR code.
You know, when you scan a QRcode at first, bring like in
(10:56):
your camera.
You can see the little linkthat it shows first.
I don't know if you canactually filter that through
something to check it first.
I don't know if you canactually filter that through
something to check it first.
I don't know if you can likesend the QR code to like virus
total first or something.
But you know, if that's a thing, we'll look into that and maybe
report back on that.
Maybe that becomes a better wayto solve for that problem.
The phone makers it seems likethat's a natural place for them
(11:17):
to insert some help for the restof us.
When you pick up your iPhone,maybe when it looks at a QR code
it should first run thatthrough VirusTotal and tell you
what's happening.
That's not 100% assurance thatyou're not going somewhere
naughty but certainly it wouldbe helpful.
I would avoid QR codes, beth can.
It's tough because I'd be lyingif I said I don't.
I have never created a QR codefor others to scan.
(11:41):
I have and I know others willtrust it.
But you know that's it's a truestatement that someone could
hijack those QR codes as welltoo.
So you know, I always, I alwayspractice putting the actual
link next to or below the QRcode if I create one, so you
know someone who doesn't want toscan it can just manually enter
it in.
Speaker 1 (12:00):
Yeah.
Speaker 2 (12:00):
Yeah.
Speaker 1 (12:01):
So businesses, you
know, make sure you have that
option, just in case, yeah,people that actually care about
not scanning things and keepingtheir stuff safe, okay, well,
let's, uh, let's move on to 23and me then talk about talk
about spillage.
Yeah, I mean all of thesecompanies.
I feel like these were startedjust so they can get everybody's
(12:21):
DNA anyways.
Speaker 2 (12:24):
I'm inclined to agree
with you.
I'm very much inclined to agreewith you.
And the question then becomesyou know, in their filing for
bankruptcy, who owns thoseassets?
Yeah, you know.
Do they just get destroyed?
Probably not.
Someone probably buys them outof bankruptcy, takes them out,
and then what happens to thatdata?
Probably buys them out ofbankruptcy, takes them out, and
(12:44):
then what happens to that data?
Is the use of that datatransferable to the new entity
for a new purpose?
Are they only still?
This kind of goes back to someof the language that GDPR was
originally formulated around.
Right, like you know, theoriginal purpose of collection
and processing.
If that original purpose ofcollection and processing and
the entity that was granted therights to collect it in the
(13:04):
first place changes ownership,like what happens.
Speaker 1 (13:10):
It's a good question.
My thought when I first heardabout it was like you read the
headline, it's like 23andMe'srecent bankruptcy announcement
set off a wave of concern aboutthe fate of genetic data for its
15 million users.
Who's actually like concernedabout this?
Is this just like people that?
(13:30):
Because I guarantee you peopleare like, oh crap, I forgot
about 23andMe.
Yeah, I actually use that.
Now I care about my privacybecause they're going bankrupt.
I mean, you probably shouldn'thave given them your DNA in the
first place if you're worriedabout your privacy.
Speaker 2 (13:46):
You know it's not,
it's not untrue, but it's hard
to expect the average consumerto have really thought about it
that way.
There's there's also theinverse of that, which was, you
know, a lot of people were usingservices like those to
reconnect with family members.
Right, yeah good point.
And to learn whether or notthey maybe had genetic markers
for cancers and stuff like that.
Speaker 1 (14:08):
Which is why it was a
perfect product to create,
because people are so curious.
Speaker 2 (14:13):
Yeah.
Speaker 1 (14:14):
And I wonder if it
was created for malicious intent
in the first place.
I'm going to argue that itwasn't.
Speaker 2 (14:21):
Yeah, I hope not, but
I would also argue that it
should have been known thatmalicious use was always
possible, like always possible.
And it depends on where youwant to draw the line of
malicious.
If that data ends up in thehands of, say, a health care
organization who simply uses itto deny coverage because oh look
, you have a DNA marker thatsays you know you're going to
(14:42):
have, you might have an issue's.
I don't think like we've talkedabout in the past.
Speaker 1 (14:46):
I don't think a lot
of people create these kinds of
companies for evil.
I think it's smart becauseagain they're thinking well, if
we can get their DNA and helpthem find family members, people
(15:08):
are going to be reallyinterested in that.
This could be a moneymakerClearly going bankrupt.
It kind of died out.
I guess I don't know what thereal reason, but usually you're
not making enough to keep itgoing.
Speaker 2 (15:20):
That's usually the
reason Sales aren't going well
yeah.
Speaker 1 (15:23):
Well, any other
thoughts on this?
I mean, there's so much goingon in the realm today.
Speaker 2 (15:27):
I think the key
takeaway from our two topics
today is a friendly reminderthat security and privacy really
does still have that weakestspot of humans being involved in
the decision-making process,and so we should always be
mindful of our activities.
It shouldn't be that we have tothink if I sign up for the
(15:51):
service, what does that mean 20years from now?
But I think we now live, andhave lived in a world for a
decent amount of time where thatshould be a question we ask
ourselves when I sign up for aservice and I give information
to company A, what becomes ofthat information in 10 years, 20
years, 30 years, 40 years?
Right, yeah, we should bethinking about those things as
(16:13):
individual.
Speaker 1 (16:14):
Yeah, because you're
not only trying to protect
yourself, you're trying toprotect your family name and
your children.
It's just all digital.
Now think of it as a treasuremap to your family's riches.
Sorry, I've been listening to astory about the Batavia.
You know that story.
Speaker 2 (16:36):
I'm not that familiar
.
Speaker 1 (16:38):
It's a good one, if
you haven't heard it.
It's like in the 1600s, wherethe death of the Batavia, where
they crash or like the boatstarts to sink and there's all
these innocent families andstuff, because these workers
that are pirates or whatever,they would take their families
because they would try to movethem to another part, wherever
they're going to work, and stuffto get out of a certain area
and they like crashed and or notcrashed.
(17:00):
Yeah, they did crash on CoralReef.
They crashed on Coral Reef,started the boat started sinking
, so they went to nearby islandsand it just turns into a
massacre.
You'll have to.
Didn't know about it either,but pretty fascinating story,
interesting.
Speaker 2 (17:23):
It makes you think
like that long ago people were
horrible people.
I mean, 1600 sounds justgenerally speaking like a rough
time to have been alive likewhenever I ask the question
would you rather live in thefuture or the past, like it's
always?
Future, like always yes, yeah,yeah totally future like totally
.
Speaker 1 (17:33):
Could you imagine
living in the Black Plague times
?
No, I could not.
Speaker 2 (17:37):
I absolutely could
not, I absolutely do not want to
, and there may be a BlackPlague in the future.
But the thing about the past isI already know how much
shenanigans have occurred andI'm not interested in
encountering that.
Yes, there will be shenanigansin the future too, but I'll take
my chances with those on.
Speaker 1 (17:54):
I'll take robots over
the Black Plague, thanks, Right
right, right, well, anyways,good stuff.
All right, gabe, thanks fordigging into that on your
thoughts and if you got anybody,if anybody else listening
anymore on these topics or anyof the stuff that's going on
with the Honda Fine and CCPA Imean that one's pretty
self-explanatory.
Condefine and CCPA I mean thatone's pretty self-explanatory.
(18:15):
They're just using outdatedtechnology for opt-out
mechanisms, making it toodifficult for consumers.
I mean, if we learn anythinghere in privacy and security is
for consumers.
Make opt-outs and opt-ins assimple as don't.
Make it so difficult for peopleto think that they're giving up
their information.
It shouldn't be a quiz.
It shouldn't be like thispuzzle and maze.
(18:36):
It should be a simple swiperight or swipe left, just like
those dating apps.
There it is, anyways, thanksfor listening.
We'll see you guys next week.