Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
All righty then.
Ladies and gentlemen, welcomeback to another episode of
Privacy, please.
I'm your host, cameron Ivey,and I got a doozy for you.
This week we're going to do alittle broadcast on some news
from the latest things that aregoing around in privacy and
security.
I'm sure you've heard aboutmost of these.
If you haven't, let's dig in,let's learn and let's see if we
(00:26):
can, you know, come up with somegood ideas on how to handle the
current situations.
So, without further ado, let'sdive in.
First, we're going to talk alittle bit about 23andMe.
Yeah, did you hear about thisone, holy cow.
So 23andMe.
If you're one of the 15 millionusers of 23andMe, now's the time
(00:50):
to consider deleting your data.
Maybe Just a little bit.
Just a little bit raisingconcerns about how your
sensitive information, like DNAand genetic health data, how
it's being managed, right?
Everybody's on the internetsthese days, right?
(01:11):
So this is huge.
There was a law professor,craig Conneth, from the
University of Virginia.
He warns that the terms ofservice governing your data
could change during thebankruptcy proceedings.
That comes with a lack ofstrong federal regulations, and
he's warning that your privacyis at risk, which I would have
to agree.
So if you do have an account,I'd go on there and try to
(01:33):
delete that as fast as you can,wouldn't say it's too late, but
better be safe than sorry.
This is a pretty big deal, soI'd jump on that.
Listen, if you want to protectyour data, go log in 23andMe.
If you have an account, followthe steps to delete your
information and just rememberit's better to kind of knock
that out, be proactive, thanrather deal with it down the
(01:54):
road.
Right, let's move on to the nextone.
So there's a lot that'shappened in the last week or so
Honda CCPA violations.
If you haven't heard about thisone in California, the
California Privacy ProtectionAgency, so the CPPA.
They fined Honda for $632,500for violating the California
(02:18):
Consumer Privacy Act.
That's not that big of anamount of money.
Why can't I?
I can't talk right now.
That's not that big of anamount of money.
Why can't I?
I can't talk right now.
$132,500 is not that much moneyfor a large company like Honda.
However, this could be big.
This could be a lesson for othercompanies that might be not
really prioritizing privacy andopt-out, opt-in mechanisms and
(02:43):
making that easier for yourcustomer base, your consumers.
You need to make that option assimple as possible to opt-in as
well as opting out.
It shouldn't be this maze ofchaos to confuse the consumer,
tricking them into giving themyour data or to giving their
(03:06):
data to you or to the company.
But OK, so the allegationsinclude requiring excessive
information for consumers toaccess their to exercise their
rights and failing to provideadequate verification processes
for authorized agents.
So basically, they were takingtheir data and doing whatever
(03:28):
they pleased with it without theconsent of the consumer.
So the CPPA's order mandatesHANA to reform its data request
process and consult with a userexperience designer to improve
privacy measures.
Shameless plug.
Yo.
Hit up Transcend, we got you.
So the CPPA, actually, you know, let me take that back If this
(03:54):
is some action that you can takeas a consumer.
So, if you're a consumer, stayinformed about your rights under
the CCPA.
If you're not, look it up.
If you got questions, reach out.
We're here, we got you.
So you know that you canrequest changes to how your data
is handled and advocate forbetter privacy practices for
companies you engage with.
Goes both ways.
(04:14):
We got to be vigilant on ourown private data.
It's very valuable, especiallyin today's crazy, crazy world
Progress there.
It's a good thing that that finehappened, although the only
thing I don't like aboutanything like this that happens.
It happened years ago and it'snow just coming to light that I
(04:35):
have a problem with, but Iunderstand that that's just how
the way things go.
Anyways, let's move on to ourfinal and third topic.
This just happened.
This is the more recent thing.
I think it happened a couple ofdays ago at the most.
So the Signal messaging app thatis, you know, supposed to be
private messaging, all that kindof stuff that messaging app was
(04:56):
recently made.
It made some pretty bigheadlines after senior US
officials used it for sensitivediscussions, encryption and
privacy features.
But the concerns have raisedsome eyebrows about its security
(05:17):
, following a warning from theNSA regarding its
vulnerabilities.
So signals, end-to-endencryption and no data
collection policy make it areally popular choice for
individuals that are seeking,you know, that secure
communication.
If not, you're probably usinglike WhatsApp or something crazy
like that, don't use that.
So the app's use by governmentofficials raises questions about
its reliability and in highstakes situations like this, for
(05:39):
example, because it's I don'tknow, it's silly to see this
happen, but I'm not surprised.
I just don't.
I don't know, you can't.
This was a doozy.
So I think if you're anindividual that's using Signal
or any messaging app forsensitive conversations, just be
aware, be smart about what youput on anything and that's
(06:02):
including iMessages, that'sincluding text, it's all
attainable, it's all somethingthat it can be found.
So just understand the securityfeatures and limitations of
whatever platform you're usingor option you're messaging with
and consider maybe somealternative methods for
discussing highly confidentialinformation.
You know, if we're talkingabout the normal consumer here,
(06:26):
we're talking about family stuff, we're talking about real deep
stuff that you probably don'twant out in the open, that could
ever be revealed, things thatare very private.
I would just make sure you'renot putting that stuff out if
you don't want anybody to, ifyou don't want something like
that to be surfaced.
You know this is just kind oflike a high level for those
(06:46):
three topics.
Pretty big things.
You're going to see some bigchanges With things like this.
This is also good because youhave to think about when events
like this happen.
It allows for stricterregulations to be proposed and
hopefully you know these kind ofthings won't happen in the
future.
But that's kind of inevitablewhen it comes to this kind of
(07:09):
stuff.
We just have to be mindful,stay ahead of it, be smart and
keep paying attention.
Privacy and security in today'sworld, in 2025, three months
into the new house and all thestuff that's going on.
It's going to be a rollercoaster as we continue in this
(07:32):
realm.
So thanks for tuning in.
Once I get with Gabe again,we'll dig a little bit further
into these to kind of dig onsome more in-depth ways that we
can kind of look at thesesituations and how we would want
to handle that from a securitystandpoint and maybe dig a
little deeper.
So if you have questions, ifyou're listening and you know a
(07:54):
lot about this kind of stuff andyou want to come on the show or
anything like that, or you knowsomebody that would want to be
a part of the digging deeper onthese, would love to have you on
.
So thanks for listening andkeep on.
We'll have some more updatesand some cool stuff coming.
So thank you again forlistening and we'll see you guys
next week.
Cameron Ivey over and out.