Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 2 (00:14):
we are back.
Ladies and gentlemen, welcomeback to privacy, please.
Cameron ivy here with gabe gums.
Gabe, how you doing, man?
How was your Thanksgiving?
Speaker 1 (00:23):
It was good.
It was good.
I'm doing well.
How are you doing Good?
How was your turkey day?
Speaker 2 (00:29):
Life is good.
I had some turkey.
I had a fried turkey.
That was delicious Boy.
Nobody blew up, I hope, nah,nobody blew up, I'm watching
those videos of people blowingup turkeys on Thanksgiving.
I saw one where someone waslike lowering it with a string
from another room.
That thing just exploded.
I think you're just notsupposed to do it when it's
(00:49):
frozen Like come on, You're ahundred percent.
Speaker 1 (00:51):
not supposed to put
water and hot oil together, just
generally speaking those twothings no point.
Speaker 2 (00:57):
Combustion, yes, I
mean, I'm pretty sure, one
episode or another.
Speaker 1 (01:02):
I'm sure that Bill
Nye the science guy taught you
that, look, if he hasn't coveredit once, he's probably covered
it three times.
Man, just don't do it, don't doit.
Speaker 2 (01:11):
We need to bring
somebody like him back.
He was golden back when I was akid.
Speaker 1 (01:15):
He was.
I don't know what his rep'slike in the streets anymore is
the problem?
I think he's ruffled somefeathers a couple of ways, has
he, I think?
So I don't pay too closeattention to that kind of stuff,
but either way, shout out toBill Nye from our youth yeah,
he's awesome.
Yeah, bill Nye, bill Nye thescience guy.
But today we got Cameron Iveyand Gabe Gumbs the privacy guys.
(01:37):
We're pulling on into the year.
It's the last month of the year.
Election's behind us.
We got a new administrationcoming in next year, but we also
got a whole bunch of newprivacy laws coming in next year
too, don't we?
We do.
Speaker 2 (01:51):
I think there's a
total of eight for 2025 coming
in.
Speaker 1 (01:56):
Eight new privacy
laws.
Still no, I presume still nofederal laws.
So these are eight state laws.
What are these?
Yeah, these are eight statelaws.
Break this down for us.
So these are eight state laws.
What are they?
Yeah, these are eight statelaws.
Write this down for us.
So eight state laws.
Speaker 2 (02:05):
Yeah, eight state
laws.
I mean this is definitely goingto increase compliance
requirements for businesses,especially ones that are
offering consumer control overpersonal data.
On January 1st of 2025, we havenew privacy laws for Delaware,
nebraska, new Hampshire and IowaSound like all the swing states
(02:26):
.
Yeah, they be swinging.
They be swinging.
They got a big swing.
Iowa's got the biggest swing,yeah, yeah.
So lots of things going on.
We don't have to get into thedetails and some of the
resources that we're pullingthis from.
I'll give a shout out toTranscend Morgan Sullivan over
(02:49):
there putting together a greatblog.
Speaker 1 (02:50):
I'll share it in the
show notes.
Yeah, get Morgan tagged inthere.
Speaker 2 (02:55):
Yeah, definitely Lots
of things going on.
Let me see what else Inmid-year we have Tennessee,
minnesota, maryland there'sgoing to affect.
There's a lot of little thingsthat we can go into.
Those are really all the states, but we're talking about what's
really nice that I pulled outfrom some of them, like New
Hampshire, they're bringing inchildren data privacy laws,
(03:18):
which I like to see, especiallyaround 13 and under.
Speaker 1 (03:27):
There's a couple of
states that are jumping on that,
which is nice, I see a trendwhere a lot of them are moving
towards more opt-out versusopt-in, so default opt-out looks
like it's a trend that thesestates are picking up on.
Speaker 2 (03:39):
Some are doing both.
I mean, what do you think aboutthat?
Speaker 1 (03:43):
I think it's
necessary.
I think the default mode shouldbe you have to choose to opt in
, not have to choose to opt outof it, right?
That's where I think a lot ofthe kind of vacuuming up of all
of that data ends up is when youdefault to people having to
choose to opt out versus themhaving to choose to opt in.
(04:05):
You know you're just kind ofbackdooring their privacy.
Speaker 2 (04:08):
Right, yeah, like
specifically.
So January 15, new Jersey has anew law and specifically
there's a six month grace periodfor opt out signals.
I don't know how common that is.
It's so weird how some of thesestates are there's little
things like that that some ofthem kind of add in or they have
differences in that.
In that sense, there's a lot ofthem that are similar, but
(04:28):
there's always some kind ofthere's similarities but some
have their own, like nuanceslike that that are added in to
make it you know, some are morestrict controls around sensitive
data.
It's it's a lot of stuff aroundtargeted advertising, especially
for restrictions on minors,which I'd love to see.
New Hampshire has a data broker, registration and biometric
(04:51):
data protection.
That's pretty cool, that'sinteresting.
Yeah.
Speaker 1 (04:55):
Yeah, make them
register, for sure.
Speaker 2 (04:59):
Yeah, there's some
pretty neat stuff in here which
you know.
It's interesting to see whatthis is all going to trickle for
the next.
You know there's obviously onescoming out in 2026 already, I
think.
From my understanding.
I think already there's Indiana, Kentucky and Rhode Island that
(05:20):
are going to be going intoeffect in 2026, that are already
on the.
Speaker 1 (05:24):
You know what I find
interesting about this, also,
although I am still a bitdismayed that we don't have
privacy laws that are moredefined, such as these at the
federal level.
If you operate across all 50states which many companies do
many companies do especially ifyou're transacting digitally or
you're trying to reach customersin other states which pretty
(05:45):
much everyone does these days ifyou offer some kind of service
that isn't physically onlyavailable within a geographic
location right.
If you're a business trying toadhere to 50 different privacy
laws, your best option is totake the strictest of them,
adhere to that one.
This way you cover everything.
So it's almost like Californiamight still be the de facto
(06:09):
privacy law to follow, unless,of course, some of these other
new ones have some provisionsthat are stricter than
California, which might makethings a little hairier.
But I think we're almost goingto end up with still this de
facto standard of you followwhat California does and you'll
just get covered for the other49.
Because otherwise, trying toalign your privacy program to 49
(06:31):
different states' rights isjust not tenable.
Speaker 2 (06:34):
Yeah, you're right, I
agree.
There's a there's a trend thatI've seen a lot of, and I don't
know if this just goes by.
This could be me being ignorant, but maybe it just goes by the
size of the state and how manyresidents are in that state.
But, for example, likeCalifornia has it's, they have a
threshold to applicabilityaround controlling and
(06:56):
processing personal data of atleast a hundred000 consumers per
year.
That seems to be the trend fora lot of them, like Virginia,
colorado, connecticut, utah.
But if you get into smallerstates like Montana and Delaware
, for instance, delaware is only35,000 compared to 100,000.
There's little things like thatwhere there's subtle
(07:17):
differences.
Speaker 1 (07:18):
Hold on, that makes
sense, right?
It makes sense for me.
If it's only going to apply atthe state level and your state
doesn't have a large populace,setting the threshold to a
hundred thousand mighteffectively mean that no one in
the state is protected.
Speaker 2 (07:33):
So lowering it makes
sense.
Speaker 1 (07:37):
But when someone like
Delaware does it.
Delaware happens to be thestate where a lot of companies
incorporate.
A lot of businesses areincorporating Delaware because
they are so friendly in theterms for which you can set up
C-Corps, s-corps, lses, etcetera.
I'm almost curious how thataffects those folks.
(08:00):
We may need to pull someexperts on to kind of dig into
that one for us on that level.
But if you're incorporated inDelaware and Delaware's
threshold is 30,000, it doesn'tmatter if you're doing business
everywhere else.
You're going to have to adhereto that lowest threshold.
Speaker 2 (08:17):
Yeah, To your point.
You just gave me an idea that Ishould have thought about
earlier, but I think I'm goingto ask Dave Barmore to come on
the show.
I don't know why I haven't donethis yet.
Speaker 1 (08:26):
He's a regulatory
expert.
I'm certain he's listening.
Dave, when are you coming on?
Speaker 2 (08:31):
Hopefully I'll try to
get him on next week and we can
dive further into some of theseand he can give us even more
insight.
I think that'd be prettyinteresting for our listeners.
So let me do that.
Speaker 1 (08:41):
I think, with eight
new laws and a new
administration coming online inunder 45 days.
I think we should get into thisconversation a bit more depth,
see if we can't help educate ourlisteners on what to look
forward to.
Speaker 2 (08:54):
Well, we can do that,
and we can also talk about
what's to come in the new yearunder Trump, what that means for
everything.
That's all changing, so I thinkthat could be interesting to
learn a little bit more aboutthat.
So, yeah, good idea, cameron.
Thanks, all right.
Speaker 1 (09:07):
Nice work, cameron,
nice work.
Speaker 2 (09:12):
Good job.
Other than that.
I mean there's a lot of littlesmaller details that we can dig
into.
But I mean, you know, I thinkif you want to learn more about
it, I'll share a link.
Speaker 1 (09:24):
You've got a link to
a blog, yeah.
Speaker 2 (09:25):
Yeah, I'll share a
link and if you have questions
about anything, happy to get theanswers for you.
Also, if you guys want to shootany questions our way, and then
we'll try to get Dave on nextweek if that's possible.
I think that'll be interesting.
Speaker 1 (09:38):
I think that'll be a
great idea.
I think that'll be a great idea.
It is almost the end of theyear, as I mentioned at the top
of the show, which means sowe'll get Dave on the Salty
Suitsay is going to be on prettysoon.
We've got him coming up in afew weeks to get some
predictions in for 2025.
But before we get topredictions for 2025, maybe we
(10:02):
just quickly cover some of thetop things that happened in 2024
.
I think one of the biggestthings that happened relatively
recently was a bit of espionageacross our telecommunications
networks Discovered that aChinese hacking group that's
identified as Salt Typhoon.
They infiltrated at least eightUS telecommunication firms and
a number of other ones globally.
And, from a privacy perspectiveand a security perspective, one
(10:23):
of the problems is some ofthese backdoors and these
telecom places were built in bywell, our governments themselves
, but it looks like they mayhave been breached and accessed
effectively giving these foreignhackers direct access to our
communications, which means allthe things that you know email,
phone, like any non-encryptedcommunications, and we can go
(10:47):
all the way back to episode one.
Folks, we highly recommend thatyou use end-to-end encryption
for all of your communications,whether you know, using your own
personal email.
I suggest things like ProtonMailfor texting and communications.
You can use things likeProtonMail for texting and
communications.
You can use things like Signal.
You don't even have to exchangephone numbers any longer.
But I think it's clear thatthis isn't just some emerging
(11:08):
problem and you know it's notabout boogeymen watching
everything we do.
But from my perspective, it's asafe assumption that all the
things are compromised from acommunication standpoint and if
you value at all any privacy,you really should look at this
breach of 2024 as probably oneof the most public examples of
(11:28):
why what we talk about in thisshow, week in, week out, isn't
fear mongering.
It isn't the what ifs.
This is the what now.
Speaker 2 (11:36):
That was the first
one.
That's the what's that I wasreading the.
Speaker 1 (11:39):
That was the Salt
Typhoon, guys.
Speaker 2 (11:40):
Yeah, Salt Typhoon.
Yeah, I don't even know.
That made me think about thegame Rollercoaster Typhoon.
Speaker 1 (11:46):
Rollercoaster,
typhoon, yeah.
Speaker 2 (11:49):
And does that show my
age?
Speaker 1 (11:51):
Just a little bit it
might.
That game goes back.
That game goes back.
I mean, you didn't say OregonTrail.
Speaker 2 (11:58):
So wait, wait, to dig
a little bit deeper on that one
Gabe, because talking aboutbreaches and stuff, I know that
you've, you know you kind ofharm and preach on this about
what I guess to dig into yourworld a little bit when it comes
to storing data and databasesand things like that.
How is this kind of related inthat sense when it comes to
(12:20):
unauthorized access to privatecommunications?
Speaker 1 (12:23):
Yeah, it's a good
question.
The best answer is we have tofocus on not just securing the
things while they're in ourpossession.
Right?
I'll just use an overlysimplistic example.
Right?
Like simply encrypting data atrest in your environment, like
just encrypting a file, isn'tgood enough.
You've got to encrypt data fromend to end, and, unfortunately,
(12:46):
one of the primarycommunication mechanisms we most
all use is email, and almost noone encrypts their email
between sender and recipient.
It's just not as common apractice as it should be, which
is why you know and zeroaffiliation just happened to be
big fans of their work, but it'swhy I suggest, you know folks
(13:08):
use things like ProtonMail andmaybe move away from, you know,
classic Gmail, et cetera.
Can those services provideencryption?
The short answer is I know theycan, but for your average
everyday user it's not quite asstraightforward.
More importantly, it isn't justthere by default, and that's
the real problem.
It's not just there by default,and so what we really need to
(13:31):
look at is ensuring that all ofour communications, from where
we send them to when they get tothe other side, are fully
protected, because if the actualnetworks they have to traverse
the telecommunication networkshave been compromised, there's
nothing you can do about that.
We don't control any of thatinfrastructure.
(13:51):
We can't even really choosewhich of that infrastructure our
data is going to traverse quitefrequently, and because so many
of them also use and share eachother's infrastructure, you
really don't have much in theway of guarantees that it's only
on infrastructure by thistelecommunication provider.
It's all the same.
Really, it's all one bigmelting pot.
So end-to-end encryption is thekey.
(14:14):
The days of having to use PGPand GPG on your own as an
individual are largely behindthis.
I can tell you explicitly, forexample, that my mother uses
ProtonMail.
If my mother can use ProtonMail, you can all use ProtonMail.
Trust me, that's a good point.
I am not exaggerating, I'mgoing to bring a.
Mom uses ProtonMail.
Speaker 2 (14:35):
I think I'm going to
pull my card and bring on my
ethical hacker correspondentfrom the field, Mr Gabe Gumbs.
What do you think that SaltTyphoon?
Obviously, when it comes to ahacking group, they're not just
hacking random things forrandomness.
There's probably some kind ofmission here.
What do you think that theywere trying to get out of this
(14:57):
particular breach?
Speaker 1 (14:58):
So hard to say, but I
think my intuition tells me
that part of it isn't just aboutwhat you can get now, but we
are in the precipice of quantumencryption capabilities being
able to break current encryptionmechanisms with quantum
computing.
If you've got access totelecommunication links, you can
just vacuum up all of that data, even if it's encrypted right
(15:20):
now, and hold onto it until youcan break.
Speaker 2 (15:23):
Let me ask you this
when you get access to something
like that and let's say youknow it gets taken care of and
they lock you out or whateverhow that works Is there still a
connection there, becausethey've already gotten into
where they can get in againbecause they're connected
already.
So it's kind of like a becausethey've already gotten into
where they can get in againbecause they're connected
already.
So it's kind of like they'vealready connected to another
time zone or time travel, butthey've gone to that place so
(15:46):
they can go back to it.
Does that make sense?
Speaker 1 (15:49):
No, it makes sense.
It makes sense.
I mean basically you're askingcan they establish a foothold
that, even if you root it out,they can simply revert back to
it?
Right, how persistent can theymake that threat?
Speaker 2 (16:02):
Yes, yeah.
Speaker 1 (16:03):
Got to tell you, at
this level of sophistication, it
is my assumption that theirpersistence can last almost
indefinitely.
It is very, very, very, verydifficult to know that you would
have gotten all the things out.
Speaker 2 (16:17):
Sure, yeah that, yeah
, that's scary.
It's also kind of cool to thinkabout.
Speaker 1 (16:21):
It's fascinating,
it's a hell of an interesting
digital world we live in and youknow, in the last call it three
years we've watched a lot ofevolution through things like AI
and we're going to see a bigevolution in encryption Again as
we get closer to quantumcomputing.
It's going to change a lot ofthe conversations we're having
around security.
Many things will not be assecure as they were literally
(16:46):
overnight.
Speaker 2 (16:47):
Man, I love all these
things.
It's so fascinating.
Let's talk about the next one.
It was a Russian cyber attackyeah, down on the Australian
port.
So this one happened last month, in November.
There was a cyber attack attackyeah down on the Australian
port.
Speaker 1 (16:57):
So this one happened.
Last month in November, therewas a cyber attack that was
attributed to some Russianactors, targeting DP World.
They're a major port operatorin Australia.
That attack in particulardisrupted some imports and
exports of over 30,000containers.
Right, and really theimportance of that is it's just
economic disruption.
Right, goods can't move backand forth.
(17:18):
It creates lots of problems fora nation A lot of times.
I think people forget that.
Sometimes hacking isn't aboutnecessarily getting into the
system.
It isn't always necessarilyabout getting to the data.
Sometimes it's just aboutdisruption.
Sometimes it's just aboutdisrupting operations.
If we look at ransomware as anexample, the primary impact
(17:39):
ransomware has is anavailability impact.
It's not just it stole the dataand the data got leaked.
Yes, that's an obvious problem,but if we're being honest, data
brokers are a bigger freakingprivacy problem than ransomware
is.
I'm sorry, it just is.
It's just.
Data brokers pose a muchgreater risk to society than
ransom attackers getting a holdof PII.
But ransom attackers themselves, they're mostly interested in
(18:02):
economic disruption.
They want to take you offline,forcing you to pay, and in this
case I don't know that theseRussian actors were a state
sponsor or not, but there's alot of reasons why state
sponsors might want to disruptshipping industries in a country
.
It is very harmful to theoverall economics of those
(18:26):
countries.
So that was a pretty big attack.
Speaker 2 (18:29):
Pretty big attack.
Yeah, I mean safe to say thatwas kind of Russian of them to
do that.
Speaker 1 (18:35):
Little Russian of
them to do that.
They've been busy this year.
Back in June there was anotherone.
There was another one back inJune by some Russian attackers,
right, that was the Microsoftemail.
Speaker 2 (18:44):
Huge, oh, okay, okay,
I was going to say Australia
too?
No, okay.
Speaker 1 (18:48):
No, no, no, this is a
Microsoft one, and so Russian
hackers had compromisedMicrosoft systems.
That's right, yeah, accessingemails of both their staff and
their customers Huge, that wasthe one that prompted Was this
by the same people or no?
We don't have.
I certainly haven't seenattribution to the same people,
just to the same region, gotchabut not necessarily to the same
(19:09):
threat actors.
But it is quite plausible.
But again, if this were nationstate and I'm not saying it is,
but it certainly looks like itmight have been A lot of times
nation state attackers havedifferent units that are engaged
in different activities underone larger umbrella.
This was the breach thatprompted a bit more regulatory
scrutiny from Congress.
There were congressionalhearings on this and in fact
(19:31):
after that the US government putout quite the scathing note
about Microsoft not takingsecurity seriously, to which
Microsoft responded and saidwe're sorry and we're going to
start taking it seriously now.
We're very sorry that we hadn'tbefore which is wild because
Microsoft is the largestsecurity vendor in the world, so
(19:52):
they're super invested inselling security products.
Speaker 2 (19:55):
It's like a backhand,
slap Like, oh you know what.
Speaker 1 (19:58):
We're sorry this is
one of those cases where you
should be getting high on yourown supply.
Microsoft, you need to take acouple of tokes of your own good
stuff.
All right, Just maybe get someof that in there.
I enjoy picking on.
Speaker 2 (20:10):
Microsoft.
This reminds me, by the way andI don't know if this is way off
topic but do you think thatwe're going to start seeing more
stuff like what just happenedwith the shooting of the
UnitedHealthcare CEO?
You know what I'm talking about.
Do you think that there's goingto be more like?
I don't know why, we don't knowwhy that happened, but it is
interesting that it's like that.
(20:30):
Civilians targeting industryleaders because they are not
pleased.
Speaker 1 (20:36):
Look, that's a damn
good question.
And let's go back to ourfriends in the data broker world
.
It is not implausible to thinkthat somebody could be so upset
and disgruntled about theirinformation having made it into
the hands of, say, a jadedex-lover or whatever.
The case is right.
There are some industries thatare so disliked generally by the
(21:00):
public that what you'rehypothesizing is very much a
concern, I think.
I think, if I were to use theparallel in our industry, it
would be data brokers.
Right, they are seen by andlarge as not really adding any
value to our world, not at anyright and at the expense of all
of us and all of our privacy.
(21:21):
Could there be some person thatgoes lone wolf and gets mad and
targets the CEO of a databroker?
I am no advocate for violence.
Speaker 2 (21:30):
No.
Speaker 1 (21:30):
No advocate for
violence.
But in that scenario you paint.
Yeah, I could see it happening.
Speaker 2 (21:35):
I could see it
happening.
It's believable, right.
It's very much believable.
Speaker 1 (21:37):
Right it's, it's very
much believable, yeah it's
scary yeah, mr robot style rightmr robot style.
Speaker 2 (21:43):
Yeah gosh, he's a
weird looking dude, isn't he?
He's got some weird beady eyes,but he did a good.
Uh, freddie mercury, I did thebest freddie mercury.
Speaker 1 (21:50):
That was a hell of a
freddie mercury I'll even give
him a shout out too.
Speaker 2 (21:53):
I don't know if
anybody saw this, but there was
a detective movie that um, hewas a fbi.
I saw that one did you see it.
Speaker 1 (22:01):
I did, and I don't
watch much.
It was good.
It was very good nine out often times.
If you ask me, gabe, have youseen that movie, the answer is
usually I haven't seen that, butI don't know.
Yeah, I've seen that movie,though I know exactly who it was
.
Speaker 2 (22:12):
I think it's the
singer from 30 seconds to to
Mars.
He played the Joker, yeah, yeah, yeah, yeah, I forget his name.
I should know his name.
He plays the.
I think he plays the killer,and then I can't remember who he
works with.
Is it Denzel it?
Speaker 1 (22:26):
is Denzel, it's
Denzel.
Denzel is the cop.
Speaker 2 (22:31):
Yes, who's the BDI
guy?
What's his name?
Malachi.
Speaker 1 (22:37):
Yeah, something like
that, See.
Speaker 2 (22:39):
I'm not good with the
names.
Speaker 1 (22:41):
I'm not the celeb guy
.
Speaker 2 (22:43):
I couldn't tell you
that's okay, but what was it
called?
Speaker 1 (22:46):
It was.
See, this is even morequestions I'm going to.
Speaker 2 (22:49):
Man, we're going down
a rabbit hole.
We are going down a rabbit hole.
Speaker 1 (22:51):
It's all good.
It's the end of the year.
That's how we wrap it up.
It was a good flick.
Speaker 2 (22:56):
I'm not going to
leave you all hanging.
If you haven't seen it and youlike detective movies, it was
actually worth a watch.
Speaker 1 (23:00):
It was kind of slow,
but I thought it was a decent If
you've got a Sunday afternoonand you're not really getting
into much.
I'd highly recommend it.
Speaker 2 (23:08):
Yeah, it was a decent
little, the Little Things.
Speaker 1 (23:11):
That's it.
That's it.
Rami Malek Jared Leto.
Yes, jared.
Leto's the Jared.
Speaker 2 (23:20):
Leto and Denzel
Washington.
Speaker 1 (23:21):
That was a good man,
Denzel.
I'm leaving with something.
It was good, I liked it.
Yeah, it was good Well.
Cam my man.
Yeah, it's always good to catchup.
It's always good to have youfolks along for the listen, as
promised.
We've got a few things comingup before the year is out.
We're going to bring in thesalty soothsayer for some
(23:44):
predictions for next year.
We'll cover a little bit moreof what went down in security
and privacy this year.
We'll bring on some of ourfriends to help close out the
year and tell us about some ofthese privacy laws that are
upcoming.
And until next time, friendsout in listening land, we
appreciate you always tuning in.
Speaker 2 (24:02):
Absolutely.
Thank you, gabe, thankseveryone, and I've got some good
changes coming for 2025 as well.
So I'm excited for all thethings that we're going to be
working on and doing, so just beaware of that, and thanks for
sticking around with us.
All right, guys, till next time.