All Episodes

July 3, 2025 14 mins

Send us a text

Cameron and Gabe dive into Healthline Media's record-breaking $1.55 million settlement for CCPA violations, examining whether such penalties are sufficient deterrents against improper sharing of sensitive health data.

• Healthline violated CCPA by sharing sensitive user health data with advertisers without proper consent
• First U.S. regulatory action against a company for disclosing "inferred sensitive data"
• Violation included failing to provide mechanisms to opt out of sensitive data sharing
• Discussion of whether fines proportional to company revenue would be more effective
• Comparison of data brokers to other harmful entities in society
• Brief preview of upcoming episode about a major data breach potentially larger than Equifax

Stay safe this holiday weekend and don't put fireworks where they don't belong! Tune in next time for our breakdown of a massive data breach of "epic proportions."


Support the show

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Alrighty then.
Ladies and gentlemen, welcomeback to another episode of
Privacy, Please.
Cameron Ivey, here hanging outwith Mr Gabe Gumbs Gabe, how you
doing.

Speaker 2 (00:08):
I'm doing well, sir.
How are you, Mr Ivey?

Speaker 1 (00:11):
Doing well, had a little storm roll through.
You probably had some effectsfrom that.
When it rains, it pours, itdoes indeed and it also
lightenings.

Speaker 2 (00:29):
When you live in the lightning capital of the world.
That's a thing, that is a realthing.
Is that why they're called thetampa bay lightning?

Speaker 1 (00:33):
I think it might be it might have a tiny bit to do
with it.
Yes, sir, yes sir that makessense yeah, yeah, a world
champion tampa bay lightning.

Speaker 2 (00:38):
Is that what's right?

Speaker 1 (00:39):
that's true it seems to be that even florida panthers
, I mean we've, we've had some,uh, the nhL has been owned by
Florida teams, which is funnyyeah.

Speaker 2 (00:49):
I mean we get a lot of Canucks that visit down this
way, but Lord knows, youcouldn't freeze an ice cube on
the coldest days of the yeardown there.

Speaker 1 (00:56):
No, I bet it makes so many Canadians mad.
But hey, it's the tax stuff.

Speaker 2 (01:04):
I think at the moment they're far angry about other
things.

Speaker 1 (01:07):
That's true yeah, that's, very true.

Speaker 2 (01:11):
We should let the line there Shout out to our
Canadian brethren north of theborder.

Speaker 1 (01:16):
Shout out Canadians, we still love you and your geese
, we love you, pal, it's true.
Canadian bacon.

Speaker 2 (01:24):
I like that Is't even know is that american you just
called canadian bacon probablylike french fries.
It's probably on the list.
Oh yeah, sorry about that.
Yeah, freedom there's.

Speaker 1 (01:37):
there's a couple things that have been going on
in the security and privacyspace that we'll just kind of
touch on First we'll talk about.
We'll just throw it out there.
So one of the biggestsettlements for the CCPA right

(01:58):
now is the Healthline $1.55million settlement under the
CCPA Gabe.
I don't know if you heard ofthis.
It's kind of recent, it justcame out.

Speaker 2 (02:12):
You may not know.
I've heard of it but I hadn'thad a chance to really dig into
it.
Was that a percentage ofrevenue of some sort, or just a
fine based on number of records?
I'm curious because, to behonest, 1.5 doesn't really sound
like a deterrence for doingbetter.

Speaker 1 (02:30):
Yeah, that's a good question because it says
Healthline Media agreed to arecord 1.55 million dollar
settlement with the CCPA forviolating the.
Let's see what they say.
Specifically Resolves claimsthat Healthline shared sensitive
user data with advertisers anddata brokers without proper

(02:53):
consent and opt-out mechanisms.

Speaker 2 (02:57):
Classic, classic.
I was going to say it's par forthe course.
I mean, we know lots of folkscontinue to still engage in
those type of noncompliancebehaviors, usually not
intentionally.
A lack of guardrails internallytends to be behind this.
More often than not, you know,the average business isn't

(03:17):
intentionally trying to benon-compliant or, for that
matter, even unethical.
Say what you might aboutcapitalism or, for that matter,
even unethical Say what youmight about capitalism.
But yeah, 1.5 still justdoesn't seem like the guardrail
I would want it to be.

Speaker 1 (03:31):
Yeah, I mean it's a lot of money, but it's really
not that much, You're right.
So here are three things thatthis will kind of shed light on
some more specifics.
So they're paying this fine dueto a couple of reasons.
So one main line health, healthline media, so prohibiting the

(03:53):
sale or sharing of personalinformation linked to specific
medical diagnosis, providingnotice and the right to limit
the use and disclosure ofsensitive personal information
before sharing it foradvertising, and implementing a
program to assess thefunctionality of opt-out
mechanisms and ensurethird-party contracts meet ccpa

(04:14):
requirements.
Those are the three things thatthey did not do, that's right,
which are pretty big.
I mean, that's it's pretty big.
Yeah, you know it's prettyimportant.

Speaker 2 (04:28):
That's quite a bit.
That's quite a bit, you know,back of the napkin.
Search suggests that Healthlineis a wildly profitable business
with, you know, revenues in thehigh double digit millions and
profits that aren't that far offof that.
So you know, that again goesright back to that.

(04:51):
I'm never really a fan ofcompliance being the first
guardrail for these kinds ofchallenges, and I'm not certain
that imposing record-breaking orotherwise 1.5 million is really
a deterrence to others yeah, II'm trying to see if there's

(05:14):
more information on what I mean.

Speaker 1 (05:17):
I wish they would kind of break down why that
number, why only that and theysettle on that.

Speaker 2 (05:23):
Like gdpr, for example, their fines are, if I'm
not mistaken, they're based ona percentage of revenue, right,
right?

Speaker 1 (05:30):
because this I mean the last one before this was
what?
1.2 million, which was thesephora one that we were talking
about earlier.
Right, right, right right, thatwas back in 22, but what's
funny is this was also oh wait,okay, yeah, they were mentioning
it.
So some of the I'm trying tosee if there's any more like

(05:54):
specific details.
So this is the first USregulatory privacy enforcement
action where a company has beenfined for disclosing inferred
sensitive data.

Speaker 2 (06:05):
What's inferred.
So not direct, but it meansthat they may have been able to
de-anonymize individuals basedon it.
I mean, inference andde-anonymization are kiss and
cousin, so I'm I'm drawing astraight line there.
But but it means that they wereable to infer who cameron ivy
was without direct reference towho cam Ivory was.

Speaker 1 (06:27):
Right.

Speaker 2 (06:28):
That's extra naughty yeah.

Speaker 1 (06:30):
Inferred based on articles.
Read is what.

Speaker 2 (06:34):
Interesting.

Speaker 1 (06:36):
I don't know.
This is interesting.
So and obviously you know we'redealing with health related
data.
I don't know the company thatI've never really honestly heard
of them before.

Speaker 2 (06:49):
There are about a billion healthcare companies
None of us have heard of, andthey're all making ungodly
amounts of money.

Speaker 1 (06:56):
Oh what.
We can sell you all of thesepersonal health information for
a lot of money and yeah, that'sit.

Speaker 2 (07:02):
Payout claims.

Speaker 1 (07:03):
Yeah, all right.

Speaker 2 (07:05):
That sounds like a good idea.
Let's do it, let's go.

Speaker 1 (07:07):
What do we do?
My other question is are thepeople that made that decision
still there or are they gonealready?

Speaker 2 (07:15):
Come on, they're still there and they're not
going to be For what it's worth.
Again, I'm not even sure I'minclined to levy blame upon
those individuals.
Right?
Like I said, it's hard to provemalice and I don't usually wake
up in the morning and assignmalice to these types of things.
Just, most people wake up andthey just want to do their jobs,

(07:38):
they want to do it well, theywant to be compensated fairly
and they want to go home.
That's not everyone, of course,but I don't necessarily
subscribe to the.
You know, all of these folksare evil, even when they're
dealing with data brokers,although all data brokers, on
the other hand, I might not havethe same appreciation for, yeah
, but you know I also don't havea strong appreciation for

(07:58):
people that you know like selldrugs or whatever.
Yes, I'm equating the two.
Yes they're both damaging to thecommunity for freak's sake.

Speaker 1 (08:08):
Hey, you know, I mean you got to put food on the
table somehow.

Speaker 2 (08:11):
Well, yeah, I understand, I understand, I
understand.
Even scumbags have to eat,right Like yeah, no, that's
interesting.

Speaker 1 (08:20):
Let them eat cake.

Speaker 2 (08:21):
So, yeah, I don't know that those individuals
should be held personallyaccountable.
We obviously don't know enoughabout it, right?
Yeah, I certainly don't knowwhat the future for CCPA is.
If this is the signals we'regoing to send to businesses to
protect our data, I don't knowthat this signal is the right

(08:44):
signal to send.
It seems, quite in my personaland professional opinion, it
might be the exact wrong signalto send, and I'm not suggesting
you find them into oblivion suchthat they go out of business.
But I don't know, maybe we needmore DOJ-style actions where
you know what?
Now we are going to embed adata privacy officer from the

(09:07):
government into your businessfor the next 18 months to make
sure that you know what to do.
Well, not to babysitit, but tohelp you to that point.

Speaker 1 (09:17):
I was just thinking well, maybe they got this number
because based on how muchinformation they had sold.
Maybe you know what I mean.
Like, maybe because it's basedoff of, isn't it like a
percentage per word or perletter or something?

Speaker 2 (09:30):
like that.
I don't think it's per word.

Speaker 1 (09:32):
Yeah, um, listeners, or you know anybody out there
that knows I mean, shoot us amessage or you can always come
on and talk further about it.
That's more knowledgeable aboutthat kind of stuff.
But I think that's.
I think that makes sense, Iwould make sense to, but you
know they should tack on towhatever else.
I don't know.

(09:53):
I think you're you're on tosomething there because it's
like, well, if it's just a slapon the risk, you know, and we
made, how much did they make onsome?

Speaker 2 (10:02):
data anytime the cost of doing business exceeds the
cost of the fine substantially.
Right, that's, that's just, uh,that's just a luxury tax really
right.

Speaker 1 (10:13):
And the other thing is like it's not going to affect
them and like, oh well, theseconsumers aren't going to trust
this company.
Now it's like it doesn't matter, they already have your
information that.
Yeah, that cat's so far out ofthe box yeah, I don't like it,
but I'm glad that there's, youknow.

Speaker 2 (10:32):
I'm glad CCPA exists in some form at least now.
You are correct.
Prior to the enactment of thatregulation there'd be zero
repercussions and recourse.
There'd just be a littlesalacious article and maybe we
don't even mention it on theshow and others aren't informed.
Then everyone keeps it movingright.

Speaker 1 (10:52):
Yeah, agreed then.
Uh, everyone keeps it movingright.
Yeah, yeah, agreed.
Um, I know that there was like,uh, well, we can touch on this.
We'll do a couple other specialepisodes.
This is a little bit shorter,of course, but we just wanted to
kind of dive into that and umon this episode, but we'll have
another one coming out.
Uh, gabe will talk about.

(11:13):
Um, we'll both talk about, butthere's, there was like a.
There was a recent hack.
It was one of the largest onesin history.
There were a few.

Speaker 2 (11:24):
Yeah, we're gonna dive into the details and peel
back the curtain on one of thelarger hacks, uh, of modern
times, certainly, uh, maybe everOf epic proportions.
Quite frankly, yeah, I've gotsome information and I've been
having some conversations withsome folks, both near and far to

(11:44):
the scenario and, yeah, we'lllay that out for our listeners.

Speaker 1 (11:49):
Yeah, because this is going to surpass Equifax, right
.
This is going to surpassEquifax, right.
I think it was actually Google,apple, I don't know.
We'll get into it on the nextepisode, but let's just say it
was pretty massive.
Indeed Gabe, always a pleasureA pleasure indeed.

Speaker 2 (12:18):
Happy 4th of July to everyone out there.
Happy uh 249th to the, to theunion.

Speaker 1 (12:22):
Hope everybody has a safe weekend and, uh, we'll see
you guys on the next one don'tput any fireworks in your
buttocks or between yourfingertips and uh, yeah, do that
.
Don't, don't do that don't dothat, don't do that, don't do
that.
Nobody wants nubs.

Speaker 2 (12:40):
Nobody wants nubs in either location.

Speaker 1 (12:43):
We out Be safe.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.