All Episodes

November 10, 2025 38 mins

Send us a text

In this episode of Serious Privacy, Ralph O'Brien and Dr. K Royal discuss the weekly news, including the Google settlement in Texas, ClearviewAI and much more.

If you have comments or questions, find us on LinkedIn and Instagram @seriousprivacy, and on BlueSky under @seriousprivacy.eu, @europaulb.seriousprivacy.eu, @heartofprivacy.bsky.app and @igrobrien.seriousprivacy.eu, and email podcast@seriousprivacy.eu. Rate and Review us!

From Season 6, our episodes are edited by Fey O'Brien. Our intro and exit music is Channel Intro 24 by Sascha Ende, licensed under CC BY 4.0. with the voiceover by Tim Foley.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:54):
You're listening to Serious Privacy, powered by
Trust Art.
Here are your hosts, PaulBreitbart, Ralph O'Brien, and
Dr.
Kay Royal.

SPEAKER_03 (01:08):
Well, welcome to another week in privacy, and
this one's probably going to bea little bit of a quick one
because not only do we like tokeep these short to make sure
that you're available to walkthe dog or do whatever it is
while you listen to ourpodcasts, but also because you
catch us packing our bags andwe're ready to move on to a busy

(01:29):
week next week.
So we should be able to getplenty of content out on what's
going on next week.
But in the meantime, I'm afraidthat our good friend Paul
Breibarth is not with us thisweek.
So it's Kay Royal and myself aswe go into a week in privacies.
So my name is Ralph O'Brien.

SPEAKER_01 (01:49):
And I'm Kay Royal, and welcome to Serious Privacy.
I think Ralph's a bit optimisticon saying this is going to be a
shorter than usual episode.
I doubt we've ever hit less than30 minutes, but we do try to hit
right at 30 minutes.
Let's see.
Unexpected question.
I'm being judgmental on this onebecause I'm trying to see if

(02:09):
make sure I haven't asked thesequestions before.
Let's go with this one.
What do you buy way more of thanmost people do?

SPEAKER_03 (02:19):
Wow.
Now there's there's an honestanswer and there's a not honest
answer.

SPEAKER_01 (02:24):
Honest answers only.
Honest answers only.
Although, wait a minute.
If you're going to talk aboutyour personal sex life, I don't
want to know what you spendmoney on.

SPEAKER_03 (02:31):
Okay.
Well, what do they say in the USis a plead the fifth?
No, no.
Well, probably my hobbies.
Probably my hobbies, to be fair.
There's a saying in the gamingcommunity that I belong to that
they call the little plasticminiatures plastic crack because
it's fairly addictive.

SPEAKER_01 (02:47):
I have a friend that does a Discord channel on
painting those.

SPEAKER_03 (02:51):
Yes.
Yeah.
I probably was to.

SPEAKER_01 (02:53):
Yeah, probably have.
Oh, that is hilarious.
I'm gonna say that most peoplehere could probably guess what I
spend more money on because Ialmost got the ones that I'm
customizing in to show you.
Yes, shoes.
I spend way more money on shoes.
But be fair, and I will defendthis to my dying day.
It's not that I go out and buyChristian Louboutons or whatever

(03:16):
the heck these shoes are.
I don't like spending a lot ofmoney for any particular shoe
because one, I've never noticedthat there is any noticeable
difference in cop in comfort orlasting, because I may wear, you
know, with over 200 shoes, notcounting boots and sandals and
flip-flops.
I don't wear most of them morethan once a year.

(03:36):
So I don't need them to last foreveryday wear for 10 years,
right?
So, but I will say there aresome that I'm starting to like a
little better than the others,some particular designers, but
only because of their quirkydesigns, like Betty Johnson,
Betsy Johnson.
I love her little quirky shoes.
I love David by David, Charlesby David, Charles Davis,

(03:57):
something like that.
They had a wonderful pair ofshoes I love before.
And then now a new one I'vediscovered is Azalea Wong.
And I just love the quirkystyle.
Of course, as you can imagine,they do lots of flash and
beating and sparkling and weirdstuff.
So yeah, that's what I like.

SPEAKER_03 (04:13):
I mean, I love shopping for unusual suits and
um York.
I quite often go downtown toCamden Town and I find a shoe
supplier that I believe I sentacross to you guys because they
were just unusual.
Yes, and I love those.

SPEAKER_01 (04:29):
I love them, love them, love them.
Okay, so week in privacy.
Let's start with perhaps some ofthe biggest news we've heard
that most people have tuned inon, which is the Google fine by
Texas.
I guess it's not a fine if it'sa settlement, right?

SPEAKER_03 (04:44):
Don't mess with Texas, right?

SPEAKER_01 (04:49):
Yes.
And the largest one before thenwas by Arizona now, by a single
state.
I know that a coalition ofstates came together to to uh
assess Google and there was abig settlement there.
Texas is the biggest settlement,I believe, against Google.
Now, I didn't go actually readthe full settlement document,
which I should have, but one ofthe points that really gets me

(05:11):
is that Arizona had a specificprovision in their settlement
where Google had to pay, I thinkit was$5 million, to an Arizona
law school in order to set upprivacy and data protection
education for the attorneys'general offices as well as
judges.
And I'm part of the Center forLaw Science and Innovation

(05:32):
there.
I teach privacy law there.
So we were very excited that Iwould be able to spin up a
fabulous program that we've beenlooking forward to.
And the governor at the timegave the money, I don't know, to
some law school out in Virginia.
And so the governor after that,or the attorney general after
that, went, nope, nope, we'repulling that money back to

(05:53):
Arizona.
You're not allowed to do that.
And then the new governor tookthe money.
It might have been the oldattorney general allocated it to
a law school in Virginia, butthe new governor took the money.
And it was a bunch ofsettlements and fines like this
across a wide variety ofactivities, went to the Arizona
Supreme Court, and they said thegovernor was allowed to do that

(06:14):
because essentially the moneywould be used for the purpose
for which it was dictated.
Not true.
Maybe on the one particular bigone they were conversing about,
but on the privacy educationone, that was very specifically
to an Arizona law school.
And the governor did not givethe money to an Arizona law
school.
So to me, the terms of thesettlement were broken, but the

(06:34):
only one that was going to fightit would be Google, and they
don't give a damn, right?

SPEAKER_03 (06:38):
That's a real shame.
I mean, we were talking withPaul last week about one of the
settlements over here that wasto allocate a certain amount of
funds into data protectioneducation and awareness from the
Dutch Law Law and TennisAssociation.
So I'm a huge fan of creativeenforcement and people traveling
that sort of penalty into dataprotection education.

(07:01):
But yeah, it's got to get there,as you say, right?
It's got to get there.
I think that Attorney GeneralPaxton has got a bit between his
teeth.
I mean, it's not really 1.375billion on Google, but that
follows up 700 million and 8million for anti-compassative
practices and 1.4 billion withmeta for biometric tech

(07:23):
collection.
It's a huge amount of moneyadded up, isn't it?

SPEAKER_01 (07:27):
Oh, yeah, absolutely.
He's I mean, Texas is a bigstate.
I guess they need big money.
So, you know, that's one thingto take into account.
But I just love seeing themcontinuing to push forward with
these penalties and these finesunder state law, even though
it's a variety of state lawsthat they're doing it under.
I love seeing their activitywith this.
But I almost reached out to KenPaxton, was like, can I carry

(07:49):
your briefcase?
Let's just stay here, dude.
You're quickly becoming one ofmy heroes.
Although, frankly, I don't knowhow big of a personal action
plan he's bringing to this.
But he is the attorney general,so he has to approve all
activities that his staff isundergoing.
And I would think something likethis would definitely have his
personal hand work on it,wouldn't you?

SPEAKER_03 (08:11):
Yeah, I mean, I'm actually, you know, unusually
for someone in Europe, you know,we can set our noses at the US
occasionally go, oh, we've gotthe GDPR.
We've got all these dataprotection laws.
But yeah, what an attorneygeneral really goes after
somebody.
I mean, you've got some powersto settle.
I mean, 1.375 billion, 1.4billion, we use the sort of

(08:31):
fines and penalties that werealmost unheard of.
These are big.
That and I saw that Tom Kempthis week was launched some sort
of model where Yeah, the CCPAdata subject access request tool
on his website.

SPEAKER_01 (08:44):
I was going to tell him I think it's time for him to
come back on the podcast.
Let's talk a little bit about,you know, what he's doing at the
agency and in information thatthey're pushing through.
But another under CCPA isCalifornia AG because got a
530,000 settlement with SlingTV.
I mean, California is Texas,right?

(09:05):
If it ain't Texas, it'sCalifornia.

SPEAKER_03 (09:08):
So Yeah, well, they do seem to be the two biggest
names in state protection as faras the U.S.
is concerned.
I mean, I know they're the twobiggest states, I guess, in
terms of income and but I'msurprised they don't see more
out of places like New York,actually.
But they've struggled to gettheir law full.

SPEAKER_01 (09:23):
Oh, they're a lot in the news anyway on what they're
doing.
Their focus ain't privacy.
It might be another P word,right?
But it ain't privacy.
But that's yeah.
There there's a lot going onthere.
Well, since we're over at theU.S.
and talking about New York,there is a voice actor's lawsuit
against Lovo after the courtruled that the AI-generated

(09:46):
voice replicas can violate statepersonality rights.
Voice cloning is not just anethical frontier.
It's now they're saying it'sentering the legal realms, but
we all knew when they werearguing about it ethically,
somebody was going to have tofind a legal underpinning for
them to base it on.
So I thought that was reallyinteresting there because I have

(10:07):
a lot of friends that arepersonalities in the media.
And they were posting that theywere being contacted by AI
companies saying, can we useyour voice, record these 30
minutes worth of reading to us,and then we'll pay you a few
pennies, you know, for doingthis?
They're like, no, this is myliving.
This isn't just my voice.

(10:28):
This is my living.
And any of you out there who arenot voice actors or not
personalities with therecognizable voices, it's true
for you too.
Why would you want to sign up tobe an AI voice model when
they're really just capitalizingon your voice and not paying you
anything for it?

SPEAKER_03 (10:48):
No, I mean, we've seen this happen, especially
with deceased actors recently,with uh people like, you know,
in in the world of Star Wars andthings like that.
But, you know, I as I heard, orI understand it, James Earl
Jones, instantly recognizablefor the voice of Dart Veda,
wasn't paid one cent for thelast sort of time that the the
Dart Veda voice was used in StarWars because we're able to sort

(11:12):
of generate his voice, as Iunderstand it, from previous
samples.

SPEAKER_01 (11:17):
And that's just that's horrible, right?
You know he's from Mississippi,just saying.
But have to throw that in.
Most people don't thinkMississippi has anything good in
the world going for it.
It absolutely does.
Very strong in the arts.
Oprah's from there, the king ofrocks from there, the king of
countries from there.
I mean, you know, just keepgoing.
But okay, I'm getting off topic.

(11:38):
Squirrend.
What do you have going on?
Those are some of the biggerones here in the U.S.
I'm sure there are some othersto go through, but you got a
couple of notable ones comingout of Europe.
I'm liking the phone.

SPEAKER_03 (11:50):
Yeah, we do.
Before you do that, I just wantto follow up on the AI story
because we actually had a courtcase that uh today that actually
came through from a companycalled Stability AI against
getty images in the world of AI.
Now, as you know, theSeattle-based Getty Images has
accused Stability AI ofinfringing copyright and

(12:13):
trademark by scraping 12 billionimages for its website to train
stable diffusion, the popularimage generator.
This went all the way to BritishHigh Court, and it's really the
first of a wave of lawsuitsinvolving Gen AI as sort of
movie studios, authors andartists, you know, exactly as
you were saying, using theirwork to chain AI trackbots.

(12:35):
And what really struck me aboutthis case is it didn't go the
way I would expect, you know,because both sides kind of claim
victory.
Yeah, of course.

SPEAKER_01 (12:45):
That's the way they spin it, right?

SPEAKER_03 (12:48):
Yeah, I mean Getty won the argument that stability
had infringes its trademark, butlost the rest of the case
because the courts, when youread into the judgment, started
talking about what was actuallyin the AI model and stored in
the AI model.
And they ruled that stablediffusions AI doesn't infringe

(13:10):
copyright because it doesn'tstore or reproduce the
copyrighted works inside themodel.
It kind of scans them, uses amathematical representation of
them, and then uses thosemathematical representations to
produce new materials.
But what was really interestingabout the court case is it said,

(13:32):
well, they haven't stolen it.
They because they're only usingthese mathematical models which
are their creation, they youhaven't sucked the actual data
into their databases or intotheir platform.
It's just a mathematicalrepresentation.

SPEAKER_01 (13:48):
So yo Okay, that's just wrong.
I'm sorry, it may be right, butit's wrong.

SPEAKER_03 (13:55):
That's kind of where I am.
And you know, it you know thatit's a really it's a real sort
of sort of difficult one to kindof talk about because yeah, I
think that a mathematicalrepresentation of the data is is
the data, right?
You know?
Yeah.
But that's not how the court sawit.
That's not how the court saw it.
So it'd be interesting to see ifthen that's used as a precedent

(14:15):
for all of these other courtcases going forwards and whether
I mean the decision sort ofleaves the UK without a
meaningful verdict on thelawfulness of AI models.
So again, it's kind of one ofthese weird wishy-washy court
cases that doesn't really giveus a direction to go.

SPEAKER_01 (14:34):
Damn those judges, what is wrong with them?

SPEAKER_03 (14:37):
Yeah, why can't they just say it's a legal copyright
theft and have them Right.

SPEAKER_01 (14:41):
I mean, why can't we get some clear guidance on
something that is absolutelyclear and is not so limited by
very specific details that youcan't extrapolate it to any
other decision you need to makeas a company to know if what
you're doing is legal or not?

SPEAKER_03 (14:55):
Or am I just saying sour grapes because it didn't go
the way I wanted it to?
I don't know.

SPEAKER_01 (14:59):
Well, it could be that too, but still.

SPEAKER_03 (15:01):
The bigger news, however, is we've got an opinion
from the European DataProtection Board today on the
draft adequacy agreement forBrazil.
So you remember a while backthat Brazil uh or sorry, the
European Commission brought outan adequacy draft to say that

(15:23):
they wanted Brazil to beadequate and looked at the law
and said they kind of think itis.
Well, we've got the EuropeanData Protection Board's sort of
analysis and opinion on it.
Broadly they agree.
Broadly they agree.
Broadly, yeah.
I mean, broadly they say notit's adequate, but they do have

(15:46):
some sort of questions, I guess,mainly around international
transfers and public bodies andredress and sanctions.
So again, I think they're sayingthat on paper the law looks
good, but obviously obviouslyyou still need an enforcement
body that actually is takingactions to enforce the law.

(16:12):
And to do with that and to dowith international transfers and
especially the potential foronward transfers from Brazil,
they have some questions still.
I mean, it's very politelyworded, but it essentially says,
you know, these are areas thatrequire a closer look, and we're
not quite happy with them.
So whilst they broadly agree, Ithink they're saying that there

(16:36):
are some still concerns aroundonward transfers, international
transfer in general, and ofcourse the ability of Brazil to
carry out enforcement action.
So I don't think that's anysurprise, but I don't think the
European Data Protection Boardwill, you know, their opinion
will in any way stop theadequacy agreement going

(16:56):
through, is my understanding.
Right.
That being said, we arerecording this in November, and
it's actually worth peopleremembering that the UK's own
adequacy expires in December aswell.
So we should expect to see somenews on that pretty shortly as
well.

SPEAKER_01 (17:11):
Yeah.

SPEAKER_03 (17:12):
Other big news from Europe, you remember a couple of
weeks ago the Letumbe judgment,which basically was sort of mutt
mutt toted as Shremes free,trying to get rid of the EU-US
data transfer arrangement isillegal.
Well, Latumbe ha lost as weknow, but he's going to seek to

(17:34):
overturn the court's assessment.
So there's basically an appealthere.
So that's gotta be interesting.
That's not dead yet.
That's going to go up anotherlevel.

SPEAKER_01 (17:46):
Never will be.
Never will be.
Always.

SPEAKER_03 (17:54):
I mean, if it was operating okay, I I probably
wouldn't have an issue, but youknow, given some of the issues
with the Civil LibertiesOversight Board and the makeup
and what's Trump done since he'scoming to power with the people
who are on the board, you know,there is some questions about is
it even functioning, I guess, atthe moment.

SPEAKER_01 (18:13):
But Well, and it's not whether or not it's even
functioning, it's whether or notit's even fixable.
I mean, with the currentadministration, I don't see them
trying to make the thingy legit.
If Europe points out severalissues, you need to fix this, is
I don't really see the currentadministration prioritizing

(18:34):
that.
So I mean, the only thing theycould do would be kill it and
then hope that they can fix itin three years.
I mean, and that's not a that'snot a way out.

SPEAKER_03 (18:44):
They're too worried about ballrooms and bathrooms,
right?

SPEAKER_01 (18:47):
Right.
Exactly.
I'm not getting into there.
I'm talking to administration ingeneral.
But yeah, and he everybody knowsI was a little disappointed that
we didn't see a lot more dataprivacy and protection movement
under Kamala.
But we didn't.
I mean, let's be honest.
We thought it was going to comein and be a rock and roll show.
You look back on it, there weresome significant things done,

(19:08):
absolutely.
But I would be shocked if we sawsignificant movement on the
direction we would like to seeit under the current
administration.
So yeah, I'm with you on thatone.
So, I mean, in other words, whybother even fixing it?
I mean, if you kill it, therethere's no putting something
additional in place that's goingto bridge the gap this time.
Under all the others, there wasalways the ability that we would

(19:32):
work on it and put in some, youknow, revisions that would make
everything a little bit morepalatable.
And yes, I'm hedging myself withlots of adjectives here, but
there was always that hope.
Now, if shrimp three happens andit kills the thingy, you've got
nowhere to go, nowhere to runto, nowhere to hide.

SPEAKER_03 (19:54):
Yeah, and I think that that's, you know, one of
the reasons I think why Latumbewas sort of unsuccessful.
You know, that they have triedtheir best, but what's the
alternative?
And you know, but maybe thereisn't one.
Maybe all data transfer succeedsbetween the EU and the US
because it's the laws are justtoo different.
I don't see that happening, bythe way.

SPEAKER_01 (20:13):
I mean, that's a shutdown of pretty much global
commerce, right?
So but on the other hand, a lotfewer companies signed onto the
thingy than had historicallybeen under the safe harbor or
the what was the second one?
SHIELD, yeah, that's right.
Yeah, the safe harbor and theshield, and incrementally you're

(20:34):
seeing less and less companiessigning up for those.

SPEAKER_03 (20:37):
Yeah, yeah, I think it's fair to say that everybody
is now going back to SCCs andTIAs and things like that as
well.

SPEAKER_01 (20:43):
So Which the new FCCs aren't as horrendous as the
old SCCs were.
So there's that.

SPEAKER_03 (20:51):
Yeah, there is that.
But I mean, because you know,considering the concerns with
the with the US, I think youknow, if I was doing a transfer
impact assessment on transfersto the US, given all of the
media coverage and the concernsthat there has been, you know,
ha y you'd have to fudge yourTIA to make it in any way look

(21:15):
like you can transfer the data.

SPEAKER_01 (21:18):
Well, but here's the thing.
If you're transferring data tothe US and you're doing a TIA,
it depends on what kind of datayou're working with.
Just put some standardcontractual clauses in place and
stop that nonsense.

SPEAKER_03 (21:30):
Well, I've always had a problem with the idea of
two lawyers sign a document andthen walk away whistling and
what what actually practicalprotections doesn't give to the
data subject.
I've always had a lot of people.

SPEAKER_01 (21:41):
Well, but the thing he doesn't either.
Let's just be honest.

SPEAKER_03 (21:45):
Yeah.
I mean paper-based Yeah.

SPEAKER_01 (21:48):
Interesting.
Very interesting.
You know, this could be a wholeepisode all on its own as to
what actually needs to be inplace.
And I know what Paul's gonnasay.
Rights guaranteed to people whoare not US citizens.

SPEAKER_03 (22:01):
I yeah, I'm I'm doing something similar
actually, in that I'm taking uhdoing a bit of a roadshow next
year.
One of the places I'm going isthe UAE, uh doing sort of a
couple of days uh privacyprogram management course where
I'm sitting down with seniormanagers and I'm saying, right,
you know, you've got a choice.
You know, you've got uh tickbox, cover your bum,

(22:22):
paperwork-based compliance thatwill cost you money and not do
anything.
You know, what I'm gonna callthe the paperwork route, or I'm
gonna call it the more technicalroute, which is where we
actually get in place properdata protection by design,
embedded in apps andapplications and software that
allows individuals to claimtheir rights because they're a

(22:44):
global customer, let alone ifthey're an EU or you're a US
one.
Well put simply, who do you wantto be and how do you want to
operate?
You know?
Yeah.
So I think that that littleroadshow is going to be fun.
Going back to more news fromEurope, talking of Shrems, our
good friend Max Shrems, I mean,Lutumbe aside, you know, Neub
has filed a criminal complaintagainst Clearview AI and its

(23:07):
managers.
So this isn't this isn't to aregulator.
This isn't like you know, we'vealready seen Clearview finds by
the UK, from the Dutch, theItalians, the Greek, the French.
The Clearview, as we know,aren't fond of even claiming
that they're under jurisdiction.
So most of them they don't evenreply to.
Right.
But you know, put all togetherthat's about 100 million euros

(23:30):
on ClearView.
And all of them have said thatClearview AI has acted legally,
including bans.
And Clearview is just ignoringEU authorities.
You know, only in the UK didthey even appeal the decision.

SPEAKER_01 (23:43):
A note to Ken Paxton.

SPEAKER_03 (23:46):
Yeah, exactly.
But you know, EU data protectionauthorities, what this really
hinges on is the fact that EUdata protection authorities
haven't really come up with aneffective way of enforcing
territorial extent.
You know, haven't really come upwith a way of saying, hey, if
you're covered by Article 3B for3, 2B, which is, you know,
you're outside, but you'retargeting products and services

(24:08):
or monitoring behaviour inside,you know, we you fall under the
law, you should follow the GDPR,and then we can, you know, hit
you with a fine, and you shouldhave an EU representative set up
in in the e within thejurisdiction.
Right.
Actually Clearview AI is justsaying, but we're not gonna do
that.
You know, we're we're you know,we're not gonna have an EU

(24:31):
representative, we don't believewe're covered by EU law, and if
you send us letters from the EUsaying you should follow our
laws, we're just gonna laugh atyou because we're sitting in
America covered by American law,and how dare you try and, you
know, h hit us with fines andpenalties when we're not in your
country, we're not your legalentity established in your

(24:53):
country.
So this is a different view.
So Max Frems is basicallysaying, you know, you can run a
cross-border criminal procedurefor a stolen bike.
So they're hoping that thepublic prosecutor takes action
when the personal data ofbillions of people is stolen, as
has been confirmed by multipleauthorities.
So instead of looking for aregulatory penalty from a seed

(25:16):
visor authority, they'reactually looking at a criminal
complaint with publicprosecutors in Austria.
So they're trying to getClearview AI and executives to
face jail times and bepersonally liable if they ever
travel to Europe at all.
So that would be against section63 of the national Austrian Data

(25:36):
Protection Act, which allowscriminal sanctions.
So this isn't a sort of an EUGDPR thing.
This is actually a national dataprotection act thing that does
allow criminal sanctions for theunauthorised obtaining of
personal data.
We've got something very similarin the UK as well.
So even though the GDPR allowsyou to, you know, doesn't allow

(25:58):
personal liability, you know, inour own local data protection
act.
We've got criminal offences suchas the unlawful obtaining of
personal data without theconsent of the controller.
So the sorts of people whonormally get caught by that are
your call centre worker whounlawfully accesses someone's
record and you know looks uptheir family, or your police

(26:21):
officer who looks up thedaughter's boyfriend, or your
car salesman who takes thedatabase to their next employer.
You know, and these people whoare done for unlawful obtaining
or unlawful selling of personaldata could actually be held
criminally liable from theregulator.
And you know, the ICO has takena number of criminal actions

(26:42):
against such individuals.
So this is a very similarprovision in Austria under
section 63 of its National DataProtection Act for your unlawful
obtaining of personal data.
And Neuberg.
Neuber going to use it to tryand go after ClearView AI
instead of a fine or a penaltyon the actual organization.

(27:05):
This is deliberately against itsmanagers themselves.
So Wow.

SPEAKER_01 (27:11):
That's a bold move.
I like it.

SPEAKER_03 (27:14):
Yeah, I mean, you know, yeah, I mean, uh I quite
like his quote against, youknow, if we can do cross-border
collaboration on stolen bikes,why can't we do it on stolen
personal data?
Yeah.

SPEAKER_01 (27:27):
Absolutely.
Absolutely.
So there's a few other things.
I think we're coming to a closeonce we add in Ralph's thing,
but there's a couple of thingson the October Roundup.
And by the way, y'all know weget these from, you know,
newsletters from law firms,online stories, LinkedIn people
postings, Trust Arc, the PrivacyPulse, their monthly roundup,

(27:48):
their weekly posts, as well asIAPP at all.
So if you want to know where dowe get these from, there's our
sources right there.
You are our source.
We pay attention to what youpost too.
So anyway, we had a few morethings happen in October that I
don't think we've actuallymentioned.
So we're not going to dive intoany great detail here.
But Bangladesh had a personaldata protection ordinance 2025,

(28:11):
Gambia Personal Data Protectionand Privacy Bill 2025, which is
awaiting the presidentialapproval.
And I don't think I heard thathe signed it yet.
Algeria's amendments to itscurrent data protection act,
Vietnam, has completed theirpublic consultation phase on
their draft decree to enforcetheir personal data protection
laws.
So there's a movement there.

(28:32):
Y'all may have heard, and Ithink we may have mentioned that
Colorado Privacy Act actuallyamended to protect minors' data
online.
So that one's old news.
But here's some movements inCalifornia that we've all seen,
and I think these all passed.
The Digital Age Assurance Act,the health, the privacy for
health data location andresearch, the California Opt Me

(28:53):
Out Act, the accountcancellation on social media
platforms, the amendmentsregarding data brokers, data
collection and deletion, thehealth and care facilities
information sharing and the databreaches customer notification
one.
And the other thing I heard isthat Tom Kemp is rolling out the
phrase Cal privacy rather thanCCPA or CPPA or any other

(29:17):
acronym.
He says, we are now going to beknown as CAL privacy.
Well, you know, someone's goingto shift gonna shorten that to
CalPry or CalPRI or something.

SPEAKER_03 (29:26):
CALPRI, yeah.

SPEAKER_01 (29:27):
Yeah, it's gonna be is gonna be something.
But they did, let's say, theygot a tractor supply company
penalty.
I think we talked about that.
And then AI development.
So I think we've mentioned a lotof these a little bit in
passing, but maybe not indetail.
Vietnam released a draft law onartificial intelligence.
Namibia is currently draftingits AI law.

(29:50):
Tanzania is preparing itsnational AI guidelines.
United States, we've had, youknow, California with the
Frontier AI, the camp companionchatbot era interactions.
That despite I hate the chatbotlitigation with the wiretapping.
Oh yeah.

SPEAKER_03 (30:08):
Well, we're seeing a lot of older wiretapping laws
being applied to AI, aren't we?
And to be honest, it freaks meout.
Wherever I go on the internet atthe moment, I'm being offered
some sort of AI companion,rather.
Trevor Burrus, Jr.

SPEAKER_01 (30:19):
And you probably can't opt out of anything on
your phone when you open up awebsite and it tells you we do
so and so.
There the only thing you coulddo is close it or accept it or
leave.
There's no closing, there's nomodified permissions.
I mean, all of that just smacksof dark patterns.
There is consultation.
I think the EU is closingconsultation by the time we post

(30:41):
this website.
So the consultation period onthe high risk AI system
providers will close November7th.
So if you didn't get it in, Toolate now.
Other iNews, Chat GPT may be thefirst AI model regulated under
the EU Digital Services Act.
We talked about the open AIthere.
And let's see what all do wehave.

(31:03):
There were some other AI andsome child privacy acts passed
over here on the East Coast thatwe did talk about, but I can't
think of anything else largethat is looming unless something
breaks today, and I just have noidea it was coming.

SPEAKER_03 (31:20):
The only ones that I've spotted that I didn't hear
on your list was Chile having anAI law coming in and they're
bringing it into theirconstitution as well, which is
an interesting.

(31:41):
And I spotted that Sri Lanka haspassed the new data protection
amendment laws as well.

SPEAKER_01 (31:46):
Oh, we love that, don't we?
We love anything that's SriLanka.
I named a cat Sri Lanka before.
I have a habit of naming animalsafter where I found them.
I had a cat named Waffles and adog named 8th Street.
I named a cat Sri Lanka, butI've never been there.

SPEAKER_03 (32:00):
Oh, I love Sri.

SPEAKER_01 (32:01):
That means we need to go to Sri Lanka.

SPEAKER_03 (32:04):
Sri Lanka was amazing.
And I suppose the last thing wethen need to talk about is your
travels, Kate.

SPEAKER_01 (32:10):
Yeah, we are next week.
I will be in London.
And there's a lot happening inLondon at the time.
There's the, what is it, thecompliance risk symposium.
I'm not going to, but I've gotfriends that are going to be
there.
So Martin is going to be there.
We're going to meet with lunchwith him.
We have the Picasso Awards.
I'm so excited.
On Tuesday night, that is ablack tie affair.

(32:31):
We've got custom bow ties.
I'm hoping I get the customsides.

SPEAKER_03 (32:35):
You're not wearing black tie.
We're wearing serious privacytie bow ties.

SPEAKER_01 (32:38):
That's right.
Serious privacy ties.
Look at me saying privacy like atrue Brit.
And I'm customizing a pair ofshoes.
I'm not going all out withbling.
I'm just doing some glitter andstuff.
But yeah, anyway, I'mcustomizing.
I'll probably be a littleoverdressed because when I think
black tie, I think pageant dresson stage, right?
I mean, that's what I've got.
But I'll go with the with thepodcast colors.

(33:01):
Whether we win or not, we'regoing to be a good-looking
group.
I'm excited about that.
And then Privacy Space ishappening on Thursday up in
Limington Spa.
I'm very excited to be there.
The podcast is going to bespeaking as well as I'll be
speaking on the conflicts ofbeing the AI governance officer
as well as the DPO.
And how did those line uptogether?

(33:23):
It's going to be fascinating,utterly fascinating.
Got to figure out what I'mwearing to do.

SPEAKER_03 (33:29):
I look forward to hosting you.
I look forward to uh having youin there in the country, and I
look forward to uh buying you adrink and introducing to a few
professionals who will no doubtworship the grounds you walk on
from all of that.

SPEAKER_01 (33:42):
Until they meet me and they're like, oh my God,
this is really her.
But my husband's coming with me.
He's trying to get us to, youknow, mark down an agenda.
I'm like, we're just there.
They're gonna have Christmasstuff up.
We're gonna go see Christmaslights.
We're gonna go to Hyde Park.
We're gonna go back toBuckingham Palace, which I'm
assuming is gonna be decoratedfor Christmas, and Kensington

(34:03):
Palace.
I don't know if I'm gonna makemy way to Windsor or Oxford, but
we're looking at, you know, whatare we doing?

SPEAKER_03 (34:11):
So it's actually we're actually recording this.
We don't often give out dates ofrecords, but we're actually
recording this on November the5th, which is Guy Fawkes Night
in Night Um, which is rememberthe 5th of November Gunpowder
Treason and Plot, where there isno reason the 5th of November
should ever be forgot.
So we are recording this, youknow, a good two months out from

(34:34):
Christmas, the holiday seasonitself.
And I went shopping today and Iheard my first piece of
Christmas music in the shops.

SPEAKER_01 (34:42):
Nice, very nice.
So I am bringing, I'm probably,I'm saying I am, but I think I
am, bringing some Halloweencandy over because Paul loves
the little Halloween packets ofcandy because they're small and
they're not U.S.
size.
So I'm bringing some Halloweencandy, but it won't be a whole
suitcase worth like it was lasttime.
I do believe that we might stillhave a seat or two open at our

(35:05):
privacy space table that we'relooking to fill.
I'm sure Ralph is on top ofthat, so we'll have a full table
there.
I think there's 10 people.
So it'll be a full table andeverybody is under orders to
show up in pink and blue.

SPEAKER_03 (35:18):
Pink and blue.

SPEAKER_01 (35:19):
Pink and blue.
That's what we're doing.
But I'm excited for that.
So we'll have a wonderfulepisode that comes out next
week, or it doesn't come outnext week.
It'll be recorded next week tocome out just to talk all about,
you know, when the Americanprivacy officer went to Britain.

SPEAKER_03 (35:35):
And then the week after that is IAPP Brussels.
And I think myself and possiblyPaul are going to be there
really at the fringe event.
So come and find us in thebrewdog across the road at Trust
Stark's party in the Rock Cafeon the Grand Place in Brussels.

(35:56):
That'd be fun.

SPEAKER_01 (35:57):
I couldn't justify staying for two weeks, though.
I tried.

SPEAKER_03 (36:01):
Yeah.
So if you want tickets to uh toto the trustark party in
Brussels, come and seek me outon the socials as usual.
And win so a pat couple ofweeks, and we'll be bringing it
all to you on the SeriousPrivacy Podcast.

SPEAKER_01 (36:15):
Yes, and I'll bring stickers.
Freak stickers.
I'll bring my favorites, whichis why so serious privacy.
Uh with the Joker on it thatPaul absolutely hates.
So I found an unopened pack ofthem from our first year.
They're very large.
They're like three inches.
I'm bringing them with me.
These are these will never beprinted again.

(36:37):
Paul would stomp on my toes if Iever tried to get these again.
So these are heirlooms, tellingyou.

SPEAKER_03 (36:43):
And as ever, my concept for short episode failed
utterly.
That was serious privacy.

SPEAKER_02 (36:50):
This wraps up our podcast for this week.
Please do share the episode withyour friends and colleagues
because we love to get morelisteners.
And join the conversation onLinkedIn or on Blue Sky.
You'll find us under SiriousPrivacy on both platforms.
You'll find Kay as Heart ofPrivacy, Ralph as IGR O'Rion,
and myself as Europol B.

(37:11):
Until next week, goodbye.
Goodbye.

SPEAKER_01 (37:15):
Bye y'all.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.