All Episodes

May 15, 2025 45 mins

EU's complex regulatory environment creates both challenges and opportunities for businesses navigating data privacy, financial services, and healthcare regulations across member states. 

• Significant differences exist between EU-wide regulations and country-specific implementations
• Large companies like Meta and Uber have faced multi-million Euro fines for GDPR violations
• Financial institutions struggle with innovation due to contradictory and slow-moving regulations
• Healthcare organizations often have regulations but lack enforcement, creating security risks
• AI adoption faces resistance similar to the US, though its implementation is transforming industries
• Traditional banks create separate "baby banks" with modern infrastructure to work around regulatory limitations
• Companies often underestimate marketing costs when entering EU markets due to privacy restrictions
• Red teaming employees creates privacy concerns that must be balanced with security needs
• Local legal expertise is essential when entering European markets to avoid costly compliance mistakes
• Every regulatory challenge also presents strategic opportunities for companies who understand the landscape

To learn more about Bruning Media and our services, visit bruning.com.


Josh's LinkedIn

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Larissa.
So this is a unique opportunityfor us to get a peek into
what's happening in the EU,what's going on with EU
regulations, how our industriesand customers especially your
customers, because you know themintimately how are they
responding to changes in theregulatory landscape in Europe?
So, larissa, welcome toCybernomics.

Speaker 2 (00:21):
Thank you.
Thank you, I'm very happy to behere, josh.
Europe is a wonderful place.
Where I'm living, it is like afairy tale, full of castles, and
sometimes Europe likes to havea lot of regulations and not
implement it.
Also, my clients usually dealaround the world, so it's not

(00:48):
Europe specific.
So we interact a lot with USregulations, us customers and US
requirements.
So from that, perspective?

Speaker 1 (00:53):
yeah, yeah, that's a really weird regulatory thing.
Is you would think that thecompanies that are under the
jurisdiction of a particularcountry or municipality here in
the US no-transcript andhandling Nigerian data and you

(01:35):
have to get audited and now makesure that you're compliant with
this Nigerian law?
So is it kind of that way?
You're in Luxembourg, but anycompany that's handling US data
would also be under thejurisdiction of the US.
Is that the way?

Speaker 2 (01:54):
that that works.
Whenever you have customers orclients that are living people,
so not companies themselves, youhave to be very careful with
the data.
Meta has a long history ofgetting fines from the European

(02:14):
Union.
They have themselves awonderful hater that created all
that for them, and he wasactually an ex-Meta employee, so
he was a PhD Max Frames he'sthe one that I'm talking about,
and he has two European Court ofJustice decisions named after

(02:34):
him.
So he went to work for META.
He got grossed out by what'shappening there and then, when
he came back to the EuropeanUnion, where he is from, he
started a long war with them onthe grounds of how they handle
personal data, and they havebeen fined hundreds of millions

(02:55):
of euros because of that.
So Nigerians are on the rightpath of saving themselves from
the headache.
Yes, as a company, you have tobe very careful when you handle
personal data.
Why?
Because it can lead to horrible, horrible situations.
For example, we had a caserecently where a psychiatrist

(03:18):
disposed improperly of thepatient information and all
those files ended up on thestreets of the city.
Patient information and allthose files ended up on the
streets of the city.
So imagine your treatment andimagine everything that a
psychiatrist knows about you.
Now, the entire town has accessto it, so we have to be very
careful, because theconsequences can be dire.
This is why it's so important.

Speaker 1 (03:41):
Yeah, especially if it's a small town.
I'm thinking of the town that Igrew up in in Guyana, south
America, very, very, very tiny,probably I don't know 500 people
.
They would love if everybody'smedical history was laid out on
the streets.
They'd have a field day withthat.
But yeah, we don't want that tohappen.

(04:02):
And you gave me a brief historyand I'm going to confess
something that I'm reallyembarrassed to confess today.
I didn't know what Luxembourgwas and where it was, and my
first stab at it was hey, isthis a city in Germany?
You're like nope, it's its owncountry.
Lesson in how some of theEuropean countries came about

(04:25):
right, in that they were sort ofthese territories and
properties that lords and kingsand all these people owned.
So that must make, if we'refast forwarding to today, that
must make regulations so muchmore complex, because you've got
so many countries so tightlyknit together, right?

(04:47):
So how do you navigate this?
Do you just come under theEuropean Union's regulatory
governance or does each countryhave its own regulations and how
does that interact?
How do they interact?

Speaker 2 (05:02):
The answer is it depends.
So in European Union, there aretwo types of regulation.
There are more types, but theseare the two main types of
regulation that the country canbe subject to.
One is you have to take the lawas the European Union presented
it to you and you have to applyit.
But again, you are allowed tohave bylaws, so small little

(05:24):
laws that help to apply that biggiant law that the European
Union sent your way, and thatwould be the case of DORA.
And then you have regulationthat the European Union passes
and says okay, please implementthis, where you have a lot more
freedom in the sense that youcan adapt it to your system or
maybe add some spice, take up alittle of it to make it more

(05:48):
applicable for your scenario,for your situation.
These are the two main types,but of course there are a lot
more other mechanisms throughwhich the European Union
regulates.
But then again, european Unionis made out of countries that
are extremely different.
So, for example, we havecountries that are made up of

(06:09):
small lands, like it's Germany,for example, that can also have
their own local regulations thatapply over everything.
So it's very interesting how alaw can come down to the person
itself, through so many layersand so many ways of being
adopted.
This is why the law slightlydiffers, for example, in Holland

(06:34):
from the law in Romania, whereI was born.
So there are slight differences, and this is why it's always
best to work with the locallawyer and not think, oh, I will
have my USS lawyer that cancover GDPR, for example, in
Europe.
It might work, it might not, andthe consequences we have seen.

(06:55):
For example, the last one thatI remember is Uber.
Uber took an 80 million eurofine in Holland because they
didn't apply GDPR as the Dutchenvisioned it.
So, again, using a local lawyerwould have saved them from a

(07:15):
very big fine.
This is why it's very importantusually to work with a local
lawyer, and that's the situationalso in the US, if I remember
correctly, because you have theFrench regions like the New
Orleans.
They have a slightly differentway of working with the law than
the traditional American systemthat we know from movies and

(07:38):
everything else.

Speaker 1 (07:39):
So you know, that makes total sense, because I had
a client in New Orleans andthere was so much red tape and
I'm like, what are you guysdoing down there?
It's just swampland.
Let's just get this thingacross the finish line already.

Speaker 2 (07:51):
But that makes total sense, it's completely different
from the rest of the UnitedStates in the sense that they
have applied the law in a moreEuropean way.
Let's call it so.
This is why it differs so much,and the recommendation is as
much as possible, work withlocal lawyers where you can,

(08:12):
where you can afford that, andwhen we think of Uber, we think
they can afford that.

Speaker 1 (08:17):
They can afford it.
Yeah, so what would the stacklook like for a large
organization in terms of okay,you need local lawyers, they
probably need a GRC team, sothey need a compliance person,
they need someone who's going tohelp them do audits, they need
a CISO, probably to help themnavigate the security gaps once

(08:38):
they're starting to remediate.
So I'm thinking about thelittle guys, the smaller
businesses.
What would their stack ofresources look like to make sure
that they avoid these fines?

Speaker 2 (08:50):
Because Meta and Uber can afford to pay large fines,
but smaller businesses might notdifficult to get 80 million
Euro fine if you're a small shop, so it would be very difficult
to get there.
I can't imagine what would youneed to do.
And also, you have to take intoconsideration the country you

(09:14):
will work in.
For example, in Romania, theGDPR fines.
I haven't seen anything morethan one hundred and fifty
thousand euros, so somethinglike that, and that was for a
very big bank, for a horriblebridge.
So about 5,000 euros.

(09:35):
That's something you can expect.
So even if you make a mistake,they take into consideration how
many people you serve and ifyou did your due diligence.
So if you can prove that youhave due diligence and you take
care of the data and you didyour due diligence.
So if you can prove that youhave due diligence and you take
care of the data and you didyour best, but something
happened, you will find a lot ofunderstanding on their side.
This is my experience with theEuropean Union when it comes to

(09:57):
fines.
Of course, for big companiesthat sometimes maybe have been a
little snotty, they willreceive the big fines.

Speaker 1 (10:07):
Yeah, let them pay for it and they can set the
example.
So as long as we're stillseeing big numbers in those
fines, I think it's just enoughto scare the people who are
handling the most data, and somaybe that's why the little guys
aren't as burdened.
Is there a particular industrythat you feel like is more

(10:29):
regulated and is more burdenedby regulations in Europe?

Speaker 2 (10:35):
There are two types of industries One that I think
it's really it's burdened withregulation to the point of not
functioning properly, and that'sthe financial industry.
I worked in financial industryfor about three years on the
cybersecurity side, and becausethey have so much regulation, we

(11:02):
end up not innovating, whichcreates a whole other layer of
risk over everything.
And then we have the medicalindustry, which in my eyes,
should be way more fine, becausethey have the regulation but
they are not checked if theyapply it and they don't.
And then we have a lot ofmedical data that is being
stolen.
So this is the paradox that Idon't like we put money above

(11:26):
our personal information andmental health.

Speaker 1 (11:31):
Yeah, I wonder if that's the same in the US.
I think that tracks.
But the US let's take NYDFS,for example, even if you're not
in New York, but you're handlingany data that's passing through
New York, which is thefinancial space, and just grill
them about that, because that isa problem.
I mean, what do you think isthe risk of not innovating?

(12:16):
What do you think that theseregulations are stopping these
companies from doing?

Speaker 2 (12:29):
companies from doing.
It would be very nice toactually work with people that
have worked for 20 years in thefinancial industry in a lot of
positions before creating a newregulation, because we have
cybersecurity regulation thatcontradicts financial system
regulation, that contradictsfinancial system regulation, and
then we are stuck in thesituation where if I innovate, I

(12:52):
need three years of approvalsand testing from a regulator in
order to change one piece of thesystem, and when I will get the
approvals, that piece of thesystem will already be obsolete.
So the old banks are kind ofstuck in using what they have
and they can't really competewith a new bank that has a new
core that works a lot faster, isa lot easier to maintain,

(13:15):
doesn't have the security coststhat an old core will have, and
there's also this risk of an oldbank creating a very hard shell
on the outside.
But if you manage to penetratethat and that can happen with a
disgruntled employee, forexample, really easily then
because you can't really do muchwith the old pieces of

(13:37):
infrastructure and software, youare stuck with some very mushy
cybersecurity inside.
So once somebody manages to getin, it's game over Working with
people that have beenarchitects in the banking system
for a very long time, and whenI say very long time over 20

(13:57):
years please thank you very muchcould help the regulator
understand better the needs andthe catastrophic system uh,
somebody that works every daywith that type of information
and that type of systems has todeal with and that it's not
really protectable, to be frankright when it comes to AI.

Speaker 1 (14:23):
so, talking about innovative systems, right, ai is
going to change everythingovernight.
It's already changingeverything, but in the next year
we're going to see agents,we're going to see AI creating
systems, and the systems arecreating systems right, and
they're going to change much,much faster than the regulations
are going to be able to keep upwith, much faster than the

(14:48):
regulations are going to be ableto keep up with.
So what that tells me is that,in Europe at least, they're on
the cusp.
You're on the cusp of anightmare, at least for the
bigger banks, because, let's say, the big banks and the big
healthcare industries they wantto adopt AI.
It's not going to work so wellbecause two things are going to
happen.
One, when a small company showsup, they're going to be able to

(15:12):
spin up systems that are soefficient and that are so fast.
There's just no way that thebig ships are going to be able
to turn in time.
So what are you seeing the bigbanks like?
What are the big banks doing tocurb that or at least to
future-proof themselves frombeing so vulnerable to these new

(15:35):
incoming companies?

Speaker 2 (15:38):
It has already been started.
A lot of traditional banksbecause it's so difficult to
change.
It's easier to create a newbaby bank and try to move
customers onto the new bankbecause it's that difficult to
obtain all the approvals.
So you will see every oldtraditional bank spawning baby

(16:00):
banks that are new and that areshown as being online bank and
that's all.
So they have no physicalpresence anywhere, they are only
online.
You can create the accountonline, you can have everything
online, you have the supportonline, everything online.
So it has already started aboutthree years ago and it's

(16:25):
catching speed, because youcan't bank the way you banked 30
years ago.
In the days we live now, it'sjust not financially responsible
to do it that way, becausehaving systems that old costs
money.
Every transaction costs a lotof money.
And then you will have thesmall people that have small

(16:47):
shops that cannot afford to havea POS because the bank requires
so much money for everytransaction that they can't sell
.
So it has already started andit's going great, and what I'm
seeing in the medical industryis not much.
So there I expect to see thebiggest progress when it comes

(17:10):
to acquiring products that canserve the patients better.
I had a procedure done not longago and the doctor was
absolutely phenomenal and Iwould go to him a thousand times
more.
He was perfect, but everythingelse was horrible.

(17:30):
It was a horrible experience.
So, from paying, it was a verybig payment and they expected me
to make it on that morning.
You know, when you go intosurgery you're not your best
self.
Let's put it lightly.
Well yeah.
And also when the payment isreally big and the bank might
refuse it, although youprevented them a week ago that

(17:51):
you will make that payment.
You see how nothing wasprepared from my perspective for
the surgery.
Again, I had to have a lot ofdocuments that I could have
provided a week ago.
So making a patient go throughall that in the morning of
surgery is not what I describe agood experience.

Speaker 1 (18:13):
Well, what's the holdup?
Why do you think it is that way?

Speaker 2 (18:17):
Because they are used with how they worked in the
traditional hospital and notprivate clinics.
And in the traditional hospitalyou have to do what they tell
you because it's basically free,because in Europe most of us
have free health.
It's not free like we pay forit from our paycheck, but it's

(18:38):
kind of free and nobodycomplains because you got it for
free.
So why are you so upset?
And they are used to puttingyou through whatever because you
won't say much.
But when you pay for it youexpect a flawless experience and
you know that experience ispossible and it would have been
easier even for them, becausenot dealing with the mess of a

(19:00):
person at 6 am in the morning isis a lot easier to have a good
day when you don't have to dealwith 35 people like that at 6 am
yeah, so is it?

Speaker 1 (19:12):
so is it fair to say that the smaller clinics Smaller
clinics are analogous to thesmall banks in terms of how, as
an industry, maybe the hospitalisn't able to spin up a small
clinic overnight, but if you'rea small doctor, you've got a
small office could you spin upoffices that run much more

(19:37):
efficiently than the bighospitals, the way that you're
nodding your head, so I'm goingto take that as a yes.

Speaker 2 (19:43):
Definitely, definitely.
You have to think how muchmoney a hospital pays to
everybody else.
That is not a medical personlet's call them.
So they are not a doctor, theyare not a nurse, they don't deal
with patients directly.
They are not a doctor, they arenot a nurse.
They don't deal with patientsdirectly, they're accountants,

(20:04):
they try to fit you in with yourschedule and so on.
So all of that it's really easyto pass on to something that's
automated and I've seen myissues getting solved a lot
easier with robots let's callthem and chatbots than I've seen
with real people.
Easier with robots let's callthem and chatbots than I've seen
with real people, which to mewas disturbing because I always

(20:28):
wanted to believe that peoplehave enough empathy to know that
you don't call them or youdon't get into their inbox
because you're bored, like youactually have an issue.
So it will be a lot easierbecause you don't have to hire,
you don't have to train, youjust get it, you take it out of
the box and if you're a gooddoctor, boom, you have a clinic

(21:04):
or a bank or a clinic.

Speaker 1 (21:05):
The regulations will be favorable to you because you
are introducing systems that cancomply right out of the gate.
Is that a good summary?

Speaker 2 (21:09):
Yes, and I think there's a lot of steps that have
been done to provideautomations for this, even for
GDPR.
We have a lot of software thatdoes this for you, so you don't
have to manually send emails andwhen you receive a request for
deletion, you don't have tomanually go through everything.

(21:30):
It is done for you.
So it's already started on thisfront.
I appreciate it a lot.
I've seen some products, I'veconsulted on some products.
Of course, we're not there yet.
So maybe a year or two from nowit's going to be amazing.
We're not there yet, but it'sgetting a lot better and a lot

(21:51):
easier to exist in this world ofregulations and of craziness
and new regulations every dayfrom everywhere on the globe.
So then you can have it all inone place.
You get an alert, the systemadapts and you can go on with
your life and do what you loveand you're good at and not have
to be so tight about.
Oh my God, am I doing the rightthing with GDPR?

Speaker 1 (22:16):
What does there look like you say you're not there
yet?
What would there look like yousay you're not?

Speaker 2 (22:22):
there.
Yet what would there look likein a year?
There would be a lot moresecurity.
The problem that I've seenlately with a lot of new
automations is lack of security.
So the API keys are notproperly used, are not properly
secured.
I'm not going to go into a lotof details because it's pieces

(22:45):
of software that everybody knowsand when I inspected them I
didn't feel confident inrecommending them for use for my
clients or for friends.
So the products are not secureenough for me to trust them with
my database or to trust themwith anything that is truly
important in the company.

(23:06):
It works, the basic principleswork, so that's very good.
It can adapt, it knows how topull information and it knows
how to validate information, soit won't just smush in
information from the internet.
It will validate it through aperson, through a lawyer or
through somebody that'squalified to actually look over

(23:27):
it.
But again, we're not there yetwith how flawless it should work
when it's sold in a companyokay, we'll talk offline.

Speaker 1 (23:40):
I might have some suggestions.
There's some really coolcompanies in the US that are
doing this very thing and arereally working on this problem.
And two words, is it threewords?
Ai agents?
Is that A is one word and I isanother word?
It's okay?
So let's say three words, aiagents, and you know, okay,

(24:07):
that's a really good segue.
In the US there's this weirdresistance to AI.
Everybody wants to get on boardwith AI, but I think
eliminating the human presenceis very scary to people for a
number of reasons.
One, employees are afraidthey're going to lose their jobs
, which is going to happenanyways.
Two, companies, vendors theydon't want to scare away their
clients by making the clientskind of put their trust in the

(24:28):
Terminator.
Right, that's not somethingthat people are very wary of, ai
, just in general becausethey've seen too many Terminator
movies.
So what is the generaltemperature of the AI climate in
Europe?
Are people embracing AI,whether in security or in other
IT systems, or do you see thatresistance as well?

Speaker 2 (24:53):
Well, in my eyes, people are just as scared.
People are just as scared, butI think we have formulated the
premise in a wrong way.
Everybody who, for example, Ican't wait for it to be even
more embracing the cybersecurityworld and the regulations world

(25:13):
, but we have presented it assomething that will replace and
we are not there yet for a lotof positions.
I think it's more important forcolleagues and parents and
everybody in our lives to learnhow to work with it, not
necessarily expect it to dothings.

(25:34):
Because they try it, theyexpect it to work, you know,
like the perfect employee.
It doesn't work like thatbecause we are not there yet and
also they are very, very bad atcreating prompts, so it can't
read minds also.
And then they are disappointedand they just say, oh my God, I

(25:56):
don't want to be in anenvironment where they use AI,
because I had a very badexperience with it and it's
going to yes to repeat.
So no, thank you, I will mindmy own business.
I think this is the problem.
They tried to make it as thesilver bullet, and it is a
silver bullet.
It will become even better andit will do even more.

(26:18):
I don't think it will take awayjobs.
I think the jobs will turnaround in different ways.
Like, yes, you will not inputdata in a computer, but did you
want to do that anyways withyour life for 30 years?
I don't think so.
So you would have quit the jobanyways.
Come on, let's be real.
So you will have an even betterjob, an even more interesting

(26:42):
job, because we will do evenbigger and better things.
So when we've created themobile phones, the mobile phones
only created more jobs and morethings, because then they had
apps on them and somebody had tomake the apps.
Then they had games on them.
Somebody had to make the games.
Then they had the covers thatare glittery and then they had.

(27:04):
So when you get a new thing,that will only spin up so many
more other things that it won'tbe less jobs.
I think it's going to be evenmore jobs that it will create.

Speaker 1 (27:19):
So I think the I think the fear is that people
are like, yeah, it's going to beeven more jobs that it will
create.
So I think the I think the fearis that people are like, yeah,
it's going to create more jobs,but my job is in danger.
What if all I wanted to do wassit and enter data into a
spreadsheet all day?
I was perfectly happy with that, or at least not motivated
enough to go find something elseto do.
Ai is going to take that job.

(27:39):
And it's true, ai will takethat job.

Speaker 2 (27:41):
Yeah, yeah, but we had a really funny case.
So it's happening in alldirections, so it's not AI.
I don't know if you know theold ways of making heat, like
you had a coal thing and youtook it with the shovel and you
shoveled it inside a furnace andit made the school hot, for

(28:04):
example.
So there was a lady that didthis for 20 years and then the
school modernized and purchaseda new gas installation and of
course, she had no work and sheinsisted that somebody has to
watch the new installation.
There was no need and theschool offered her three other

(28:24):
positions, but no, she wanted todo that and of course, she lost
the litigation becauseeverybody told her you can't do
something that doesn't exist.

Speaker 1 (28:33):
You sit there and watch the gas.

Speaker 2 (28:36):
Yeah, some jobs will go away.
It it is what it is.
Even, for example, if we takethe legal jobs, so many
paralegal jobs will go away.
So in time, a lot of them willgo away.
Why?
Because now we can file onlinea lot of things.
There was a very big market.

(28:57):
You have to go and file inperson a lot of things.
That was eating a lot of time,so there were a lot of
paralegals doing that.
You couldn't take documents outto study them at your office,
so there had to be somebody thatwent there and did that.
Now, at least in Romania, youcan study it online.
You don't have to go and takethe file and read it in the

(29:19):
library.
It's fine.
So there already are a lot ofjobs that, because of the
internet, have gone away, evenin the legal industry.
So things will change, even ifwe like it, even if we don't,
and the best way to do it is toactually embrace change and
accept that it will make ourlives better.

Speaker 1 (29:39):
How are your customers reacting to these
changes?

Speaker 2 (29:44):
I want to believe that good, because I love the
changes.
Are they a little scared?
Everybody's a little scaredbecause now you don't know in
what direction they will take it.
So you don't really know how toadapt your product so that the
product still exists in fiveyears.

(30:05):
If I look at Google Sheets, forexample, now they added a small
icon called extensions whereyou can add JavaScript code over
the Google Sheet, where youhave your database and you can
create a micro app that does alot of things.
So a lot of products that weredoing that and they were sold as

(30:28):
doing that will not existanymore.
So when you have a smallcompany, you're kind of thinking
, okay, am I next?
Am I next that I'm going to beautomated out of my small
business?
So there is a dose of beingscared, but I also think it's

(30:51):
normal and we just have to learnto work with it.

Speaker 1 (30:56):
Awesome, great.
I think your clients are ingood hands, especially if you're
forward-thinking, because Ithink there are a lot of
consultants, msps, when it comesto the AI revolution and I
think the clients that have aforward-thinking, steady hand at

(31:24):
the helm will do a lot betterin the long run, because people
are afraid.
I think that if you're incybersecurity today and you're
doing things with AI, or ifyou're in any tech industry and
you're doing things with AI,you're part psychologist tech
industry and you're doing thingswith AI, you're part
psychologist.
You have to sort of distill andbreak down the complexities of

(31:46):
AI, but in a way that easespeople's anxieties, makes them
feel better, doesn't keep themup at night and doesn't create
more problems.

(32:19):
No-transcript researcher wassick or had to go to an
emergency or something, and soshe told her team the next day
we've got to get this thing done.
So I want you guys to get thisall done.
I don't care how you do it.
We need to present thisresearch paper tomorrow.
So the team said bet, we'regoing to use ai.
They weren't using ai agents,but they were using large

(32:40):
language models, and these AIbots, large language models,
produced a very good researchpaper impeccable data, wonderful
graphs, good visuals.
The only problem was theresearch sources were all fake,
yeah, so they had to start overfrom scratch and they spent

(33:02):
weeks.
I mean that was the best casescenario.
Um, if the worst case scenariowould have been if they went
ahead with the research paperand these billions of dollars
were then, uh, spent onerroneous information, so that
could have.
That could have.
So I think it's justified.
What do you think?

Speaker 2 (33:24):
This is what I've mentioned before.
People expect AI to be anotherperson that can work like 200
miles per hour and it's not yetjustified.
It will be, but it's not yet.
So if you slice work in smallpieces and you have checkpoints,
the chances of it being wrongare lower.

(33:47):
This is what I'm seeing in code.
So I was having fun withanother friend.
An intern of ours submitted apiece of code that looked
suspiciously good and well donefor an intern and, of course,
okay, it's AI.
And I called him in like comeon, I've asked you to do this,

(34:08):
not because I need it, butbecause you need to learn, and I
spent my time correcting thisand I'm not going to correct the
AI.
So what did you do here?
So I'm somewhat of a vibe coder.
That's the danger.
Everything looks very good andif you expect AI to take a task
from beginning to end, it willbe all wrong and fake.

(34:29):
If you give AI small tasks andyou have checkpoints, it's going
to be amazing.
The problem is that we aregiving these Ferraris that are
AIs to kids that have zeroexperience.
So to me, that's the mostdangerous thing.
So you give an intern that's 21years old and has zero

(34:49):
experience, a Ferrari.
What do you think he will dowith it?
He will plunge in the firstlight bulb that they found yeah,
and and then you will have topull the car back and deal with
the mess of the car and hopeyour intern it's still well
enough to function right, yeahare you still alive?

(35:13):
Intern, is there anybody?
Yeah, the lights on.

Speaker 1 (35:16):
Yeah, and I mean the equivalent of that is if you're
fortunate um, not only if you'refortunate, but it's not.
The worst case scenario is ifyou get fired.
You know, but you're not goingto fire an intern for that.
I think it's just a really goodopportunity for everybody to
learn how to live with this newtechnology.

Speaker 2 (35:35):
My problem is when this stupidity stacks.
When this stupidity stacks, soyou will see, for example, in
the network, they changesomething in the network with AI
.
They don't check it, they justput it in Over that.
They install some software,also half-coded with AI, god
knows what's there.
And when you stack so manymistakes and they are, you know,

(35:58):
sky high the chances of itcoming down in flames are huge.
So the problem is not the AI.
The problem is how we work withit, and we need to establish
rules and checkpoints and smallitsy-bitsy pieces that the AI
can actually performsuccessfully, because it can
perform successfully a lot oftasks as it is now, but it

(36:20):
cannot take a project frombeginning to end and go through
it flawlessly.
So that's not the case.

Speaker 1 (36:26):
Do you think agents will change that?

Speaker 2 (36:30):
Somewhat.
I mean agents are verydifferent because agents are
coded to learn.
So once you put the effort inand you try to teach them the
tasks, I think they will be ableto do small tasks so not coding
something from beginning to end, but something small for
example, reservations andputting meetings in calendar

(36:53):
with the correct people andsending them the correct
information prior to the meeting, and all that.
I think they will be able to dothat flawlessly once we put the
effort in.
But it will take a good two orthree years in order to see
results and financial proof thatit works.

Speaker 1 (37:11):
Yeah, yeah, we'll get there.
Okay in the time that we haveleft, I want to talk about some
hidden costs One, the hiddencosts of red teaming your
employees.
Right, this is something Ihaven't really considered as
much as I should have, but whatare some of the downsides of red

(37:34):
teaming your employees and whatare some of the benefits?

Speaker 2 (37:38):
One of the downside is having a very trigger happy
red team, because sometimes theywant to win so much that they
pass the legal aspects and theyforget it's their colleague we
are talking about.
So that is one of the downsides.
On the plus side, I think it'sa good exercise because you know

(38:03):
those marriages of 35 yearswhen she discovers he's a serial
killer and she's like I didn'tknow who I married and I lived
with them for 35 years.
Sometimes you discover somethings that are crucial for the
business to become healthy again, and we've had some recent
examples.
We had the North Korean hiringsthat happened and that should

(38:26):
have never passed HR and otherverifications, and we also had
those cases with the FBI agentsthat were double agents for 20
plus years.
So you don't really know whodoes what until you check and
you know peddling through andchecking employees that are of a

(38:48):
higher level is a healthyexercise for any company that
feels spied or knows that thereis a high interest in their
technology or their customerlist or whatever.
So it has a lot of benefits,and having a red team that only
does that also brings a lot ofbenefits.

(39:10):
But again, we have to becareful at regulations and
people's privacy, so that wedon't cringe that.

Speaker 1 (39:20):
Yeah, yeah, so that's really interesting.
I'm curious about where is thatline?
I mean, does private data endat the office, like you leave
the office and your geolocationwhere you're driving to okay,
that's private data.
But if you've got yourgeolocation turned on and you're

(39:44):
on the clock let's say you'redriving from one client to
another client does theorganization get to claim that
data as their own?
So then they can essentiallyred team you, spy on you and see
what you're doing on companytime even in the european union

(40:06):
you have the right to privacy,even during office hours.

Speaker 2 (40:10):
So they are not allowed to have cameras that
face your screen.
They are not allowed to recordyour screen or to record how
many clicks you took for thepurpose of evaluating you.
But, again, anonymous data canbe collected.
So if you collect theinformation anonymously and then

(40:33):
you look for trends and youfind some really disturbing
trends like, for example, youalways have some calls or you
always have some contact thatshouldn't happen from the
company network, or you see someinformation that leaves the
company network, that should notleave the company network, then

(40:55):
you can investigate.
So you are allowed toinvestigate and to dig deeper to
see what's happening with yourown company that is allowed.
But calling somebody's motherand telling them you're calling
from the hospital in their kid'sname that is a director in your
company to find out someinformation about him because

(41:16):
you're doing an exercise, so youdon't even have suspicion that
is infringing that person'sfreedoms and that should not be
treated lightly in my eyes.

Speaker 1 (41:29):
So just to wrap up and close out, I wish I could
talk to you for much longer, butmaybe we'll have to have you
back for a follow-up.
But here is what I want toclose with what is the number
one pitfall that companies fallinto when doing business in the

(41:50):
EU?
And we're talking about large,small, like what's the one thing
that they keep getting wrong,that you wish they could finally
get right?

Speaker 2 (42:00):
Because of our privacy regulation, they
underestimate how much marketingwill cost and then they try to
force their way through privacyregulation and it costs even
more because they get a fine.
So the cybersecurity and theprivacy costs they have to also
be evaluated from a marketingperspective, because if you have

(42:23):
some limitations in, it wouldbe really nice to have a
holistic view.
So whenever you enter a newmarket at first, it would be
very nice to talk to a lawyerfrom there that can explain the
biggest hurdles you have to gothrough, and it would be best to
be there with your team.
So have somebody from marketing, have somebody from sales, have

(42:45):
somebody from HR, for example,everybody thinks in China,
chinese people speak a lot ofEnglish.
It's not the case, so you willhave such a difficulty finding
Chinese people that speakEnglish and that can work with
the cybersecurity audit and soon and so forth.
So always have somebody fromthere first, have a talk, have

(43:08):
your head of departments thereasking questions and only after
that reassess your businessstrategy.
Don't try to use what you havefor the US, in EU or in China or
in the Arabic countries,because everybody has their
tweaks and, depending on how youtalk with that specific lawyer,

(43:30):
you could gain such a greatadvantage in the face of your
competitors because you go inalready knowing the pitfalls and
you go in already knowing whatis the advantage, because in
every pitfall there's alsoopportunity, but it has to be
presented to you.
So to me that's, that's one ofthe biggest issues and that's

(43:51):
where I see clients notunderstanding properly how how
cyber security, for example,could affect their sales yeah,
so talk about hidden costs.

Speaker 1 (44:01):
Do you know a good lawyer?

Speaker 2 (44:04):
oh I, I have a clue yeah, okay.

Speaker 1 (44:07):
Well, if you're looking for a good lawyer and
you're migrating your businessto the eu or you're doing
business in the eu, uh, calllarissa and she'll hook you up
with a good lawyer, all right,um, larissa, thank you so much
for this time.
I've gotten so much informationand such a great look into the
EU market and how you guys areready.
I feel like this is just thetip of the iceberg.

(44:28):
There's so many questions thatI have left.
If people want to find you andthey want to follow you, how
could they find you?

Speaker 2 (44:37):
For the moment, there's only LinkedIn.

Speaker 1 (44:53):
For the moment there's only LinkedIn.
I answer to everybody there, sodon't be afraid to reach out.
I hope in time, if and if youwant to learn more about Bruning
Media and what we do, check outbruningcom.
Thanks for listening to thisepisode of Cybernomics, where we
talk about the hidden costs andopportunities, and sometimes
the opportunity costs of cyber.
Thanks Bye.

Speaker 2 (45:14):
Thank you, bye-bye.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.