Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Aaron Pritz (00:06):
Thanks for tuning
into Simply Solving Cyber.
I'm Aaron Pritz.
Cody Rivers (00:10):
And I'm Cody
Rivers.
Aaron Pritz (00:11):
And today we're
here with Shelly Jackson.
She is the partner at KriegDeVault.
She's the chair of the Labor andEmployment Practice, heavily
focused in privacy and security,and newly appointed chair of the
AI Task Force.
We're excited to dive into thatand hear your perspectives as a
lawyer in that space.
So welcome, Shelly.
Shelley Jackson (00:31):
Thank you,
Aaron.
Thank you, Cody.
I'm excited to be here.
Aaron Pritz (00:34):
Awesome.
So let's get right into it withAI.
So maybe from a law practicestandpoint, how is AI or related
technology coming up in yourpractice?
Like what questions are comingyour way, or what issues are you
having to deal with?
Shelley Jackson (00:49):
Well, I would
say just like any other
organization, both as a law firmand then also as a business, it
seems that AI is suddenlyeverywhere and it has been
coming for many years.
This is not something entirelynew, but it feels like it's
really exploded and has,advanced at a level that we
haven't seen previously.
And so we have lots of clientsasking us questions and those
questions can range from avariety of things.
(01:12):
They can be.
Business related questions likeintellectual property rights as
it relates to generative AI, forexample.
I get a lot of questions becauseI am in the employment law space
about data privacy and securityas it relates to artificial
intelligence that may be used tomake employment based decisions.
We have some guidance coming outfrom some of the regulatory
(01:33):
bodies like the Equal EmploymentOpportunity Commission has
recently issued some guidanceabout using algorithmic decision
making tools, includingartificial intelligence in
making employment decisions sowe're seeing a lot of that.
I think also just entering intoagreements and contracting with
different organizations that areproviding maybe software as a
service technology, they'reincorporating AI in various
(01:57):
forms whether it's generative ornot generative.
We're doing some deep dives intowhat exactly is this technology
doing?
What product is coming out ofit?
What risks or benefits areattached to that product?
And that's really aconsideration, like I said, not
just for c lients, but it's partof the business environment that
we're all finding ourselves in.
Aaron Pritz (02:15):
Yeah.
So linked to employment,screening or making offers, I
remember probably almost 20years ago when I was in an
internal audit function at acorporation doing some audits of
HR providers that providedscreening tools.
So AI has changed, butalgorithms and weeding through
(02:36):
thousands or hundreds ofthousands of resumes isn't new.
I'm just curious what'sdifferent in that space and
what's changing?
What do people need to thinkabout or do differently or be
aware of?
Shelley Jackson (02:47):
Well, I think
one of the things that you've
seen in that space isparticularly an explosion of
technological capabilities.
Those capabilities allowemployers to get more bang for
the buck.
In other words, I can go througha larger group of resumes.
There are the basic sort ofscreening tech technologies that
might be used, tore to screenout applicants.
Based on keyword searches orthings like that, that's been
(03:09):
around for a really long time.
Artificial intelligence iscoming in to do things like
profiling potential candidates,creating the profile of the
ideal candidate, taking that onestep further and creating, new
content based on, harvestinginformation or pulling that data
in, in different formats.
And I think that's one of theareas that's really important
(03:29):
for employers to understand.
On the outside, it may soundlike just a slick new feature
that can be added, but if it isadding, let's say, an artificial
intelligence component where itis actually creating, new
decision making about acandidate or perceiving
information about the rolethat's open, say, let's say
profiling a particular role,some of those behaviors and some
(03:51):
of the data that it can becollecting, especially if it's
in the case of about anindividual.
Let's say a candidate or acurrent employee.
Cody Rivers (03:59):
Sure.
Shelley Jackson (04:00):
There could be
regulatory impacts in terms of
how is that information beingcollected and used?
Is it creating issues of, forexample, I.
Reaching discriminatory results?
Cody Rivers (04:10):
Yeah
Shelley Jackson (04:11):
Not
intentionally.
The vast majority of these toolsare not set up to create a
discriminatory impact but it iswhat we would call like a
disparate impact, meaning thatthere's not, a specific
discriminatory action that'soccurring, but because of the
way an algorithm or a decisionmaking process or artificial
intelligence is employed, you'regetting to a result.
That results in a disparateimpact or an increased impact to
(04:34):
one particular, group that maybe protected under, applicable
law so that's one of the thingsthat we're seeing.
Aaron Pritz (04:40):
So let's, before we
move on from this topic, let's
flip the tables.
Cause we've talked aboutemployers.
I've heard in the last twomonths from two companies,
senior leaders, that they'vestarted to see an influx of very
similar worded.
Resumes and cover letters andcommunications where I'm
assuming, they inputted theirresume and say, make this better
(05:02):
for company A.
And in both cases I, I heard anddiscussed the outcome where
they're like, flush'em all.
So I guess what, from yourperspective, less of a legal
issue, but on the employee side,as people are gravitating to,
how do I make this tool?
Work the best for me.
Is there any impact from a legalstandpoint as you start to
represent yourself through whata bot is saying about you?
Shelley Jackson (05:26):
Absolutely.
I, that's a great question and Ithink it's something we all have
to individually wrestle with,not even just as organizations.
I think the bottom line is wehave to remember what artificial
intelligence and othertechnologies are, and what they
are not.
They are not a substitute forhuman thinking.
Mm-hmm.
And human discernment and humanintelligence, they are
(05:47):
artificial by designation.
They are essentially a reallysophisticated computer program
that's been taught that canteach Yeah.
Or that can learn.
Aaron Pritz (05:54):
To predict right
words, not even thoughts.
Shelley Jackson (05:56):
Exactly.
It has the perception of beingvery sophisticated, and in many
cases it is.
But what we also find is we findthese pockets of inaccuracies or
where the data somehow fails theprocess.
Mm-hmm in other words, it'spulled the wrong data, it's
relying on the wrong data.
It doesn't have the data, so itmakes something up.
(06:17):
This extends beyond candidatesfor a job.
I think it's just professionalsin general.
You have to understand what isit that's being created and do
you have a process to ensurethat that's accurate.
Yeah.
There are a couple of higherprofile cases right now where,
you know, I.
Lawyers in my profession havesubmitted.
In one case, a an attorneysubmitted, a brief that was
(06:37):
written by Chat GPT that ChatGPT and its effort to give that
person what they wanted, createdfake case law.
And now that it blew up and itbecame, like a little
sensationalized news story.
Aaron Pritz (06:48):
The world's worst
shortcut.
Shelley Jackson (06:50):
It's exactly
right.
When you're moving quicklythrough space, whether that is,
you're trying to, to submit lotsof different applications for
jobs, you're trying to screenlots of applicants.
You're start, you're trying toget work product out the door
for whatever your profession is.
It can be a really great tool,but it's just that, it's a tool.
It is not a substitute for, thehuman element, which I think is
a really key part of this.
(07:11):
And.
If you do any work in thisspace, and I imagine you see
this all the time as well, thatthere's not a substitute for a
human being getting in there andusing their expertise and their
skill and their training andtheir education and whatever it
is they've put together tocreate their, professional
abilities.
There's not a substitute forthat.
(07:32):
Yeah.
You can roughly approximate it.
You can help the process, youcan help prepare a draft, for
example, but there is nosubstitute to that
individualized, feedback from ahuman perspective.
Cody Rivers (07:41):
Yeah.
I thinking about putting, Codyis six foot tall on a lot of
Aaron Pritz (07:46):
Smart enough to
say, uh, it's Cody.
I don't think so.
Cody Rivers (07:48):
I don't think it
is, man.
We just see what happens.
Aaron Pritz (07:50):
Yeah.
The last thing on G PT and thefails is I went to a conference,
a couple, a healthcareconference a couple months ago
and got a lot of follow upemails just from being there.
Cody Rivers (08:00):
This is great.
Aaron Pritz (08:00):
Uh, and one of the
articles was HIPAA spelled like
hippo, which is if you're inprivacy, like that's about the
worst thing you can do to blowaway your credibility.
But it was spelled, I had some,had somebody tell me the female
version of Hippo is hipaa, likeSpanish, um, which is not
Spanish, but anyway, it wasspelled wrong throughout.
And after reading about half ofit, I'm like, this is clearly
(08:23):
written by a chat GTP likething.
And to your point, Shelly, theno human looked at it.
That was an expert to say, Hey,this is wrong, or this basic
spelling of the thing that I'mtrying to be sharing thought
leadership is way off the bat.
So I emailed the company andthey didn't acknowledge my
feedback to update it, but theyupdated it within about 30
minutes.
Cody Rivers (08:43):
Yeah, they did send
a follow up.
Shelley Jackson (08:44):
That was quick
marking mode.
Cody Rivers (08:46):
That was great.
That was great.
Shelley Jackson (08:47):
It's impressive
quickness.
Cody Rivers (08:48):
I have a question
going back to employees and
policies and them keeping upwith AI and what do you think
around there as far as,employees using it, employers
using it, but then you've gotthese old policies that are
written from years ago and arehopefully, used but not always
used.
So what are you seeing there?
Shelley Jackson (09:04):
Well, I think
one of the things that happens
is that we have technologyadvancing at such rapid paces.
This is not new to the AIdiscussion.
This is part of, I think, justgenerally managing risk.
Your policies, of course, areonly as good as either the paper
they're written on, or thedatabase that they sit in.
If they're not living, breathingparts of your organization.
So when you're bringing in,let's say, a new technology,
(09:26):
it's really important to thinkabout do we have policies in
place that govern or that giveinstruction as to how we will
use this technology and how wewill make this technology work
for us.
And as for the limitations,there's a lot of really
sophisticated great opportunityout there to, to use technology
in a way that will reallyleverage information, leverage
(09:47):
data, leverage time.
It helps you, create more time.
But I think the policy piece ofit is just being intentional
about do we have the frameworkset up to use this great new
technology and make sure that weunderstand the risks that come
with that technology.
I feel like artificialintelligence and technology in
general is parallel.
There's huge risk, but there'salso fantastic opportunity.
(10:09):
Mm-hmm and it's, as I view it,this is not, do we want to
engage with AI?
This is just how do we do it?
Cody Rivers (10:15):
Yeah.
Shelley Jackson (10:16):
And how do we
grow our organization?
How do we update our policies?
Do we make sure that ouremployees are trained?
Having training sessions, youknow, doing tabletop exercises
when you're looking at risk,putting together scenarios for
people.
Testing people doing, givingthem some sort of opportunity to
exercise those decision makingskills that can be driven by
(10:36):
policy.
Cody Rivers (10:37):
Yeah.
Shelley Jackson (10:37):
And then also
staying up to date because at
the same time you've got all ofthis innovation going on on the
technology side.
You've got.
Lots of legal updates.
Yeah.
You've got regulatory, itemscoming up, and we have a new
privacy law that will take placein the next couple years here in
Indiana that just got passed.
And I think goes into effect, in2026, and so I.
(10:58):
We're gearing up for the future.
And again, there, there are waysto do this and it's just no
one's gonna be perfect in doingthis.
Yeah, this is a, these are bigissues and big items, but I do
think being thoughtful andproactive can really head off a
lot of potential challengesalong the way.
Just because you're staying infront of it and you're thinking,
okay, great, we've got thisgreat new technology.
(11:18):
Let's figure out how it worksfor us.
But let's also make sure thatour work population, our
employees, are empowered to useit.
It's an empowerment issue.
Creating consistent policiesthat are tracking what the
current legal landscape is andthe regulatory landscape.
It's an empowerment tool foremployees.
Yeah, because it gives them theboundaries they need to
effectively leverage this verypowerful technology that's
(11:39):
available to them Now.
Cody Rivers (11:41):
Very cool, very
cool.
Aaron Pritz (11:42):
I'd like to pivot
to a new segment of our show
called, and let's Say Ittogether.
Fun facts.
Fun facts.
Yes.
So, let's start out easy.
What was the first concert youever went to?
Cody Rivers (11:53):
Live concert?
Shelley Jackson (11:54):
I was thinking
about this earlier in, I don't
know if it was Debbie Gibson orPublic Enemy, which is an
interesting brand.
Aaron Pritz (12:00):
Those are the
bookends of the options that you
can have.
It's great.
Very, and like we, one of them,like we talked about when we
warned you that the segment wascoming, can you please recite
one of the choruses from eithersong?
No, I'm just kidding.
I won't put you on the spot.
That's okay.
Let's think, any other funfacts?
What do you do for hobbies whenyou're with family?
Shelley Jackson (12:19):
Well, I spend a
lot of time with my family.
We are crazy about our pets.
We have a dog and a cat.
Enjoy spending time with the dogand doing stuff with him.
And uh, but we recently, as afamily, none of us are golfers,
but we decided that we wouldlike to become golfers.
Nice.
And so the four of us, we havetwo adult sons, and my husband
and I, we go out onto the golfcourse and we are terrible at
it.
We choose the last tee time andthe golf course.
(12:42):
Pro Shop Pro has taken, I think,pity on us.
And so he's very kind to us whenwe come in.
He helps get us set up with thegolf clubs that we need and any
of the materials.
And we did that all last summerand frankly, we did it in a very
low pressure environment and itwas just, it turned into this
great bonding opportunity forour family.
That's awesome.
And we now, we like to do it.
(13:03):
every few weeks in the summer,and just spend time together
nice.
And not like, worry about, we doget frustrated.
Aaron Pritz (13:10):
Everyone does it.
Golf,
Shelley Jackson (13:11):
yes, we do get
frustrated, but it's okay
because we're all together andwe're driving the golf carts and
we're having a good time.
Cody Rivers (13:16):
That's like a
sport.
You can have like it's 17 badholes, but you get one good shot
and you're like, I could dothis.
Shelley Jackson (13:22):
We've seen some
questionable techniques too.
One of my sons decided he wasgonna do one hand, one armed
golf swinging, which okay, is athing.
I did not know this
Cody Rivers (13:30):
Really?
Shelley Jackson (13:31):
Apparently
Cody Rivers (13:31):
I didn't know this
either.
Aaron Pritz (13:32):
I can't imagine
that's gonna my game yes.
Uh, give us a fun fact thatmaybe 99% of your colleagues
would've never heard.
Shelley Jackson (13:43):
Well, I have
like one recyclable fun fact
that they probably all have her,they have heard.
Okay.
So if they hear it, they'll belike, Shelly's using her Fun
fact again.
But my one fun fact that I use,when you're in a, an icebreaking
scenario?
Mm-hmm.
Sure.
And they tell, you have to saysomething interesting about
yourself, but you have like 30seconds to figure out what that
is.
Cody Rivers (14:01):
Yeah.
Shelley Jackson (14:01):
I always tell
the story that back in the day,
so this will age me.
There were these commercials forOnStar, which was a very early
form of, I mean, it's stillthere, but you basically could
call for emergency help when youwere in your car.
Aaron Pritz (14:14):
Yep.
Shelley Jackson (14:14):
Yeah.
And we had OnStar in one of ourvehicles and I actually had a
collision involving a deer withmy grandson who was like a, oh,
dears yes.
On it.
Who was about one at the time.
So it was really scary.
Everyone was fine.
I think there was a littledamage to the car.
Everyone was fine.
But it was like maybe a couplemonths later and OnStar
contacted me and said, we wannado a commercial and we have a
(14:36):
recording of it.
Would you like to hear it?
Aaron Pritz (14:38):
Oh, was that an
instant Yes.
For you?
Did you have to reflect a bit?
Shelley Jackson (14:41):
The recording
was very embarrassing.
Aaron Pritz (14:43):
Okay.
Shelley Jackson (14:43):
Because it was,
semi panicked, I got my little
kid in the background.
I've just hit a deer.
I'm worried about the deer.
Aaron Pritz (14:50):
Did you say, show
me the money and then.
Press forward.
Shelley Jackson (14:53):
Well, we talked
to them and they were, glad to
put together a commercial that'scalled Deer Damage.
So that's the name of thecommercial.
It aired a few times.
Aaron Pritz (15:02):
We're going to
youTube right after this.
Shelley Jackson (15:03):
I have tried to
find it.
I haven't, I a recordingsomewhere in my house.
Cody Rivers (15:06):
Have you asked Chat
GPT yet?
Have you asked the AI to findit?
Shelley Jackson (15:08):
I have not
asked Chat GPT and I won't.
Aaron Pritz (15:10):
So from a legal
standpoint, the point in which
you had the accident is therenow a deer crossing sign.
Are there any precautions thatwould set the right framework of
prevention?
Shelley Jackson (15:21):
To my
knowledge, I do not believe
there is a deer sign in thearea.
Aaron Pritz (15:23):
Is there a lawsuit
opportunity?
I.
Could you get paid twice?
I'm just kidding.
This is is not your professionalopinion.
Shelley Jackson (15:30):
Just this was
like two decades ago.
Aaron Pritz (15:32):
Yeah.
What's the term?
Is the period of the statute oflimitations.
Statute of limitations might beup
Shelley Jackson (15:36):
expired many
years ago.
Cody Rivers (15:37):
We got it.
So we've got all these funrings.
We've got the golfing fa, we'vegot the Jan Star story.
Give us some, give us the Shellystory.
How did you get into to thispractice here?
what did you enjoy?
You know, kind of your story andthen I think, well, let's start
there.
Aaron Pritz (15:50):
Yeah.
Both attorney and the privacy.
Technology side.
Cody Rivers (15:55):
Yeah.
Shelley Jackson (15:55):
Okay.
So the attorney side, which camebefore the privacy side, I was
actually a teacher.
I was an English high schoolEnglish teacher and loved my
students and loved teachingEnglish, but I knew at some
point that I wanted to go to lawschool.
Cody Rivers (16:07):
Yep.
Shelley Jackson (16:07):
At one point it
made sense for our family and to
make the decision to make theleap and try law school.
So I actually took a year leaveof absence from my teaching
position.
And in the idea that if lawschool was really terrible, I
could just say, come back, back,boy, that I dodged that bullet
and I decide, I went into myfirst year and I really just
have never looked back.
(16:28):
I've been in the legal communitynow, it will be, oh gosh, like
17 years.
And so I've had an opportunityto build a career over that time
that has spanned a lot ofdifferent things, but as it
relates to privacy and security,What I found is as a young
attorney, I intersected, maybe Iwould say in incidentally with
it.
In other words, I would beworking, like I did some med mal
(16:50):
defense, some pharmaceuticaldefense very early in my career.
So we intersected with it inthat it was a part of the
litigations.
Mm-hmm.
We needed to be aware of thatfrom like producing documents
and reviewing information.
So it started in a reallyincidental way.
But what I found is over time Ireally enjoyed the, helping
clients navigate decisions aboutinformation, personal
(17:12):
information, and so it, itexpanded from just healthcare
related items into a broaderarray of consumer data, employee
data, all kinds of differentdata.
And so I started to develop aninterest in it, and at one point
I just decided that it was time.
I'd been in a law firm forseveral years, decided that I'd
like to.
Explore that opportunity andbecame a chief privacy Officer.
(17:35):
So I decided to leave externallegal practice in a law firm.
Cody Rivers (17:38):
Yeah.
Shelley Jackson (17:39):
And be
in-house, with one dedicated
client and in that role I wasboth an assistant general
counsel who was, uh, hadresponsibility over human
resources and litigation.
And then I also was the chiefprivacy officer, which was a
global role, was a reallyinteresting, exciting, fun role.
That's where I met Aaron, andTim.
Yep.
And, some of your team members.
So that's how I first, got tomeet Reveal Risk.
(18:01):
But, it was a great experienceand ever since it's just,
there's a couple differentthings.
First of all, it's about how we,how.
Create opportunities to handleinformation and data in a
respectful manner.
Yeah, so I think it's, and it'shard.
It's about being respectful inthe way that we use information
that we have about people, butit's also it's a highly
(18:21):
regulated area.
It's an interesting area.
Yeah.
You know, sometimes people liketheir eyes glaze over, right?
Like privacy.
No, but if you dive in it,there's really interesting, I
mean, how all the, all of thedifferent frameworks work
together.
It's an up and coming area.
It's a rapidly changing area.
We have new laws all the time.
Cody Rivers (18:37):
Yeah.
Shelley Jackson (18:37):
And then
because I spend so much of my
time in the employment space.
Privacy is so closely connectedboth how you treat employee
data.
How you deal with candidatedata, how you deal with your
employee data, former employees,but also how those employees are
empowered to treat data the wayin a compliant manner and in a
way that is, really contemplatesthe best use of that data.
Cody Rivers (19:00):
Yes.
Aaron Pritz (19:01):
So what happens on
the AI task force that was done
with no filter by the way.
Um, what happens on the AI taskforce and what are some of the
top tips or good conversationsyou're having with clients about
AI?
Shelley Jackson (19:14):
Well, we're
looking for someone to do like
voiceovers as we begin ourmeetings.
Aaron Pritz (19:19):
Sign me up.
Cody Rivers (19:20):
He also plays the
drums, so I put this little
snare beat, have
Shelley Jackson (19:24):
like a whole
entertainment segment.
So one of the things wediscovered at our firm as I'm, I
think is happening everywhere.
We're doing a ton of this work.
Our clients have reallyinteresting.
Sometimes very complex questionsand issues that arise with
respect to, managing artificialintelligence and making sure
that they are thinking about allangles of how to leverage.
(19:46):
Um, and then also manage risk asit relates to artificial
intelligence.
So the idea of our task force issimply to make sure that we are
serving clients in the areasthat are important to them.
Also in the areas that maybethey're not as aware of, but we
can help bring awareness andhelp them manage risk.
A lot of times from a riskmanagement standpoint, so much
of what I do in a day ismanaging risk, right?
(20:09):
There's the legal compliancepiece, but a lot of times it's,
we know the legal compliancepiece.
We just need to figure out howdo we effectively manage that
risk.
Yeah.
And how much of our resources dowe devote to x activity versus y
activity.
How much of a benefit do we getfrom leveraging this new AI
technology versus simply usingwhat we already have and how
much risk are we creating here?
(20:30):
Yeah, so we're seeing it inevery area.
I see it a lot in employment andhuman resources issues.
Certainly a healthcare is a,yeah is a big place because
that's a highly regulated areaand you've got lots of apps that
can tell you all kinds of thingsabout you from a health
perspective.
You gotta think about how you'reusing that data and how that
data is being processed andwhat.
Is how the inputs are set up tocreate any sort of artificial
(20:54):
intelligence, generativeoutcome.
If there's, if they'regenerating new content, how is
that being pulled?
What's the security and, and theprivacy controls.
Yeah.
Business negotiations anddiscussions.
And who owns AI?
There's some lawsuits right nowthat are pending about just
talking about who owns somethingthat's been generated through AI
that was created based onexamples of real artists and
(21:17):
yeah.
Is it a derivative work if thathappens?
So there's a lot of discussionshappening in that area.
Aaron Pritz (21:21):
So what I'm
hearing, or this is my kind of
wrap up of that, is you'reletting, you've asked AI to come
up with the agenda for your yourtask force
Shelley Jackson (21:30):
that is it.
No, no.
That's not what it's
Aaron Pritz (21:33):
client demand.
Shelley Jackson (21:33):
I will not lie
though, in any meeting that
involves AI, it is not uncommonfor someone to utilize AI in
some way.
Just to use it as an example.
Cody Rivers (21:41):
Yeah.
Shelley Jackson (21:41):
Because again,
we are a business that is
navigating really sophisticated.
New technology that frequentlyuses AI and thinking about is
lawyers, what is our role inexercising that AI and using it.
So I think it's very clientdriven.
It's based on what we're seeingfrom our clients, but it's also
what value can we provide toclients about maybe looking
around a corner that they're notyet Yeah.
(22:01):
Experiencing yet and stayingahead of it, rather than all of
us running to keep up with it.
Aaron Pritz (22:07):
Maybe we've got
time for one more question.
Give us some of your top tipsfor AI that you're seeing from a
risks that are out there orguidance that you're giving.
Shelley Jackson (22:16):
Yeah.
Well, I think, and especially ifI'm framing it, you do so much
in the cybersecurity space and Ithink when I think about it from
like a data privacy and a datasecurity space is.
There's so much, sophisticationto what is out there.
There's so many newopportunities and they are
incredible.
Again, this is not, do weutilize AI?
This is not, do we utilizetechnology?
It's how do we do it?
(22:37):
Yeah.
And how do we manage risk?
So when I think of the bigpicture issues, number one.
What is the legal and regulatoryframework that you're in?
And are you doing something, areyou changing something about
what you're doing that couldimpact that?
Cody Rivers (22:50):
Yeah.
Shelley Jackson (22:50):
So, you know,
if you're on top of that, that's
gonna help manage your risk atthe outset.
Mm-hmm.
Rather than you implementsomething that seems super cool
and then yeah, you go back intime and you think, oh, ooh, I
wish I would've known that itwas this issue.
A good example of that is the EE O C.
Is regulating, the use ofdecision making as it relates to
candidates and employees, andholding employers responsible
(23:12):
for decisions that AI makes.
If that decision turns out tohave a discriminatory impact,
even if it's not intended, ifit's just an adverse impact
that's created by the, decisionmaking process or the generative
AI.
That can result in someadditional risk.
But if you're on, if you thinkabout it on the front end and
look at that and take a lookbefore you're using the
technology, you can help managerisk.
(23:32):
Second of all, I think it's justto have conversations and, I
always believe employees areyour number one.
Your best weapon, right?
Yeah.
They are your best advocates.
They are out there doing thework day to day of the business.
Yeah.
And so thinking about what is itthat your employees need from a
business standpoint, what doemployees need to feel
empowered?
Whether that is, understandingpolicies, staying ahead of
(23:55):
policy development, clearcommunication of expectations,
giving employees a voice to say,Hey, I see this great
opportunity here.
Can we explore this?
And sort of outline what thoserisks are.
Yeah, so I think that's thesecond.
And then the third I think isjust to be curious.
There's so much going on andthere's so many opportunities
here, and I think being curious,doing a lot of reading, staying
(24:16):
up to date on what theopportunities are in your
industry.
There's so many opportunities tolearn and you don't have to
learn the hard lessons.
Cody Rivers (24:23):
Yeah.
Shelley Jackson (24:24):
You can learn
from other people who learn the
hard lessons so just beingengaged in that industry or
community.
Aaron Pritz (24:29):
So with AI, being
careful on what you feed it.
What the implications are andhow you're using what you're
getting back.
Shelley Jackson (24:35):
Yeah.
Aaron Pritz (24:35):
I almost think
that's like an analogy to like
raising kids, like accidentallysaid a bad word.
What did I feed my kids?
Cody Rivers (24:41):
Whoops.
Aaron Pritz (24:42):
What were the
implications?
Well won't go into that.
And then what did I get backfrom them?
Well, a lot of crap.
So that wasn't the word I usedeither, so.
All right.
Well, thanks for coming on,Shelly.
It's been great to have you hereand discuss these topics.
Really appreciate you coming inand talk to you soon.
Shelley Jackson (25:00):
My pleasure.
Thank you so much.
Cody Rivers (25:01):
Awesome.
Thank you.
Bye.