All Episodes

December 13, 2024 17 mins

Comment on the Show by Sending Mark a Text Message.

This episode is part of my initiative to provide access to important court decisions  impacting employees in an easy to understand conversational format using AI.  The speakers in the episode are AI generated and frankly sound great to listen to.  Enjoy!

Can technology uphold fairness, or is it silently perpetuating bias? Discover the complex world of AI in the hiring process as we unravel the case of Derek Mobley versus Workday Inc. Mobley, a black man over 40 with mental health conditions, challenges the algorithms that he claims have unjustly barred him from over 100 job opportunities. Despite the court's decision not to categorize Workday as an employment agency, the episode prompts a pivotal discussion about the responsibilities HR tech companies might bear when their software influences employment outcomes. We grapple with the concept of disparate impact discrimination and what it means when unintentional practices result in a skewed playing field for protected groups.

From the courtrooms to the broader tech landscape, the implications of this case ripple across the HR industry and beyond. We weigh the necessity for transparency, accountability, and fairness in algorithmic decision-making while acknowledging the delicate balance with innovation. Listen as we delve into the potential for increased scrutiny and regulation of HR tech companies, and encourage job seekers to critically engage with the data that drives these systems. Join us in exploring how technology shapes our employment landscape and what needs to change to ensure it does so equitably.

If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, Twitter and LinkedIn. We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts. Leaving a review will inform other listeners you found the content on this podcast is important in the area of employment law in the United States.

For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.

Disclaimer: For educational use only, not intended to be legal advice.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome back.
Today we're taking a deep diveinto a case that's making waves
in the world of tech and hiringDerek Mobley versus Workday Inc.
It's not just a legal battle.
It really gets you thinking.
How are algorithms used inhiring?
Can companies like Workday beheld responsible if there's bias
?

Speaker 2 (00:18):
Yeah, it's a really interesting case, isn't it?
It shows just how much AI isaffecting our lives now, like
even finding a job.

Speaker 1 (00:24):
And it all revolves around Derek Mobley, a black man
over 40 who claims he wasrejected from over 100 jobs and
all of them used Workdaysoftware for some part of hiring
.

Speaker 2 (00:35):
What's really striking is he wasn't just
applying to one company or evenone industry.
All sorts of jobs, differentsectors, and every time he hit
this Workday wall.

Speaker 1 (00:44):
OK, so before we jump in too deep, can you give us
some background on Workday?
What exactly do they do?
Why are they the focus here?

Speaker 2 (00:49):
So Workday is a big name in HR tech, cloud based
software stuff like HR payrolland well, this is important for
the case Talent management.
They work with tons ofcompanies, lots of Fortune 500
firms, even so, their software.
It could be affecting a hugenumber of people applying for
jobs of Fortune 500 firms, evenso their software.

Speaker 1 (01:05):
it could be affecting a huge number of people
applying for jobs.
Wow, they're not just somesmall startup then.
They're a major player in thishiring space.

Speaker 2 (01:11):
Exactly, and that's partly why this case is so big.
It's not just about one guylooking for work.
It's about the potential forbias, you know, algorithmic bias
on a massive scale.

Speaker 1 (01:20):
Okay, Back to Mobley.
He was applying for all thesejobs each time running into
Workday.
What was that experience likefor him?

Speaker 2 (01:28):
Well, he'd find postings on LinkedIn pretty
standard stuff.
But clicking apply he'd getredirected to a Workday platform
on the company's website.

Speaker 1 (01:37):
So, even though he's applying to different companies,
it's always Workday behind thescenes handling his application,
always Workday behind thescenes handling his application.

Speaker 2 (01:43):
That's right.
Every time new Workday account,upload his resume, sometimes
even these Workday assessmentslike personality tests.

Speaker 1 (01:50):
Workday is collecting a lot of data on that.

Speaker 2 (01:52):
And that's crucial right.
Algorithms need data to learn,and that data, it's not just
what's on your resume.

Speaker 1 (01:57):
Like what else?

Speaker 2 (01:57):
Well, think about it when you create a Workday
account, maybe your age,location, education, history and
those personality tests theymight show your personality
traits.
Are you emotionally stable?
How about risk aversion?

Speaker 1 (02:11):
Hmm, yeah, I see your point.
It's all data that an algorithmcould use to make decisions
about you.

Speaker 2 (02:16):
Exactly, and this is where Mobley's concerns start.
He argues Workday's tools arediscriminatory.
They use biased data, thingslike personality tests that
might put certain people at adisadvantage.

Speaker 1 (02:27):
So he's not saying he's just unlucky.
There's something wrong withhow Workday's algorithms are
making decisions.

Speaker 2 (02:33):
Yeah, inherently biased that's what he's claiming
against people like him, black,over 40, and with mental health
conditions like anxiety anddepression.

Speaker 1 (02:41):
He was rejected from over 100 jobs.
It's not just a few rejectionshere.

Speaker 2 (02:44):
And get this, some of those rejection emails middle
of the night like 2 am.

Speaker 1 (02:49):
That is kind of creepy, got to admit.
Definitely sounds likeautomation was involved.

Speaker 2 (02:53):
Makes you wonder how much human judgment was really
there, versus an automateddecision made by Workday's
software.

Speaker 1 (03:00):
That's a big question .
Goes to the heart of this case.
But before we go further, whatdoes Mobley mean when he says
Workday's tools arediscriminatory?
Does he mean they're designedto discriminate against certain
groups?

Speaker 2 (03:11):
Not necessarily he's arguing.
It's what's called disparateimpact discrimination.

Speaker 1 (03:16):
OK, disparate impact Sounds like legal jargon.
Can you explain that for us?

Speaker 2 (03:20):
It is legal stuff, but super important here.
Disparate impact means, even ifa practice doesn't mean to
discriminate, if it ends updisproportionately harming a
protected group, well, legallythat can still be discrimination
.

Speaker 1 (03:35):
Ah, so even if Workday wasn't trying to
discriminate, if theiralgorithms have that effect,
they could still be in trouble.

Speaker 2 (03:41):
Exactly that's Mowgli's point, Even if
companies using Workday meanwell the software itself can
lead to bad outcomes.
Discrimination Interesting, soit's not just about intent, but
the actual impact Right on, andin this case it makes us face
the possibility of algorithmicbias in a system that's relying
more and more on AI for bigdecisions.

Speaker 1 (04:01):
Okay, we've got the background on Workday, Mbley's
experience of constantly beingjudged by their algorithms and
this idea of disparate impact.
What are Mobley's actual legalclaims?
What is he arguing in court?

Speaker 2 (04:15):
Actually a couple of different arguments, and they
both hinge on whether Workdaycan be held liable for the
discrimination, not just theindividual employers.

Speaker 1 (04:23):
OK, now I'm really interested.
What are those arguments?

Speaker 2 (04:26):
First one Workday is an employment agency under laws
like Title VII of the CivilRights Act, the Age
Discrimination and EmploymentAct, ada, that kind of thing.

Speaker 1 (04:36):
So he's saying they're in the business of
finding people jobs like aregular employment agency.

Speaker 2 (04:40):
That was his initial argument.
Yeah, because Workday is sodeep in the hiring process.
The gatekeepers they shouldhave the same
anti-discrimination rules as anyother agency.

Speaker 1 (04:49):
Makes sense.
I mean, they are screeningcandidates, right?

Speaker 2 (04:52):
They are.
But the court actuallydismissed that specific claim.
They said Workday doesn'ttechnically procure employees,
legally speaking not activelyfinding people to fill jobs.
They just provide the software,the platform.

Speaker 1 (05:06):
So Workday's off the hook then.

Speaker 2 (05:07):
Not entirely.
Here's where it gets a bittricky legally.
The court did say, while not anemployment agency Mobley's got
a case, a plausible one thatWorkday was acting as an agent
of those employers.

Speaker 1 (05:20):
Hold on Employment agency and agent.
What's the difference?
They both seem to be involvedin helping companies find
employees.

Speaker 2 (05:26):
It is subtle but important difference.
An employment agency their mainbusiness is connecting job
seekers and employers.
Think headhunting firms, tempagencies they actively go out
and recruit and place people.

Speaker 1 (05:39):
So Workday isn't doing that, they're giving the
software.

Speaker 2 (05:42):
Right.
But being an agent of theemployer, they take on some of
the employer's responsibilities.
The court's view if Workday'sdoing core, that's exactly it
and that's why this case is sobig for the whole HR tech world.
If the court sides with Mobleyon this agent idea, big
precedent Software companiescould be held accountable for

(06:02):
algorithmic bias in their hiringtools.

Speaker 1 (06:05):
That is huge.
But Workday is fighting backhard.
I bet Not just going to acceptliability.

Speaker 2 (06:10):
Of course not.
They've got their legal teamworking on their defense.
Main argument we're just thesoftware provider, a neutral
platform, basically saying ourcustomers, the employers they
set the hiring criteria, makethe decisions.

Speaker 1 (06:21):
So don't blame us, blame the companies using our
software.

Speaker 2 (06:25):
That's the gist, but it's not quite that simple.

Speaker 1 (06:28):
Why not?
What's Mobley's counterargument?
Why not?
It can't be that easy, can it?
What's Mobley saying?

Speaker 2 (06:34):
Well, think of it this way Imagine buying a car,
but the brakes are faulty.
You get in an accident.
You wouldn't just blameyourself, would you?
You'd hold the carmakerresponsible too.
Yeah for sure, especially ifthey knew about the bad breaks
and didn't do anything to fixthem Exactly.
And that's part of whatMobley's arguing.
He's saying Workday knows theiralgorithms can be biased.

(06:55):
There are studies out thereshowing how AI can carry over
those biases from societydiscriminate based on race,
gender, all sorts of things.

Speaker 1 (07:03):
So he's saying Workday is aware, or should be
aware, that their software couldlead to discrimination.
They can't just play dumb.

Speaker 2 (07:09):
That's right.
And he says they haven't doneenough to deal with those
potential biases.

Speaker 1 (07:14):
So we've got this back and forth right Workday
saying we're just the softwareguys.
Up to the employers to use itfairly.
Mobley's side is no, you builtthe tool.
You knew it could be biased.
You're responsible for what itdoes, even if you didn't want to
discriminate.

Speaker 2 (07:28):
You've got it.
It's a really complex situation, not black and white at all.
Legal stuff, ethical questionsthe court's got to figure it all
out.

Speaker 1 (07:37):
OK, the case hinges on whether Workday's an agent of
the employers and if they canbe held responsible for any
discrimination.
But there's something else I'mwondering.
Mobley hasn't said whichcompanies he thinks actually
discriminate against him right.

Speaker 2 (07:50):
That's true, and Workday's using that in their
defense.
Okay, you say you werediscriminated against, but by
who?
Show us the proof that specificemployers were biased against
you because you're Black, over40, or have a disability?

Speaker 1 (08:03):
So Mobley's got a challenge he has to show the
connection Workday's algorithmsplus what specific employers did
led to those unfair rejections.

Speaker 2 (08:11):
You got it.
It's not enough to sayWorkday's software might be
biased generally.
He needs to show how that biasplayed out for him across all
those job applications.
And proving discriminationNever easy, but proving it's
from algorithmic bias eventougher.

Speaker 1 (08:27):
Totally agree.
So what are the hurdles he'sfacing in proving his case?

Speaker 2 (08:31):
Well, for starters, he needs data showing exactly
how Workday's algorithms wereused in his specific
applications.
What were the screeningcriteria?
What factors were weighted moreheavily?
How did he score on thoseWorkday assessments?
That kind of thing.

Speaker 1 (08:44):
I bet getting that data from workday is an uphill
battle.

Speaker 2 (08:47):
Probably yeah.
Companies guard their algorithmclosely, trade secrets, that
whole thing.
Mowgli might have to fighttooth and nail for that
information.

Speaker 1 (08:55):
Okay, let's say he gets the data.
What does he do with it Toprove his case?

Speaker 2 (09:00):
He has to show a pattern of rejections and it
can't be because of anythingother than his race, age or
disability.
For instance, if heconsistently aced the skills
assessments but kept gettingrejected for jobs needing those
skills, that could be evidenceof bias.

Speaker 1 (09:15):
It all comes down to showing a clear connection
Workday's algorithms, what theemployers did and the
discrimination he faced.

Speaker 2 (09:21):
Exactly, it's a high bar to clear.
But if he can do it, bigimplications not just for him
but for the whole industry.

Speaker 1 (09:28):
Okay, let's play this out.
He wins the case.
What then?
What happens to Workday?

Speaker 2 (09:33):
Well, there's the money, of course.
If the court says they'reliable for discrimination, they
could have to pay.
Mobile damages could be a lotof money.
But the bigger thing, the legalprecedent what do you mean by
legal precedent?
If Workday loses, this casecould open the floodgates for
lawsuits against other HR techcompanies Sends a message you
can't just say we're a neutralplatform.

(09:53):
You've got a responsibility tomake sure your algorithms are
fair.
Don't lead to discrimination.

Speaker 1 (09:58):
So a win for Mobley could change the whole game for
the industry.

Speaker 2 (10:02):
Definitely possible.
Companies like Workday mighthave to be way more transparent
about how their algorithms work,more proactive about checking
for bias and takingresponsibility for the decisions
their software is involved in.

Speaker 1 (10:13):
That's a huge shift, shows how important this case
really is.
It's not just one guy and hisjob search.
It's about the role ofalgorithms in all our lives.
Can technology make inequalityworse or can it challenge it?

Speaker 2 (10:25):
Exactly, and as AI gets more and more powerful,
that debate's only going to getmore intense.

Speaker 1 (10:31):
So back to this specific case.
What's next?
Where do things stand now?

Speaker 2 (10:35):
The court's given Mobley a chance to revise his
complaint, provide more specificevidence to back up his claims.
It's a crucial moment for himto bolster his case and make
those connections we've beentalking about.

Speaker 1 (10:46):
He's got to show that concrete link between Workday's
algorithms and the rejectionshe faced.

Speaker 2 (10:51):
Right.
He needs to prove Workday'sactions, as that agent of the
employers directly led to himbeing rejected from those jobs.

Speaker 1 (10:58):
He's got a lot to do, but if he pulls it off, the
impact could be massive.

Speaker 2 (11:03):
Absolutely A case worth keeping an eye on.

Speaker 1 (11:05):
This deep dive has been fascinating.
I can't wait to see how it allplays out.
Thanks for helping usunderstand all the intricacies.

Speaker 2 (11:11):
Happy to do it.
Where law, tech and ethics meet, Always a lot to think about.

Speaker 1 (11:18):
Okay, we've covered a lot Derek Mobley's story,
Workday's role, disparate impact, the legal arguments.
We even touched on what thiscase could mean for the whole HR
tech world and algorithms ingeneral.

Speaker 2 (11:32):
But let's take a step back for a second.
What does this case really mean?
Good point it raises bigquestions going beyond just the
legal stuff.
How does tech fit into society?
Can it do good or can it doharm?

Speaker 1 (11:41):
Yeah, like, what does fairness mean when algorithms
are involved and who'sresponsible when these systems
make decisions that have realconsequences for people?

Speaker 2 (11:50):
Exactly the questions we need to be asking.
We can't just embrace every newtechnology without thinking
critically about how it mightaffect people.

Speaker 1 (11:56):
Right.
It's not about saying no totechnology.
It's about using it responsibly, ethically, in a way that
benefits everyone.

Speaker 2 (12:03):
Couldn't agree more and cases like this, as messy
and complicated as they are,they help us have those
conversations, figure out how tonavigate this new world.

Speaker 1 (12:13):
OK, we've looked at Mobley's claims Workday's
defense, what this case mightmean for the whole industry.
But, as we wrap up, what arethe key takeaways for our
listeners, especially when itcomes to understanding how
algorithms might be affectingtheir own job searches?

Speaker 2 (12:27):
Be aware that's the biggest thing Know that
algorithms are being used moreand more in hiring and that
those algorithms can be biased,even if they weren't meant to
discriminate.

Speaker 1 (12:37):
So not paranoia, just being informed.

Speaker 2 (12:39):
Right, know how these systems work, what data they're
using, what blind spots theymight have, and speak up, demand
transparency and accountabilityfrom the companies using this
technology.

Speaker 1 (12:50):
That's a great point.
As job seekers, we have a rightto know how these decisions are
made.

Speaker 2 (12:54):
Absolutely, and the more we know about these systems
, the better we can navigatethem, make sure they're being
used fairly and ethically.

Speaker 1 (13:00):
It's about being empowered, not just letting the
algorithms decide for us.

Speaker 2 (13:04):
Exactly, Technology is a tool.
Any tool can be used for goodor bad.
It's up to us to decide howit's used to make sure it
reflects our values, our goals.

Speaker 1 (13:12):
Well said, really thought-provoking, deep dive.
Thanks for sharing yourinsights.
My pleasure and to all of youlistening.
Thanks for joining us on thedeep dive.
We'll be back soon with anotherDeep Dive into a topic that'll
get you thinking.
So to prove his case, mobleyreally needs to paint a clear
picture for the court.
What kind of evidence will theybe looking for specifically?

Speaker 2 (13:32):
They need to see a direct link, you know, from
those Workday algorithms to therejections he got.
Like did the system red flagsomething about Mobley that
caused his applications to beautomatically tossed out?
Did he get consistently lowerscores on Workday's assessments,
scores that don't match hisactual qualifications?
That's what the court needs tofigure out.

Speaker 1 (13:52):
And on top of that his legal team has to counter
Workday's argument the wholewe're just a neutral platform
thing that it's the employerswho should be held responsible
for using the software fairly.

Speaker 2 (14:04):
Exactly.
They have to make a strong casethat Workday was more than just
a software provider, that theywere acting on behalf of those
employers like an agent andtherefore share the blame for
any discrimination that happened.

Speaker 1 (14:17):
It all boils down to proving that connection
Workday's actions, thealgorithms they created and the
negative impact it had onMobley's job search.

Speaker 2 (14:26):
That's the heart of the matter.
And if Mobley wins could sendshockwaves through the whole HR
tech world.

Speaker 1 (14:32):
What kind of intact are we talking about?
Paint us a picture.

Speaker 2 (14:35):
Imagine a future where companies like Workday
they're required to check theiralgorithms for bias regularly,
to be open about the criteriathey're using to screen
candidates and to be heldaccountable for any unequal
impact their software might behaving.
That's the kind of change thiscould bring.

Speaker 1 (14:51):
So this case, it could really change how these
companies do business, how theyeven design their products.

Speaker 2 (14:57):
It's definitely within the realm of possibility,
and it could give job seekersmore power too.
They could start demanding moretransparency and fairness from
the companies they apply to.

Speaker 1 (15:06):
It sounds like this case could be a real turning
point in this whole debate aboutmaking algorithms accountable
in the hiring process, but I'msure there are some people who
worry about too much regulationin this area.
What are some of those concerns?

Speaker 2 (15:19):
Well, some folks argue that too much regulation
could stifle innovation.
You know those concerns.
Well, some folks argue that toomuch regulation could stifle
innovation.
You know, hold back progress inthe HR tech sector.
The worry is that if companiesare constantly looking over
their shoulder, afraid oflawsuits about algorithmic bias,
they might be less likely tocreate new and innovative tools.

Speaker 1 (15:35):
So it's a delicate balance, protecting job seekers
from being treated unfairly butalso not squashing progress in
the field.

Speaker 2 (15:42):
Exactly that's what makes this case so complicated
and so important.
We have to face these toughquestions head on and find
solutions that encourageinnovation while ensuring
fairness and justice.
It's not an easy task.

Speaker 1 (15:55):
So, as we wrap up this deep dive, what's the one
key takeaway you'd want ourlisteners to remember about the
Mobley versus Workday case?

Speaker 2 (16:03):
The big takeaway Don't just sit back and watch.
We can't be passive in this newage of algorithms.
We need to stay informed, stayengaged and be willing to ask
the hard questions how is thistechnology being used?
How is it affecting our lives?
Those are the questions we needto be asking.

Speaker 1 (16:20):
Well said.
It's about understanding therole technology plays in our
world.
Thanks again for walking usthrough this complicated and
fascinating case.

Speaker 2 (16:28):
It's been my pleasure Always enjoy these
conversations.

Speaker 1 (16:31):
And to all our listeners, thanks for joining us
on the Deep Dive.
We'll catch you next time withanother Deep Dive into a topic
that'll get those brain cellsfiring.
Until then, keep thosequestions coming.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Introducing… Aubrey O’Day Diddy’s former protege, television personality, platinum selling music artist, Danity Kane alum Aubrey O’Day joins veteran journalists Amy Robach and TJ Holmes to provide a unique perspective on the trial that has captivated the attention of the nation. Join them throughout the trial as they discuss, debate, and dissect every detail, every aspect of the proceedings. Aubrey will offer her opinions and expertise, as only she is qualified to do given her first-hand knowledge. From her days on Making the Band, as she emerged as the breakout star, the truth of the situation would be the opposite of the glitz and glamour. Listen throughout every minute of the trial, for this exclusive coverage. Amy Robach and TJ Holmes present Aubrey O’Day, Covering the Diddy Trial, an iHeartRadio podcast.

Good Hang with Amy Poehler

Good Hang with Amy Poehler

Come hang with Amy Poehler. Each week on her podcast, she'll welcome celebrities and fun people to her studio. They'll share stories about their careers, mutual friends, shared enthusiasms, and most importantly, what's been making them laugh. This podcast is not about trying to make you better or giving advice. Amy just wants to have a good time.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.