All Episodes

March 17, 2025 16 mins

Comment on the Show by Sending Mark a Text Message.

This episode is part of my initiative to provide access to important court decisions  impacting employees in an easy to understand conversational format using AI.  The speakers in the episode are AI generated and frankly sound great to listen to.  Enjoy!

Age discrimination in the digital workplace takes an alarming turn when algorithms become the gatekeepers of opportunity. The landmark case against iTutor Group reveals how technology can systematically exclude qualified workers based solely on age—with women over 55 and men over 60 automatically rejected by software regardless of their teaching qualifications or experience.

When applicant Wendy Pincus discovered she was rejected but later offered an interview after reapplying with a younger birth date, she exposed a troubling reality facing many older workers in the digital economy. The Equal Employment Opportunity Commission's investigation uncovered evidence that over 200 qualified applicants were similarly denied consideration based on age thresholds programmed into hiring algorithms.

At the heart of this case lies a critical question that affects millions of remote workers: does the traditional distinction between employees and independent contractors still make sense in the digital age? iTutor Group attempted to evade age discrimination laws by classifying its tutors as contractors despite controlling their schedules, lesson plans, and monitoring their work through video—highlighting how companies may use classification loopholes to circumvent worker protections.

The $365,000 settlement represents more than just compensation—it signals that discrimination laws apply even in virtual workplaces. As remote work continues expanding globally, this case establishes important precedent for how anti-discrimination protections extend into digital environments.

Perhaps most fascinating is technology's dual role as both problem and potential solution. While iTutor Group allegedly used algorithms to discriminate, other companies are now implementing AI to detect and prevent bias in hiring processes—raising complex questions about privacy, ethics, and the future of work. Who's monitoring your job application, and what criteria are they really using to evaluate you?

If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, Twitter and LinkedIn. We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts. Leaving a review will inform other listeners you found the content on this podcast is important in the area of employment law in the United States.

For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.

Disclaimer: For educational use only, not intended to be legal advice.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
All right, let's dive in.
We've got a pretty interestingcase this time around Some legal
documents and press releasesall about this company, iTutor
Group iTutor Group.
Yeah, they provide onlineEnglish tutoring.

Speaker 2 (00:12):
Right.

Speaker 1 (00:13):
Mostly connecting tutors here in the US with
students in China.
Yeah, but things get a littlemessy here because there are
accusations of agediscrimination.

Speaker 2 (00:23):
It is a fascinating case.
We've got court filings fromthe EEOC Right it's the Equal
Employment OpportunityCommission, the agency that
investigates these kinds ofclaims and from iTutor Group
itself.

Speaker 1 (00:35):
Okay.

Speaker 2 (00:36):
We also have some press releases you know, each
side trying to control thenarrative, and a docket report
which gives us a timeline ofwhat happened in court.

Speaker 1 (00:44):
So a lot to unpack here.
Our goal is to really walk youthrough what happened, what each
side argued and what it allmeans for folks who work online
Right, especially when it comesto age discrimination.

Speaker 2 (00:54):
Exactly To understand the case.
We really need to understandhow I tutor group operates.
Ok, so we're actually talkingabout three interconnected
companies, all under the I tutorgroup umbrella.
They were hiring tutors here inthe US to teach English online.
Right now, and these tutorscould work from their homes or
anywhere with an Internetconnection.

Speaker 1 (01:12):
Sounds pretty convenient.

Speaker 2 (01:13):
Very much so.

Speaker 1 (01:14):
Yeah, but that's where things take a bit of a
turn.
The EEOC stepped in and filed aformal complaint.

Speaker 2 (01:21):
Right.

Speaker 1 (01:21):
Alleging that iTutor Group was using software to
automatically reject anyapplicant over a certain age.

Speaker 2 (01:26):
Yeah.

Speaker 1 (01:27):
We're talking 55 for women, 60 for men, pretty
blatant.
Yeah, that's a pretty hardcutoff.

Speaker 2 (01:32):
And a pretty clear violation of the Age
Discrimination and EmploymentAct, which protects folks 40 and
over from discrimination in allaspects of employment.

Speaker 1 (01:42):
Gotcha.

Speaker 2 (01:46):
So the EEOC was using this law to argue that iTutor
group was breaking the rules.

Speaker 1 (01:48):
Okay, and they had some compelling evidence to back
that up.
Right, did they highlight aspecific person's story?

Speaker 2 (01:54):
Yeah, they did.
They used the story of WendyPincus.

Speaker 1 (01:57):
Okay.

Speaker 2 (01:58):
She applied to be a tutor with iTutor group and was
immediately rejected Wow.
But she reapplied using ayounger birth date.

Speaker 1 (02:05):
Oh, interesting.

Speaker 2 (02:06):
And guess what?
She was offered an interview.

Speaker 1 (02:08):
Really.

Speaker 2 (02:09):
Yeah, this, combined with data that over 200
qualified applicants over theage of 55 were automatically
rejected by the software, reallyhelped build the EEOC's case.

Speaker 1 (02:20):
So how did iTutor group respond?
Did they try to explain theseage cutoffs?

Speaker 2 (02:24):
Well, in their response to the complaint, they
denied the allegations point bypoint.

Speaker 1 (02:28):
OK, so basically pleading not guilty, but did
they offer any specifics to backup their position?

Speaker 2 (02:34):
Not really.
They claim to have legitimatebusiness reasons for their
actions but those reasonsweren't specified in the
documents.
They also hinted that maybe theapplicants hadn't done enough
to lessen any potential harm,and they also suggested that the
EEOC's demands were a littlebit over the top.

Speaker 1 (02:51):
So a lot of legal maneuvering there.

Speaker 2 (02:52):
Yeah.

Speaker 1 (02:53):
But there's one argument they made that I found
really interesting.
They tried to claim that thetutors weren't employees but
rather independent contractors.

Speaker 2 (03:02):
This is a key point because it gets to the heart of
how worker classificationaffects legal protection.

Speaker 1 (03:07):
Right.

Speaker 2 (03:08):
Because the ADEA, that law, protects employees.
Okay, so the question is arethese people employees or not?

Speaker 1 (03:14):
Especially in the world of online work, where more
and more people are workingremotely, setting their own
hours, taking on theseproject-based gigs.

Speaker 2 (03:22):
Exactly.

Speaker 1 (03:23):
So iTutor Group.
Even though they werecontrolling things like lesson
plan schedules, even videotapingthe sessions, they were still
trying to say these individualsweren't technically employees.

Speaker 2 (03:33):
That's right, and this really raises a very
interesting question of how wedefine employees, especially in
an increasingly digital worldwhere the line between employee
and contractor can be prettyblurry.

Speaker 1 (03:47):
Yeah, we'll have to come back to that because I
think that has huge implicationsfor worker rights Huge.
But before we get too deep intothat, let's get back to the
iTutor Group case.
What happened with all thisback and forth?
Did it go to trial?

Speaker 2 (03:58):
Well, ultimately, it was settled with a consent
decree.

Speaker 1 (04:01):
Okay, so what does that mean?

Speaker 2 (04:02):
It means iTutor Group agreed to certain terms, but
they didn't admit any wrongdoing.

Speaker 1 (04:08):
Ah, the classic we're not saying we're guilty, but
we'll pay you to go away, Right?
So what did this settlementinvolve?

Speaker 2 (04:15):
Well, iTutor Group agreed to pay $365,000, which
was to be distributed amongthose who were potentially
affected by the hiring practices.

Speaker 1 (04:24):
That's a significant amount of money.

Speaker 2 (04:25):
It is.

Speaker 1 (04:26):
But aside from the money, did iTutor Group have to
change anything about how theydid business?

Speaker 2 (04:30):
They did, even though they had stopped hiring tutors
in the US.
By that point, they agreed toput anti-discrimination policies
in place Right, and providetraining on those policies,
gotcha.
And if they ever decided tostart hiring in the US again,
they would have to notify andinterview anyone that they had
previously rejected.

Speaker 1 (04:49):
Interesting.

Speaker 2 (04:49):
Particularly those they might have rejected because
of their age no-transcript.

Speaker 1 (04:59):
You know, this whole situation reminds me of another
case involving the EEOC and acompany called DHI Group, which
runs the website Dicecom yes, Apretty popular job search
platform and in that case theEEOC found that the company was
allowing employers to post jobads that discriminated against
certain people based on theirnational origin.

Speaker 2 (05:20):
Yeah.

Speaker 1 (05:20):
Specifically whether they were American or not.

Speaker 2 (05:23):
That was a key factor .

Speaker 1 (05:24):
So it wasn't DHI themselves discriminating Right,
but they were providing aplatform for others to do it.

Speaker 2 (05:30):
Yeah, that's the gist of it.

Speaker 1 (05:32):
And what's interesting is that, as part of
their settlement, dhi agreed touse artificial intelligence, or
AI, to scan job postings andflag any potentially
discriminatory language.

Speaker 2 (05:45):
Yeah, they're trying to essentially automate the
detection of discrimination.

Speaker 1 (05:49):
AI policing job ads.
That's both impressive and kindof unsettling at the same time.

Speaker 2 (05:54):
It is.
It really raises someinteresting questions about how
technology can be used to combatdiscrimination.
Huge questions, huge questions.
Could this type of AI beapplied to other parts of the
hiring process?
What are the ethicalimplications?

Speaker 1 (06:08):
A lot to ponder there , for sure.
But let's circle back to thatdistinction.
Itutor Group was trying to makeRight Employee versus
independent contractor.
Yeah, it seems like that's atthe heart of a lot of these
issues.

Speaker 2 (06:18):
It is, and in part two we're going to explore
exactly why that distinctionmatters, especially in the
context of discrimination laws.
We'll also look at what itcould mean for the future of
work, where more and more peopleare finding jobs online.

Speaker 1 (06:32):
Looking forward to it .

Speaker 2 (06:34):
Yeah, stay tuned.

Speaker 1 (06:36):
OK, so we're back and ready to dig into this employee
versus independent contractorthing.

Speaker 2 (06:41):
Right.

Speaker 1 (06:42):
It sounds like it gets pretty complicated,
especially with these onlinejobs, these discrimination laws.

Speaker 2 (06:47):
Yeah, it definitely can be a thorny issue.
As we mentioned before, theADEA is there to protect
employees.

Speaker 1 (06:53):
Right.

Speaker 2 (06:53):
But when you have someone working as an
independent contractor, thoseprotections might not apply.

Speaker 1 (06:58):
So iTutor Group was basically trying to argue that
they were off the hook becausetheir tutors weren't technically
employees.

Speaker 2 (07:05):
Right.

Speaker 1 (07:05):
But it seems kind of tricky because they had so much
control over how those peoplework.

Speaker 2 (07:09):
Yeah, that's where the legal waters get a little
bit murky.

Speaker 1 (07:12):
Okay.

Speaker 2 (07:12):
Because there isn't one simple test to determine if
someone is an employee or anindependent contractor.
Courts and agencies like theEEOC look at a lot of different
factors.

Speaker 1 (07:22):
So what are some of the things they look at?
Is it just whether you workfrom home or set your own hours?

Speaker 2 (07:27):
No, no, it's definitely more complex than
that.
One of the most importantthings they look at is the level
of control the company has overthe worker.
So, for example, if the companydictates the worker's schedule,
tells them how to do their job,provides all the tools and
materials Right, that pointstowards an employment
relationship.

Speaker 1 (07:45):
And it sounds like that aligns with what we know
about iTutor Group.
They set the schedule, theyprovided the lesson plans.
They even monitored thesessions by video.

Speaker 2 (07:54):
Exactly Does it sound very independent.
No, not really Okay.
So other factors they look atinclude whether the worker is
integrated into the company'soperations, the nature of the
relationship between the twoparties, how payment is handled
and whether the work requiresspecialized skills.

Speaker 1 (08:11):
Gotcha.
So it's really about the bigpicture, not just a couple
little details.

Speaker 2 (08:14):
Exactly.
It's about the totality of thecircumstances and in this case,
even though a tutor group saidthese tutors were independent
contractors, the EEOC argued andI think pretty convincingly
that the company had enoughcontrol over their work to call
them employees.

Speaker 1 (08:32):
Right.
And therefore bring them underthe protection of the ADEA.
So that's why they settled,even though they didn't admit
wrongdoing.
Right, they knew they wereplaying with fire.
Exactly though they didn'tadmit wrongdoing, they knew they
were playing with fire.
And this whole situation itkind of makes you wonder about
gig workers and how vulnerablethey are to discrimination.

Speaker 2 (08:50):
Yeah.

Speaker 1 (08:50):
Because if they're considered independent
contractors, they might not havethose legal protections.

Speaker 2 (08:55):
Absolutely.
That's a huge concern,especially with the rise of
remote work and these onlineplatforms.
It makes you wonder if our lawsare really equipped to deal
with the changing nature of work.

Speaker 1 (09:05):
It's not your typical nine to five office job anymore
.

Speaker 2 (09:08):
No, it's not.
Policymakers and legal expertsare definitely grappling with
these questions, as technologytransforms the way we work.

Speaker 1 (09:15):
I mean, are we even going to have offices in the
future?

Speaker 2 (09:18):
That's a good question, I don't know.
But we need to make sure thelaws keep pace to protect
workers from discrimination,regardless of where or how they
do their jobs.

Speaker 1 (09:28):
So it feels like this case is just the tip of the
iceberg.

Speaker 2 (09:30):
Yeah, I think so.

Speaker 1 (09:32):
What are some of the bigger implications for the
future of work?

Speaker 2 (09:35):
Well for one.
It shows how important it isfor companies to be really
careful about their hiringpractices, even if they operate
online or use remote workers.
They can't just hide behindalgorithms or automatically
label everyone as a contractor.

Speaker 1 (09:49):
Right, and for anyone out there looking for work
online or in the gig economy,it's a good reminder to know
your rights and to understandthe risks of being an
independent contractor.

Speaker 2 (09:59):
Absolutely.
This case also raises importantquestions about how technology
can both contribute to andpotentially help fight
discrimination.

Speaker 1 (10:06):
A double-edged sword.

Speaker 2 (10:07):
It is.
We saw how iTutor Groupallegedly used software to
screen out older applicantsRight, but we also saw how DHI
Group is using AI to try toprevent discriminatory language
in those job postings.

Speaker 1 (10:20):
So technology can be used for good or for bad, and
it's up to us to make sure it'sbeing used responsibly.

Speaker 2 (10:25):
I completely agree.
You know speaking of DHI andusing AI to monitor job postings
.
This brings up a reallyinteresting question.
Okay, could similar technologybe used to monitor other aspects
of the hiring process?

Speaker 1 (10:40):
You mean like analyzing application data,
interview transcripts, evensocial media profiles, to look
for potential bias?

Speaker 2 (10:49):
That's right.

Speaker 1 (10:50):
It's a fascinating idea.

Speaker 2 (10:52):
It is.

Speaker 1 (10:53):
But also a little bit scary.

Speaker 2 (10:54):
It does raise some serious questions about privacy
and the potential for thesesystems to be misused.

Speaker 1 (11:00):
For sure.

Speaker 2 (11:00):
But, on the flip side , if these AI systems are
designed and used responsibly,with transparency and fairness
in mind, they could beincredibly powerful tools in the
fight against discrimination.

Speaker 1 (11:11):
So it's a tough balance to strike harnessing the
power of technology withoutcompromising our values or our
individual rights.

Speaker 2 (11:20):
That's the challenge and the opportunity I think
we're facing.

Speaker 1 (11:23):
It is.

Speaker 2 (11:23):
Hopefully this deep dive into the iTutor group case
and the broader issues we'vediscussed has given you a better
understanding of the complexissues at play.

Speaker 1 (11:32):
It has.

Speaker 2 (11:33):
When it comes to age discrimination, employee
classification and the role ofAI in hiring.

Speaker 1 (11:40):
It feels like uncharted territory.

Speaker 2 (11:42):
It does.

Speaker 1 (11:42):
As technology keeps changing the way we work.

Speaker 2 (11:44):
It is definitely changing rapidly.

Speaker 1 (11:46):
But this case gives us some valuable insights and
things to think about as we moveforward.

Speaker 2 (11:51):
I agree.

Speaker 1 (11:51):
Wow, we've covered a lot of ground in this deep dive.

Speaker 2 (11:54):
We have.

Speaker 1 (11:55):
Age discrimination, employee classification, AI in
hiring.
It feels like everything's sortof converging in this world of
online work.

Speaker 2 (12:03):
Yeah, it really does.
The iTutor group case reallyjust highlights all these
challenges we face as our lawsand really our understanding try
to keep up with this digitalworld.

Speaker 1 (12:12):
It's like trying to fit a square pig in a round hole
.
Right, we're trying to applythese old rules to this new way
of working.
I mean, think about it.

(12:39):
The ADEA was specificallyinclude independent contractors,
or maybe even create new lawstailored to the gig economy.

Speaker 2 (12:46):
Yeah, those are definitely possibilities.
Updating the legal frameworkcould give us clearer guidelines
and stronger protections forworkers who don't fit those
traditional employment models.

Speaker 1 (12:56):
But legislation can be a slow process.

Speaker 2 (12:58):
It can be very slow.

Speaker 1 (12:59):
So, while we wait for the lawmakers to catch up, what
can we do?

Speaker 2 (13:02):
Well, the courts will continue to play a crucial role
.
They'll be the onesinterpreting these laws and
applying them to new situations.

Speaker 1 (13:09):
Right, like we saw with iTutor Group.

Speaker 2 (13:11):
Exactly as more cases like this go through the system
, they'll help establishprecedents and refine our
understanding of how these lawswork in the digital world.

Speaker 1 (13:20):
So it's a combination of legislation and court
decisions that will really shapethe future of work in this new
landscape.

Speaker 2 (13:26):
That's right, and it's not just up to lawmakers
and judges.
Companies have a responsibilitytoo.
They need to make sure theirhiring practices are fair, no
matter how they classify theirworkers.

Speaker 1 (13:39):
So what can they do?
What?

Speaker 2 (13:40):
are some steps companies can take.
Well, for one, they can be moreproactive about designing
algorithms and AI systems thatminimize bias.

Speaker 1 (13:47):
Right.

Speaker 2 (13:47):
Being transparent and thinking carefully about
potential problems is key Right,but even beyond that, they can
foster a culture of inclusivity,valuing diversity and equal
opportunity for everyone.

Speaker 1 (13:58):
It sounds like we all have a role to play in creating
a fairer future of work.

Speaker 2 (14:02):
We absolutely do, you know.
Before we wrap up, I want tocircle back to the idea of using
AI to monitor for bias inhiring Right.
It's a complex issue and it hasa lot of potential, but it also
raises ethical questions.

Speaker 1 (14:15):
You know I'm still a little unsure about that.
I see the potential benefits.

Speaker 2 (14:18):
Yeah.

Speaker 1 (14:18):
But the idea of AI scrutinizing applications,
interviews, even social mediaprofiles feels a little too big
brotherish.

Speaker 2 (14:27):
I understand your concerns.
It's definitely something weneed to approach carefully.
We know that AI can sometimesperpetuate existing biases if
it's not developed thoughtfullyRight.
But if we can build AI systemsthat are transparent and
accountable and designed withfairness as a core principle,
then they could be reallypowerful tools to fight

(14:47):
discrimination.

Speaker 1 (14:48):
So it's about finding that balance.

Speaker 2 (14:50):
It is.

Speaker 1 (14:50):
We want to use technology without sacrificing
our values or our rights.

Speaker 2 (14:55):
Exactly that's the challenge and that's the
opportunity.
We've got a lot to think about.
Hopefully, this deep dive intoiTutor Group and all these
broader issues has given you abetter understanding of what
we're dealing with.

Speaker 1 (15:05):
Oh, it definitely has .
It's a reminder that the oldrules don't always apply in this
new world of online work.
They don't, and it's a call toaction for everyone lawmakers,
judges, companies andindividuals.
We need to be aware, thinkcritically and work together to
make sure there's fairness andopportunity in this digital age.

Speaker 2 (15:23):
I couldn't agree more .
The future of work is beingwritten right now, and it's up
to all of us to make sure it's afuture where everyone can
thrive.
Thanks for joining us on thisdeep dive, until next time.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy And Charlamagne Tha God!

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.