All Episodes

January 14, 2025 35 mins

James Mackey and Steve Bartel explore how AI is transforming high-volume hiring. Learn how companies can AI to enhance sourcing, streamline processes, and navigate compliance while balancing innovation with human oversight. 


Thank you to our sponsor, SecureVision, for making this show possible!


Our host James Mackey

Follow us:
https://www.linkedin.com/company/82436841/

#1 Rated Embedded Recruitment Firm on G2!
https://www.g2.com/products/securevision/reviews

Thanks for listening!


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, welcome to the Breakthrough Hiring Show.
I'm your host, james Mackey.
We've got Steve Bartell backwith us today.
Founder and CEO of GEM.
What's going on, steve?

Speaker 2 (00:06):
Hey, it's great to be here.
Thanks for having me, James.

Speaker 1 (00:09):
Yeah, of course, looking forward to today.
As always, we have a lot of funrecording.
So, to start us off, we wantedto talk about AI applications
and hiring, as you put it, likewhat's real and what's not, like
what's actually driving value,I think, particularly when LLMs
came out.
I guess ChatGPT.
It's been a couple of years,right?

(00:29):
Has it been over two years?

Speaker 2 (00:32):
Yeah, it's been probably just a little over two
years now.

Speaker 1 (00:39):
That's wild.
Time doesn't even make sense tome anymore.
I can't tell if something'sbeen like nine months or two
years.
It's just all blur In tech.
It's just like everything movesso fast, right, yeah, I guess it
has been a couple of years aswell, but anyways, I think we
saw initially, you saw a fairamount of companies get funding
or startups pop up, even ifthey're bootstrapped with some
use cases that probably weren'tas valuable as folks maybe
initially thought they would be,and you see some of those

(01:02):
companies just disappear, right.
Really, they might have evenbeen just more of like a product
feature, not even necessarilylike a full point solution,
certainly not a platform likeJim is creating.
So I think that this is a greatplace to start.
Let's talk about different AIuse cases and hiring and try to
share some insight with ouraudience on what we feel is most

(01:25):
valuable versus what is notreally going to help companies
achieve their goals, and itmight be more of a headache to
implement than it's worth, orreally is more of just a very
small feature that should justbe nice to have in terms of
associated with purchasing abigger product.
So maybe you could start us offhere.
How do you?
What are your thoughts on thistopic?

Speaker 2 (01:48):
Yeah, absolutely, and this is like a super timely and
important topic in terms ofwhat's real and what's not with
AI, because so many folks arelooking to deploy AI, they've
realized that the technology hasgotten a lot better, but it's
not entirely clear, like, whichuse cases are real and which
ones aren't.
And, to your point, like,there've been a lot of things

(02:11):
that have looked prettypromising but maybe have fallen
short, but there's also dozensmore that are popping up every
single week, right, and so it'sreally hard to separate what's
real from the noise.
Maybe, zooming out a little bit, a lot of industries, when
there's major tech disruptions,they go through what Gartner

(02:33):
calls the hype cycle.
Have you heard of this one,james?
The hype cycle.

Speaker 1 (02:38):
I have not heard of that.

Speaker 2 (02:39):
Essentially it's this idea that markets enter four
different phases when it comesto really disruptive technology.
And the first phase is what'scalled the peak of inflated
expectations and that's whenexcitement's running really high
.
People are really excited aboutall the different use cases but
aren't really sure which onesare going to be real or not.

(02:59):
It isn't really backed byactual customers having driven
value over an extended period oftime.
So that's where we are today.
In my mind it's hard to separatewhat's real from what the noise
is and unfortunately almostevery major technology
disruption from there enters thetrough of disillusionment.

(03:20):
Disillusionment because buyersand the market enter this phase
where all of those loftyexpectations and they've been
sold dozens of different thingsfrom all different angles those
expectations aren't met andactually the market perception
about this new technology pivotsto be overly negative compared

(03:40):
to that overly positive hype.
Inflated expectations part ofthe cycle.
Gradually things start torebound it's called the slope of
enlightenment.
As the products get better, theuse cases get more defined, the
industry really understandswhat's real and what's not, and
then things eventually level offin a much better place at the

(04:01):
end of all this.
But I would say we're at thatpeak of inflated expectations
moment where people are soexcited about the possibilities
but it's not 100% clear which ofthe use cases will be real or
not.
Does that kind of resonate interms of how these markets go
through cycles?

Speaker 1 (04:21):
That definitely resonates with me.
It's nice to hear that laid out.
It sounds very intuitive and itmakes sense.
I don't know if I really spenttime to think through that, but
yes, it's certainly aligned withwhat I'm seeing on the market.
I am seeing at this point,teams really start thinking

(04:41):
they're really consideringimplementing AI into their
workflows in 2025.
So I am starting to see more ofthose conversations occur in
the past few weeks or so,talking with teams as they're
doing planning.
So maybe we're entering thatnext phase where it's the slope
of enlightenment.
I don't know, maybe not, Idon't know, but we're starting.
I am starting to see folksovercorrected and I think we're

(05:02):
oh, ai is really maybe not goingto make a lift.
Or this folks overcorrected andI think we're oh, ai is really
maybe not going to make a lift,or this is really just about
it's just note-taking.
It seems like all we're doing isorganizing feedback and
summaries and I think,particularly recruiting people
are just getting a little bitlike okay, this is just another
AI note-taker, essentially, butthat was basically it.
They just feel like the marketwas flooded with that kind of
like content creation, which, ofcourse, like has its place, but

(05:23):
it's not like that.
It was just flooded with that.
But now I think, yeah, peopleare starting to dial in and some
of the top organizations maybenot across the entire market,
but people are starting toidentify the use case that makes
sense for their business.
Maybe that's what also createslike it's more of a nuanced
conversation.
Is that and I think, what wecan get into our products as
well later on based on for us interms of like our customers, or

(05:45):
based on the business the thesame use case may not be
valuable for two differentcompanies totally so I think
it's also been like a learningcurve on not like only which in
case are real, which use case isactually valuable for an actual
individual business, whichthere is a lot more nuance than
maybe folks, including includingmyself maybe realized.

(06:06):
I thought, okay, what are thetop like?
Where's going to be the mostdisrupted place across every
type of company?
And I don't think that'sactually the case anymore.
I think the top use case for aSaaS company startup growth
stage is going to be verydifferent than a top use case
for a staffing company.

Speaker 2 (06:22):
Yeah, totally agreed.
Yep, yeah.
For a staffing company yeah,agreed, yep, yeah, a hundred
percent, because these differenttypes of customers need
different things and care aboutdifferent things.
Yeah, so it's interesting interms of what's real and what's
not.
I think that there's just thisproliferation of tons of
different companies doing thingswith AI.

(06:43):
I've tried to like think deeplyabout if, as a buyer, like how
could you understand at a glancewhether something's going to
work or not?
I have a few litmus tests thatI think are really helpful.
So the first is the thingthat's gotten a lot better with
LLM technology is anythingtext-based, and so there's a lot

(07:05):
of applications actually a lotof really exciting ones, in
recruiting as it relates tothings that involve text.
The first and most obviousplace is resumes right, because
at the end of the day, likesomebody's resume, or like
online profile, their experience, their education, history, the
things they've worked on, likeall of that is just text, and so

(07:25):
there's a ton of really greatapplications on top of resumes.
We can dive into those in a sec.
But I think the other placethat's just really rich when it
comes to text and like humanlanguage is both like the
interface between companies andcandidates as it relates to the

(07:46):
messages that are sent, but alsolike maybe even more excitingly
the conversations that arebeing had, and so I do think
that's why a lot of the AI notetakers have really spun up and
they are driving real value.
But I do think, taking that astep further, there's probably
something pretty interesting tobe done in the conversational

(08:09):
interface between recruitingteams and candidates, maybe even
assessments and things likethat in terms of use cases being
real.
So the interesting thing is, ifa use case in recruiting
doesn't involve, like, naturallanguage or text or things like
that, there actually isn'tanything that's fundamentally

(08:30):
changed about that technologythat would make other AI
applications better 10x betterthan they were two years ago.
Does that make sense in termsof a easy litmus test for which
use cases could be real?

Speaker 1 (08:42):
Yeah, yeah, for sure, cool.
What I wanted to dial in on,too, is I think you mentioned a
product or a feature that yourecently have released at GEM
and I think that's incorporatingis that it was a product
release that you incorporatingAI as well.
Right, which one?

(09:03):
You talked about a couple,which one was the first one we
discussed?

Speaker 2 (09:06):
Yeah, totally so.
We have three awesome newproducts leveraging LLMs and the
next generation of AItechnology, and it's all focused
on like use cases that have todo with the resumes, and
essentially what we've done iswe've built this matching and
ranking layer, leveraging thenew LLMs, where recruiters,

(09:30):
hiring managers, folks can enterthe criteria that they care
about for a role and then we canuse the new LLM technology to
match and rank resumes based onthat criteria.
And we're applying that to afew different really valuable
use cases, and that's wherethose three different products

(09:51):
come in.
But all of it is built on thatsame underlying matching and
ranking technology, built on topof the new advancements in LLMs
.
And so the first product thatwe launched it was back in the
summer was around AI sourcing,so you could tell us what
criteria you care about for arole and then a sourcing bot
will get to work for you,scanning the hundreds of

(10:12):
millions of public profiles outthere and surface folks that
could be a good fit for thisrole that you opened up, super
complimentary to all of theamazing sourcing and CRM
technology that gem had alreadybuilt, so it felt like a really
natural place for us to start.
Then we took that sametechnology and we applied it to

(10:33):
your inbound applicants.
So, again, tell us theobjective criteria that you care
about for a role.
But better yet, we'll actuallyingest your job description.
We'll ingest your spot doc thatmaybe you've collaborated with
a hiring manager on, and we'llcreate that criteria for you as
a first pass, using our bestpractices embedded directly into
the AI, and then AI will helpyou identify the folks that are

(10:59):
applying that might be the bestfit, so you can get to them
first.
And this is really impactfulbecause Something like 20% of
our customers have thousands ofapplications for a single role
and we've seen inboundapplications year over year
increase more than 3x across ourcustomer base, which is a

(11:19):
tremendous amount of inboundthat companies are dealing with,
especially given the fact thatso many teams these days are
being asked to do more with less.
Given the fact that so manyteams these days are being asked
to do more with less, the thirdthing that's coming next in the
next few months for us is thatsame AI technology applied to

(11:39):
candidate rediscovery on top ofyour ATS and on top of your CRM,
and when you combine all thesethree things together, the
vision is you can for every roleyou open up, enter that clear,
objective criteria.
Once AI will get to workimmediately on your current
inbound applications for thatrole, then, as soon as you've

(12:01):
exhausted that inbound queue,it'll go and find the people
that could be a good fit fromyour CRM and your ATS.
And then, finally, if you stillneed help filling that role, an
AI sourcing bot will get towork for you to go find external
candidates.
And that's how all these thingscome together.
The great thing is you justconfigure that criteria once it
works across all the differentchannels and, yeah, I think it's

(12:23):
going to have a huge impact interms of helping companies fill
those roles faster and with themost qualified candidates
referrals faster and with themost qualified candidates.

Speaker 1 (12:32):
Yeah, that's pretty cool, right, and the one that
resonates the most with me isthe inbound applicants.
We are growing and that'sexactly so.
As we've talked about, offline,I'm founding a recruiting tech
company called June, andessentially what June does is it

(12:52):
helps companies hire faster byessentially screening all
inbound applicants, so it'sturning high volume applicant
pools into qualifying candidateshortlists, essentially
primarily in high volumeenvironments staffing companies
as well, as we think there mightbe application for large
enterprise companies and retailconsumer goods manufacturing
Anywhere where you see highvolume inbound applications I
probably could leverage thissolution.

(13:13):
But really dialing intostaffing and we definitely see
that as an area where there's anopportunity for disruption,
just given again, like theamount of inbound applicants,
the fact that a lot of companiessimply just can't get to review
all of them.
So I like that use case a lotand I've been hosting on the

(13:36):
show this AI for hiring seriesover the past several months
speaking with folks that arebuilding different AI use cases
right, and this one inparticular, managing inbound
candidate flows and reallymaking the most of that and
giving everybody opportunity toevery candidate to be heard and

(13:58):
evaluating in a consistent,unbiased way is really exciting,
and it's just thinking about,like the amount of there's just
sheer number of hours that itwould take to screen every
inbound applicant that wouldmake it impossible.
Or the sheer number ofrecruiters or sourcers that it
would take, the cost associatedwith that.
It just isn't possible.

(14:20):
And particularly for highvolume candidate pools, a lot of
times the high volume jobs,depending on the rules, folks
may not typically have there.
It may not be clear on theirresume whether or not they're a
fitter or not.
They may not have a resume thatoutlines everything that they
have experience with.
And then there's other peoplewho stack the resume with
keywords yeah, yeah, that's whatabout june?

(14:43):
what we're building is it getsto the bottom of that, where
it's like it clarifies and thenhiring teams can look at, really
dial into the resumes of folksthat are essentially already
qualified.
But yeah, I think it's a reallyinteresting use case in general
, like the inbound applicationsand how to leverage AI to make
that a much better experience.

Speaker 2 (15:00):
Yeah, totally, and if I can just expand on that, it
really sounds what we've builtis a lot more focused on the
in-house recruiter pain pointsfor knowledge worker hiring,
tech hiring, large enterprisehiring.
What you're focused on soundsuniquely valuable, both for
higher volume roles but also forstaffing agencies, and I think

(15:24):
one of the key distinctions isit sounds like with JEM, what
we're doing is we're helpingidentify the folks from the
resume that could be thestrongest fit so that recruiting
teams can go focus on thosefolks and for knowledge work
hiring, for tech hiring, forlarge enterprises that are

(15:44):
hiring a bunch of knowledgeworkers, a lot of those resumes
are pretty built out, so youhave a lot of the information
you need, but also the actualassessment of whether that
person's a good fit today atleast, might be better handled
by a human hopping on a call,Because when you're trying to
understand and assess thosecandidates, A recruiting tends
to be like way more high touch,but also understanding the

(16:07):
complexity of their backgroundmight just be a little bit more
nuanced.
For some of these knowledgeworker roles, tech roles, Now
for higher volume roles, it doesfeel like there's a big
opportunity to take that a stepfurther and leverage AI in the
actual assessment piece which itsounds like y'all are focused
on, which makes a lot of senseto me makes a lot of sense to me

(16:31):
.

Speaker 1 (16:31):
Yeah, I think it's like it's applicant pools, where
you have thousands of peopleapplying and not necessarily the
best, most clarity into resumes, but also there are industries
out there and recs out therewhere literally 75% of
applicants are not even beingviewed.
And so it's, and I the other.
I love the use case forstaffing, specifically because
it puts June in a revenue budgetbucket versus often in talent

(16:56):
acquisition, hr tech.
We're put into this considerthis like cost bucket of.
We have to fight our way out ofthis perception that it's just
like this cost center, thisoverhead center, and explaining,
explaining the value of peoplebeing your biggest asset and the
biggest investment that you'remaking.
And so when you're making, inmost cases for our customers,

(17:18):
literally eight-figureinvestments and payroll and
people, this is a massiveinvestment that you're making
and you're wanting the best ROI,you should have the best
recruiters and the besttechnology.
And so we make those argumentsand folks get it.
But what's cool about June is,again, it's when staffing
companies make more placements,they make more money, right.

(17:39):
So I love that.
I love that for June because wehave been talking with staffing
agencies and the ones that arereally interested.
They operate in industries thatare high volume right.
Where they just don't, it'sjust not possible for them to
quickly look at all applications, or it's not possible for them

(18:00):
to look at resumes and trulyidentify who's worth having a
conversation with, or they're ina space where it's difficult to
schedule, there's a lot ofrescheduling, and then on the
first five minutes of thescreening call you ask your
knockout questions and yourealize why am I like?
Okay, now I got to jump offthis call.
So it's for those types of usecases.
People are really jumping on it.

(18:20):
It's been pretty easy honestlyto engage with staffing leaders,
explain the value and get themto move forward, and I think
that's our primary focus on ago-to-market.
But I do think some in-houseapplications again where it's
like really high volume but alsoso I don't necessarily know.
That's one thing actually weshould talk about from a product

(18:41):
strategy perspective is howdialed in to be in the staffing.
But I think that is our focus.
But who knows, I could see italso for some in-house teams if
it's like really high volume,entry level, potentially blue
collar roles right.

Speaker 2 (18:53):
Totally and I could see it for both.
But I do love the angle thatyou're taking with staffing.
To start, because those folksuniquely care about the bottom
line and, if you can, like yousaid, maybe 75% of these folks
aren't even getting reviewed.
Percent of these folks aren'teven getting reviewed and it's

(19:14):
not obvious which ones of thosecould be really killer fits,
because so much of the time theresumes are sparse so you don't
even have much information to goon as to which folks you should
even be spending time withright.
So this vision of every singlecandidate could get an AI
assessment or an AI interview,whether that's over text or
whether that's over AI voice.

(19:34):
In the long term it feelspretty compelling to me for the
high volume use case whereresumes are sparse you don't
always know about somebody'sbackground until you hop on a
call and ask these questions.
But also the types of questionsthat you need to ask tend to be
a little bit more well-defined,scoped, and especially for
staffing agencies.

(19:55):
If you could just help turn asmall percentage of that 75%
that aren't being touched intoplacements, that's a ton of
revenue for them.

Speaker 1 (20:05):
Yeah, and that's it's really exciting is that it's
like directly tied to revenuegrowth and actually healthier
margins in general.
I think from a go-to-marketperspective, we're going to be
focusing more on top line growthand value creation from that
perspective.
But the reality is thatservices companies are looking
at EBITDA in terms of valuation.
It's not just top line growth.
In fact, top line growth isyou'd rather have a company at 5

(20:27):
million that generates amillion in EBITDA than a company
at $10 million that generates$1 million in EBITDA.
It's the smaller the topSometimes it's so it's just, of
course I guess that's anycompany.
But the emphasis on services isreally looking at that margin
and the consistency of thatmargin.
And so what's cool is that froma cost perspective we can make

(20:48):
this like 10x cheaper to do tomanage all these screenings.
It's so much cheaper thanhiring screeners or like at that
entry level for the blue collaror high volume searches.
There's still a need for arecruiting team, but a little
bit further down the funnel,once folks are qualified, it's
decreasing that cost.

(21:08):
It's increasing revenue.
So it's creating a really nicespread for staffing companies,
which traditionally have reallylow margins.
And again, just because astaffing company has a lot more
revenue doesn't necessarily meanthat they have a higher
valuation or that they'reactually making more money,
because the more you grow astaffing agency, the more
overhead you have likesignificantly more, because

(21:32):
people are delivering thesolution People are managing.
There's a lot of managementoversight that needs to occur,
and so it's just very heavy interms of org charts and
expanding your leadership andmanagement team.
It's just.
It's so cost burdensome thatstaffing companies really have
to look at how to limit costs asmuch as possible.
So I love it.

(21:52):
It feeds into both of thoseneeds, right, where talking
about budget and cost isprobably, I think, more
important to staffing asdecision makers, maybe even more
so than SaaS, right?

Speaker 2 (22:04):
Just because of the interest yeah, totally James.
I'm curious.
I'm sure y'all run into this alot.
This is certainly top of mindfor our customers as it relates
to AI.
Everybody realizes they need it.
They're starting to figure outwhich of these use cases are
real and which ones aren't.
The next thing that's on abuyer's mind is the shifting

(22:26):
regulations and compliance.
How do you think about that?
What do you see as happeningthere?

Speaker 1 (22:32):
Yeah, so we actually had a guest on the show a couple
of episodes ago, wes Windler,and he's the founder and CEO of
Woven, and Woven is actuallyinteresting because it's a
combination of technology andservices and it actually sounds
like it's somewhat servicesheavy to deliver on their
solution.
But what I found interestingabout Wes is that he had this.

(22:55):
He thought he had aninteresting take on how to
navigate regulation as a company, incorporating AI into the
hiring process and for productsand for potential customers as
well, and basically what he wassaying is that you know, we have
to get clear on how we defineevaluation and AI evaluations,

(23:18):
because he's saying there'snuance in the definition.
There's nuance in thedefinition.
We have to really be clear onthat, because is it considered
evaluation if the hiring teamputs together the role
requirements and all the AI isdoing is matching somebody's

(23:39):
background to role requirements?
Is that considered evaluation?
Or is evaluation only if ahiring team says I need to hire
an engineer?
An AI puts together a jobdescription and says here's what
you need, right, here's it'sdefining the role requirements
and then doing kind of the matchbased on the requirements that
it defines itself, and so that'slike the interesting nuance.
What are we actuallyconsidering?

(23:59):
Ai evaluation and, from aregulatory perspective, how that
kind of nuance might impact howproducts are built.
I thought that was aninteresting conversation.

Speaker 2 (24:08):
Yeah, and it's a really interesting distinction,
as I'm just thinking about whatyou're building in your use case
.
Companies a lot of the timehave very simple like knockout
questions, especially for likehigher volume roles where you
just might need a certaincertification to like actually
just do the role, and socompanies require that you might
need a very specific skill, andif somebody doesn't have that

(24:31):
on their resume and y'all arehelping companies understand
whether they match those verybasic criteria that are defined
by the hiring team throughconversational AI or like an AI
interviewer, you're not actuallydoing assessment in that world
necessarily.
You're just like pulling outwhether they match these

(24:55):
criteria that the hiring team'salready defined Right, and so
it's, at least to me,intuitively it should feel, it
should be good.
It is a little bit of a grayarea, but yeah, I can totally
see where you're going with thatone.

Speaker 1 (25:08):
Yeah, it's a, it's a nuance and being on the same
page on a definition to makesure everybody's speaking the
same language on what we'reconsidering assessments and what
we're considering an AI drivendecision.
Is AI making a decision?
If it's just taking rolerequirements, in June's case,
before we had really dialed inon the staffing high volume use

(25:28):
case, we're like, okay, maybe itwould be valuable for somebody
to just say, hey, I really needto hire an engineer for my
startup.
This is a project scope.
And June pushes out like a jobdescription, comes up with
custom questions that could beused for screening the applicant
.
And we got away from thatbecause we're like, look, we're
working with experts.
Our customers know what theywant to ask, they know the role

(25:55):
requirements.
They're not looking for a toolto teach them those things.
They already know those thingswith a lot more clarity and
expertise than an LLM has, right?
So now it's the shift is okay,the customers are giving us the
role requirements.
They're giving us the questions, the knockout questions.
Right, that they want to haveasked the questions where it's
in five minutes whether youshould continue on the call, on

(26:15):
the screening call, right, likeeliminating all of those
conversations where you get offin five minutes, like if there's
a way and that's best foreveryone, it's best for the
candidates and the recruiters.
There's just no point in havingthose conversations if there
isn't like a high level fitright, and I do.
I think that what I'm seeing isthat I think, ultimately,

(26:35):
evaluation is going to be when,if AI is defining what is
important.
That feels like evaluationversus matching.
It's essentially just moresophisticated and fluid matching
, like with an LLM, if you'rejust pulling role requirements
from a hiring team, versuscoming up with what is going to
make somebody successful in arole.
And you actually had a goodpoint too in terms of being able

(26:58):
to track how LLMs actually cometo conclusions and how that's
different from previousgenerations of AI.
So maybe you could talk aboutthat a little bit too.

Speaker 2 (27:08):
Yeah, totally.
I think one of the cool thingsabout how we've built our
matching and ranking leveragingLLMs is, first off, just like
you're saying, recruiters andhiring teams have full control
over what the criteria are basedon their intimate knowledge of
the role and what they'relooking for.
And so they enter their five to10 pieces of criteria and they

(27:31):
can even say, hey, these aremust have pieces versus nice to
have, so we'll score those more.
And then all the LLM is doingin the background is looking for
evidence on the resume thatthey match those criteria and
coming up with a score based onhow much of the criteria they
match.
The cool thing about that isall of that is transparent to

(27:52):
the end user.
So when we find those matchesfor you, we'll tell you this
person is a 0.8 match and here'sthe breakdown of exactly the
criteria they match.
And we can even point to whereand why they match on the resume
.
And so that still givesrecruiters full control over the
criteria that they're inputting, gives them full visibility

(28:14):
into the algorithm and why it'smaking the decision it is.
And then if, for whateverreason, the recruiter assesses
that this person isn't the rightmatch or actually these other
people are a stronger match.
They can then use thatinformation to go adjust the
criteria and make it betteragain, giving them full control,
I think, taking that a stepfurther.

(28:34):
One of the cool things withLLMs is we can actually give you
advice on the criteria and flagcertain criteria that might be
more biased, that recruitersmight not even think about.
So we can build some bestpractices, run the criteria that
the recruiters are enteringthrough those best practices and
then have some guidance on theside as they're setting up their

(28:57):
criteria.
So ideally we even do betterthan what a recruiter would do
on their own.

Speaker 1 (29:02):
That's really cool than what a recruiter would do
on their own.
That's really cool.
So that's actually not a usecase like that I've specifically
heard is pulling in morerequirements from recruiting
team and actually creatingaligning that with a list of
best practices that I'm assumingyou're providing to the LLM
saying, hey, make sure to lookfor these types of things and
whatnot.

(29:22):
That's really cool.

Speaker 2 (29:23):
Yeah, exactly.
So two examples.
We can ask LLMs does thisspecific criteria have the risk
of introducing bias and givesome feedback as to why?
Now, obviously, a recruiter isstill in the final control of
what they put in criteria.
I think that's still reallyimportant, but at least we can
nudge them in the right ways ifthere might be introducing some

(29:43):
risk there.
We can also do things to helpthem just set up more successful
searches.
For example, if they entercriteria that could be
subjective in nature instead ofobjective, based on someone's
resume, like good collaboratoror something like that.
Like how would anybody be ableto tell that from a resume?
Something like that?

(30:05):
Like how would anybody be ableto tell that from a resume we
can guide them towards?
Hey, that's probably betterasked and dug into in that
initial recruiter screen than asa criteria that you input for
matching and ranking.

Speaker 1 (30:15):
Yeah, yeah.
And one other point and we'rerunning out of time here, but
one other just example of howuse cases can differ
significantly based on the typeof organization.
When you look at, I think thatthere are a lot more than AI
note takers.
I think it's like interviewintelligence is what it's being
called at this point.
So you have companies likeBright, higher and Pillar out

(30:37):
there that essentially are theAI co-pilot component.
They do a lot more than that,of course, but they're
essentially recording theinterviews on Zoom and they're
putting together summaries andmaking sure that any kind of
custom questions in the ATS areessentially being addressed and
flagging if things are notaddressed in an interview, and

(30:58):
it's really making sure that theinterview process is thorough
and rigorous and consistentacross the entire candidate base
.
And for instance, like anexample of how the use case can
be significantly different isfor a SaaS company who's hiring
for an enterprise accountexecutive or a full stack
engineer.
In a third round interview theymight have four candidates
they're considering.
You don't need AI doing aninterview evaluation when you

(31:22):
have a small candidate pool,like where you need the AI
evaluation or matching orwhatever you want to call.
It is at the top of the funnelright when you have thousands of
applicants.
So it's just again like what'sreal and not, and then also the
nuance to that is what'sactually impactful for your
business and the problem thatyou're trying to solve.

(31:42):
So there's some cases where whatcompanies need is like a bright
hire.
Other cases they need help withinbound applicants and
sometimes it might be more so onthe resume side.
Sometimes it might be more onlike actually conducting screens
, and it's just cool to see.
I think buyers are becomingslightly more sophisticated in
terms of understanding what'sgoing to make sense for their

(32:04):
business, which is pretty cool.
I haven't the conversationsI've been having over the past
month or so or a lot.
Buyers are becoming a lot moresophisticated.
There's still a big gap and alot of folks maybe aren't, but
I'm starting to have more ofthose conversations where people
are really dialing in on whatmakes sense for their business,
which is pretty cool.

Speaker 2 (32:22):
Yeah, totally.
And yeah, just to maybe roundthat out all out, I do agree
that, depending on A the type ofcompany you are, but B where
you're looking to addressproblems in the funnel, you need
a different approach and I dothink, like the down funnel,
once you get to the actualonsite interviews, humans
interviewing candidates that'swhere the and especially for,

(32:44):
like knowledge worker roles,that's where, like the right
hires, the meta views of theworld, I would add that to the,
that's where they can reallyshine at the top of the funnel,
like matching and ranking, butalso AI assisted assessments
like what you're talking about.
Those feel like where those usecases really shine and
especially on the higher volumeside, for the full, like AI

(33:05):
assessments like what you'retalking about.
And so I do agree that there'sthis nuance to what type of
customer, at what point in thefunnel are they looking to
address a problem and like fromthere, what's the best
application of AI?
Yeah, for sure, for sure.

Speaker 1 (33:21):
As usual, we're right up on time.
We go right up to the top ofthe hour, steve.
I sure For sure, as usual,we're right up on time.
We go right up to the top ofthe hour, steve.
I just wanted to say thank youso much for joining me today on
the show, and it's always a lotof fun.

Speaker 2 (33:32):
Likewise Thanks for having me.

Speaker 1 (33:33):
Yeah, of course, everybody tuning in.
Thank you for joining us andwe'll talk to you real soon,
take care.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.