Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
James Mackey (00:00):
Hey, welcome to
the Breakthrough Hiring Show.
I'm your host, james Mackey.
We got Steve Bartel on the showtoday.
Founder and CEO of GEMS.
Steve, welcome back.
Steve Bartel (00:07):
Thanks, great to
be here.
James Mackey (00:09):
Yeah, it's great
to have you.
It's always a lot of funrecording with you.
I guess, just to start us offhere, I guess, like the last few
times you've come on the show,we've really dialed in to GEMS
product, which is a littleunusual.
We like with a lot of CEOs thatbring on it's not necessarily
so product centric in terms ofconversations, but it's just.
There's a lot going on with Jimright now over the all-in-one
(00:30):
product suite that you'redeveloping and rolling out and
how you're incorporating AI andthe applicant tracking system
functionality, and there's justa lot of cool stuff that you're
doing.
So we've slowed down on that, Ithink, for today.
We I'm sure Jim's going to comeup, because you're going to be
talking about Jim we're going todo an X, y and Z.
That's going to happen.
But I think, just talking abouthow AI is currently being
(00:53):
leveraged by recruiters andhiring teams and hiring managers
as well, we can talk a littlebit about current functionality
that recruiting technologycompanies have incorporated into
their products and then I thinkwe could talk about the future.
There's also a pretty big debatein terms of how AI should be
leveraged and I think I actuallylike when I was texting, there
(01:16):
was a couple of poor things here, like we talked about the role
of AI in terms of enablingrecruitment versus replacing
recruiters in certain stages ofthe interview or hiring process,
and then AI's role in terms ofevaluation.
Should AI just be essentiallyleveraged to package data and
neat summaries and also to helpidentify gaps or thoroughness in
(01:39):
evaluation, or should AI beaccountable for, to some extent,
stack ranking or grouping topapplicants out from the rest and
essentially provide some of thetop applicants for, whether at
like the resume stage or downfunnel to essentially shortlist,
if you will, applicants orcandidates rather for the roles?
(02:01):
So yeah, those are just acouple of things that come to
mind for me.
Are there any other kind oftopics or use cases or anything
that are a little bitcontroversial from your
perspective beyond those two?
Steve Bartel (02:12):
No, that makes a
lot of sense, and I think
there's even a third distinctionbetween AI that's just doing
like summarization, maybeputting together packets, versus
AI that's ranking and maybebubbling up the most compelling
candidates, versus AI that isactually making hiring decisions
and maybe automatically doingso, and so we could talk about
(02:35):
the three differences there andmaybe where talent acquisition
leaders should be a little bitmore careful.
James Mackey (02:41):
Yeah, let's do it.
Can we just start talking aboutAI's role in potentially
replacing recruiters in certainparts of the process?
Because from my perspective, Idon't know.
I think there's some inherentbias if you're talking to folks
in talent acquisition, in termsof AI's ability to potentially
replace recruiters at certaininterview stages, for instance,
(03:03):
or certain parts of theevaluation process, and so I'm
wondering I don't even knowwhere to start.
I could throw out a couple ofrecent conversations I've had on
the topic to start us off, butI don't know if you've had any
recent conversations surroundingthis.
I guess let me just provide someclarity.
Right, you have like interviewintelligence platforms like
RightHire and Pillar, forinstance, or just a couple,
(03:24):
which are essentially like AIco-pilots that are integrated
with your Zoom and essentiallythink about a Fathom note taker,
but it's trained forinterviewing and it's
essentially matching list ofrequirements and taking the
candidates answers on the videocall that they have with the
recruiter, packaging that datato show hiring teams like how
(03:48):
essentially folks answered,highlights or whatever from the
interview, matching theevaluation to see if there's any
gaps.
Hey, you said you need to knowthe salary range that they're
targeting, but you didn't coverthat in the interview.
So you got to ask that in thenext one, like stuff like that.
It's like packaging data that'sstaying away from evaluations.
But then you have otherproducts that actually are taken
as far as stack rankingcandidates.
(04:10):
We're seeing a little bit moreof that top of the funnel right,
Like with resume matching.
I see application for that downfunnel too, so I'm just curious
to get your thoughts on that.
It's a lot of context, right,but that's the conversation I
was hoping to have with you atleast to start.
Steve Bartel (04:29):
Yeah, that makes
sense.
So my perspective is that we'revery far away from AI replacing
recruiters.
I think AI, though, has thepotential to really evolve the
job, the role of recruiters,whether they're in-house or
agency, and I'm actually prettyexcited about what that means
for recruiters and for theindustry, because I think, in my
mind, ai has the opportunity toautomate a lot of busy work
(04:51):
that goes into the role of arecruiter and allow everybody to
show up more strategically.
Use cases that you talked aboutaround summarizing and
transcribing calls, so thatrecruiters don't have to focus
on taking meticulous notes thewhole way through and then
(05:12):
taking a first pass at puttingtogether a summary of that
conversation.
Now, that still feels very muchlike a co-pilot use case where
the recruiter's in full control,and I think that's great, and I
think all AI, in my perspective, should be thought of as a
co-pilot experience, and whatthat allows recruiters to do is
(05:34):
be a lot more present on thesecalls and not have to be taking
furious notes the whole time andpotentially save up a little
bit of time with the write-up.
But I still think it's reallyimportant for recruiters to look
at that write-up, especiallythe summary, and make sure that
it matches what happened,because even if the AI is not
(05:55):
actually making a hiringdecision, in that case it could
still influence hiring decisionsif it's helping to contribute
to a scorecard, for example andso I think it's super important
for recruiters to pay reallyclose attention to that and see
it as a time saver, but notnecessarily as like the thing
that's making the evaluation.
James Mackey (06:15):
Yeah, and I guess
like maybe a way to put it is
like making the evaluationeasier and that's like a sliding
scale.
So I'm thinking about it interms of like not AI making a
final decision, but I do thinkthat a lot of the enablement
tools, I feel like there couldbe, in some cases, more
aggressive steps taken thansimply just doing co-pilot work.
(06:37):
I think like some of thefunctionality I'm seeing too for
interview intelligenceplatforms is doing some
analytics Like think about likeGong, for instance, right
Revenue tool where it's the callrecord how much time do you
speak during the call Just somedifferent analytics around that.
So we're seeing some of thatwhich interview effectiveness
and providing feedback to thehiring team on doing a better
(07:00):
job.
Totally, I think it is prettycool too, but okay, but we get
to evaluation right.
Like I think the most clear cutor one like we could just start
top of funnel maybe work ourway down is resume matching
right, like for the AI forhiring series.
On the show we had theco-founders of a company called
Brainerd come on, it's actuallyon the LinkedIn post I tagged
(07:21):
you on today one of thecompanies and it's they.
Essentially it's like resumematching, but it was a lot more
than I thought it was going tobe.
It was pretty pretty in depth,but essentially there's there's
certain industries right, likejust even take like light
industrial, which apparently islike a CEO of a founder and CEO
of a company called Qual whocame on the show, so it was like
(07:41):
50% of job openings openings orsomething crazy.
Like in the United States, likea lot of companies, a lot of
job openings are within thatspace.
Or Brainerd focuses on staffingand recruiting.
They're doing a high volumehiring where some of these
companies will have likethousands of applications and
sometimes they're in there'sjust not enough time, rather, to
get to all of these potentialscreens, and so they're using a
(08:05):
lot more sophisticated resumematching technologies than we
used to have to essentiallystack rank or evaluate fit, and
so in those circumstances alittle bit difficult, because
it's a recruiter might get tothe top the first 50 people, but
then they're not even lookingat the other profiles, and so
you could say, arguably, byhaving AI evaluate all 1000 plus
(08:30):
or whatever, it's actuallygiving more access and
opportunity to more people toevaluate fit.
So I don't know.
There's just two trains ofthoughts here not be doing any
kind of stack ranking or itshould just be packaging
information but not essentiallyfloating candidates to the top
right.
I don't know if you have anythoughts specifically on this
(08:53):
use case like top of funnelresume matching.
Steve Bartel (08:56):
I do, and I think
there's also a distinction
between top of funnel, resumematching and ranking when it
comes to inbound applicantsversus AI, narrowing down the
total addressable set ofeverybody out there to folks you
might want to reach out to forsourcing.
There's a key distinction there.
I'm sure you've thought aboutit, james, but even the EEOC
(09:17):
makes a very clear distinctionthat once somebody has raised
their hand for a specific role,either express interest or
applied, they become a candidate, and the way that the law
treats those folks is actuallydifferent from passive talent,
whether that's folks that you'resourcing net new externally, or
even folks in your CRM or yourATS that maybe applied for a
(09:40):
role in the past.
If somebody is activelyinterviewing for an open role or
express interest in that openrole, companies really need to
approach that with a lot morecare and thought, because you
definitely don't want todiscriminate against folks that
are actively interviewing.
Now I think there's a wholerange of ways that you can apply
AI and different varyingdegrees to which you need to be
(10:04):
careful.
On the one hand, if you have AIthat is picking out the top
candidates and auto-rejectingthe rest, now that starts to get
into pretty dangerous territoryin my mind, where AI is
actually making hiring decisionswith no recruiter oversight.
If AI is doing its best to rankinbound applicants, I think
(10:28):
that starts to trend towardssomething that's more okay, but
it depends on how the AI works.
My perspective on this is thatthe AI needs to do a few things.
One I think it needs to bebuilt on this new wave of
generative AI technology thatactually allows for better, more
ethical algorithms.
(10:49):
It used to be that AIalgorithms were a black box
where it was trained on tons ofhuman decisions which inherently
, are biased and you don't knowwhy it does what it does.
So like unknowingly, you couldbe deploying AI and creating a
bunch of bias.
Now, with these new generativeAI algorithms from OpenAI, from
(11:13):
Anthropic and others, you canactually give the AI clear,
unbiased criteria for what makesfor both an ideal candidate but
also the minimum requirementsfor the role, make sure those
criteria themselves are unbiasedand then leverage that criteria
in the matching and the ranking.
But you can also get theexplainability of AI, explaining
(11:36):
exactly why that criteriamatched or didn't, and then feed
that into making the criteriabetter.
Taking that a step further, youcould even reduce bias in the
process by asking AI to take apass on the criteria before it
even starts ranking people andgive recruiters tips on where
their criteria might actually be, unknowingly creating bias in
(11:57):
the process, and so I thinkthere's like actually a really
ethical way to do this.
My perspective on this is stillthat AI shouldn't be
auto-rejecting big swaths ofcandidates and that recruiters
should still be reviewingcandidates and seeing this as
more of an efficiency driver andmaybe as a way to respond
(12:19):
really quickly to the gems, soto speak, because what we know
to be true in recruiting istiming is everything, and if you
can bubble the folks up to thetop and have a really fast
response rate to the folks thatare most likely to be a fit,
based on the clear, objectivecriteria again unbiased criteria
(12:40):
with the minimum requirementsfor the role straight from the
job description and or from thehiring manager intake process,
that starts to feel more okay tome.
James Mackey (12:49):
What's your?
Steve Bartel (12:49):
perspective though
.
James Mackey (12:50):
Yeah, I think I
don't think AI should be
auto-passing on anyone, but Iknow there's this huge
conversation around bias andthere absolutely should be.
We need to be very careful here.
But I think the way that the AIsystems can be trained, even
now with generative AI andtelling it what to do and what
not to do and what informationto exclude, I almost feel like
it'd be easier to train an aisystem than a staff of 5 000
(13:12):
people.
On some of this, some of theseunconscious bias issues and
whatnot totally I don't know.
I think it's like the realityis that for a lot of these high
volume jobs, the vast majorityof applications may not even be
looked at if the volume is toohigh.
Is that a somewhat accurateassumption in a lot of cases?
I think that's pretty accurate.
Steve Bartel (13:31):
I think it is.
And for Jam, we're mostlyfocused on knowledge worker
hiring, partnering with a lot ofthe leading tech companies and
large enterprises, and so I haveless insight into high volume.
But you're totally right.
But even for knowledge workerhiring these days something wild
I think it's double digit, Ithink it's north of 20% of our
(13:58):
customers have roles withthousands of applicants which
you've never seen before.
So I think even for knowledgeworker hiring it's starting to
happen where folks mightinterview a big batch of
candidates but then never evenget around to evaluating the
rest because they've alreadyfilled the role.
James Mackey (14:15):
I think like that,
the whole issue of people
already aren't happy with theprocess.
Even pre-AI or withoutleveraging generative AI, they
still, oh, I got this, this autoreject email, or I got ghosted
by this company, or it's appliedto 100 companies today.
It didn't go anywhere.
People are afraid of bias.
There currently is bias.
I think a lot of the issueswe're talking about with ai are
already issues.
Uh, I don't see it as like areason not to use it.
(14:37):
I see it as a pause.
Okay, we have to focus on this,but I think that if leveraged
properly, it's like you'reactually able to most qualified
candidates, the 900th applicantlike this actually gives.
They could actually surfacethat profile.
The other interesting one isthere was a company called Qual.
Their CEO is David Tell and oneof the things he mentioned he
(15:02):
does like an AI voice agent andthe voice agent does like a AI
voice agent and the voice agentdoes like screening calls.
Essentially, the AI voice agentdoes screening calls and what
he was talking about is he doesa lot of the blue collar, a lot
of like blue collar work andlight industrial, very high
volume, and he said one of theissues in that industry is that
(15:22):
people don't have very goodresumes.
So it's very hard to tell from aresume if it makes sense to set
up a conversation with.
So by doing this voice AI agentscreening, they're actually
able to collect a lot more dataand they can have every
literally anybody can do it, andso any applicant could do it.
(15:43):
Have every literally anybodycan do it, and so any applicant
could do it and so they're ableto basically surface the
candidates that are truly a goodfit, instead of just going off
a resume that they have a veryresume base where they have a
very low confidence and which II was like oh, that's a really
interesting use case for ai andhow it could actually, like I
don't know, I think better servepeople to make sure that their
(16:04):
skill set is truly seen and putin front of relevant
opportunities.
Steve Bartel (16:08):
It is and that is
super interesting and I know for
again, we don't focus on thisas much for JEM but for high
volume hiring.
There's oftentimes just basicrequirements that folks need.
that might not always be clearfrom their background, for
example, certain certificationsfor some industries.
And being able to have a set ofjust very basic questions to
see if somebody passes theminimum set of requirements
(16:33):
makes a lot of sense, especiallyif, for whatever reason, your
ATS doesn't let you do basicknockout questions.
So I totally see a use case forthat.
I also see a use case for AIchatbots for that same use case.
Maybe the video is moreinteresting, maybe candidates
are more likely to try it, but Ithink AI chatbots could totally
ask some of these very basicquestions to understand
(16:56):
somebody's background betterwhen their resume is sparse
sparse.
James Mackey (17:04):
What do you think
of screening calls?
We can just stick to knowledge,right?
Just a lot of our customer base.
Both of us, right?
Secure Vision and Jim.
We work with a ton of techcompanies and I'm wondering too.
It's a screening calls, ofcourse, at this point, are
definitely done by recruiters,right, that's how it's done, but
I'm wondering to what extent wecould start leveraging AI to
actually run screening calls,and I'm thinking from a
(17:26):
candidate perspective.
I don't know, man, I'm hearingfrom recruiters okay, people
need to be engaged.
They want to talk with a person.
I'm thinking to myself, yeah,but also they have to take time
out of their day.
They have to do it in betweenmeetings and before, after
dinner with the kids and stufflike that in between meetings
and before after dinner with thekids and stuff like that, where
, if they had the ability toessentially do a screening
(17:47):
process on their own time and beable to collect the information
that they need to regardingbenefits and professional
development and these types ofthings and interact with the
system instead of dealing withall the scheduling and
everything like that, I thinkthat there's a lot of people out
there that would prefer to dothat, and I also, again, I see
that as a way like why not letanybody take the?
(18:09):
Be very clear with therequirements, but you can let
anybody do the screening call oryou could send out the
screening call to candidatesthat the recruiter has reviewed.
But I just see a lot of thetraditional screening call is
completely.
I honestly I think it could be.
I'm open to being swayed in theother direction with some good
(18:30):
data points, but I don't see aton of value in the majority of
screening calls.
I see that there's that saleselement of pulling candidates in
, but I also see a lot ofinefficiencies and screening
calls for both candidates andrecruiters which I think
ultimately are going to outweighthis human sales-driven
approach.
Steve Bartel (18:48):
Yeah, and so the
interesting thing with screening
calls is now it definitelystarts to get into that
territory of starting to makehiring decisions, at least if
it's fully automated.
But let's forget for a secondwhether it's legal.
My perspective on that is if avast majority of folks wouldn't
(19:14):
get screening calls anyways, itfeels ethical to give everybody
a shot Right, to give everybodya shot right.
And today there's just so manyapplicants that recruiters
really have to pick and choosewho they do a screening call
with.
And so for the folks thatwouldn't get a shot, wouldn't
(19:36):
get that initial recruiterscreen, giving them the option
to do an AI-based screening call, which would only be upside for
them, would actually give thema crack at getting a real
interview.
To me, ethically, that feelslike a good thing, even if
there's all this sort of legalgray area that companies need to
navigate.
(19:56):
And, to be clear, we don't havethis in Jam.
We probably I don't know itmight not even be worth us
building it because of all ofthe legal considerations.
Actually, did you see thisrecently that Workday itself is
getting sued for providing thealgorithm that could be making
hiring decisions leveraging AI?
(20:16):
This happened just a month ortwo ago Landmark lawsuit, which
is actually going to probablymake vendors a lot more careful
about any sort of AI thattouches hiring.
Did you see that, james?
James Mackey (20:31):
I'm looking it up
now.
That's what I'm doing.
So, yeah, a class actionlawsuit against workday.
Well, this one says 2023.
So I don't know if this waslike a maybe there's a further
iteration or news July.
Steve Bartel (20:44):
It's this.
I'm pretty sure it's this yearand it's going to court and it's
very interesting becausepreviously it was the entire
liability was on companies fordiscrimination or bias in hiring
processes.
This is the first ever casewhere it's actually going to
court.
(21:04):
That Workday could be liablefor the algorithms they provide.
And so I think actually a lot ofvendors that build the software
are probably going to be likevery cautious about anything
that touches hiring decisions inthe recruitment process, unless
maybe they're like a smallstartup where they don't have
anything to lose because of thiscase that's happening right now
, which actually might stifleinnovation.
(21:25):
And even if we think it's agood thing to give the longer
tail of candidates, thatwouldn't get a screen, wouldn't
give them the opportunity togive them a shot, that could
still result in a lawsuit,because how does the candidate
know if they were in the longtail or not?
And what?
How do you actually prove that?
(21:47):
And so I think vendors mightend up just being a lot more
cautious on some of these things.
That could be good forcandidates but just might be too
risky, and so might companiesyeah, that's a really a really
good point too.
James Mackey (22:00):
Yeah, I was
looking at the workday.
Yeah, it looks.
Yeah, ai powered applicantscreening tools discriminate on
the bias of race, age anddisability disability yeah,
that'll be interesting to seewhat happens.
I'm going to look into that.
Steve Bartel (22:13):
Yeah, see, yeah,
that is really scary stuff
because it's it could be likediscrimination at scale first of
all, it is scary, if that'struly the case, if their
algorithm is discriminating atscale, and that's what these
laws are here to protectcandidates and applicants from
discrimination.
Part of me wonders if that wasleveraging algorithms that were
(22:35):
based on the old way of doing AI.
James Mackey (22:36):
Yeah, yeah,
possibly.
(22:58):
Yeah, that's interesting.
But here's the thing withscreening, though.
To me it's just take asalesperson, for instance, like
what was your quota?
What was your quota attainmentdeal length with transactional
consultative six months?
What was the ARR?
Were you doing new business orupsell than existing accounts?
I mean like more that kind ofthey're not necessarily
(23:20):
open-ended questions, they'rejust very pointed screening
questions to just check boxes onyes or no, um, and that's the
only information that's being,you know, validated.
There's, I guess, follow-upquestions right to that's the
benefit of generative ai versusthe uh, just a static form, and
(23:40):
then, of course, packaging thatinformation.
But yeah, that's essentiallywhat I'm thinking of, screening,
which is, I guess, probablywhat you're thinking too.
But I'm just wondering.
That seems pretty clear cut tome.
Steve Bartel (23:52):
So my read is if
AI is helping to understand
whether somebody meets theminimum requirements for a role
that are outlined in the jobdescription and are very clear
objective criteria Like, yeah,what was your quota attainment,
and that there's clear minimumrequirements for the role, I
think that's okay, especially ifit could help plug in gaps from
(24:16):
screening questions that maybesomebody didn't fill out in
their application or somethinglike that.
Yeah, whether they have acertain certification or like
compliance thing, that'simportant for the role as a
minimum requirement.
Like that kind of stuff feelsokay to me.
I think it's when AI starts tomake an assessment on the less
objective criteria that that wejust need to be a little bit
(24:36):
more careful, in the same waythat recruiters need to be more
careful, yeah.
James Mackey (24:39):
There's those
nuance that I don't know why
this, even stuff like tenure,potentially, or I don't know.
That's not necessarily like abias thing, but there's even
like nuanced things like thatwhere a lot of people lost their
job during COVID, during due tolike layoffs, and so if you see
somebody in the tech industrywho's had three one year stints
and there's, you could train analgorithm to say you want an
(25:00):
average tenure of X, y, z, thenfolks who like literally had
zero control over being laid offcould be essentially, yeah,
discriminated against to someextent.
I don't know if technicallythat would be considered
discrimination, but there's just, there's a lot of.
That's just an example ofsomething where you don't really
maybe think of, of how somebodycould be essentially pushed out
(25:20):
of an opportunity.
Steve Bartel (25:22):
Yeah, and taking
that a step further, there
actually is a real of howsomebody could be essentially
pushed out of an opportunity.
Yeah, and taking that a stepfurther, there actually is a
real chance of discriminationthere If, for example, like
women who have children mighthave briefer stints because they
had a kid and, for a lot ofreasons, maybe they wouldn't go
back after that, and so thereactually could be gender
discrimination, depending on howAI is thinking about that.
James Mackey (25:43):
No, you're
absolutely right.
Steve Bartel (25:44):
I think this stuff
is really complicated and it's
evolving really quickly.
But my high level thinkingaround this stuff.
For that reason, I thinkcompanies and vendors should be
really careful when it comes toany sort of AI in hiring
decisions and even this exampleof how, like looking at whether
(26:08):
somebody had short stints couldinadvertently discriminate
against women more than men.
You wouldn't even expect that.
At first glance You'd thinkthat's not discriminatory.
It's like a great reason totread carefully that's a really
good point.
James Mackey (26:25):
Yeah, I guess
that's like why you see some of
the more established playersiron pillar in the interview
intelligence space.
It's really it's coming down toenablement and they're going
more of the co-pilot route.
And then it's like they'redoing the data packaging,
essentially summaries,identifying gaps in evaluation,
which I thought was pretty cooland they're, of course, doing
(26:45):
easier stuff.
They're pulling together rolerequirements, customized
interview questions, populatingthat Hiring teams can add their
own and putting that as part ofthe evaluation criteria, and
then again tracking to seewhat's been covered and what
hasn't, to ensure consistencyand that different folks are
being asked the same questions,and all that as well.
And then they're integratingwith the applicant tracking
(27:07):
systems to essentially puttogether scorecards, which maybe
you start to get a little bitinto evaluation, but it's
reviewed by people, but it'sjust essentially doing the AI
note taking and then putting thesummary into a scorecard format
.
So that's more of the path, butall of the interviews are still
being done by recruiters andit's just sitting in, like how
(27:29):
we have Fathom and Fireflysitting in on this conversation.
Steve Bartel (27:32):
Yep, exactly, and
I think that's a good approach,
a good thoughtful approach fornow, and it's the same approach
that we're taking to ourapplications of AI when it comes
to ranking and matching yourinbound applicants.
Ranking and matching andapplying that to resurfacing
people from your CRM or your ATSfor a role.
(27:54):
Or, yeah, ranking and matchingat the top of the funnel for
external AI sourcing.
Yeah, for sure, sure reallythinking about it as a co-pilot
approach yeah, yeah, that's.
James Mackey (28:03):
It's all really
interesting stuff on the ai
front.
Have you, are there anyadditional use cases that you've
been thinking a lot aboutrecently?
Steve Bartel (28:14):
so when I think
about ai use cases, it comes
down to really what's beenenabled by the new generative AI
technology shift in terms oflike, where we think and focus,
and for me it's anything techspace, which is why, like the
resume matching and rankingpiece is pretty compelling,
(28:35):
because, after all, resumes aretechs, so are qualifications for
a role, and so I think AI cando a really good job of ranking
and matching there.
And then I also thinktranscription's gotten a lot
better to your point around theinterview intelligence and call
recording software similar toGong in the world of sales, and
so I think that space is reallyinteresting too.
(28:56):
For us at Gem, we've got an ATSand a CRM, and so we're gonna be
focusing, instead of buildingthat, on strategic partnerships
with some of these new startupsthat are doing that, like
BrightHire and others.
But I think that space is superinteresting and I do believe it
can drive real value forrecruiting teams and save a ton
of time when it comes to thenote-taking.
(29:17):
But I also love what you werecalling out in terms of
enablement and coachingrecruiters, making sure there's
consistency across interviewers,helping companies identify
which are their bestinterviewers, which are the ones
that need more enablement.
I also think that potentially Idon't know if these platforms
do this, but if I was them, Iwould be building in little
(29:39):
feedback loops to ask and runfeedback that recruiters and
hiring managers and hiring teamsare writing through a set of
prompts that checks for bias andmaybe gives them some coaching
in real time in the moment asthey're putting together the
feedback about where they couldbe potentially biased or
introducing bias into the hiringprocess.
James Mackey (30:01):
Yeah for sure.
Yeah, it's a really interestingspace and these products are
evolving a lot faster over thepast year, which is really cool,
and the other definitely, steve.
An episode to check out is withNikos, the CEO of Workable, and
they've done essentially amassive overhaul on their
product suite and they'reaggressively implementing AI in
(30:25):
each aspect of their business,which was honestly for a larger,
more established player.
I wasn't, for whatever reason,going into the call.
I wasn't expecting to see thataggressive of a shift from
workable, but yeah, they're likewe're doing this, we're doing
this, we're doing this.
Different stages of the funnel,everything from recruiting to
(30:47):
employee management, essentiallymore of the HR side of the
house.
So that was really interesting.
It's a pretty cool episode.
He drops a lot of cool insightsthere too.
Steve Bartel (30:57):
Oh, that's awesome
.
I'll definitely check it out.
Actually, here's another killeruse case of AI that we've been
thinking about that we're goingto be building into our platform
.
So we've talked about theranking and the matching piece.
I think it's going to beamazing for the industry if AI
can be writing even more hyperpersonalized messages and
helping recruiters with thatdraft creation.
(31:19):
And here's the killer use casethat I think we're really
excited about at Jam.
So for us, like, ai is part ofa broader end-to-end platform
and we think there's a lot ofadvantages to that.
But imagine if somebody'salready engaged with your
company.
Maybe they've attended arecruiting event nine months ago
, or they were talking to JamesMackey, like one of the agencies
(31:43):
that we partner with, sixmonths ago about a different
role, or that they applied twoyears ago.
Maybe they were silver medalistsand that from the rejection
reason and the interview notesall really valuable touchpoint
context in terms of who thisperson is in your relationship
with them really valuabletouchpoint context in terms of
(32:03):
who this person is and yourrelationship with them.
Now imagine if they're a goodfit for a new role that you just
opened up.
Ai helps you surface thatperson, but it also helps draft
a highly personalized firstdraft to that person,
referencing the relationshipthat you have with them.
The fact that I know James waschatting with you about this
other role that's really similarto this new role we just opened
up that we really enjoyedgetting to know you as part of
(32:24):
the hiring process 18 months ago.
We think you'd be a reallyincredible fit for this new role
, for personalized reason, a, band C, based on your background,
maybe even based on thescorecards from back then.
And, by the way, we've gotanother event similar to the one
that you attended nine monthsago.
James Mackey (32:47):
Wouldn't that feel
amazing to a candidate and
wouldn't that just be a reallygreat thing for candidates and
companies alike.
Yeah, yeah, that's awesome.
So is that something you'redoing now, or when are you going
to?
When is Jim going to be able todo that?
Steve Bartel (32:55):
We're going to be
able to do that in the next
three to six months, and sowe're just super excited about
what the future holds.
Before we do that, we're headsdown on applying the same, like
AI ranking and matching, thatwe've built for AI sourcing,
which is, in general,availability to AI ranking for
your inbound and also forcandidate rediscovery.
(33:16):
And that's when I think, likethis stuff is all going to come
together in a really compellingway for our customers and for
the market, because then you'regoing to have the AI ranking and
matching technology threadedacross all the important
channels, whether that'srediscovery, inbound, external
sourcing.
Plus, you're going to be ableto feed in all those touch
(33:39):
points and relationship contextdata, which you can really only
do if AI is part of thisend-to-end platform that has
that complete source of truthfor every relationship and touch
point.
Feed that into the messagingfor each of these channels to
make it hyper-personalized.
You're only going to have toset up like the criteria, the
(34:02):
matching criteria, once, andthen AI is just going to go to
work for you across all threeplaces instead of having to
configure this in differentpoint solutions.
Yeah, we're really excitedabout how it's all coming
together.
James Mackey (34:11):
Yeah, that's
really awesome.
I know we're talking about oncea quarter here.
We should definitely revisit onwhat you're doing right now and
over the next three to sixmonths as that continues to
develop and more lessons learned.
And then when you start to,it'll be really cool too when
you start to roll it out tocustomers as well, just to see
their feedback and differentiterations of the product and
how you're of those features andhow you're making it better.
(34:33):
It's going to be really cool.
But yeah, I agree, that's justa totally different level of
personalization and that's alsojust really cool for jim because
I know, like with the productsuite play, that your team's
gone in that direction.
It's like all of the systems,all the products essentially
working together to pull evenmore personalization, probably
even smoother than I.
(34:54):
Would it still work if theyhave a different ats but use jim
for sourcing?
Are you still pulling in thatdata for personalization or is
it essentially if they're justusing your CRM and ATS?
Steve Bartel (35:03):
Yeah, that's right
.
I think one of the uniquethings about GEM is for that
specific piece, we're going tomake this work whether you use
GEM, ATS or another ATS.
Now, things will work a littlebit better together if you use
Apple products for everything.
The handoffs and your AirPodsjust know how to sync a little
(35:23):
bit better to your iPhone thanmaybe an Android and your Apple
Mac.
But, yeah, no, I think we arevery committed to remaining ATS
agnostic and knowing thatthere's a lot of different ATSs
out there, and we want to beable to support customers of all
sizes and shapes regardless ofwhat ATS they use.
James Mackey (35:43):
So on the ATS side
, I know we got to jump in a
minute, I have a hard stopcoming up here in a few minutes,
but are you now so?
I knew you were rolling out theATS in tiers, right Starting
with SMB, then pushing up market.
I think initially I don't knowif that was something from wrong
, but how's that going?
Are you like, has the ATS likefully rolled out?
Are you aggressively sellingthat product now?
Like, where are you in thatproduct life cycle and
(36:05):
iterations thereof?
Steve Bartel (36:07):
Yeah, so it's
going incredibly quickly.
We only started building theATS a year and a month ago.
It's wild and there's so muchto build for an ATS, but we
already have so much.
Yeah, it's do it well, but wealready have a hundred plus
customers using it and it's nowin general availability for
(36:28):
companies of up to 500 employees.
So we're moving up marketincredibly quickly and we're
starting to sign on customersthat are in that thousand
thousand plus range.
Actually, the last three months, I personally went and talked
to a lot of our biggest bestcustomers that leverage GEM
about becoming design partners,and so we've got this amazing
(36:51):
enterprise design partnerprogram where we have
commitments from five to 10upper mid-market smaller
enterprise customers, many ofthem several thousand employees
or more, lots of them publiccompanies, where we've, of
course, gave them a reallycompelling offer on the ATS
piece, but they've allpre-purchased the ATS as part of
(37:13):
a three-year contract and areworking hand in hand with us on
bringing it up market, whichwe're super excited about
because and I can't talk aboutthese companies just yet, but
they're some of the mostincredible logos in tech top AI
companies, amazing either preIPO or post IPO tech companies
that that anyone would haveheard of some of the best brands
(37:35):
in tech.
James Mackey (37:36):
Oh, that's amazing
.
Congrats on that.
Uh, yeah, I'm excited to havemore of that conversation too,
next time we record.
Yep, that would be great.
I'm excited to have more ofthat conversation too, next time
we record.
Steve Bartel (37:43):
Yeah, that would
be great.
I'm really excited for it aswell.
James Mackey (37:47):
Yeah, for sure.
Hey look everyone.
Thank you so much for tuning in.
We have a lot more episodescoming up for AI, for Hiring
series.
It's going to be great.
We've had some incredibleguests thus far and we're going
to have even more.
And, steve, thanks so much forjoining us today.
As always, it's a lot of funand you dropped a lot of great
insight for our audience, so I'mreally appreciative of that.
Steve Bartel (38:08):
Likewise Great to
be here.
Thanks, James.
James Mackey (38:10):
All right, thanks,
bye.