Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to the
Breakthrough Hiring Show.
I'm your host, James Mackey.
I got my co-host with me,Elijah Elkins, today With us
today.
How you doing, Elijah?
I'm well doing, well, Excitedto be here.
All right, good stuff.
We also have Fede and Guillermo, who are the co-founders of
Brainerd.
We're really excited to haveyou both with us today.
Thanks for joining us.
Speaker 2 (00:25):
Thank you for having
us guys.
We are very excited to sit hereand discuss what we're doing
and also just discuss aboutrecruitment in general.
I think it's a great space thatyou guys created to discuss
these topics.
Thank you for having us.
Speaker 3 (00:31):
Thanks for the
invitation.
Speaker 1 (00:34):
Yeah, no, it's great
to have you both here.
Just to start us off.
We would like to hear a littlebit about the founding story and
, essentially, what problem youwere looking to solve when you
started the business.
Guillermo Fede, who would liketo answer that one?
Speaker 2 (00:52):
I go ahead, I go
ahead.
Yeah, okay, you guys, everyone,this is.
Speaker 1 (00:55):
Guillermo speaking,
just so everybody knows there we
go, so do you get used to myaccent?
Speaker 2 (00:59):
as we move forward.
This is Guillermo speaking.
Yeah, so we started about ayear ago.
We are friends.
We've been friends for thethree founders.
We are three founders of thebusiness and we've been friends
since high school, so we do knoweach other very well and each
of us has their own sort ofbusinesses before and kind of
life got us again in a situationwhere we were looking to start
a business and we actually hadthis problem was like in our
(01:22):
radar, like the problem aboutresume screening, and then also
it was the rise of AI.
I come from a technicalbackground, very interested on
the whole AI, everything thathappened in AI and all the
benefits and all the tools andthings that you can use.
And we, I think, discussingwith a friend I can't remember
exactly what point well, yeah,this is the idea, but it was
just an exploration of ideas ofwhere AI can help businesses and
(01:45):
where it can really make a bigdifference.
And I think AI seems like itcan do everything.
But if you bring it down to thebasics, where can it bring
value today?
So what we did?
We actually probably spoke withmaybe like 50 different
recruiters, start acquisitionmanagers, hiring managers, CEOs
50 different recruiters,starting, acquisition managers,
(02:05):
hiring managers, CEOs so we'vebeen working in the industry for
10, 15 years, so we do have alot of contact between the three
of us and we spend a long timejust talking with people.
That's really how we really gotstarted and that's the one I
set up as our starting point andwhen we have a really good
understanding of the problem andwe started being an MVP of the
possible solutions and thenslowly we started growing into
what Brainerd is today.
(02:26):
But I would say that the mainstory here, the founding story,
it was just all by talking withpeople.
I think we have an idea of whatwe wanted to grow, but we also
wanted to make sure that it wasaligned to how people are
working today and how people areusing these tools today.
Speaker 1 (02:42):
Yeah, I appreciate
the background there, and so
let's dive into the primaryvalue proposition from the top.
What problem did you decide toinitially tackle with generative
AI?
Speaker 2 (02:54):
Yeah, so the first
problem, it was very simple, it
was just about resume screening.
It's like how can we speed up?
I think the first time thatsomeone brought this up it was I
have a lot of candidates on mypipeline and I'm not able to go
through all of them.
Can we use AI to at least helpme extract some information from
the resume so it can help memake my decision faster?
So that's how the problemstarted.
(03:16):
Recruiters are spending,depends who you talk to but it
could be a couple of hours perday just simply sitting on their
ATS, opening one candidate at atime and looking at the resume,
maybe doing a scorecard oradvancing or tagging, but
there's some sort of very manualprocess of reviewing resumes
and so with Brainerd, we createdthis product but we'll get into
(03:36):
more details later butbasically it helps you screen
resumes faster and smarter.
And this process what we arelooking into, which is the
resume screening process, is thetop of your funnel, where you
get a job.
You've got a new open position.
If you publish that openposition, you're probably going
to get maybe hundreds ofapplicants, probably in the
first 24 hours to 48 hours, likein a moment that you publish a
(03:58):
job.
There is a stats forum I thinkit's Ashby ATS that says that
between 2021 and 2024, there'sbeen three times more applicants
than we used to have.
So in terms of volume ofapplicants applying to jobs and
that affects directly your costto hire, your time to hire, your
accuracy the more candidatesyou have and you don't have
enough resources, you're notgoing to be able to cover all of
(04:19):
them.
You're not going to be able to.
Maybe you get a hundredapplicants and maybe you find in
the top 20, you already foundthe first 20 that apply.
Maybe you already found someonethat you like and then maybe
the other 80, maybe they're notgoing to get the same level.
They will.
You will try.
This is a problem.
We found these two people.
Do we keep looking for moretalent or not?
Do we stay with this?
(04:39):
What our solution does is itmakes the process even more fair
as well, because it actuallydoes an analysis on every resume
according to your own criteria,and it just gives you a report.
It doesn't tell you who's goingto be the best candidate.
It doesn't tell you who youshould hire.
It's just only going to extractthe information that you care
about each candidate, so you canthen have a report that will
(05:03):
allow you to pick your topcandidates much faster than just
going one by one.
So the main value is about howcan we spend less time reading
resumes and more time talking tothe candidates.
How are we doing moreinterviews if necessary?
Are we doing more phonescreenings?
And that was going to make goto the first interview faster,
which is going to make you tohire faster and the user time
that you take to hire.
So we are focusing on thatfirst part, which is resume
screening and just making surethat as a company, you don't
(05:26):
miss anyone, and also as acandidate, we want to make sure
the candidate gets a fair chanceas well.
Speaker 1 (05:33):
Okay, yeah, that
sounds really good.
I want to hear I thinkeverybody would like to hear
more about your customer base,which I'm sure early stage
company.
Maybe it's evolving, but Iwould love to get a sense for
early adopters who you're seeingcoming on as initial customers
and then since then maybe you'vedialed in on an ideal customer
profile, maybe you're going wide.
(05:54):
But I'd just be curious to seefor this, specifically for
resume screening are you seeinglike higher volume types of
customers or that do a lot ofhigh volume hiring?
Are you seeing specificindustries or types of roles?
I would love to get someinsight there from you.
Speaker 3 (06:19):
We have three
different kinds of clients.
Currently.
We have a lot of traditionalcompanies like healthcare or
banks, for example, but alsotech companies.
But then we are working withsome RPO agencies also, and also
we're working with staffaugmentation companies.
(06:39):
So it's really why the kind ofcompanies that we are working
nowadays, but something that wehave is that small companies
doesn't need this kind of systembecause they don't have a high
volume of resume applications.
That's something that I cantell you.
Speaker 2 (07:00):
Yeah for sure.
And I think the other way weapproach it is when we were
first started, when we firstbuilt our first MVP, we didn't
really have integrations with anATS, we were just building.
We are like oh, just uploadyour resumes here and we're
going to do the analysis for you.
You can just upload a hundredresumes and in a couple of
minutes you will have youranalysis, your report, done.
And what?
(07:20):
When we started talking tocustomers, of course everyone
used the customers that you willlook at, that have many opening
roles and have a high volume ofapplicants.
They are companies that theyuse an ATS.
If you have even like maybe twoor three open positions, you
probably need an ATS already.
Then we started looking more OK, let's just focus on customers
that are using an ATS, and thatalready narrows down the market
(07:41):
as well, because if you're usingan ATS, it already sets you as
a profile and especially, westarted with Lever as our first
ATS.
So then, if you are losingLever?
So then we went more of okay,let's just focus on customers
that are using Lever as the ATS,that they have at least one of
a couple of current openpositions, and that kind of
really helps you to narrow downyour customer base.
(08:02):
And the good thing is, all thisinformation is public as well,
so it becomes quite easy to spota company and be like I think
this company could benefit fromBrainerd.
And we can see that, becausewhen we reach out to those
companies, we actually get areally high response rate
because you are talking thelanguage they're like yeah, I'm
dealing with this problem.
Yes, I am spending a long timereading resumes.
What do you have to say?
So it becomes so the better youget at finding those companies,
(08:24):
the more chances are thatpeople actually are interested
in what you have to say.
Speaker 1 (08:28):
I'm curious like we
had Elijah I think it was the
founder and CEO of Qual right,so he was.
They were doing voice AIscreening calls and it was so
top of funnel initial touchpointtypically, sometimes second
touchpoint, but it sounded likeit was typically being used at
the top of the funnel so it'sessentially as soon as somebody
(08:49):
would apply they would do thisvoice AI screening call.
So it was higher volume and heactually it sounds like.
Just correct me if I'm wronghere, my memory isn't serving me
properly, but he had a fairamount of customers and staff
augmentation, contingentworkforce, staff aug space a lot
within light industrial right,is that yeah?
Speaker 4 (09:11):
Okay.
Yeah, I can't remember if hewas sending everyone, if
everyone actually gets the linkto use fall, or if it would be
right like something.
Like brainer is used to reduce,let's say, 3000 applications
down to 2000.
And then those 2000 are sentthe qual link to be able to go
(09:32):
through like an automated kindof screening.
Speaker 1 (09:35):
Yeah, I don't know.
I think the way that he wasdescribing it was the CEO was
describing it was basically thatfor that industry, for light
industrial, particularly for theRPO firms that service that and
the contingent workforce,contingent firms that service
that, a lot of the times folksin those industries they don't
do a really good job puttingtogether their resumes.
So he's like, all right, let'sjust, we got to screen these
(09:56):
people so we can actually decidewho it makes sense to set up a
call with.
So I guess the reason I'm goingdown this like little, like
side path here is I'm wondering,like the art like from a
staffing perspective, whetherit's staffing agencies, RPOs or
whether it's like companies.
I'm wondering, is it similar inthe sense that you're getting a
lot of the very high volumetype of roles where it might not
(10:17):
necessarily just be likeengineering openings at a growth
stage tech company, but you'reseeing more adoption of the
companies that might be in light, industrial, CEO, qual or maybe
healthcare, wherever, likedifferent spaces where there's
just a ton of applicants andmaybe like similar challenges to
what qual is solving, obviouslyfrom a different approach.
But I'm just wondering, likewhat you're seeing more of right
(10:37):
now?
Speaker 2 (10:39):
I would say, if I had
to pick one industry that, as
customer base, is more prominent, is the tech industry.
I want to just make sure thatthat's so far has been in the
majority of our cases.
When we go into demos, there'salways some sort of technical
position that they are workingwith, and I think the technical
positions has.
You know, in many casessometimes they're a remote
(10:59):
business, so they kind of evenhire even from a bigger pool of
people.
So that's why I think techpositions they do get a lot of
applicants and also they are theones that sometimes they
require an extra level ofknowledge to actually do a
resume screening.
If you have to be a technicaltalent acquisition, you have to
be a technical recruiter,because you need to understand
AWS or Google Cloud or Azure andyou kind of have to have the
(11:22):
language and understand whatthey're looking for a Python,
javascript or Node.
The good thing about usingsomething like Brainerd is that
the AI has that knowledge.
So if you decide, okay, this ismy position, I'm hiring for a
technical role and I'm lookingfor someone that has at least
five years of experience onPython, that's something that
the AI can say, yeah, thiscandidate has it and then this
(11:44):
candidate doesn't have it, andthen so then that is why AI is
helping you even recruiters tonavigate these terms.
Then the other day, we have acustomer that, for example, they
were looking for a facilitiesmanager, so they were looking
for architects and we were.
So we got a lot of people thatapply architects and we had a
challenge where many of thosearchitects were actually
(12:05):
uploading resumes as images.
They had this very nice designimage and it was harder for our
system to parse it and we had toimprove the system.
I'm just telling you thesedetails just because every
industry comes with differentchallenges.
You know what you meant.
Some people may not reallyprepare their resume very well.
Some people may prepare theirresume too well.
Speaker 1 (12:23):
That's the other big
issue, right.
Speaker 2 (12:26):
Is that the other
part where you get a free page
resume with all these images andall these things and it's, with
a way of expression, the resumeright.
So, whatever they decide to do,that's okay, but the system you
have to also consider thosecases as well.
And then, yes, in thehealthcare industry we have
someone I think this is theother one where there's a lot of
rotation nurses and healthcarepractitioners.
They need to continuously behiring, so they have these
(12:48):
positions where they never startand finish.
They continuously look, theysay, hey, if you can do this, we
want to talk to you.
So we do have that, and thosejobs get a lot of applicants as
well.
So, yeah, and then we hadanother customer which is an RPO
customer and he's based in theNetherlands and they actually
use our API and they embeddedour processes in their own
(13:10):
proprietary system.
So they use our API to connectto their system and they do the
resume screening directly ontheir process.
And that was another challengefor us.
Another thing OK, we need anAPI, we're going to build an API
and then you're based in Europe.
So now we have to have Europeservers and GPTDR and all these
sort of compliance things.
But those are the things thatgrow in pain.
Those are the things that pushyou to make your product better
(13:31):
and bigger.
So it is a challenge Again.
There's no one type of customerthat I'm like.
I think it's part of theprocess right now that we are
trying to refine and fine-tune,tune, are targeting continuously
and constant.
But this is so far.
We try to go for this vertical,see what happens and we get
some response and iterating likethat.
Speaker 1 (13:49):
Yeah, just to see you
guys know too the guests that
we've had on thus far.
We just had the CEO of Workableon the show, ceo of BrightHire,
ceo of Pillar, and then there'sone other, elijah I'm blanking
on the name of the other company.
We've had enough podcasts nowin the series where I'm like I'm
starting to blank out a littlebit, but anyways, it's.
What's been interesting is,across the board, we're seeing
(14:11):
companies go really wide.
I don't know why.
I just had this assumption inmy head.
Okay, we're going to seewhether it's like resume
screening or AI note taking orevaluation of packaging,
organizing data or whatever partof the workflow.
I thought we were going toprobably see more of companies
(14:31):
specializing a little bit into aspecific industry or a specific
role, but I guess now it makesa lot more sense, because the
more that I'm learning about islook the way that this
technology is being built is youcan apply it to like any role.
Yeah, but it's still justinteresting because yeah like
we're just hearing fromdifferent CEOs.
Oh, we have customers like allover the place.
(14:51):
It doesn't seem like it's likethere's one industry per se that
is leveraging different AIdriven technology these days.
Speaker 2 (15:01):
And I think also it
has to do with the fact that
when you're just getting started, you're not going to say no to
a customer.
So in many cases it's oh, youwant to talk to us, or like they
come directly.
They found us on LinkedIn orsomewhere.
They book a meeting, they tellus what they're doing
(15:28):
no-transcript faster.
Blah, blah, blah, and maybe wewent more narrow our approach,
but at this stage it feels morelike we are trying to discover
where this works the best.
And the reality is, yes,there's many industries that
this could work.
Maybe we'll never really focuson one industry and maybe we
will continue this.
So maybe at a certain point,especially for marketing efforts
(15:49):
and content, sometimes you doneed to have more of a narrow
focus.
So from a go-to-marketperspective, we may have more of
a narrow focus, but in the backwe'll be like, hey, if a
customer comes?
Speaker 1 (16:05):
that doesn't, I will
still.
We can serve them, like forsure.
Yeah, let's do it.
That's the go-to-market.
I don't know what you're seeingout there from a go-to-market
strategy perspective, but yeah,that's another interesting
question, right, do you try todial in on a specific segment
but then it's you can basicallyservice other areas, or do you
just present as going wide fromthe beginning, like we can help
you with any role company-wide?
I'm seeing more companies justdo that, right, like they're not
even like they're trying almostnot.
It seems like they're tryingeven not to really say like
(16:28):
which industry they're gettingmore traction.
They just really want to keepit as open as possible.
Right now, I think everybody'strying to still figure out is it
going to be consistent acrossthe board in terms of different
industries where we're going tosee traction?
I think another thing is likepeople want customers and
technology.
But I think, just after thepast few years, founders don't
only want customers andtechnology, right, they want
(16:50):
more diversification outside ofonly tech, just because of the
volatility.
And so, at least for me I'lljust speak for myself at Secure
Vision, it's like my company'shelped over 200 companies
through our RPO solution.
A lot of them are in the techindustry and we're making a big
push for diversification outsideof tech.
We still that's a lot of ourrelationships are in tech, but I
(17:12):
want more customers inmanufacturing, life sciences,
banking and finance,construction, real estate.
I'm really trying to push outas much as I can to diversify
our customer base.
So when things like SiliconValley bank crash happens, we
have a little bit.
We have customers in differentsegments.
I'm curious, like from ago-to-market perspective, what
(17:33):
are you seeing out there?
How are you positioning thecompany?
Just curious to get yourthoughts here.
Speaker 3 (17:39):
Yes, we are targeting
always talent acquisition
managers.
That's something that werealized exploring the market.
Right, we started targetingrecruiters, but we noticed that
recruiters sometimes don't havethe power to make a decision,
for example, or sometimesthey're just screening resumes
and they don't want to go andpush one initiative, a new
(18:01):
initiative or a new software.
So a great thing was todiscover that talent acquisition
managers or recruiter managerswere the ideal persona, right.
And then, as Guille saidrecently, we prefer the tech
companies.
You guys prefer Tech companies,of course, but we are always
finding new kind of companies.
The staff augmentationcompanies is the thing.
(18:22):
Because we are from LATAM, wehave a lot of contacts in Latam
and United States companies arehiring developers from Argentina
, from Colombia, from Uruguay,and that's the reason why we
have a lot of staff augmentationcompanies, because we have
contacts of companies inArgentina that are looking for
those profiles, hire them inArgentina and put them to work
(18:42):
for the United States.
We are finding cases every day.
That's the thing.
We don't have something reallynarrow.
Speaker 1 (18:51):
Yeah, there's the use
case or the more that you think
about staff augmentation firmsbeing early adopters of these
types of more AI nativesolutions is it makes a lot of
sense because they're alwayshiring.
It's in any market, like ifthey're in business, like
they're hiring there's alwaysgoing to be, even in a slower
market, if they're a bigger firm, there's still going to be a
(19:11):
high volume of hiring,particularly compared to
corporate jobs.
So it's going to be corporatecompanies that are just in tech
or whatever else.
So I think it's going to beinteresting too.
I think a lot of companies inthe recruiting tech space that
are rolling out new tech.
They probably are going to havea decent segment that comes
from that industry just due tothe need for efficiency gains
(19:32):
right, particularly, too,because these are services firms
that have to hire a lot ofheadcount right, like to deliver
business to the extent thatthey can be a little bit more
optimized and cut outinefficiencies.
That's a lot of helps them movea lot faster to not only
increase revenues.
But the reality is likeservices firms do have to be
(19:55):
incredibly focused onsustainability and margins and
very careful.
So I think that that's I thinkmy guess is that we're going to
hear from more CEOs that thoseare.
Some of the customers they'reseeing right now are in that
space.
Speaker 3 (20:07):
Yes, and they are
always hiring the staff
implementation companies andhiring the same positions and
the same tech positions.
So they have a big database ofcandidates already analyzed with
Brainerd.
So they have a lot ofcandidates that are a good fit
that they can continuecontacting and hiring them.
Speaker 1 (20:25):
Of course, yeah, yeah
, for sure, elijah, I feel like
I've been talking my head off.
Man, what questions do you have?
Speaker 4 (20:31):
Yeah, all good.
So I'd love to rewind to whenyou were looking at starting the
company so you mentioned.
You talked to, let's say, 50plus recruiters town acquisition
managers.
What did you do next?
Take us back to that time andhow did you really get started
building the AI and even dippinginto like the technical side a
(20:53):
little bit?
Yeah, I just think it'd bereally interesting for our
listeners to hear that firsthandexperience of like actually
building AI for recruitment.
Speaker 2 (21:04):
For sure?
Yeah, no, thank you for thequestion.
I think it's a great segue intohow we did that.
And yeah, once I switched thefeedback, I mean there was a
part of we are three people, soI have mostly Fede and Ignacio
and they were having the calls.
Some of the calls I willparticipate as well, but I was
on a dark room.
They didn't let me leave thatroom and they made me be coding.
(21:26):
I had to be coding and the onlything I could do was coding.
That's how we got things done.
I was not.
They would just slide me pizzaunder the door and they just
gave me coding.
That's how we started buildingthe software.
No, but in a real sense, therewas definitely this sort of
because, first of all, we had tosee if the technology can do it
.
So the first thing that we did,I remember we just went on
(21:47):
ChatGPT and we were like here'sa resume, here is a job
description.
It is a good fit.
You know that, honestly, that'ssomething anyone can do today.
You don't even need to pay.
You can just do it on any sortof AI.
You can just paste your resume,paste your job description, and
the AI will actually give you avery decent analysis of that
applicant versus that jobdescription.
(22:08):
So that's why we're like, hey,this is something that we can
use.
So the technology is there.
Then we go back to theinterviews and we're like hey,
so what do you think of thisreport?
What do you think about this?
Do you find this useful if thisreport is automatically
generated for every candidate?
And they were like, oh, yeah,that will be useful.
So we're like, okay, let's be avery simple product, very
(22:29):
simple UI, very simple website,not something that we can show,
because you need something toshow and you need to test it out
.
You can do so manyconversations, but people need
to see things.
So we call this the MVP, yourminimum viable product, and in
that time it was about it's justa simple ui that you just drop
your like 150 resumes and youwill just do this analysis.
It will compare the jobdescription with a resume and it
(22:49):
will give you a report and ascoring of one to a hundred of
like how good is that?
And we thought that was akiller product.
We're like, oh, this is amazing.
They give you a score, it givesa report, and then we started
showing that.
And then, when we startedshowing that, the first thing
that comes to the recruiter mindwas but how do I control this?
What is this report based on?
Like, how do I decide why thisscore is this score?
(23:10):
Why am I getting?
There was a lot of questionsabout why.
How was the AI doing theanalysis?
Like, how was the AI decidingthat this candidate has a 70
score versus the other candidateat 50?
Like it was a bit of like ablack box magic that was
happening.
Initially we thought, hey, thatwould be great.
But when we started talking topeople that were using software
(23:31):
that black box magic is notsomething that they want to see
they actually want to have morecontrol of how that analysis is
being done and why is like, whythe score and where the score is
coming from.
Ultimately, they want to alsoset up their own logic, like
their own rules for the scoring.
So we went back on thewhiteboard there and they were
like okay, so what is that?
What do we need to do here?
And then that's when weintroduced it is.
(23:53):
I think that was probably atipping point or a turning point
for for our product, which iswe introduced the concept of
defining your criteria.
So instead of being a black boxsort of like magic thing that
happens and give you a score, weactually say, okay, just put
your resume, put your jobdescription and we're going to
from there, we're going toextract the criteria will be
what really are you looking forin this candidate?
(24:13):
Number of years of experiencewith certain technology that has
worked on a high growing partof a team, or that works on a
high-growing part of a team, orthat works on a high-growing
company.
That the candidate is based ona certain geographical location.
So you go from objective thingsno, not subject, no, soft
skills so much, but just more ofthe things that you.
Hey, I need to use scorecard.
Has this candidate been workingfor more than five years?
(24:35):
Has this candidate hadexperience with this technology?
So you list all your criteriaand then the analysis is done
based on that criteria and thenthe analysis that the AI will be
doing will be saying does thiscandidate fit that criteria
partially, completely, or itfails to meet that criteria.
So let's say, five years ofexperience this candidate
partially fits this criteriabecause he's got three years of
(24:55):
experience and has worked in ahigh-growing company.
This candidate worked on Airbnb, so he has worked.
So high-growing company.
This candidate worked on Airbnb, so he has worked.
So that's a fully.
So then you can imagine youstart having this kind of like a
green-yellow per criteria, percandidate and then, based on
that, then you get the score.
Hopefully it makes sense.
So you have these two.
So you set up your jobdescription, you set up your
(25:16):
criteria and then that criteriastarts being analyzed on every
candidate.
So what you have at the end isa dashboard where you will have
all your candidates and then youhave this color coding.
When I say this has everythinggreen, it means that it meets
all your criteria and you willprobably have 100 score.
But maybe you have a candidatethat fits almost everything, but
a couple of those criteria arepartial, so you have an 80 score
.
(25:39):
And when we did that, that'swhen we started like people and
really understanding how thesoftware does, really
understanding the power of howuseful it would be, and things
really started moving in theright direction and it was just
a matter of not automatingeverything and not making
everything so magic and sopowerful, but actually taking a
step back and giving more toolsto the recruiter and then let
the AI do a very objectiveanalysis.
(26:00):
Know that they are not tellingyou if it's the best candidate
or not.
It's just going to tell youthis candidate has more than
five years of experience onPython, but that's it, that's.
And then it's up to you likemove that candidate ahead or et
cetera, and set up the scoring.
But that's something that youknow.
That came from just talking withcustomers and talking with the
potential customers and reallyunderstanding their processes
(26:21):
and what they wanted to see, andI think that's something that
is a topic that we can discuss.
It's like AI can do so much.
It could be even maybe even notme right now.
Maybe it could be an AI avatarright now talking that
technology is already possible.
It's just that we are notreally ready for that.
We can't go from crawling torunning.
We have to go crawling, walkingand get there.
Speaker 1 (26:42):
Okay.
So I have a question aboutscoring.
So I brought up scoring withthose founders CEO of Workable,
and I briefly touched upon itwith BrightHire and Pillar.
So Pillar, brighthire andWorkable are all staying away
from scoring.
There's not like any kind ofmatch per se.
(27:04):
We're saying this is a 90 matchbased on criteria.
They're not doing it in numbers, they're not like 90.
They're not even doing it likein any kind of stack, rank or
upper percentile or best fitbatching or anything like that.
It their focus is on organizingthe data I think is how Nikos
(27:26):
put it and packaging it in aneasy format for hiring teams to
essentially consume.
Now, granted, these companiesare focused down funnel, so
these are when they're in theinterview process.
But I specifically asked Nik.
So I was like what aboutscoring for top of funnel?
Like you got, you have athousand applicants.
(27:46):
It would be good to be able tohave some kind of criteria or
some kind of matching criteriato essentially evaluate score.
And he basically Elijah, do youremember what he said?
I think he was basicallygetting into like difficult
mistakes can happen, this andthe other.
I don't know if I.
He shared a lot of reallybrilliant things.
I learned a lot from him.
I didn't know if I totallyunderstood or agreed on this
(28:07):
part.
Quite honestly, I'm a bigadvocate on trying to show
relevancy at a minimum of anevaluation, not just presenting
data.
Speaker 4 (28:16):
But I think he
might've meant with an actual
scorecard or evaluationperspective.
If you go on to Workable'swebsite and you look at their, I
think it's like workable-ai isthe end of the URL.
They do have something called ascreening assistant that uses
AI for semantic matching withcandidate resumes and they do
(28:43):
have like a best match AI withsomebody gets 40 or 60.
Speaker 1 (28:45):
They do.
Yeah, he didn't say that at allon there, cause I remember I
asked him a few times.
Speaker 4 (28:49):
He might've thought
you meant right, like an actual
scorecard or evaluation.
I don't know, guillermo or Fede, how do you feel like I can
speak to this?
Yeah, yeah.
Speaker 2 (29:00):
We can speak to this.
Yeah, I can speak to this.
Speaker 4 (29:01):
Yeah, yeah, we can
speak to this.
Yeah, please do.
Speaker 2 (29:02):
Because this is what
we do.
We look at all of them, we tryall of them and, you know,
because ATS is someone that wewant to work with.
We can't really we're notlooking to compete with the ATS
in any way.
So then, in some case, some ATSare workable and I'm sure some
other ATS will eventuallydevelop some sort of technology,
and I'm not going to mentionany names.
(29:24):
But when we do talk to thecustomers this is not about any
particular ITS, let's just saythe model but when we do talk to
the customers, do mention thesesort of features, oh, they do
have this thing, they do havethis thing, and the normal
feedback that we get is yeah,but we don't know how it works.
We don't understand where it'scoming from, how it's making
this decision, like why this isthe best match or why this is
(29:44):
the other one.
It's definitely helpful becausethe suggestions from the ATS
they tend to be accurate, likeif they say it's the best match,
it probably is the best match.
That's not incorrect.
The problem is, like how areyou deciding that this is the
best match and why the othersare not a best match, like why
what's the logic behind that?
And that is no one to berepetitive, but that's why we
(30:05):
introduce the whole criteria andthat's what we talk about
scoring based on do you fit thiscriteria yes or no?
Yes or no?
And then your score comes uplike how many of those criteria
do you meet?
And that's it.
And I think for ATS it's a muchbigger product.
Development Like.
One thing is just to add a bestmatch and AI analysis.
Another thing is just introducea whole criteria definition and
a program.
So I think that is what'shappening.
(30:27):
Customers are like, yeah, welike we see the AI feature, but
we don't really get it, we don'treally use it, we don't
understand why this is comingfrom.
So it helps.
But I think the recruiters andthe talent acquisition managers,
they want to take it one stepfurther.
And also they are like when youdo a brain analysis, it's not
only about the best match, it'salso about maybe the don't look
(30:47):
at your top 10, look at your 10below that that they may be
really good for your nextposition.
And you may be able to say, hey, this candidate has 10 years of
experience of development.
Yeah, he may not have thislittle thing that I need for
this, but this is going to begreat for this other position,
and that's information that theATS is not going to give you,
like it's only going to tell youthe best match.
But what about the second bestmatch?
And that is so powerful as well, because sometimes your second
(31:08):
best match tends to be theperson that you end up hiring
and it's just a matter of aresume that was not showing the
information much, for those AItools is based purely on what's
been written and what's been onthe job description.
But what is information thatwasn't on the job description
but it was important for you?
Where is that sitting that way?
That's sitting on that solution.
So that's what.
(31:29):
Again, the criteria thing comesvery handy and and that allows
you to put criteria that may notbe on your job description but
are still critical for yourprocess.
I think there is this.
There is some there is.
I understand why they don'twant to put a score, because
when you put a score that youdon't explain how the score is
being done, it could bedeceiving.
So I think I understand whythey end up being just moving
away from the score, becauseit's too difficult to, unless
(31:50):
you give them the tools tocontrol how the score is being
done like what we do, then it'sbetter just to do a best match
or recommendations kind of title.
Speaker 1 (32:05):
Yeah, I think it was
like one of the things he kept
coming back to is we don't wantai to make the decisions, we
just want to organize the dataand I don't think assigning a
score is.
No one is saying that the aishould be making the decision.
It's just like when you have athousand plus applicants, we
need a starting place and it'sjust yeah, it's organizing data.
That's literally what it is.
Scoring is still data.
It's just doing it in a formatwhere we can get the highest
leverage, the best leverage onour time.
(32:27):
I think scoring is incrediblyimportant.
Maybe not like down funnel,where you're comparing like four
candidates.
You may not need as muchanalysis on fit because a hiring
manager is okay, it's fourpeople we can look.
The data can be packaged, butit's a slightly different
conversation.
But when you're looking top offunnel with volume, to me I feel
(32:51):
like that should be tablestakes.
Speaker 2 (32:54):
It's almost like when
you take an exam at school.
It's like they give you a scoreand the only way to get all
your, your class to somehow havesort of relevancy between each
other is that we all take a test.
We get 10 questions and then weget to see how many we get
right.
And in this case we are notasking, we're not taking a test
from the candidates, but in away you're saying, hey, do you
have this?
Yes, you have this.
Yes, you have this.
Okay.
No, all right, this is yourscore.
(33:15):
It doesn't mean that you're abetter candidate or no better
candidate.
You're a better person.
It's just a matter of you'retaking a test and that's what
we're trying to organize theinformation and give you
relevancy, from every candidateto everyone.
I think that's what youmentioned, this, and I think
it's a great way to start.
It's a matter of how toorganize this information.
Speaker 1 (33:32):
It's literally what
junior recruiters are doing, or
junior sourcers is they're goingthrough this motion, probably a
lot slower, probably missingmore, looking at resumes to find
the relevant ones to push alongto the recruiter.
Elijah, you look like you gotto.
You want to say something?
Speaker 4 (33:48):
Yeah, I'm just
curious.
So do you find your customersif they've defined a scorecard
in their ATS that they copy andpaste over what they were using
for the scorecard as a criteriainto Brainerd as the criteria,
or is that different?
I'm just curious if they'reactually using the scorecard.
(34:08):
It seems like they should be ifthey have one right.
Speaker 3 (34:14):
Exactly.
It's similar, like a scorecardthe same scorecard they are
using.
But we studied a lot and writea lot of guides about how to
write a good criteria, forexample, quantify the criteria.
It's not the same to putexperience in Python or plus two
years of experience in Pythonor experience in Python in the
(34:35):
last two rows.
It's the different level ofaccomplishment that the criteria
is going to have, the differentlevel of accomplishment that
the criterion is going to have.
Or, for example, we suggestthem to include things that
wasn't included in the jobdescription.
For example, you are targetingthe competition and you want
persons that are working in thecompetition.
So we write a lot of guidesabout how to write good
(34:57):
criterias and, yes, they aresimilar to the scorecard, but we
include more things than thescorecard.
Speaker 1 (35:06):
That's a really good
example, Like the whole
competitor, because that's likein a kickoff call.
I'm sure Elijah and I have bothbeen on a ton of calls where
customers were like do you havea target list of companies?
Do you have companies you wantus to stay away from?
Speaker 4 (35:18):
That's another good
one, right.
Speaker 3 (35:20):
Sometimes you want
stuff like they worked at a
company in the past, right,there's all sorts of nuance to
that right we have a client thathas a list of 300 companies
that he's targeting and he has acriteria that has the name of
those 300 companies and we aretelling him if the resumes are
aligned or not to that criteriayeah, and he can put more weight
(35:41):
on that criteria as well.
Speaker 2 (35:43):
So then that affects
the scoring as well.
So if he gets a candidate thatmeets that criteria, he
automatically bumps it up on thescoring because, oh, this is
very important to me.
He may not have this otherattribute, this other criteria,
but he does have this.
That is very important for me.
Yeah, this is the interestingthing about having your little
criteria on the side, and I wantto also mention, for anyone
(36:06):
listening, when you set up yourjob, when you upload your job
description to Brainerd, weextract the criteria
automatically.
You don't have to do it fromscratch.
You start with a base that wealready give you, that based on
our best recommendations.
So we have a little AI modelthat is trained on extracting
criteria from the jobdescription and then giving it
to you just to save you time.
So that way it only takes you acouple.
It probably takes you likemaybe three to five minutes
really to tune this.
(36:26):
It's not that you're startingfrom scratch, copy pasting
everything, and we're alreadyembedding our best practices in
there as well to make theprocess much more faster,
because I feel like we'retalking so much about it, it
seems like a tedious process.
It's not really tedious.
It only takes you a couple ofminutes to get it set up and
then you start tuning it.
Once you get like 100 applicants, you're like, oh, this scoring,
I need to go back.
And then you go back to yourcriteria and you tune in and you
(36:47):
add more criteria, you reduce,you change the weight and then
that affects the scoring, andthen you can see that in live
data.
You can continue to do ananalysis on just those
candidates that you already havethere, which I think is very
powerful, especially for staffaugmentations as well.
Where you want to go, I haveanother role that is very
similar to the previous one.
You may need to open the role.
You go back to the previous one, you tune the criteria and then
(37:09):
you're like, oh, here's myother candidate, let's start
with this one.
Speaker 1 (37:11):
Can we talk about
weight for a minute?
So you're talking about how youcan weigh the criteria.
Can you walk us through?
Actually, I don't think we'vetalked about that.
No, that's fine.
Speaker 2 (37:19):
Yeah.
So what we hear from customersis that they want to give more
weight to certain criteria, forexample, like the example that
Fredegate just said aboutworking for this company, or
that they are more than fiveyears of experience, again, on
certain technologies.
So they want that criteria tobe above.
Maybe let's say anothercriteria that is not so
(37:42):
important, like there could besomething, a soft skill, it
could be something that you knowtechnology or framework.
Sometimes, you know, in techrole.
I'm a technical person so Ialways talk about tech roles,
but in tech roles sometimes it'sfive years of experience as a
developer and maybe working withAWS.
That's a plus.
You know, when they're in thejob description they always say,
oh, these technologies are aplus, or these technologies are
(38:03):
a bonus but you don't need tohave them.
But they will be really good ifyou have them.
But then those go a bit belowthe ones that are like five
years of experience as adeveloper.
You want to be a softwareengineer for at least for five
years.
You want to be graduated.
Sometimes they ask for a grant.
They want people that onlygraduated from technicals, from
master degrees or technicaldegrees, and they are very
(38:24):
important for the client andthey want to make sure that if
they can teach that, they wantto bump it up more than a
candidate that fits the othermore bonus criteria and then
yeah so to that kind of what weallow that you do.
We just allow you to give moreweight, and that weight affects
the scoring, so it will end upaffecting how the candidate gets
sorted in the table, becausethe table is sorted by top
(38:44):
scores at the top, and so havingthat ability to weight in the
criteria has been something thathas been very highly requested.
When people understand theprocess, they are like oh, how
do I give more priority to this?
So when we didn't have theweight, we have people actually
putting the same criteria morethan once.
They will put five years ofexperience with python.
They will put the criteriathree times, because that will
actually end up affecting thescore, because if you have it
(39:04):
that three times, you will win.
So that's how people were goingaround this before we
introduced the weight in thecriteria.
But I don't know what are yourthoughts on.
How do you guys?
Is that something that you guysconsider Like some criteria are
more important than others?
I think it must be quite commonin your industry as well.
Speaker 1 (39:20):
Yeah, yeah, I really
like that functionality.
It's not something I actuallythought of, but I feel like,
yeah, if I was using yourproduct, I would very soon after
be like, oh yeah, this isimportant, we need to be able to
do this.
Speaker 2 (39:32):
Yeah, you don't want
to miss that one, you know.
Speaker 1 (39:35):
Yeah, for sure.
So is it as simple as justyou're telling the product is
asking the customer like whatyou're telling the product is
asking the customer like what'smost important?
Is it really that simple?
How does it?
What does it look?
Speaker 2 (39:44):
like, yeah, we
started with as simple as saying
, if this is mandatory, we havetwo categories, like mandatory
requirements and non-mandatory.
And we split the criterias inthose two columns because, yeah,
the mandatory ones you may wantto have that all green, because
it's like for every criteria,we give you like a little color
bar, right, so you have themandatory criteria that you want
to probably see all that green,and then you have the ideal
(40:08):
criteria or the other criteriathat is not mandatory, that
you're like I'm okay, some ofthese are missing, and that was
the first division.
Like that was the first thingthat we introduced and that
already helps a lot of customers.
And then, one step further, wedon't have it live yet, but it's
about for you to say this isvery important, this is somehow
important and this is justaverage.
That would be the otherdimension that we'll be
(40:30):
introducing, not only themandatory, but just also saying,
taking even one step furtherand saying this is very
important, and then just putthat on the scoring formula.
The scoring formula is somethingthat we publicly share with our
customers.
It's not a secret, something weare like.
Here's the code, here is how wecalculate the scoring.
There's no magic numbers.
There's no AI on the scoring,it's just purely looking at the
(40:51):
flags.
Yes, no, yes, partial.
Speaker 3 (40:53):
Another important
thing is that you can also
filter by criteria and, forexample, you want only to get
those that have experience inPython.
You can filter those candidates.
And why this is important?
Because companies also need toarchive candidates and, for
example, if they want to archivea candidate because he's not
(41:14):
from that location, they canfilter those candidates that
aren't from I don't know UnitedStates and they can archive them
in bulk Instead of going intoone single resume.
They have to archive thatcandidate and that's important
because it's about the candidateexperience.
It's not only about the goodcandidates, but also about
(41:35):
archiving or giving a reason tothose that apply and you are not
going to call them.
Yeah.
Speaker 2 (41:41):
I think many times I
think we talk so much about
looking for the best candidate,but the reality is many times
sometimes it's just aboutarchiving all the ones that are
not qualifying, like it's just amatter of you get a thousand,
maybe it's going to be maybe 500, that, probably with two
filters you can already archivebecause it's like, oh this, do
they have five years ofexperience?
No, okay, how many 200.
(42:02):
You select them all and thenyou archive them and that
already gets reflected on yourATS.
So then, by the time youarchive all the ones that don't
fit your criteria, you'reprobably going to end up with a
much shorter list.
Even if you want to go one byone, you can still do it.
But the ability of justremoving the noise, if you will.
Speaker 1 (42:26):
That makes a lot of
sense to me.
Elijah, do you have any otherquestions?
Speaker 4 (42:30):
I'll just a final
question.
So let's say, in your scenario,guillermo, you did archive 500
of those candidates, but maybe alot of those candidates would
be relevant for other positions,kind of candidate database that
they have, does it suggest andalmost give them.
You can add, a candidate canhave a score for one role but
(42:52):
then a different score foranother role.
Yeah, I'm just curious, I don'tknow.
Could you have 15 scores andyou're like high score for this
role based on this JD, butyou're very much higher score or
lower score?
Speaker 2 (43:06):
for this Totally and
I'm glad you're bringing this up
because it's very aligned toone of our latest features that
it has to do with findingcandidates in your database.
So thank you for the flag.
But you know what we did.
We allow you to search forcandidates on your ATS.
So what we find from recruiterswas that because we are just
getting started with Brainerd,so it's not so much the data we
(43:27):
have on Brainerd, we are justgetting started with Brainerd,
so it's not so much the datathat we have on Brainerd, it's
actually the data that youalready have in your ATS,
because you already, in manycases, they already use tags,
they already use stages, theyalready somehow they have this
organized Not as good as havingthem on in Brainerd, but there
is some sort of system ingeneral that they already have.
So what we allow you to do isyou have your position and then
(43:48):
you go on Brainerd and be likeyou know what I want to search
in my ATS database thecandidates that apply to this
other role or that arecandidates that have this tag,
or these candidates that havethis source, or you know there's
many filters that you can applyand what we do is, with all
those filters.
We go, get all those candidatesfrom your ATS and we bring them
into Brainerd and we do thesame analysis that we did for
(44:09):
the candidates that applydirectly, like the candidates,
the inbound candidates and thenwe import the ones from your
other positions from your ATSand we import them into Brainerd
and we run the same analysisand we basically are able to we
allow you to spot really goodcandidates from your previous
positions.
That is that's what we haveright now.
That's something that customersare using right now, where we
(44:29):
see this.
We want to take it even onestep further and be more aligned
to your question where, onceyou have enough candidates in
Brainerd, you can start doingalso some sort of smart
searching on.
If you could go on yourdatabase and say I want to get
all the candidates that havemore than five years of
experience on Python, andBrainerd will be able to just
bring you just that.
And then you start searchingnot by keyword, not by tags, but
(44:52):
just by actual semanticmeanings, by real things like
customer candidates that work onthis company, candidates that
do this, and because the AI isgoing to be able to run that
analysis and then find you theright candidate.
So the more candidates we getinto Brainerd, the more we're
going to get.
We're going to keep evolvingthis.
I think the search in yourdatabase is a critical component
and yeah, it's something thatit hasn't.
(45:12):
It wasn't possible before.
Keyword matching it works, butyou leave a lot out on the table
, like keyword matching is, andthen people also they game you
because they put all thekeywords on their resume and
then you end up getting the AIdoesn't get confused on that.
If the AI reads keywords, itdoesn't matter.
Like the AI will actuallyunderstand the text, understand
the dates and sort of like howyou've been moving around in
(45:33):
your career, et cetera, etcetera.
Yeah, so we've started withimporting from your ATS and the
next stage will be to searchthrough your in this way.
But so far what we found is wecan.
The recruiters were telling usI already have tags for.
Speaker 4 (45:51):
Or I think they call
them like the second place or
third place or whatever they usefor the process.
Nice, yeah, that's reallyhelpful because I know on the
roles that I work on whenever Ineed to search through a
database and try to findcandidates, or re-look at
candidates, review a pipelineagain.
It's just irritating, honestly,right Like you've either looked
at the candidate before and nowyou're trying to reassess.
It would be really nice to havea score based on objective
(46:15):
criteria and not just trust thetags and, like you're saying,
Boolean searches and keywordmatching.
Yeah, that's a powerful feature.
Speaker 2 (46:25):
Yeah so we are
getting there and then also have
to do with us.
Just, yeah, one thing at a time.
There's so many things we wantto build, but yeah, that is one
of them and if you want, we cantalk about this other.
I think maybe that where thisis going and I think other parts
that if you guys think that areinteresting, where we see a lot
of other components, the searchon the database, like the
number one.
(46:50):
The second one has informationin terms of, okay, what if you
can take a step further andmaybe send customized rejection
emails and not only just ageneric?
Or what if you could actuallyintroduce a bit of feedback in
your rejection emails?
Or what about if you can't bethere?
Apply and it has already like a90 or 95 score?
Why don't you try to book ameeting already?
Why don't you try to get acandidate on a phone call?
If your candidate is alreadyreally good?
Why are you going to wait threedays to reach out when you can
just get them to book themeeting right away?
(47:12):
Imagine the candidate isapplying and we do the analysis
in live and then we can just getthem to book the meeting to
talk to the recruiter, so therecruiter will wake up the next
day and you already have acouple of meetings with
potential candidates for therole.
Speaker 1 (47:22):
That's cool.
I really like that one.
I really like that one.
Just chiming in there, I thinkthat's a really cool use case.
Speaker 2 (47:29):
And then the other
one is about you mentioned this
before about resumes not havingthe right information, and
that's something else that weare many times what recruiters
are doing.
They have a lot of questionswhen you go to apply.
So if we all try and find a jobbefore, it's very personally.
You go to apply your job, youput your resume and then you
have 10 questions that are likedo you have experience with this
?
And in my head I'm like this isalready in my resume.
(47:50):
What am I completing this formfor?
And then that.
But I know they use the form tothen I know how it works.
Now in the ATAs they use thatform to then add tags and then
to do some Boolean search andfilters.
But what about if thosequestions could be dynamically
generated?
What if we just look at yourresume and we feel like, hey, it
sounds like you have it all,but have you ever worked with
this framework or have you everworked on with AWS?
(48:11):
And maybe that little questionthe candidate didn't mention it
on the resume.
They fill out the answer andthen we can put that back on the
analysis and now you save a lotof time because the candidate
was able to express somethingthat they forgot to mention on
the resume and then, as therecruiter review, that it will
be like, oh, oh, I see it in thequestion.
So, no, not turning into a fulland full interview.
(48:32):
We don't want to be like, Ithink, that other company that
you guys mentioned, we don't tryto get into like so much of
interviewing.
But, yes, you like, complementthose, your knowledge gaps that
sometimes people forget tomention in the resume, and it
could make a big difference onthe applicant so yeah, those are
some of the things I haven'twritten here, but I feel like
maybe you guys will find itinteresting and I'm curious to
hear your thoughts about thisstuff as well.
I'll be your customer researchI love it.
Speaker 1 (48:56):
Elijah, would
anything come top of mind for
you right now?
no, go ahead and try yeah, Iguess I'm wondering, just like
the, in terms of feedback loopas candidates are.
I'm wondering I'm just pickingout loud here, just thinking
through this, as I say, ascandidates are getting down
funnel, I wonder if there's away to evaluate.
We said this is the rolecriteria.
We initially thought, okay,these folks are 95% match.
(49:18):
But we're noticing the peopleat the final round are like 75%.
They're like they have, they'vematched different criteria.
I wonder if there's any kind offeedback loop there and maybe
not.
Speaker 2 (49:31):
Again, I literally
just thought of this, but just
curious, I get it.
Yeah, no, I think there's a lotof that goes in the ats.
That has to do with, like thenotes after the first phone
screening.
It becomes a little more thatit's not so easy for the third
party provider to get into thatprocess as well, because it
becomes a very sensitiveinformation.
You have the hiring managerputting notes and then you have
(49:51):
the notes from the recruiter andthings like that, and ATS don't
necessarily share thatinformation that easily.
But I hear you that itdefinitely may be all linked
together, also even have a.
There could be somethingfurther done, but that feels
like it's maybe not right now.
I feel like right now it seemslike it's a more sensitive
process.
What happens right afteradvancing the candidate?
(50:12):
We leave that to them and then,yeah, I think every company has
their own sort of like littleprocess there.
Speaker 1 (50:18):
What about Elijah?
Go ahead man.
Speaker 4 (50:19):
Oh, sorry, I was just
going to say part of your
go-to-market.
if you haven't considered itcould potentially be trying to
power some matching features fordifferent ATSs.
Maybe they don't have theinternal resources to be able to
develop this sort of feature.
Being able to like power thatkind of in the background could
(50:41):
be really powerful for them tohave a differentiated product in
the market.
You don't have to go sell athousand customers.
You just got at a lower rate athousand customers from like X
company Plus it could turn intoif it went really well, maybe
just buy you it out.
I don't know what you're asking.
Speaker 2 (50:59):
If anyone there is
listening and you guys watch us
with the buyer, we give you afinder's fee as well.
But no, I think the interestingpart is yes, a%.
Think that you could become aninfrastructure company like the
stripe of this, if you will sayso, and I think that's a big
dream.
And what is on our side is thefact that AI has the unique
technology challenge, or thisunique technology proposition,
(51:20):
that you have an AI model, thatyou are customized, that we are
trading the model, that we arecustomizing the model, and the
more customers we get, thebetter we get at doing this, and
that becomes really valuable,because that's just only for you
and for your company.
And I can see how, if you justwant to start what we are doing,
you have we already one yearahead.
Like you have to cover all thatand that's something that you
can really speed up.
(51:40):
You need a lot of data, youneed to train them all, you need
to test it out, you need tomake sure that works correctly
and that, yeah, becomes veryvaluable.
So I could see how maybe evenit could be ATS or it could be
even a job board.
Job boards they try to get intothat process as well, where
they try to help you find yourbest matches in your job and
things like that.
There's been a couple ofconversations, very
conversations from very far, butwe'll see where that goes.
Speaker 1 (52:04):
All right, I have one
more question.
So we have Daniel Chait,co-founder and CEO of Greenhouse
, on the show fairly often aboutonce every few months and we've
talked about AI on a fewepisodes and one of the things
he talks about.
He talks a lot about hisconcerns with AI.
I think more than like hisshares excitement about
potential applications, he'smore just concerned.
(52:26):
A lot of big customers.
Yeah, I always.
Yeah.
One of the things he talkedabout is just like from a resume
matching perspective, like thisarms race of.
He called it an arms racebetween AI resume kind of
evaluation tools right, possiblysimilar to yours right and AI
generated resumes.
And he's so he was just talkingabout how it's like you're
seeing the market flooded by allthese AI generated resumes, and
(52:50):
then we're starting to see moreevaluation tools for resumes,
matching tools.
So curious to get your thoughton that topic and then like,
more specifically, do you worryabout that?
Do you like think about, okay,like this tool needs a way to
try to figure out if a resumewas identified for AI, or is
(53:11):
that really just not a concern?
What do you think about thistopic?
Speaker 2 (53:15):
Yeah, definitely.
Something we discuss is becausewe talk about that Once you get
your resume made by the AI, andthe AI is evaluating your
resume like it's just the AI,and I think, if you take it one
step backwards, I think this ismore like long-term right.
I don't even know this.
I think this is more likelong-term right, I don't even
know.
This is my personal opinion.
I think we have to look at theresume as a medium.
I think here is are we going togo after AI and symbol or are
(53:37):
we just going to look at the waythe process is right now?
It's a process that has beendone for how many, I don't know,
since forever.
We always use a resume as a wayfor me to introduce myself to a
company and, yeah, why theresume has to be in a particular
format, in a particular way?
Why is the resume?
The best way to sell me as acandidate, if you think about,
most people have one resume andthey send the same to everybody.
(54:00):
Now there's this new trendwhere they tell you you should
tweak your resume for the jobposition and you should make it
more, and that makes a lot ofsense because it's like, why
would I tell you all thisexperience that hasn't applied
for this job.
Let me show you about thisexperience.
So it does make sense that youshould have almost a unique
resume per job application Likethat.
Maybe people don't have the timeto do it and the tools are
maybe not there yet, but italmost feels that you could have
(54:22):
one resume per job that youapply and the resume maybe is
going to become more like aletter, like you could become
more of something like hey, thisis my story, this is a place
that I work with, but this iswhat you really care about.
I did work with this technology,I did work with these people
and I did work with that and soon.
So, if anything, I think what Ican, if the AI is in the resume
and the AI is doing theanalysis, I think, if anything,
(54:44):
we just kind of turn going tohave a work experience.
They're going to answer somequestions and that's going to be
the process.
Like the resume will become notwhat we know it as today.
I think it's just because theAI technology is here to stay.
I think we're not going to beable to just no, we don't want
AI, we just only want to doresumes in a Word document and
we're going to read that andthat's it.
Speaker 1 (55:14):
It's just going to
happen.
So it's more about how do weadapt to this process.
So I agree with you.
I definitely agree with you.
One of the examples he broughtup and so everybody listening
definitely go back and listen toI think it's the most recent
episode I had with daniel, butyou can check in the episode
description.
I think it highlights thischapter, so you guys should
definitely check that out too.
But he brought up like ai beingused to match something.
I don't know if it was likebeing done in healthcare or
something like that, but theanalogy is like the matching it
was using, like the type of MLit was using, picked like a very
(55:37):
random piece of informationthat was like a correlated trend
associated with a ton of images, and then basically said that
piece of data was what should beused to group them into a
cluster or something.
But it was like somethingtotally irrelevant, like it was
like something that wasn't likerelevant to the decision at all
and he cited that as an examplefor why we have to be careful.
(55:58):
But it's again like I.
This is happening.
I think already the tools arebecoming a lot more
sophisticated than that.
This isn't just even basic likeclustering or anything like
that.
Like this is totally differentyeah but I'm just.
It was just.
It was an interestingconversation.
This wasn't that long ago, thiswas probably within the past
few months, and he was pretty Idon't want to say like
(56:20):
pessimistic, but he has to sayno, but I think like, why would
he can't freak out like?
Speaker 4 (56:26):
all of the enterprise
companies using greenhouse you
talk about.
Talk about a smaller an ATSwith a bunch more startups.
Yeah, I don't know, anyways goahead?
Speaker 1 (56:37):
I don't know.
Speaker 2 (56:38):
There's definitely.
We did something that wasinteresting.
Maybe let's talk about the badside.
Bias is something that it isembedded on the AI models, and
that is that it exists and it'sclear and it's actually.
You can do tests and in a verysimple test for anyone out there
, if you want to test a bias,you can put a resume, right, and
you have the same.
You have three resumes and allyou do in each resume is change
the name of the person and thenyou do names for different
(57:01):
ethnicities, different cultures,and then you put those three
resumes and then you ask the AIwho is a better job, fit for
this role?
Same resumes, just differentnames and the same description.
And then you ask the AI,whatever BD Gemini, whatever
provider you want to try, andyou'll be surprised.
In some cases they will waitmore a candidate from the other
one, based on, maybe, whatcountry are they from or what
(57:23):
name they have on, and that's avery cool.
I can imagine the CEO ofGreenhouse being like.
That is really bad.
Yes, and we all agree that wedon't want that to happen, and
that's why we can't let the AImake the work for us.
We can't let the AI make thedecisions for us and we also.
We have to be very smart aboutthe questions that we ask to the
AI.
What do we ask the AI to do?
We want you to just analyzeeverything and tell me who's the
(57:45):
best match.
Or just tell me this candidatehas five years of experience.
That's it.
So it's almost.
Are you gonna put a child to doto start cooking?
Or maybe you're gonna get thechild to just ask you can you
pass me the eggs?
I'm doing pancakes with my kids.
Is he really doing all thecooking in the fire?
No, he says give me the eggs,grab the eggs.
And that is how we should treatai.
That's like a kid.
Not, he's not ready for themainstream and but he's really
(58:08):
good at those tasks for a kid.
He's really good at giving youthe eggs.
He's really good at giving youthe flower.
Speaker 1 (58:12):
I hope the analogy is
good, but I just frame it that
way yeah, I think that makes alot of sense.
It's like also putting in thisagain the specific criteria of
what like good looks like.
It's not necessarily looking Idon't know, it doesn't sound
like it's necessarily lookingfrom a database of this is what
good looks like.
It's more of taking the searchcriteria and then using that to
determine what good looks like.
In terms of preventing bias andI'm not an engineer do you
(58:35):
train systems to specificallynot make a decision based off
certain information?
So can you get it to ignorenames completely, or is there a
way to-.
Speaker 2 (58:45):
No, you definitely
can, but you have to because you
ask the AI not to give you ananalysis.
The whole thing here is thatyou don't want the AI to give
you an analysis.
You don't want the AI to giveyou conclusions the way I say,
this is a good candidate becauseof one ABC.
You don't want the AI to dothat because that's when the
bias comes in, that's when youdon't have control.
Where is your reasoning comingfrom?
(59:06):
Ai nowadays is not able toreason it.
The reasoning is not somethingthat AI can do.
What AI can do today is from allthis internet data.
It's able to come in with somesort of conclusion based on what
it's been trained on, but it'snot really thinking.
It's really good at readingtext, but it's not really good
at really coming with your ownconclusions.
(59:27):
All it's doing is just matchingtext and using text all
together and coming up with thebest text that follows that.
Sorry, it's a bit technical,but what I'm trying to say here
is you can't have the AI tothink, and because the AI can
think and all it does is use thetext that it has been training
on and, yeah, the data on theinternet, it has bias.
We all have bias and we writearticles, blog articles, and we
write things, and we do that andthings like that, and that is
(59:48):
embedded in the model and it'shard not to have bias.
What you can do is just askquestions where the bias is
almost none Like objectivequestion Is this green or red?
Is this blue or black?
Blue or red, is this black orwhite?
That is the question that youcan ask the AI, and the AI is
really good at that, and that'sthe new thing versus a keyword
search that you can actually askdoes this candidate work Amazon
(01:00:09):
, yes or no?
Period has this candidateworked for more than five years
at Amazon, yes or no?
The AI can do that and it cando it very well and very
efficiently and it doesn't getconfused, it doesn't overseen
and it does it in a matter ofseconds.
That is really.
That is an NDA.
It all has to do with how youpresent the software and how you
prompt the AI and things likethat, but the bias is something
that is just always going to bethere, I think, until we get to
(01:00:31):
the next version of AI, where AIwill start thinking, and then
now I don't know what willhappen with them.
Speaker 3 (01:00:37):
From another
perspective also.
Humans have bias when they areanalyzing a resume.
So I prefer having the AIextracting if the candidate has
five years of experience inPython or not and then provide
to the recruiter all the list ofcandidates with the
accomplishment of the criterionand maybe you can do that in
Brainerd.
You can hide the name and youcan see the list of candidates
(01:01:01):
and which one meets whichcriteria.
So we decided to put the AI todo something that can do more
productively than persons andmore impartially than persons.
That's the thing.
Speaker 1 (01:01:14):
Yeah, it makes a lot
of sense and we're going a
little bit over time today,which has been great.
Thank you for spending moretime recording with us.
This was very insightful.
I really enjoyed this, so Ijust wanted to say thank you so
much for coming on the show andsharing all of your insight with
us today.
Speaker 2 (01:01:33):
Thank you for having
us guys.
I think it was a greatconversation and, yeah, it will
be interesting to even if we dothat next year not only from our
perspective from a company, Ithink just in terms of the trend
in general in tech and HR Ithink it's going to be a very
interesting couple of yearsahead of us in tech and HR.
Speaker 1 (01:01:50):
I think it's going to
be a very interesting couple of
years ahead of us.
Yeah, definitely.
We have probably, I think, atleast 15 more episodes on this
series, so make sure to checkthose out and go back.
So in this series, I think thisis like our fifth or sixth
episode, and then also for folkstuning in, if you're interested
more in the AI topic as well,there's again.
There's, I think, two or threeepisodes with Daniel Chadover at
(01:02:10):
Greenhouse about AI, and thenalso with Steve Bartel,
co-founder and CEO of GEM.
We've done a few episodes withhim discussing how GEM's
implementing AI and his thoughtson how AI is going to be
incorporated with hiring.
So we have a fair amount ofreally good content on the topic
already, but we're really upleveling the amount of knowledge
that we're going to be able toshare with everybody here.
(01:02:32):
Anyways, thanks again, guys.
It's been a great conversation,elijah, as always, great to
have you here too.
Thanks, cool.
Thanks for joining us.
Talk to you soon.
Bye.