Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
What are some of the
cool ways that you are seeing
frontline leaders leverage thisto kind of work with the team
and not just help to close dealsbut maybe develop people etc.
Everything that.
Speaker 2 (00:11):
I mentioned can be
really valuable for a frontline
or second line leader.
You're increasingly able topersonalize chat, gpt.
Where I have seen leaders takethis to the next level is how
they also understand your team.
For example, I have my team douser guides.
So with every member of my team, whether they're a frontline
leader or whether they're an IC,we understand how they like to
work.
We understand their workinghours, what their preferences
(00:31):
are, their communication styles.
You can put as much of it inthat as you want, whether this
is an GPT and so on.
Suddenly you sort of have aversion of that individual
through which you can now askyou know, chat to BT to support
communication to them.
You can ask it for, like youknow gut, check it within the
sort of coaching that you'reproviding and, as long as you
were continually feeding it thefeedback that you were giving,
(00:52):
the conversations that you'rehaving, the stellar performance
that you're seeing or the areasof growth and coaching that you
want to provide, you'reresponsible for.
Speaker 1 (00:59):
Where do you think
we're headed?
How about over the next, likesix months, that you think that
people are going to be doingaround Gen AI, or what's going
to be possible in the next, youknow, six to 12 months?
Speaker 2 (01:09):
What I will say is
we're seeing some suggestions of
it already and AI poweredseller.
Speaker 1 (01:15):
Connor, I'm pumped
for the conversation here, so
why don't you tell people alittle bit about your background
and the role that you play atOpenAI?
Speaker 2 (01:24):
Yeah, sure, and
excited to be here.
Thanks for having me.
So I lead our mid-marketenterprise sales team at OpenAI.
So it's a pretty big segment bygo-to-market standards.
It ranges from smaller-sizedbusinesses all the way up to the
lower end of true blueenterprise companies.
We have two products, so wesell Chachapiti Enterprise.
(01:45):
We have our API and I guess theway I think about our team's
role is like we translate thecapability of these products
into value for our customers.
So that's what I do at OpenAI,happy to expand on that role.
The last 10 years has mostlybeen spent at early stage
companies.
I was fortunate enough to startout as an SDR at a company that
(02:08):
I think I still think is prettyworld class and app folio cut
my team there and between thatentry point and where I am today
, I worked for about threedifferent startups, some for
like close to five years likeproduct board, others not as
long, didn't quite have the legsto go the distance.
We've all seen that movie andmostly where I've really enjoyed
(02:28):
spending my time is sort ofacting in that role of
translator and working reallyclosely with product teams and,
especially, as it relates toOpenAI, our folks in research
and applied and thenunderstanding how we can make
the sort of foundational workthat they're doing
understandable and valuable forour customers, and so I joined
(02:49):
OpenAI just about two years ago.
The go-to-market team at thattime was, I think, probably
fewer than 20 people, maybefewer than 15, whereas today, I
think, we're pushing 500.
Speaker 1 (03:00):
That's awesome.
Yeah, that's good.
It's a little different thanlike.
Maybe it'll work out, maybe itwon't.
It's like I think I'm prettysure this is a safe bet, like
I'm pretty sure this is going tobe a thing as a part of it.
Well, that's great, man.
Yeah, and I have to imagine Imean for you guys in particular.
Just, you know how much of yourwork and sales team you know is
(03:23):
like that translation, because I, you know my first kind of set
of questions are really around.
You know how revenue teams andgo to market are actually
implementing you know generativeAI today, because what I see is
a lot of like still a lot ofwild, wild west, right when it's
like oh yeah, these people havesome licenses to this and these
people have some licenses tothis, and so what are you seeing
(03:43):
from call it like best in class?
Like what are the companiesaround go to market in
particular?
Like what are the things thatyou see?
Like, have they identified painpoints?
It's not all about AI.
Like what are the things yousee from the people that get it
and are, you know, kind ofrunning forward towards some
type of solution?
Speaker 2 (04:00):
Yeah, that's, that's
sort of the question.
Thankfully, a year ago I wouldprobably tell you we're still
sort of identifying exactly whatthat looks like, but now I
think we have a much bettersense.
We sort of moved from the eraof experimentation and pilots
and like is this hyper, is thisnot?
Into real world deployments,and that shift feels very stark.
(04:29):
And what I have noticed when itcomes to the workforces that, to
use your words, really get itis there is this concept around
the importance of having anorganization that is AI-fluent
or AI-enabled, and they sort ofview it in very clear steps.
The first is really just aroundthe technology.
They know they need to giveaccess to it.
This is table stakes and itgoes to your point.
It goes beyond oh, we know thisis available and so our team is
(04:51):
using it.
Or this was like turned on forfree by like insert cloud
provider and so our team isusing it, and it's rather
there's a lot of intentionbehind making sure that they are
democratizing access across theorg to the technology first and
foremost, democratizing accessacross the org to the technology
first and foremost.
The second thing that I noticedis that they recognize the
criticality of being enabled andeducated, like there is some
(05:12):
type of curriculum, whetherthat's external, whether that's
internal.
They are leaning in to thelevel of education required to
like to get get to, at minimum,a one-on-one level of AI fluency
across the organization.
And then there's thisappreciation that goes beyond
time savings in terms of therecognition of value, the
(05:32):
business justification for it.
What are the metrics?
It goes beyond access andeducation.
How are we actually measuringthe efficacy or the value of
this work?
Speaker 1 (05:40):
And this is always Is
that usage-based Connor Real
quick Is that?
When you're talking about thatlast one, because that resonates
with me a lot is it about usage, or are you guys trying to
quantify business impact too?
Or maybe it's both?
Speaker 2 (05:54):
It is definitely both
.
I think, whether this is in thecontext of pilots that we run
or in a version of QBRs, it'salways really important to help
stakeholders, the buyers, theexec team to understand what
that usage looks like, what isthe utilization, and that itself
has nuance.
It goes beyond like yeah, maybeit's being used daily, but in
what way and what models arebeing used?
(06:16):
Who are the teams usingsomething like deep research,
for example, or some of the moreadvanced models to sort of back
into those use cases?
But that only goes so far.
I think you know it's veryreasonable for an exec to say
like that's great if we'resaving time, but where is that
time being reinvested?
That's the higher value workbeing done.
You know it's a very common,very reasonable question, and so
(06:36):
we then have to go that extradistance to help map that back
to the things these businessescare about.
Sometimes that's morestraightforward and other times
it's not right.
This is very new and thetechnology is advancing very
quickly, and so, at minimum,there needs to be this
understanding of how teams arerethinking the way that they do
work with AI first principles inmind.
Speaker 1 (06:59):
Yeah, and I think for
many leaders out there, you
know, and I'm going to I want toask you the question of what's
stopping people.
What we've seen is the sayingthat I have that I use
frequently is look, generativeAI is not, I think, a lot of
people.
It's like they're treating itlike we've got to have the
perfect solution across everyrole and it's like, guys, no,
(07:22):
you don't.
Generative AI is a departmentby department and use case based
deployment and I think so manyleaders they just they let it
get too big and a lot of what Ifeel like I've been doing lately
is like stop, yes, we're goingto be able to do all this stuff,
but we're going to start withthis group.
We're going to solve thischallenge that they're having on
pipeline generation, you know,deal evaluation, whatever it
(07:43):
might be, challenge that they'rehaving on pipeline generation,
deal evaluation, whatever itmight be, and just getting them
to focus there, because I feellike there's very few people
that are like I get it bigpicture.
And then the other issue I seearound that's hurting
deployments at time is sometimesIT is also behind on the art of
possible.
Sometimes IT is like well,especially when it comes to chat
(08:07):
GBT, it's like well what aboutour data?
It's like, guys, that's beensolved for like two years, you
know, as a part of this.
So so, again, I think it's likeI want to hear from you what
you feel like stopping people.
For me, I feel like it's likethey're making it too big,
they're thinking of it.
It's like trying to say what'sour internet strategy in 1996?
And it's like, well, the answeris it depends on this person or
(08:28):
this person.
So I'd love to hear from youwhat you're seeing around what's
holding people back.
Speaker 2 (08:36):
I think you said it
pretty well.
We see something similar in theconversations that we're having
.
I often characterize it, or weoften characterize it, in three
myths that slow companies down.
The first you've touched onit's that we need to have this
big, massive AI project and thatis the only way that we're
going to succeed.
(08:56):
What's this big bet?
That's not to say that youshouldn't have it, but many view
that as the singular way thatthey'll be able to prove that
it's valuable.
I think the second and we didmention this for a moment there
was that there's like thegeneral accessibility leads to
adoption Simply by having itthere and inviting your
(09:18):
employees to raise their hand,and if they do with a very clear
, maybe big project to use itfor, they will then have access.
That that is then the thingthat is needed.
And then I think the third isthat there's just this sort of
waiting game until, like, aisort of quote unquote matures.
And I think those three thingsare probably where we've seen
companies get really hung upeither on one or all of them,
(09:39):
when in the reality like if youthink about one, you know the
misconception that you need thisbig AI project.
The reality is that AI literacyspeeds up time to value and
gets you closer to that big betthat you want.
The more that you put thistechnology in the hands of your
employees and just have themstart using it, the more likely
it is that you'll A discover usecases that you hadn't thought
(10:01):
of previously, because you havesubject matter experts using
this for the stuff that they dobest, but also, b you'll get a
much clearer picture of therough edges of what the current
use is right.
So if you're using chat, you'remaybe bumping up against you
know you mentioned before westarted like the usage of
operator, like where's thelimitation of the way that you
might be using this today?
(10:21):
And then what?
Maybe do you have to take thatextra step for to sort of like
see a more sophisticated usecase?
And then for something likeaccessibility, like adoption
depends on the usefulness of thetool itself, and then, finally,
when it comes to waiting for AIto mature, first movers are
seeing impact.
I think I know that this issomething you're seeing, this
(10:42):
every day, and so there's reallyno, can't afford to wait
frankly, I like that.
Speaker 1 (10:48):
I wrote it down this
three myths of Gen AI.
Call it deployment oraccessibility, um, and then
let's get let's get tactical um,and maybe go kind of throughout
the go-to-market roles, right,you?
You have kind of the vantagepoint of both being a sales
leader right and also helpingsales leaders or go-to-market
(11:09):
teams to implement.
So let's, let's start with thefront lines.
You know are what are, aremaybe the top one or two use
cases where, when you're talkingto a sales leader, you said
look, I know you want to do allthese things, but here are one
or two for frontline salespeopleor frontline go-to-market folks
.
What are the top few?
Start here's that you typicallyhear or see.
Speaker 2 (11:34):
I think I'm like
deciding between more than a few
.
Oh yeah, go ahead.
Speaker 1 (11:41):
Give me a second.
Speaker 2 (11:44):
My mind is going a
lot of different places, but I
will say some of the first onesare probably at least obvious to
you, maybe not obvious toeverybody, but they're like
productivity-driven use cases,right, these are things around
first mapping out the areaswithin your workflow that you
just understand are sort of likeripe for AI usage.
And this comes into, like, thethings that LLMs are really good
(12:05):
at writing, summarization, dataanalysis and research.
And so I'll be specific aboutone.
I think deep research has sortof fundamentally changed the way
that our team, and certainlythat of many other go-to-market
organizations, do their job.
Certainly in the world of sales,like what used to take us hours
, especially for, like,enterprise reps who are
genuinely doing the homework andcoming showing up with a very
(12:29):
sharp point of view and a verydeep understanding of their
customer's business.
This was a labor of love for thebest of the best, and suddenly
you have deep research at yourfingertips and you're now doing
something in minutes that couldtake hours or days, and I'm
seeing that even the sort ofaccount directors, as we call
them, on our team leverage thisin all sorts of creative ways,
(12:50):
not only in an effort to sort ofprepare for these conversations
, but also they're sort of doingsome of the work on behalf of
their customers to show them theart of possible, and that in
and of itself is a form ofselling.
When they are, you know, deeplyunderstand the problems that
they're trying to solve usingour technology, to sort of take
some early steps in terms of,like, how they would think about
solving it through use of ourtechnology, you suddenly have
(13:10):
like asynchronous demos takingplace, when they can sort of
send these research reports thatare valuable to the customers
before you even have aconversation with them.
Speaker 1 (13:18):
I love that, yeah,
are they sending them again?
We ran a play with our privateequity partners where it was
like basically creating likewhite papers on trends in the
company that they just investedin, and it's like an eight page
white paper on.
Here are the trends, thatthey're the headwinds that this
company is going to face, basedon their market and where we
think we can help, and itcrushed right.
(13:40):
I mean, it's like it's such ano-brainer.
It's like that's why, withoutbound, it's so frustrating, I
feel like right now, because,to your point, the level of
richness and insights and likespecificity that I can get is is
higher than it's probably everbeen.
But I feel like so manycompanies, when it comes to
outbound, or have gone the otherway right, it's probably ever
been.
But I feel like so manycompanies, when it comes to
outbound, have gone the otherway, right, it's like how can we
do more and automate everything?
(14:00):
You know as a part of it whichthere's certainly things that we
can have, so okay, soproductivity, again kind of
number one.
I think we're seeing the samething.
What are some other?
Any other use cases?
You're seeing kind of thefrontline or cool Like again,
you talked about the kind ofdeep research being one any
other kind of frontline either.
You know sales or accountmanagement use cases that you're
(14:20):
seeing across clients that aresuccessful, yeah.
Speaker 2 (14:24):
I think another and
all sort of give a simple
version and then maybe a moresophisticated version.
But a simple version of this iseither a GPT or a project
through which you are funnelingthe types of internal data
information that you might useThink, call transcripts, think
notes that you've taken and arejust feeding these in such so
that you can create a much moreholistic picture of the work
(14:47):
that needs to be done for astrategic account and then sort
of breaking that out into a veryspecific project plan, but then
also using chat as sort of thedaily assistant to like hold you
accountable to that plan andthen to do a lot of the work in
terms of the sort of daily stepsthat you do to create a very
like valuable, differentiatedexperience for the customer.
And so this might be the waythat you are leveraging deep
(15:10):
research to send this.
It could be how you are likeanalyzing a lot of the different
conversations that you'rehaving to sort of challenge you
in terms of some of the blindspots that may exist or the
areas within a deal in which youare exposed.
I think so much of what we do assales leaders is we try to
understand in these engagementslike where are the exposure
(15:30):
areas, Like what have we missed?
And chat's.
Actually really good at that isthe more context that you feed
it, and I think projects havebeen a really good place that
I've seen my team and othergo-to-market organizations
create a home for all of this.
I have folks pinging me all thetime asking like, when are we
going to be able to like sharethis across teams?
That's something our org isworking on, but that's another
(15:51):
really good example of just likecreating homes for each account
, through which you can justcontinually bolster them with
context and information and thenuse that as a vehicle for
ongoing coaching actions to betaken, project management and
then also some of thecustomer-facing content that you
create in terms of the stuffthat you'll put into a deck or
(16:14):
into a proposal.
Speaker 1 (16:15):
Love that.
Yeah, that's a great one, andso you guys are using projects
within ChaiGPT.
Okay, cool, awesome.
Speaker 2 (16:22):
Yeah, I love that.
Projects and GPTs.
Speaker 1 (16:24):
Yeah, and I think
that that's the future I think a
lot of people need to thinkabout.
It's like imagine you'll havethis through line, imagine the
beautiful handoff processes thatshould be happening right.
It's like everything is there,it's all in one place.
You can query it, you know.
It's like it's so easy now frommaybe if you have, like a setup
where you have SDRs andsalespeople and account manager,
to where imagine how much thisis going to improve the customer
(16:47):
experience.
Right.
Where it's like I'm notrepeating myself 50 times the
account.
Like once I close the deal, theaccount manager is immediately
up to speed on my business.
Like that is a true, true gamechanger.
And we built a GPT, a custom GPT.
Like called competitive dealwinner.
And it's like it'll have youknow, you upload your transcript
or whatever it might be, andit'll help to say like hey,
(17:08):
typically your deals have thesefive people I don't hear any of
those people on this callcompared to the competition.
Like concede this point, focuson this point.
Right, and like the ability tohelp to stress, test some of
these as a frontline rep.
It's a no brainer.
I mean, the alternative used tobe in Connor, you remember this
.
It's like you would sit up atnight and play out the scenarios
(17:32):
.
You're like, okay, this personneeds to be involved, and then
we do this.
But it's now like by creating aproject using different GPTs,
you know you have your own liketo your point.
We call it.
You know we kind of use theword assistance a lot because we
feel like it resonates whereit's like each role has your own
little two or three buddiesthat are kind of sitting there
helping you to do your day andwith your chat GBT, I can at
mention the next buddy and thenbring him into that and then at
mention the next one.
And so I think anybody who's insales, you know, if you're on
(17:53):
the front lines or leadership, Ithink this concept of kind of
creating a home for a deal inprod is kind of a no brainer.
So I love, I love that use case.
Let's go to the leaders now.
Ok, let's talk about maybe youknow, and we're going to go to
like executive leaders next, butlet's talk like frontline
leaders.
What are some of the cool waysthat you are seeing
(18:15):
frontline-line leaders, or youknow front or you know
director-level leaders leveragethis to kind of work with the
team and, you know, not justhelp to close deals, but maybe,
you know, develop people etcetera.
Speaker 2 (18:28):
Yeah, this I mean,
it's the use cases will blend
and then, in some cases, reallysort of stand out as being very
singular and specific to roles,and so, like, everything that I
mentioned can be really valuablefor a frontline or second line
leader in the way that theysupport their team.
And but as you start to thinkabout like, for example, you're
increasingly able to personalizesomething like you know, chat
(18:49):
GPT, through which it like ismore and more valuable in the
way that it understands yourbody of work, like what your
function is, how you communicate, and where I have seen leaders
take this to the next level ishow they also help chat GPT, um
or the or you know, usingwhatever it is that you're using
, like also understand your team.
(19:12):
And so, for example, if youhave a leader who's uh, you know
we for example, I have my teamdo user guides, and so, for
example, if you have a leaderwho's you know, we for example,
I have my team do user guides,and so, with every one, every
member of my team, whetherthey're a frontline leader,
whether they're an IC, weunderstand how they like to work
, we understand their workinghours, what their preferences
are, their communication stylesyou can put as much of in that
as you want, whether this is aGPT and so on.
(19:32):
And suddenly you're, andsuddenly you sort of have a
version of that individualthrough which you can now ask
ChatGPT to support communicationto them.
You can ask it for, like youknow, gut check it within the
sort of coaching that you'reproviding and as long as you
were continually feeding it thefeedback that you were giving
the conversations that you'rehaving, the stellar performance
that you're seeing or the areasof growth and coaching that you
(19:54):
want to provide, you're havingthe stellar performance that
you're seeing or the areas ofgrowth and coaching that you
want to provide.
You now suddenly have thisrunning list that A is valuable
in the day-to-day conversationsthat you have.
You can consult it beforeone-on-ones.
But suddenly it's mid-year timefor check-ins and reviews or
you're doing annual performancereviews and you have this rich
history and context of yourrelationship that not only
understands the work that you'vedone up to this point but they
(20:15):
understand that individual thatyou're responsible for their
career growth and progress.
And not only does it save a tonof time we all know how much
time we spend during performancereviews but it's substantive
because it's backed in that sortof rich data set and context.
So that's one of the areas.
I think that's been really,really cool and super valuable
(20:35):
just from a person developmentstandpoint.
Speaker 1 (20:37):
That is dope.
Yeah, can you explain toeverybody user guide, just in
case they're not familiar withwhat that is.
Speaker 2 (20:42):
Yeah, in our case
it's quite simple.
We initially did it in Notionand we've sort of since moved it
.
You know I might now live intheir individual GPTs or
projects, but these user guidesare a template that you can
provide out to your team thatasks them both basic or
increasingly, you know, maybecomplex questions that help you
(21:02):
understand who they are asindividuals, as human beings,
not just workers, but then alsohow they like to work, what are
their preferred working hours,what are the ways that they like
to receive feedback or give it,what are their communication
styles.
You know you can get into likepersonality tests and things
like that and incorporate thatdata if you like, but these are
really good ways to sort of cutthrough the noise and understand
at a deeper level who the humanbeing is on the other side of
(21:26):
the desk from you, so that youcan tailor your coaching
accordingly, because you knowyou have to, of course, modify
that at the individual levelwhen you're working with as a
leader.
Speaker 1 (21:36):
That's so.
Yeah, I'm definitely stealingthis idea and are you creating
so you know if you're afrontline leader out there and
you're listening to this, or foryour team?
Are you creating, then,individual custom GPTs for each
person, or is it like ateam-wide one?
Speaker 2 (21:52):
I would ChatGPT will
get so much better at this, at
just the universal level, whenyou think about it as the
orchestration law.
But in order to keep it reallyclean and consistent, I would
recommend GPTs for now, becausein all of the custom
instructions you can put inthose user guides and then you
can continue to add to it, andso the same goes with feedback
loops and so on, and suddenlyit's just your sort of partner
(22:14):
in the process of being a moreeffective leader.
Speaker 1 (22:16):
Oh my gosh, this is
so.
Oh my gosh, I'm just like, ohgosh, I cannot wait to like talk
to people about this use case,like it's such a, it's such a no
brainer one that you know we,we talk a lot about the kind of,
you know, a rep can you knowuse, like your version of a good
discovery, call GPT right, andthen they're kind of putting
their call recordings through itand presenting that to you in
your one-to-one.
(22:36):
But this to me goes, as aleader, you know, I think we
like to think our minds are asteel trap.
Let's be perfectly honest.
You know, as soon as we finisha one-to-one, we're kind of off
to the next you know thing.
And this ability for me as aleader to have this kind of
running dialogue and like, hey,we finish a call where we just
had our goals one-on-one, putthe transcript in there and then
it uploads it so that that isgold, that is absolute gold.
(23:01):
Um, in my opinion, any, anyother kind of gems around
frontline leadership that you'reseeing, that you know you feel
like, are, um, you know movingthe needle.
But you know I love this usekind of.
I'm also thinking we'll createa user guide gpt, that then it
will help them to fill out theuser guide.
Yeah, and then, and then it'llget the output, and then that
can be your, your, yourknowledge doc.
(23:22):
As a part of it.
Speaker 2 (23:24):
That's exactly right
Like it's.
It's.
I think before we even startedrecording, we sort of talked
about how hard it is to breakhabits.
Like starting somewhere like adoc or Notion seemed obvious at
the time, but, of course, likeyou should start with chat, you
have to remind yourself of that.
(23:45):
I think the other would just be,on a number of different levels
, taking advantage of dataanalysis, and so as a frontline
leader let's talk about that youmight be using something like
intelligence already, a gong ora chorus or insert any number of
the ones that are out there andyou can have that.
But you can take that manylayers deeper.
You can also take theinformation that you have around
team performance.
You can use, uh, that plus deepresearch to understand sort of
(24:10):
the competitive landscape andcompare that relative to what
you're encountering in the field.
This can be, you know,information around um, that
you're history and beyond, andmaybe we'll talk about this if
we have time, maybe we won't,but this starts to get more
sophisticated as these frontlineleaders and as they work with
cross-functional partners startto work with the technology in a
(24:30):
rarer form, maybe through theAPI, to think, through,
replicating a lot of the workthat an SDR once did right All
the way up from enrichment torouting, to outreach and beyond.
We have certainly moved pastwhat that function traditionally
did and rely on our technologyto do it.
(24:50):
And frontline managers have arole to play here, because they
understand very deeply andintuitively what it is that
these roles once performed, andso they're now as much
architecting the responses andthe quality and the nature of
this work that's happeningthrough the technology as they
are actually working with withthe evolved roles themselves.
Speaker 1 (25:13):
All right, let's talk
about this because this is a
good transition.
I talk also about, like, seniorleaders and how execs and you
know, in the go to market.
But I want to talk more aboutdata and analysis.
Right, and maybe where are the?
Where are the areas you feellike, hey, if you had to put
yourself as a frontline leader,director, if you had to say one
or two use cases around dataanalysis, that chat, gbt is
(25:38):
really good at now.
Are there a couple that kind ofcome to mind of like you know,
okay, definitely this, and alsothe flip side.
Are there areas where you'relike, look, it's not there yet,
where it can do why, right?
So are there a couple of kindof tactical data analysis?
Because I, you know, we, youknow the call transcript and
kind of noticing patterns?
(25:58):
I was just talking to one ofour partners about this earlier
today and it's like great, likeyou know, with the api I can
take all the gone recordings,like from jake dunlap, I can
then run them through my callscore, gpt, you know, via the
api, and then I can turn back.
Here are the trends for jakeand it crushes so good.
So that's one, but I have had,you know when I try to do more
(26:21):
detailed like data trendanalysis of like large amounts
of data across reps, you know,are we there yet?
Or where is that level of likebeing able to use a custom GPT
or even the API to really spotthose meaty trends in individual
performance?
Speaker 2 (26:41):
I think we are there.
I think it will get better.
It often is, I know you,probably you understand deeply.
It just comes down to the workthat you're able to do with the
data itself, and it's like thequality and the nature and the
categorization of the data.
And so we're at a point now, Ithink, where you may not have an
(27:03):
FLM if they don't have theseskills developed yet to take all
of this data in its sort ofunstructured form and supply it
and get the type of analysis andinsights that would help them
really move the needle.
However, if they have across-functional partner that
can help get that data into astate where the model can
interpret it much more easily,suddenly you have something
(27:24):
there, and so it's a little bitless about what the model can do
from a performance standpointand more about what is necessary
to get that data into such astate that the model can make
sense of it.
Speaker 1 (27:37):
Yeah, that's it.
Yeah, and I think a lot ofpeople it's like that last part
you said is critical, like don'tthrow the baby out with the
bathwater, meaning it might notbe giving you kind of the
insights, because the way thatthe data is structured is just
like not as easy.
I mean, it's pretty good attaking a lot of like
unstructured points, but we'vehad to kind of learn that lesson
the hard way a few times on,just like why can't we get it to
(27:59):
look at this data point?
And then you know we kind offigure out, oh, we need to feed
it this way, or let's focus onlike one or two people at a time
, or something like that.
Speaker 2 (28:12):
Totally, and you know
, to be fair, companies have
built up this massive amount ofinternal intelligence.
They have data, they have tools, they have insights, they have
research.
It's the most valuable assetthey have in many cases, but
most employees can't access iteasily.
Right, we've had this whole jobcategory of data scientists
just to help solve this problem.
But that data is unstructuredor it's hard to find.
And so I mentioned all thisbecause I think, until recently,
and still to a certain extent,you have a lot of knowledge
(28:34):
workers trying to work withintheir specific domain or
function who then feelunderwhelmed when they go into
chat, either through lack ofunderstanding or training or how
to give the right context tothe models to get valuable
results.
They feel like this is toogeneric or this is not relevant
to my job, and then they go away.
And that's fair.
It's hard to like, go in withexpectations, see a result
(28:55):
that's lackluster, and thensuddenly you're like this is not
for, and so that kind of goesback to our initial comments
around, like the steps that arereally important to take to
support adoption at scale.
But it's not.
It certainly isn't just plugand play, like in some cases it
is, but in most cases, there'sthat extra, there's those extra
steps that really make thedifference.
Speaker 1 (29:14):
Yeah, yeah, I think
that that's such a good call out
.
Yeah, we're just seeing it.
We did a session in Januarywith one of our clients We've
got about 400 sales reps and Isaid how many of you use ChatGBT
or something similar every dayor at least weekly, and probably
90% of hands went up.
Next question how many of youhave had any training from your
(29:37):
company on how to use it?
Every hand goes down and Ithink that's your.
You know to your point.
It's if you're not enabling orobviously partnering with a
company like ours, right,shameless plug.
But you know, to get theeducation, your team again.
We are used to talking tomachines in like five words of
you know broken English, right,it's like, and we're so used to
(30:01):
talking to machines of like if Igive it too much, you know
Google quits, it's like nosearch results, and so you know
we're kind of reprogramming andI think this is important for
anybody in leadership.
It is your job as a leader.
Like this is like whenevercompanies had to lead their
employees to the internet, youknow kind of revolution and they
had to train them.
They had to train Brenda how tosend an email right and and how
(30:23):
to like, use Google, like, andI think employers really need to
realize like you have to investin this.
Like this is a trans.
This is a transformation in theway we solve problems as humans
, and it's not an easy one,right, because it is we're
unlearning.
For me, you know, 20 plus yearsof like behavior that led to me
being successful, and now I'vegot to reprogram that to solve
(30:44):
problems kind of foundationallydifferent, and so I think it's
just such a good call out aroundmaking sure that your teams are
being enabled to learn how touse the tools, and it's your job
, you know, as a company, to dothat.
So, um, um, so, all right, Igot a couple more here.
So let's talk exec level.
Okay, if I'm an exec, I'm a CROor even a CEO, but I'm
(31:07):
go-to-market focused.
What are some of the ahas?
That, when leaders implementthis, they're like oh my God, my
life is so much easier now thatI'm doing x I think there are a
few things and I I'll keep it.
Speaker 2 (31:26):
I'll begin with the
primitives, and we can always
expand to like where, how wethink about at the company level
, leveraging the technology tosort of reimagine workflows and
how that might translate tosomething very like a very new
experience or a very importantoutcome for executives.
But if you think about it fromthe context of VP or C-level,
they are suddenly seeing a teamthat, like, using the example of
(31:51):
an AI-enabled team, you nowhave an organization that's sort
of reimagining the way thatthey can approach the work that
they've been doing the same wayfor a very long time and in our
experience, like I, you know wewe've used this example a few
times, but that's it.
But for good reason, you know,you have some, some.
A company like Moderna, forexample, and they in probably
(32:14):
the first couple of months, ormaybe it was within the first 90
days, had a team that created750 custom GPTs, and it wasn't
just their researchers or theirscientists, they had their legal
team creating the contract GPT.
(32:34):
They had their marketing teamcreating the brand GPT, and what
they saw was this transitionfrom individual impact to team
impact, to organizational impact.
And so when you're, when you'rean executive, thinking about the
importance of deploying thistechnology.
It really is that sort ofmaturity curve.
That's when you start to feelthe difference, when it goes
(32:57):
from individual to team toorganization, because, as we
talked about at the beginning,initially like wow, like time
savings, amazing, that's great,but what are we then doing with
it?
And going back to the Modernaexample, they actually, if you
take something that they callthe Dose ID, gpt, basically this
had the potential to boost theamount of work that they were
(33:17):
doing as a team.
They were comprehensivelyevaluating these extremely large
amounts of data and suddenlythey're measuring this in the
(33:41):
context of how quickly they can.
They're either growing theirbusiness or reducing costs,
because the organization is nowreapplying that time saved into
much higher value areas of work.
Speaker 1 (33:58):
Yeah, I think that
that's great.
It's like you know, they'rekind of once they start to see
that type of behavior, we didsomething, you know, similar.
We brought everyone in thecompany together in March and we
had everybody create and weblocked out three hours and we
taught everyone how do youcreate a custom GPT and what do
you do, and had everybody createyou know, their own custom GPT.
And once that starts to trickleup, and now they're like oh,
(34:20):
I'm just going to do that forthis problem now and for this
problem now and for this, andhow easy it is and what are the
leaders that this type ofbehavior is happening.
What do you think?
What are these execs?
What are they doing differentlythan maybe other execs that are
kind of like, well, myenablement person's figuring
this out or my IT, like what arethose execs doing differently
(34:43):
than execs that aren't seeingthat type of individual team?
You know, company performance.
Speaker 2 (34:51):
It's a really good
question.
The first thing that comes tomind is really quite simple, but
you called it out earlier assomething that possibly that you
hadn't seen.
In surveying a room, which isthe first thing they do is they
just sponsor it.
Something that we encounter andI'm sure that you have all the
time is when you're speakingwith frontline leaders or
individual contributors, thereis a certain uncertainty as to
(35:14):
whether or not they can use itat all, or whether they can use
it for a specific use case, andthere's a little bit of a
reticence, frankly, to admitthat they're using it.
There's kind of like a culture,or at least this like pervasive
sort of concept, that byadmitting that you're using AI,
you are somehow cheating or, youknow, doing shortcuts to your
(35:37):
work, and they don't want toadmit that, and that's something
that needs to be addressed veryquickly.
And so that where I seeexecutives facing that head on
is they're a, they aresponsoring it, they're leading
from the front, not only throughthe way that they are
communicating to the company andthe communication is important
how they message this out to thecompany and the way that they
think about AI policy and AIgovernance and the effective,
(35:58):
safe usage of it is very, veryimportant.
How they message this out tothe company and the way that
they think about AI policy andAI governance and the effective,
safe usage of it is very, veryimportant, but then also how
they use it themselves whenthey're leading by example.
I can't tell you how many justexec hackathons we have run
through which we're going inon-site doing workshops with
exec teams, and that's veryoften where the magic happens
and you start to see a lot oflight bulbs turn on because, to
(36:20):
your earlier point around Art ofPossible, that's where that
starts to become a reality.
I'd say the second thing thatexecs are doing is that they are
orchestrating or delegating tothe right people the types of
curriculum and education thatneeds to happen.
So they're sponsoring thehackathons, they're bringing in
(36:41):
folks to help educate their teamon the right ways to use this,
and then they're also elevatingchampions and they're rewarding
or highlighting or recognizingthe good work that's being done
with the technology itself.
And so when you're celebratingit and recognizing it and then
also sponsoring the sort of safe, responsible use of it,
(37:01):
suddenly you see folks leaningin much more, and what really
needs to happen is you need tosee the proliferation of these
use cases.
You want your employees sharinghow they're doing it, because
that's where innovation startsto happen, and you get the
flywheel going, and if employeesdon't feel like they can do
that, the impact will be stymied, and so that's where I see
(37:22):
execs really make a difference.
Speaker 1 (37:23):
Yeah, the exec
leader's like, yeah, you guys do
it, but I'm still doing it thisway, and you're like, well,
that's not going to work.
And the other thing that I'dadd that we're seeing that
that's kind of leading tosuccess is we're, you know, for
each department we're setting upKanban boards of like the
custom GPTs, right when it'slike each department, like
really making the departmentstake ownership over, like
(37:44):
solving their problems, andsaying like, okay, we're going
to do these two first and wekind of create it's like okay,
business case validation.
Like what would it be used for?
Is this going to increaseproductivity by more than 20%?
Yes or no?
Right, great, move into liketesting.
And then you kind of have thislike it's, I really feel, like
with Gen AI.
It's like each department inthe go to market needs to have
their own kind of like almostlike a product.
You know, not a full likeproduct type roadmap, but you've
(38:05):
got this kind of productionbased roadmap and we're seeing
that lead to a lot of success.
Because then we get all theseideas that people have and like,
great, jake's running on thisone and this.
But we're also making sure that, you know, we're capturing all
of those best practices and sowe can share them out.
And you know, because we do seethat too, it's like, oh yeah,
(38:26):
stephen's team built this, youknow.
You know, assistant, but noteverybody is on a team's version
of chat GBT.
You're like, well, that doesn'tmake any sense.
Like, if he built that, why,why don't we share that with
everybody?
Right, and so if you're onenterprise or teams, you know
you can, you can do that.
So, all right, this has been anawesome.
I literally have like this ismy scribble of notes that you
can't see of like differentpoints that are going to be
(38:48):
amazing.
But I've got kind of one lastquestion for you when are we
headed?
Okay, if you think about andobviously I know you know
there's only so much you can sayright, I know we got chat GPT-5
coming down the pipe, you knowhere and that's going to be
pretty dope.
I'm pretty excited for that.
But what?
Where do you think we're headed?
(39:08):
And again, like I mean God, whoknows, in two years?
So I won't even know how, aboutover the next, you know 6 to 12
months.
Speaker 2 (39:29):
Yeah, even.
It's the pace of everything andI appreciate that you gave six
months.
I've heard five years manytimes.
Speaker 1 (39:35):
That's max Five years
.
Come on Five years.
What are you kidding me?
My robot's going to be overhere interviewing my avatar will
be interviewing you in fiveyears, my avatar will be
interviewing your avatar in fiveyears yeah, no kidding, but
sometimes even six months canfeel like a lifetime away.
Speaker 2 (39:55):
But what I will say
is we're seeing some suggestions
of it already and it's makingits rounds and, I think, in many
cases becoming a bit of abuzzword.
But this idea of agenticorchestration is real and
meaningful and I think we'llincreasingly see that play out
in ways that are much moreaccessible to your average user
In the context of something thatI think almost most people at
(40:18):
this point are familiar with,something like a chat experience
, and they can chat GPT.
Increasingly you will view thisas, I think, the orchestration
layer for work, wherever work ishappening, and some of the sort
of stepping stones to that havebeen laid out in the context of
things like GPTs being able toconnect those to data.
But as we increasingly releasesort of agentic offerings into
(40:42):
it so you mentioned Operatorbefore we have Deep Research
these are some of the firstexamples of agentic technology
within chat you can imagine thatthat continues to expand and
this concept of chat is like agenuine assistant for work,
wherever work is happeningbegins to take very real, very
meaningful shape and as itincreasingly understands more
(41:03):
and more about you you knowthere was memory that was
released not long ago.
You now have connectors throughwhich you can connect it to your
Google Drive or your email oryour calendar, and you also have
this concept of tasks throughwhich it can also proactively
prompt you to do things.
It's increasingly that sort ofsuper assistant that you don't
just go to here and there, sortof for one-off use cases.
(41:26):
It's sort of this always-onassistant through which you're
doing work.
I think that is a pretty safeway to think about some of the
future potential of thetechnology.
And then I think, from like anorg perspective, like I do think
we'll see the pyramid flatten alittle bit, like you'll see
fewer, you'll see more rolesemerge that work very closely
(41:48):
with the technology to architectworkflows, to drive change
management.
And you'll see, like I think,there will be a higher degree of
expectation around thetechnical aptitude of sellers,
frankly, and those in technicalsuccess will become the new
force multiplier and havealready, frankly, like we all in
(42:09):
sales, should, should put ahigh, should hold ourselves
accountable to raising the barfor what we are capable of from
a technical success standpoint.
Speaker 1 (42:18):
I love that.
Yeah, I think.
Yeah, I think that you werementioning kind of tasks, like
just kind of in passing.
But for those of you who don'tknow, I mean I agree, like the
future seller man, I'm gonnahave 600 things that the tasks
are running.
You know, I've got my wholeaccount book and every day my
tasks are looking did theyrelease a new annual report?
Did they release a new blog,right?
(42:40):
And so you know, we're so hungup on these tools that are like
intent signals.
It's like imagine a world whereone seller has 600 tentacles
going out every day and it'skind of coming back to you and
saying, jake, these are the fourthings that happened yesterday
that are going to allow you togo out and have a better
conversation with the currentcustomer.
Or, like you know, get yourfoot in the door with someone
and then it can run thosethrough.
(43:00):
Hey, jake, you know he runs itthrough the GPT and says, okay,
this is how you should talkabout it.
Or, like you know, then thenthat's where you know, I think,
for so many people.
And then you turn your brain onwhat I tell a lot of people.
(43:23):
Look, if you're copying andpasting purely AI responses,
you're not going to be employedbecause the agents will start to
do it.
So we have to realize that thepeople in the loop.
It's like it's expanding myability to do more and higher
quality, but you still have toturn yourself on and you know
like to your point.
You have to take the ownershipover learning the tools, not
(43:44):
sitting around waiting for yourboss to teach you, although it
would be good if they did that.
So, parting thoughts, connor,parting thoughts.
This has been an awesomeconversation.
Speaker 2 (43:58):
Like I said, I've got
literally a page of notes
Parting thoughts for anybody outthere who's on the front lines
or up to the exec team.
Yeah, my guidance at the broadlevel of just ask what to do or
how to get started with thetechnology is usually the same
and it's very simple and maybeunimpressive, but it is just go
out there and start using it ayou know, too often we do fall
(44:19):
back into old habits and it'sreasonable.
The pace is crazy.
Everything is moving veryquickly.
It can be very overwhelming,but the fastest way to sort of
start to mitigate, I think maybe, a growing sense of anxiety.
That is very fair and a lot ofpeople in terms of like can I
keep up with this?
Am I doing enough?
Is to a like, give yourselfsome grace.
Like it's okay if you don'tfeel like you're an expert yet.
(44:39):
There's a lot of great contentout there and you can take steps
to slowly sort of immerseyourself.
But the best way to do it isjust to get in there and start
using it, start innovating,share what you're doing with
your team or with your leaders,assuming that you know this is
the like.
That's like a like safe andresponsible use is happening, um
, but go out and go out and useit and and sort of challenge
(45:01):
yourself to try to like learnsomething new each week, um,
with the technology, and and seehow you can sort of you know,
uh, deploy it in a way that'sthat's relevant and valuable to
you and your personal andprofessional life.
And it just gets easier fromthere.
Speaker 1 (45:16):
That's right.
Try one thing.
I think that's great.
I tell everyone I'm like youknow I have this concept in time
management 80, 15, 5, 80% ofthe time you're doing things
that are going to impact yourkind of world in the next.
You know now and month.
You know 15% is the next sixmonths and then 5% is that six,
12, 18 months.
Dedicating time an hour or twois absolutely critical.
(45:38):
This is not optional.
To learn this, it is the future.
So, connor, I appreciate you,man, that's fun.
It was a good conversation.
I enjoyed it Absolutely.
I had it as well.
I appreciate you having me Allright, amazing.
Thank you very much for joiningeverybody.
I hope you got a ton of valueout of the episode.
Make sure to subscribe, if youare not subscribed already, to
(46:01):
the channel.
If you're listening on podcast,make sure that you get alerts
for when new episodes come outand download.
And, connor, appreciate you,man, great conversation, thank
you.