Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to the Sales
Enablement Society Stories from
the Trenches, where enablementpractitioners share their
real-world experiences.
Get the scoop on what'shappening inside Sales
Enablement teams across theglobal SES member community.
Each segment of Stories fromthe Trenches share the good, the
bad and the ugly practices ofcorporate sales.
Enablement initiatives learnedwhat worked, what didn't work
(00:25):
and how obstacles wereeliminated by corporate teams
and leadership.
Head back, grab a cold one andjoin host Paul Butterfield for
casual conversations about thewide and varied profession of
sales enablement, where there isnever a fits all solution.
Speaker 2 (00:39):
Hello and welcome to
another episode of the Sales
Enablement Society PodcastStories from the Trenches the
only bias for us podcast formatthat we're aware of where we're
bringing together enablementleaders from across the globe
hearing about the new andinnovative things they're doing,
the successes that they'reseeing and sometimes even, just
(01:00):
as importantly, where they'vefallen and failed and how they
backed up and did even betterthe second time around.
We learned from all of it.
I am excited to introduce youto this week's guest.
I'm going to start off byNickname that a lot of you may
know him by.
He goes by Coach K.
The name his mom gave him isJonathan Carford.
(01:22):
Welcome, jonathan.
Speaker 3 (01:23):
Thanks, Paul.
It's so good to be here.
I've been looking forward tothis.
Speaker 2 (01:26):
Thanks for a little
while, For a while.
So the cool thing for everybodythat may not have seen your
announcement is you recentlystarted a new role, so maybe
share a little bit about that.
Speaker 3 (01:36):
Yeah, I just actually
started last week, but I'm the
head of revenue enablement for astartup called Symmetric.
It's a reconciliation SaaScompany out of Columbia actually
, but they're goinginternational Really and they
needed some scaling help, whichis why I'm there to help them go
to the next level.
Speaker 2 (01:51):
Wow.
So you have the opportunity tohelp lead their go-to-market
motions into North America.
Speaker 3 (01:56):
Yep North.
Speaker 2 (01:56):
America and Europe.
It'll be fun.
That will be fun.
Well, congratulations, thankyou.
Want to have a little funbefore we get into the serious
stuff we're going to talk about,and so we're going to do our
signature Jimmy Kimmel challenge.
Yeah, so Kimmel passes away.
Through your connections,you're offered his show.
You can have anybody you wantas your first guest.
(02:17):
Who do you choose, and why didyou bring them on?
Speaker 3 (02:22):
Well, after listening
to the show many times, hearing
everyone's stuff, I thought alot about it, about famous
people, but honestly, for me itwould be my mom.
She passed away two and a halfyears ago from cancer and she's
one of the people who inspiresme to be the best I can be, and
so I'd love just to.
She would hate to be in frontof an audience, but it'd be a
lot of fun to pick her brain andjust see what inspires her and
(02:44):
give her some kudos to forinspiring me.
Speaker 2 (02:46):
Wow, that's great.
Almost positive that that isthe first time a parent has come
up in that context, so thankyou for that.
You're welcome.
All right, AI unless you'vebeen living under a rock, you've
probably already been hearing alot about it and hopefully
maybe even using it a little bit.
But what you wanted to time andtalk about with us today are
(03:09):
some very specific use cases andsome of your experience with
that.
So let's jump right in Again.
Unless people have been under arock, they must know what AI is
.
But I think it's still helpfulto start with a definition of AI
and enablement, so let's kickoff with that.
Speaker 3 (03:26):
It's kind of funny
because AI to me, is like the
sexy thing that everyone calls,but it really comes down to
predictive analytics, naturallanguage processing and some
sort of machine learning of somekind.
So a lot of times people arecalling predictive analytics AI
just because it's the sexy thingto call it, but they're not
exactly the same.
No, I think it's good tounderstand the differences
between those, because then youcan really release the power in
(03:48):
each platform if you understandwhat it actually is, not just AI
, because a lot of people getconfused with being like this
alter thing out there that,things for itself, is going to
take over the world, which isnot what it really is.
Speaker 2 (04:00):
It's like years ago.
I led a sales team and into itand it wasn't the financial
products, it was online databaseproduct they acquired from MIT
and they called it.
They always talked about itbeing in the cloud.
It wasn't.
It was also the server farm outof the North West.
The cloud sounded so muchcooler in 2009, 2010.
Yeah right.
Speaker 3 (04:19):
Cool thing to say now
yeah it was All right.
Speaker 2 (04:22):
So, just for those
that might be wondering, what
are the one or two keydifferences between predictive
analytics and true AI?
Speaker 3 (04:30):
Yeah, so an actual AI
is like its own entity that can
think and solve problems byitself without much guidance.
Predicted analytics is like,basically, take it's looking at
patterns and saying, based onthe stuff that I see, what is
the most likely thing to happennext.
So like, if you chit, if youput in a chat like please tell
me the top 10 things or top 10books that would be about
(04:50):
business and pop up 10 things,it's just doing that based on it
, what's gone on the past.
And then, with natural languageprocessing, that's where, like,
you have your Siri, your anystuff that recognizes your voice
, recognizes patterns of speechin most languages and can
thereby give answers, because itrecognizes in code our voices
(05:10):
and language, puts into computercode and pops it back out
whatever response it is.
So okay.
Speaker 2 (05:15):
All right.
So where are you seeing theintersection between AI and your
work and enablement?
Speaker 3 (05:22):
Yeah, it's a good
question.
I've used lots of tools likethere's things from just making
my life easier and not so messyto make it a lot faster to get
stuff done.
So, for example, I had, throughthe job interview process, I
had to put together a one pagerand went to mid journey to do a
couple of images to make thebranding look like the brand I
was looking for.
Went to Canada to use their AI,mimicked a download on whether
(05:46):
PDFs canva duplicated theirbranding.
Pop that to a thing.
Speaker 2 (05:50):
I don't care if we
could do that.
That's awesome.
It's pretty cool.
Speaker 3 (05:54):
And then you I put my
picture in mid journey so it
looks branded for the productthey're going for, and then
create this interactive PDF outof it, which was freaking cool.
And then the guys were like,how'd you do this?
This is amazing.
Like well, it's a you know, $10month canva tool.
It's pretty easy.
Speaker 2 (06:10):
So that's, it was
1000 hours, but you're working
time, but you should pay me forthis consultant.
Speaker 3 (06:16):
So there's that.
And then there's little thingslike there's tools of all kinds
from your voiceovers you canduplicate your voice and do lots
of voiceovers to you know,leverage your time, and then
things like I think I posted onein the works that we have about
there's a couple of AI toolsthat integrate with Slack that
can, like, summarize chats goingon so in case you miss anything
(06:38):
, it can review it for you,which is super nice.
So there's tons of stuff outthere.
This kind of depends on whatpain you're feeling, what you
want to solve for.
Speaker 2 (06:46):
Tell everybody about
there's an AI for thatcom.
I hadn't heard of that untilyou brought it up either.
Oh, really.
Speaker 3 (06:52):
Oh yeah, If you have
not gone there, it's free to
subscribe to it.
They just kind of like anaggregator of all things AI and
you can go in there and searchfor literally anything and
figure out and find stuff.
So if you want to like an imagegenerator that's free versus
mid-journey you can go in andfind that.
There's like coaching platforms, just all sorts of things in
(07:12):
there.
So if you haven't gone there,go in there, start measuring out
and searching things.
Most things that I find areeither from the newsletter, from
them, or because I went in andI researched out some options.
So pretty cool stuff.
Speaker 2 (07:25):
We published this
podcast on Buzzsprout that's our
platform that feeds Apple andGoogle and everything from there
and just a few months ago theyintroduced new AI, which I find
has reduced my workflowsignificantly.
Now, when I upload thistranscript in the next couple of
weeks, it will read it, it willsuggest titles, it will write a
(07:47):
summary of the episode, and Iusually end up tweaking that
stuff, but the fact that they'rejust boom putting it out there
and all I have to do is justmake it in my voice, it's pretty
cool.
So even just little things likethat, that weren't a big deal
to do, but it's a lot betterwith somebody else, or AI is
helping you go through and do it.
Speaker 3 (08:08):
It's kind of like
when you and I met before you
had that note, I had my own notetaken.
Speaker 2 (08:14):
There's lots of them
out there.
Yeah, Fireflies is amazing.
Speaker 3 (08:16):
There's a lot of good
stuff out there for free.
That does all the summary foryou on your calls Like why are
you not using that?
Yeah?
Speaker 2 (08:22):
no, that's a really
nice, a really good point.
Yeah, All right, so let's talkabout then.
When you talked about, youtalked about Canva, mid journey.
You went through all of that.
Have you have you discovered oridentified any potential
downsides?
Because so far, you knoweverything.
We're talking about sounds likeyou know everything's good, but
(08:44):
everything's awesome, Right,what's the flip side?
Or is there a flip side?
Speaker 3 (08:47):
Yeah, I think a lot
of times.
Well, there's several.
I'm going to give you a few.
One is because content is soeasily made.
That means the market's goingto be flushed with content.
That's not always the best,it's mediocre sometimes.
I love chat GBT for severalreasons, but you can tell when
(09:08):
someone posts on LinkedIn a chatGBT post.
Speaker 2 (09:10):
Oh my gosh, oh, yes,
yeah, yeah, yeah, yeah.
Speaker 3 (09:13):
And not that that's
bad.
I mean, sometimes it's good tosee that, but it's just.
I think it can adhere tolaziness and, like you said, I'd
rather have what it producesfor me than tweak it to my own
style or words.
But it can make it.
You can make easy.
Maybe sort of it can make youbecome lazy if you left.
The other side of it is a lotof people who are concerned
(09:34):
about data privacy with AIbecause a lot of the
integrations and some of thecontent you put in.
Where does that go when you putin a prompt about some
financial modeling, who knows?
And then the other one was someof them like mid-journey is its
own language and you have to.
There's a learning curve aroundhow to prompt mid-journey
because it's not like a chat.
Gbt is going to do what youwant.
(09:54):
You have to kind of work it alittle bit Less intuitive.
Yeah, it's awesome, it's a goodtool, but to get the exact
thing you want you have to kindof finesse a little bit.
So there's a little educationon it.
Speaker 2 (10:04):
Yeah, I know that
LinkedIn's algo favors longer
posts.
I personally don't get that,because I look at LinkedIn on my
phone more often than not andif I have to scroll I'm rarely
engaged enough to do that, butit is what it is and I think the
chat GBT the times when youread something that's just so
(10:28):
clearly written by some AI.
It reminds me of when thatnewbie BDR rep reads right, they
read the script as opposed tointernalizing it and talking to
somebody.
Yeah, there's something aboutchat GBT style.
It likes to be really fancy orformal or I don't know what it
(10:50):
is Anyway.
Speaker 3 (10:51):
So you can prompt
different ways that you can make
it like.
I saw one post when someonesaid something like make this
more bro, and then I said makeit Uber bro.
So like it kept going more.
Like I gotta try that one nexttime.
Speaker 2 (11:06):
That's actually
pretty funny.
Yeah, huh, all right, I gottatry that.
That's a new thing to do, newthing to do, all right.
So it sounds like you're sayingthat the downside is really if
you're not being thoughtful withit, if you're not using this as
a springboard and justexpecting it to essentially do
your work, the creative side ofyour work and that's a really
(11:27):
good point about financials, Iguess how do you fact check that
?
Right, it's a dome tool in thesense that it synthesizes from
whatever it finds on the web.
So if it finds bad numbers onthe web, it doesn't know that,
it doesn't have a way tovalidate that.
I'll bet that changes.
I'll let it evolve to do that,but I'm sure it will.
(11:48):
Points well taken.
So speaking of evolve, yeah,based on your experience with
the AI I know you've done someresearch into this topic what is
the future hold for enablementteams using AI?
Speaker 3 (12:04):
I thought about this
a lot actually, and actually put
a post up on LinkedIn aboutthis a few weeks ago because I
was thinking about where itcould go.
But it made me think of one ofthe Iron Man's I don't remember
which one it was, or one of theMarvel movies with Iron man, and
he has that virtual thingNumber two.
Speaker 2 (12:18):
I hope I hated that
one.
Speaker 3 (12:20):
It wasn't that one.
It might have been Civil War,but anyways, he had this
interactive thing where hewatches his younger version of
himself talk to his parents.
Speaker 2 (12:28):
Right, right, yeah, I
do.
Speaker 3 (12:29):
And so I was thinking
about something like that going
.
How cool would it be if we hadan AI that could go out and
research a persona and likeliterally the person, the VP of
whatever at some company, lookat all the articles, look at
their job history, look at theircompany and then create this AI
persona of this person that asalesperson could then pitch to
on practice and get ready for abig presentation, right, and
(12:52):
then actually get feedback fromboth AI and people on real time
stuff, because right now it'snot to that level.
You need a little more of aexperienced sales director or
someone who knows what they'redoing to kind of give that
feedback.
But how cool would it be tohave Stuff from the actual
company in person be the contentthat they produce that you have
to be ready for.
You know, I think I'd be ablast.
Speaker 2 (13:13):
What about concerns
and objections?
Is it smart enough to do that,or is that where your sales
leader has to filter?
Speaker 3 (13:19):
I would sit.
Well, I think it could go thatway if it had, like, a lot of
people.
I don't think know that withlike chat, you Bt or a lot of
them, you have to kind of trainit.
So I think if you loaded itwith a bunch of FAQs, it could
probably replicate it if youknew kind of like the theory
behind it.
I look at it more like it'd becool if you could go into a gong
or chorus, identify all thequestions are being asked, get
(13:41):
all the answers, have some wayto fact check those answers and
then create a thing out of it.
So if you asked to the questionon a chat or voice, it could
replicate that answer real time.
It'd be really cool if we coulddo that.
Speaker 2 (13:53):
You just sparked an
interesting idea for me, so what
do you think of this?
In my experience, productmarketing Typically owns and ICP
or personas, however enablementis a big stakeholder in that
and typically revops is as well,right.
Do you think that if a companyDoesn't have that ICP or persona
(14:17):
figured out, that that theycould get AI to you know, put in
some, put, like you say, faqsor some data about, about the
problems they solve and usecases and things like that and
smart enough to start generatingsome ideas for customers?
They should go, not notcustomers by company, but, yeah,
general customer types.
Speaker 3 (14:36):
Yeah, that's
something that Thomas actually
went through.
On the sec thing.
He just went the same thing.
It said, hey, let's find yourideal persona job and train that
train.
Gonna quote chat.
You did say, okay, here's thejob description, here's a couple
of paragraphs about the company.
Give me the top concerns ofthis role for this product and
(14:57):
then brand that popped out abunch of stuff.
Wow, okay, that's the kind ofthing I suggest doing all the
time is like really getting someif you don't know, like Just
like that example, if you don'tknow exactly what they do or
what the response before, go, gograb a job description, fill it
up and say, okay, here's theproduct I'm trying to pitch.
What are some concerns thatcould have?
Now, the challenges with chatto be tea, specifically, is
(15:19):
limited to 2021 info, but if yougo into things like AI, ai, prm
, if it was called, yeah, AI.
PRM is a is like a Add-on youcan put to chat GBT and then has
access to more modern stuff andthen you can have real-time
2023 concerns.
Speaker 2 (15:37):
So I saw Thomas speak
on this in New York City.
Yeah, but, gosh, that was wayback in March, right, so sounds
like I missed a more recentthing, but but he's, yeah, he's,
he's doing some amazing thoughtleadership On this topic.
I definitely agree.
One that I've heard about but Ihaven't had the chance to try
in a live enablement environmentis Analyzing 10ks.
(15:57):
I know you know in the past,because how many times do you
hear that from sales leaders?
Oh, they're a public company,go read their 10k.
But if a salesperson didn't goto B school, I didn't really
always know what to look for.
And and I know of people thatare using that very Effect
teaching their reps how to usethat just to analyze, upload the
10k and just get an analysis ofyou know the top takeaways and
(16:21):
you know From certainperspectives, that sort of thing
.
I thought that was pretty cool.
Speaker 3 (16:26):
Yeah, I think there's
a ton of stuff out there, but I
think that there's actually oneI'm gonna look this up in.
I mean, there's that there's anAI for that.
Right now I'm looking at 10kfinancials.
You can find anything.
Speaker 2 (16:37):
I think there might
be people I know doing.
They're just using chat GPT.
Oh yeah, you can totally do itthere may be a specialized tool
as well, but mean to me thatone's a kind of a big deal
because in the past, for exampleat Vonage, we developed what we
called mini MBA program forsales Mm-hmm and that had a few
components to it.
For example, our head of IT.
(16:57):
Right, we had him do a sessionwith us on who gets through to
his gatekeeper to him, becausewe sold to IT is one of our
Presence like who got through toDara and why did he choose to
listen to that person versus theother?
You know 20 right?
Yeah, we did a few things likethat, but one of them was
actually how to read, how toanalyze, and you know, and make
(17:20):
use of that analysis in selling,and I think that's still a
useful skill.
But, man, with the, with theacceleration of business and
everything, just in the timeI've been gone from Vonage what
three, three and a half years?
Yeah, a while.
This is so much better, so muchbetter.
You talked about Iron manearlier.
What about, like a Jarvis forsales?
(17:41):
That would be pretty cool, thatwould be awesome.
Speaker 3 (17:44):
I think they're in
close, honestly, with some of
the companies they have, likeClary and other places like
they're.
We're really advancing all thethings that can be capable of,
and I think the next step islike a Jarvis type AI, where
it's like a salesperson andthey're Jarvis chat, you BT
person Talking to them, say,okay, let me make this thing and
let's talk about this thing,and they explode it up and can
see some really cool diagrams.
I think it's gonna be way funto be a part of and, to be fair,
(18:07):
though, I thought you probablysaw that Gartner article about
how CRO is going to beimplementing like an AI specific
revenue generator manager whoyou want to call that title
there's got to be, with someonefocusing on that full time.
I think that's going to makehuge strides in the organization
, with revenues specifically,which would be fun.
Speaker 2 (18:27):
I agree, and not to
scare anybody, but my thought
with that is there's some entrylevel rev ops position that are
going to disappear too, yeah,you know which?
I think we all have to be alittle cautious about that to
one degree or another.
But I mean, that's the kind ofstuff that you don't need nearly
as large a team because you'renot trying to generate all of
(18:49):
that.
I think of.
Yeah, it'd be interesting tosee how that evolves.
Speaker 3 (18:55):
I think you can see
in rev ops or enablement or
maybe both, whatever.
But I think there will be aposition sooner than later that
will be an AI specificenablement role, whatever that
title is you know, but therewill be something.
Speaker 2 (19:07):
That would be
interesting when to explore.
Well, you know you're going tobe putting together a team
sooner than later, so maybeyou're the first one to try that
out AI specialist.
Speaker 3 (19:17):
There you go, that's
right, you're right.
Speaker 2 (19:18):
An enablement AI
specialist.
Ai enablement specialist.
There, you go Very cool, Allright.
So AI is pretty cool.
I mean, we're having a lot offun talking about it here, but
what is your advice forenablement teams or maybe even
companies?
You know, because I know you'vedone a broad, broad types of
work when it comes to AIanything, any specific advice or
(19:44):
first steps to get started,anything like that would be
helpful.
Speaker 3 (19:49):
Well, I think it's
really identifying what the core
need is.
I think with any tech tool ofany kind, it's really easy to
get the shiny new thing and belike hey, we need to get this
thing because it's so awesomeand everybody else has it yeah.
Yeah, it's just I don't thinkthat way.
I think more of like what arewe actually trying to solve for
it?
It comes down to enablement,basics.
What are we trying to change?
Is there a behavior, is it aKBI and what is it?
(20:11):
And is there a tool that willbe best fit to do that thing?
Because you'll find most of thetimes that you could probably
get a core of three or fourtechs and you'll cover 80 to 90%
of what you need.
Right, I agree Versus saying Ineed 10 things and have all the
stuff, and it just makes itoverwhelming for you and for the
sales team and everyone elseinvolved.
So I think I say keep it simplewith what you actually want to
(20:31):
change and then, secondly, juststart to research out with some
different options.
Most of them have some sort offree offering, like I, I tried
your Fireflies note ticker.
I tried three or four of them.
I'm just trying to figure outwhich one I, like you know they
all have different styles, yeah.
Different styles, differentapproaches.
And then, lastly, I'd say isdon't be afraid, like don't be
(20:53):
afraid of trying something newand and figuring out, because
it's it's well worth it.
Speaker 2 (20:57):
On the other side, I
think it would be helpful for
our audience to get a little bitinside your head on on how
you've gone.
You know, done that becauseprobably a lot of them haven't
really gone and done that levelof evaluating and implementation
and that sort of thing.
So let's start with what yousaid on outcomes.
Now, hopefully everybody inenablement by now, if they
(21:19):
haven't been already is is isdefining outcomes or forcing
their stakeholders to defineoutcomes with them before they
just start enabling stuff.
But do you have a coupleexamples of specific outcomes
you've identified that AI isgoing to?
Where AI will shine over maybeother things, ways that we've
been doing it.
Speaker 3 (21:40):
Yeah, so I'm actually
in the middle of I won't see
the name of the company, but I'mactually in the middle of
implementing a content, a CMS,lms combo, which is built on AI
platform.
Okay, and the cool thing is isthat in the old iterations of
content management systems, I'dhave to manually tag everything.
My current company has threedifferent languages.
That speaks yeah, I'm tagging.
(22:01):
It speaks three languagesSpanish, english and Portuguese.
I don't speak Spanish andPortuguese, so for me to have to
tag in Spanish and PortugueseI'd have to have someone to
translate for me.
What the world they're talkingabout would take forever.
So this new AI auto tags basedon the content itself and the
title and the language, and thenwe'll pull up at the right time
(22:22):
or right sequencing, either inemail or Salesforce, based on
the persona, the cell stage andwhat they're looking for, which
makes my life light years betterthan what it would have been
two years ago when I was doingthis all manually, right.
Speaker 2 (22:35):
Oh my gosh.
Yeah, we went through that.
We went through that instructure because we had
deployed an enablement platformthat let us serve up content
right stage, right time, rightpersona, all of that right in
the Salesforce.
But the amount of meta taggingthat had to go in.
And then I happen to haveSpanish speakers and Portuguese
speakers on my team because abit writing team reported me and
(22:57):
then we had to bring them in todo everything you just said
manually.
I don't even know how many workhours that was.
That's a pretty cool use case.
That's a really cool use case.
Speaker 3 (23:08):
Yeah, I mean, and
from enablement person of one.
I'm building it from scratch.
I don't have time to do allthis Like I've got to do it now.
So anything that can help meget that done quickly, I think,
is for me just well so helpful.
Speaker 2 (23:21):
Plus, you get to look
like a badass because you did
it with AI, right yeah?
Speaker 3 (23:24):
because how'd you do
this?
Well, you know, I studyPortuguese and that's right.
Speaker 2 (23:29):
And then the other
question that your last track
sparked for me was can you sharesome of the criteria that
you've developed for yourself ina value?
So you talked about theimportance of evaluating
platforms, not trying everythingon earth.
Any recommendation for people.
What should they be looking for?
(23:50):
What do you use when youevaluate AI?
Speaker 3 (23:53):
Well, it really comes
down to me, for who's going to
be using it and it's for me.
It's more about how often willI use this tool.
Is this a once a month thing oris this a once a day thing?
That's number one, and then twofor the team.
It's more about, specificallyfor revenue teams.
It's more about do they have tolog in somewhere else or can
this go to where they are andhelp them?
(24:14):
Because if they have to gosomewhere else, it's just
another shiny tool that sellspeople have to keep track of, or
the CSM or whoever is using it.
So ease of use and being ableto understand what they're going
through in their own workday isthe best use case, and I got to
see it in action, which, ofcourse, takes a demo or some
sort of experience.
But if anyone tries to convinceto go into their platform and
(24:36):
all this other stuff, I guessit's just not helpful.
And then, like any other techstack, I want to compare it to
what else is out there, which isagain why there's an AI, for
that is great because you canlook at and see you know you can
even see any other options youhave out there to see what's
going to be the best for yoursituation.
Speaker 2 (24:55):
That makes sense, and
I would think that we already
should be reevaluating theirtech pieces on a regular basis.
With AI, we probably need toreevaluate those maybe two X the
speed, because this is evolvingso much and there's going to be
tools online in six months thataren't there today.
So very good.
Well, thank you.
This is this has been a greatconversation on that, but before
(25:18):
I let you go, want to give youa chance to you know, drop a
truth bomb on us all, and ifyou've been given the gift of
time travel and you can go backand you can coach young Jonathan
on anything you want, but theonly restriction is it has to be
what you, only one thing.
So what would you choose andtell us a little bit about that?
Speaker 3 (25:41):
Oh gosh, even in the
moment I've thought about this
for weeks now.
I still have a hard time withthis but the biggest thing is to
trust, trust myself.
I know it sounds kind of cliche, but okay.
There's been many times overthe years I have doubted my
career path, doubted my skilllevels, doubted all sorts of
things, and if I can have myfuture stuff come to me, I'll be
(26:02):
like okay, just trust yourself,you're on the right path, it's
going to be okay.
Because a lot of times I thinkthat doubt is what has held me
back professionally, financially, all sorts of ways, because
they didn't trust my initialinstinct.
So that makes sense.
And if I could add one thing tothat is like always, always be
(26:22):
progressing not perfect, butprogress whatever.
That is One reason why I lovebeing on the edge of things,
because I love learning, I lovenot knowing stuff and figuring
stuff out, and so it's alwaysbeen like hey, you don't know,
go figure out, don't be afraidof failing.
It's a lot of fun.
Speaker 2 (26:37):
That was actually one
of our values at GE was fail
fast.
You know, if you figured out,move on.
If that's not the right way,you'll do it Well, thanks,
Appreciate your time.
I'm sure you have sparkedquestions with some of the folks
listening right now.
Is LinkedIn the best way toconnect with you if anyone wants
to follow up on thisconversation?
Speaker 3 (26:59):
Yeah, please.
Linkedin, and probablyconnected to you, is looking me
up as Jonathan Carford, a coachK it's, I think it's JMK MBA on
LinkedIn is my handle.
I respond to all messages aslong as you don't pitch, slap me
.
So if you do that, I may or maynot respond.
Speaker 2 (27:15):
Yeah, and if you
respond, they probably won't
like it they won't, I'll be likecoaching and make.
Speaker 3 (27:20):
Let me tell you what
you should have said yeah, Well,
and what I do is I?
Speaker 2 (27:24):
I just unconnect if
I've chosen.
I accepted the connectionrequest.
They hit me with that.
I just I'm busy, I just love togive them coaching, but yeah,
all right.
Well, thank you again.
Appreciate the time, especiallyyour you know weekend to your
brand new job.
So you got a lot going on.
And thank you to everybodythat's invested 30 minutes of
your time with us.
Stay safe, come back in twoweeks.
(27:44):
We'll have a new guest and newcontent.
Thank you, thank you.
Speaker 1 (27:47):
Thanks for joining
this episode of stories from the
trenches.
For more sales enablementresources, be sure to join the
sales enablement society at s?
E societyorg.
That's s e s o c I e t y dotorg.