Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
.999Hey crafters.
Just a reminder, this podcast is for informational entertainment purposes only and should not be considered advice.
3
00:00:08,330.0000000002 --> 00:00:18,10.001
The views and opinions expressed our own and do not represent those of any company or business we currently work with are associated with, or have worked with in the past.
4
00:00:18,610.001 --> 00:00:20,530.001
for tuning in to the FutureCraft podcast.
5
00:00:20,800.001 --> 00:00:21,770.001
Let's get it started.
6
00:00:22,820.001 --> 00:00:23,390.001
Hey there.
7
00:00:23,420.001 --> 00:00:32,780.001
Welcome to the Future Craft GTM podcast, where we're exploring how AI is changing all things go to market, from awareness to loyalty, and everything in between.
8
00:00:33,230.001 --> 00:00:36,80.001
I'm Ken Rodin, one of your guides on this exciting new journey.
9
00:00:36,690.001 --> 00:00:41,250.001
I am Erin Mills, your other co-host, and together we're here to unpack the future of AI and go to market.
10
00:00:41,490.001 --> 00:00:48,510.001
We're going to share some best practices, how tos and interview industry leaders and pioneers who are paving the way in AI and go to market.
11
00:00:50,775.001 --> 00:00:58,30.001
All right, Ken, so how are you paving the way this week? I am a big baker, and Sundays we bake on our house.
12
00:00:58,30.001 --> 00:01:07,930.001
I am looking for new recipes all the time, but one of the challenges I have is sometimes those recipes don't have every detail you need to make the recipe.
13
00:01:08,20.001 --> 00:01:12,10.001
It might be a temperature thing, it might be the tools I have.
14
00:01:12,60.001 --> 00:01:13,530.001
Anyway, I always stumble into something.
15
00:01:13,800.001 --> 00:01:18,400.001
So what I've done is I've created a custom GPT called Idiot Proof Bake Buddy.
16
00:01:18,610.001 --> 00:01:22,240.001
And what it does is I put the I put the recipe in.
17
00:01:22,530.001 --> 00:01:27,90.001
It will then lay out the ingredients by the aisle of the grocery store.
18
00:01:27,210.001 --> 00:01:27,430.001
Ooh.
19
00:01:28,65.001 --> 00:01:33,645.001
where I walk through the grocery store, I need to pick up the things and then it will tell me things that I need to prep.
20
00:01:33,645.001 --> 00:01:37,185.001
So it's like before you go to the grocery store, you can do these things when you come back.
21
00:01:37,215.001 --> 00:01:40,725.001
Make sure you preheat the oven and then make sure you know you've got your butter set out.
22
00:01:41,55.001 --> 00:01:47,175.001
And so what this does is it prevents me from being an idiot and messing up a recipe or not.
23
00:01:47,685.001 --> 00:01:51,105.001
Catching some subtle point that was maybe made in the instruction.
24
00:01:51,105.001 --> 00:01:53,355.001
So it has been a huge lifesaver for me.
25
00:01:53,485.001 --> 00:01:58,435.001
And my partner's very happy with the results because he gets to have baked goods that actually taste good.
26
00:01:59,465.001 --> 00:02:02,255.001
And those baked goods are pretty phenomenal.
27
00:02:02,305.001 --> 00:02:07,665.001
I didn't know that you were using GPT, but now I know why that baking is improved so much.
28
00:02:07,665.001 --> 00:02:08,345.001
So that's exciting.
29
00:02:10,220.001 --> 00:02:28,615.001
I owe GPTA lot and baking's one of them speaking of GPTs, my tip this week is maybe controversial, but I think we'll help our listeners quite a bit, let's say I really like your GPT and I wanna build something similar, but maybe just a little different, for my own personal use.
30
00:02:28,765.001 --> 00:02:32,995.001
Obviously wanna make sure to give any creator credit if you're doing external stuff.
31
00:02:33,45.001 --> 00:02:36,985.001
I may want to something more about grilling 'cause I'm not as good at the grill.
32
00:02:37,255.001 --> 00:02:43,975.001
So I can ask your GPT to give me the instructions on creating that GPT and it will.
33
00:02:44,125.001 --> 00:02:47,425.001
you can also get links and things like that is feeding its knowledge base.
34
00:02:47,425.001 --> 00:02:54,755.001
Another example of going a little meta with GPT and asking it to recreate something for you that you can then recreate for yourself.
35
00:02:55,395.001 --> 00:02:57,105.001
I really like that.
36
00:02:57,165.001 --> 00:03:01,805.001
Do you have any tips on how you went about setting that up? if I wanna steal that idea from you, how I would do it.
37
00:03:02,895.001 --> 00:03:05,125.001
Give me the instructions of how I'd recreate you.
38
00:03:05,175.001 --> 00:03:11,415.001
You can go deeper and ask for the knowledge base, but it depends on what you're interested in creating.
39
00:03:11,415.001 --> 00:03:16,35.001
if the GPT is exactly what I want, I'll use the original creators that's available on the GPT store.
40
00:03:16,215.001 --> 00:03:19,215.001
But a lot of times I want something a little more custom to what I'm doing.
41
00:03:19,455.001 --> 00:03:24,245.001
And so then being able to get those instructions and edit them to create my own GPT is what I do.
42
00:03:24,275.001 --> 00:03:25,525.001
I just start by asking.
43
00:03:25,805.001 --> 00:03:27,125.001
Yeah, it's a great call.
44
00:03:27,135.001 --> 00:03:29,745.001
it is community and we are all able to draw from things.
45
00:03:29,745.001 --> 00:03:32,985.001
And you're right, if I find a GBT that I really like from someone, I keep using it.
46
00:03:33,45.001 --> 00:03:34,155.001
I use quite a few of yours.
47
00:03:34,345.001 --> 00:03:40,35.001
But it is sometimes more about personalizing it like my grocery and baking guide because, I.
48
00:03:40,200.001 --> 00:03:41,100.001
gotta idiot proof it.
49
00:03:41,570.001 --> 00:03:45,50.001
And not only that, I might not have the same store that you have on the East Coast.
50
00:03:45,50.001 --> 00:03:47,360.001
So yeah, different locations.
51
00:03:47,520.001 --> 00:03:48,95.001
All right.
52
00:03:48,95.001 --> 00:03:56,550.001
Speaking about being creative with ways to use ai, who do we have today? We have one of the most creative people I know using ai.
53
00:03:56,780.001 --> 00:03:59,60.001
His name's Jonathan Car Ford.
54
00:03:59,250.001 --> 00:04:00,630.001
Some people call him Coach K.
55
00:04:00,680.001 --> 00:04:03,470.001
He is a university instructor.
56
00:04:03,590.001 --> 00:04:07,670.001
He is the head of Growth for Momentum io as well as CEO.
57
00:04:08,310.001 --> 00:04:10,260.001
Of GTM AI Academy.
58
00:04:10,260.001 --> 00:04:18,805.001
So pretty well known probably to a lot of our listeners, and he's going to come talk to us about all of those things for our next session, Great.
59
00:04:18,805.001 --> 00:04:29,435.001
Can't wait to talk to him All right, and we're back, today we've got Jonathan Car Ford.
60
00:04:29,505.001 --> 00:04:31,5.001
You've done so many different things.
61
00:04:31,35.001 --> 00:04:35,185.001
Can you introduce yourself? my name is Jonathan Carver, some people call me coach.
62
00:04:35,185.001 --> 00:04:40,375.001
Long story short, I was at a startup where there's 20 of us in a small office and 10 of us were named John.
63
00:04:40,645.001 --> 00:04:42,775.001
they got tired of all of us responding to John.
64
00:04:42,775.001 --> 00:04:44,605.001
And so the guy was like, Hey, you're the coach of the team.
65
00:04:44,655.001 --> 00:04:47,565.001
And I was a sales coach and he was like, okay, you're going to be Coach K.
66
00:04:47,595.001 --> 00:04:48,705.001
'cause no one can say your last name.
67
00:04:48,795.001 --> 00:04:49,405.001
You're the coach K.
68
00:04:49,465.001 --> 00:04:50,905.001
And then ever since then, it's stuck.
69
00:04:50,905.001 --> 00:05:09,865.001
I'm currently the head of GTM Growth at Momentum, a series a startup and revenue orchestration platform, I've been Running the GTM AI Academy for three years now, recently co-founded the AI Business Network with a friend of mine, which is more broad for leaders in business across finance and legal besides that, I advise people, I talk to people all the time about ai.
70
00:05:10,15.001 --> 00:05:15,435.001
I teach a beginners class at Bryan University at night I live, breathe, eat, sleep, AI all the time, every day.
71
00:05:15,465.001 --> 00:05:16,485.001
So love it.
72
00:05:16,485.001 --> 00:05:19,185.001
And I'm really a big fan of applying it to go to market.
73
00:05:19,535.001 --> 00:05:27,45.001
Thinking about the breadth of the work that you've done, you've spent the last few years helping go to market teams, use AI in practical ways.
74
00:05:27,215.001 --> 00:05:34,765.001
So from your perspective, what's actually changing inside go-to-market teams right now? From a tech level, but also how work's getting done.
75
00:05:36,370.001 --> 00:05:36,880.001
Oh man.
76
00:05:37,70.001 --> 00:05:38,510.001
That's a large question.
77
00:05:39,200.001 --> 00:05:42,590.001
I think right now people in general, they're feeling pressure from two different ways.
78
00:05:42,620.001 --> 00:05:49,700.001
One is you see things like from Shopify CEO saying won't hire unless we can find out for sure if AI can do it or not, which is a fascinating thing.
79
00:05:50,435.001 --> 00:05:59,55.001
Then I have a friend of mine who's in another larger organization comparable to Shopify, and her boss, said the same thing, except they also said, you need to learn how to use ai, but we're not giving you tools.
80
00:05:59,55.001 --> 00:05:59,925.001
We're not giving you time.
81
00:05:59,925.001 --> 00:06:00,915.001
We're not giving you training.
82
00:06:01,95.001 --> 00:06:02,805.001
We're not giving you a budget, but you gotta learn ai.
83
00:06:02,805.001 --> 00:06:04,155.001
And by the way, you can't use it on our stuff.
84
00:06:04,155.001 --> 00:06:17,650.001
So like, how are we supposed to do this? So there's this disparity going on right now that I think individuals are ready for AI and excited but they either don't have the ability to use it internally the time or money or whatever the case might be.
85
00:06:18,795.001 --> 00:06:21,300.001
But the up and ups are saying, Hey, let's get going with ai.
86
00:06:22,100.001 --> 00:06:25,100.001
They don't know how, they have no plan and that kinda stuff.
87
00:06:25,100.001 --> 00:06:27,50.001
So it's fascinating to see that difference in change.
88
00:06:28,550.001 --> 00:06:41,85.001
the other side of the, I would say is that there's just such a sheer overwhelm with like tools launching like crazy all over the place the models come out like Gemini or Claude or Chet, BT one of them is having an update literally every single week.
89
00:06:41,325.001 --> 00:06:43,317.501
So keeping up with it is extremely overwhelming.
90
00:06:43,455.001 --> 00:06:49,935.001
So between the pressure of work and people saying, you gotta get better faster, and AI changes, it's becoming insanely overwhelming.
91
00:06:50,155.001 --> 00:06:52,45.001
that's why I always say start with where you are.
92
00:06:52,435.001 --> 00:06:54,565.001
Don't worry about trying to become the AI expert overnight.
93
00:06:54,565.001 --> 00:06:57,655.001
You're not going to, and even if you tried, you're not going to make it 'cause it just goes too fast.
94
00:06:57,955.001 --> 00:07:06,105.001
But focus where you are to build out AI workflows or automations where you are, and you'll see big differences quickly Yeah, it's funny.
95
00:07:06,105.001 --> 00:07:09,825.001
We often talk to people about this stuff and we're like, you're okay.
96
00:07:09,855.001 --> 00:07:13,115.001
Wherever you're at right now, you're fine and we'll figure it out.
97
00:07:13,325.001 --> 00:07:20,765.001
right? One thing that's really interesting to me though is in your report, if you guys haven't checked it out, the go to market AI 2025 Yeah.
98
00:07:20,775.001 --> 00:07:21,555.001
Really interesting.
99
00:07:21,995.001 --> 00:07:27,875.001
84% of go-to-market leaders are saying that AI is a strategic priority, and I think that's coming from the top too.
100
00:07:27,905.001 --> 00:07:33,975.001
The CEOs are really pushing that it's strategic priority, but only 24% are using it daily.
101
00:07:33,975.001 --> 00:07:41,715.001
So if you took away the folks that are not allowed to use it, where do you think the disconnect is between the folks that can.
102
00:07:43,305.001 --> 00:07:48,835.001
I think people are overconfident in what they think they can do with it, we don't have a lot of examples of people using it really well.
103
00:07:49,265.001 --> 00:07:56,625.001
there's a big separation you'll see on LinkedIn, people posting crazy automation flows if you're a beginner and just used to prompting Chet GBT, it's a different world.
104
00:07:56,675.001 --> 00:07:59,165.001
I have people come to me and say, I use every AI every day.
105
00:07:59,165.001 --> 00:07:59,555.001
that's good.
106
00:07:59,555.001 --> 00:08:01,655.001
You also drive every day, but you're not a NASCAR driver.
107
00:08:01,655.001 --> 00:08:02,555.001
there's nothing wrong with that.
108
00:08:02,555.001 --> 00:08:03,905.001
It just means you haven't gone through the training.
109
00:08:03,955.001 --> 00:08:06,955.001
People think because they use it every day that's enough and they're good at it.
110
00:08:07,135.001 --> 00:08:24,280.001
And I'm like, there's a whole world that you gotta really dedicate yourself to, but most people can't 'cause you're busy doing your sales job or your customer success job that's why I'm always telling enablement people or leaders that it's incumbent upon them to make sure the enablement is such that someone can actually apply it and get good benefit out of it, it's not realistic to expect everyone to become.
111
00:08:24,550.001 --> 00:08:29,430.001
A prompt engineer, So either tech's gotta make that happen or the leader's gotta give the space for it to happen.
112
00:08:29,650.001 --> 00:08:35,80.001
momentum, that's one of the things we talk about A lot is that we don't want people to have to worry about the adoption cycle.
113
00:08:35,110.001 --> 00:08:40,840.001
'cause that's the biggest pain in the neck for tech is like adopting yet another tab or tech or something.
114
00:08:41,590.001 --> 00:09:01,10.001
AI in general should be the ones that's the least needed to be adopted because it should work where you are versus having to go to it, so there's some key differences in shifts that are changing, you mentioned when you talk to teams, you're like meeting them where they are, but for teams that wanna start but are feeling overwhelmed, where would you typically recommend they focus first.
115
00:09:01,735.001 --> 00:09:07,230.001
So generally speaking, I would give them access to my GPT and say, go have a conversation with this.
116
00:09:07,230.001 --> 00:09:14,205.001
I made A GPT that answers this question and says, okay, what's your job? What's your task? it's trained on being able to identify.
117
00:09:14,580.001 --> 00:09:24,750.001
What's most important and impactful, and then suggesting based on their level of comfort with AI or automations, to say, okay, if you're brand new to Ai, here's the things you should do right now.
118
00:09:24,750.001 --> 00:09:25,800.001
And it takes it by their.
119
00:09:26,505.001 --> 00:09:32,805.001
Job tasks and says, you can do this with ai, and in the GT's opinion, it'll say, this is the number one thing you should do now.
120
00:09:33,255.001 --> 00:09:36,645.001
And you focus on that, and then you start going down the path and start knocking it off.
121
00:09:37,905.001 --> 00:09:47,695.001
I always tell people and this probably comes from my life coaching days, but you try to lose 50 pounds, you don't focus on the 50 pounds, you focus on the first five, then you go to the Next Yeah.
122
00:09:47,745.001 --> 00:09:48,435.001
It's the same thing.
123
00:09:48,435.001 --> 00:09:51,15.001
You can't think I'm going to automate everything in my business.
124
00:09:51,15.001 --> 00:09:52,125.001
Like it's insanely.
125
00:09:52,620.001 --> 00:09:57,570.001
Overwhelming you think that's thinking, what is the thing that's going to me the most right now? And then focus on that.
126
00:09:57,570.001 --> 00:10:00,420.001
Instead of trying to eat the elephant in one, swing it doesn't work very well.
127
00:10:00,470.001 --> 00:10:04,600.001
a lot of people are getting this pressure of Shopify CEOs saying, apply everything to ai.
128
00:10:04,600.001 --> 00:10:07,660.001
And you're like, where do I start? of just saying, I'm going to start here.
129
00:10:08,140.001 --> 00:10:09,730.001
Is the most important, most impactful thing.
130
00:10:09,970.001 --> 00:10:12,100.001
I'm going to get this good and then I'll move on to the next thing.
131
00:10:12,100.001 --> 00:10:13,750.001
That's the thing that most people aren't doing.
132
00:10:13,750.001 --> 00:10:16,90.001
So you doing just that, you'll be farther ahead.
133
00:10:16,390.001 --> 00:10:18,130.001
the other thing that folks really struggle with is.
134
00:10:18,910.001 --> 00:10:24,850.001
Because there are so many tools, it's almost if I just use all these tools, I'm proficient in a way.
135
00:10:25,295.001 --> 00:10:27,65.001
As we all know, it's not about tools.
136
00:10:27,125.001 --> 00:10:32,115.001
You've mentioned in the past you have a mental model that's trust, education, and intent.
137
00:10:32,595.001 --> 00:10:35,0.001
Would you walk us through that? So trust, education, intent.
138
00:10:35,0.001 --> 00:10:37,550.001
When you say trust, I'm assuming you mean trust on all levels.
139
00:10:37,970.001 --> 00:10:39,860.001
Is that correct? Trust intent.
140
00:10:40,230.001 --> 00:10:43,140.001
in general, I'll just talk about trust 'cause it's a fascinating topic.
141
00:10:43,950.001 --> 00:10:47,670.001
We're talking to a lot of companies that are massive in size and even smaller ones.
142
00:10:48,510.001 --> 00:10:55,630.001
And whether it's in Prompt engineering with beginners or even massive companies trying to apply an AI across the tech, single most.
143
00:10:56,605.001 --> 00:10:58,45.001
Common thread is trust.
144
00:10:58,435.001 --> 00:11:00,325.001
And trust can take on many layers.
145
00:11:00,385.001 --> 00:11:04,135.001
It could be SOC two compliance and data security and governance.
146
00:11:04,435.001 --> 00:11:06,265.001
It could be the output of the ai.
147
00:11:06,505.001 --> 00:11:09,295.001
It could be trusting that the human understands enough to do things.
148
00:11:09,295.001 --> 00:11:11,305.001
It's like there's trust on multifactors.
149
00:11:11,945.001 --> 00:11:13,565.001
when I tell people about trust.
150
00:11:14,300.001 --> 00:11:24,360.001
like when you're looking at tech, can you trust what you're putting into it? Number one, is it safe? Is it secure? Is there Zero data retention policy with OpenAI as an example.
151
00:11:24,790.001 --> 00:11:33,70.001
It basically means that there's absolutely no training whatsoever and that is a unique agreement that not everyone with an API has you have to especially apply for it and get it.
152
00:11:33,410.001 --> 00:11:35,120.001
Momentum does a few other people do.
153
00:11:35,120.001 --> 00:11:47,480.001
It's not common, it's a question you need to ask vendors when you go into a technology saying something like, do you have a zero retention data policy with your AI vendors? Because if they don't, you need to know what is going on with the data, SOC two is another one.
154
00:11:47,480.001 --> 00:11:47,960.001
Governance.
155
00:11:47,960.001 --> 00:11:56,560.001
How do I know what's going on? What happens to the data going in? And the question is what's coming out? as a prompt engineer, I spend a lot of time that.
156
00:11:57,510.001 --> 00:12:02,850.001
Making sure that the prompts can produce the results I'm looking for consistently without having issues or problems.
157
00:12:03,720.001 --> 00:12:08,400.001
That's not an easy thing to do 'cause it means you have to have usually monster prompts that guide an AI of how to respond.
158
00:12:09,270.001 --> 00:12:10,680.001
And again, most people don't know that.
159
00:12:10,680.001 --> 00:12:12,660.001
They don't know the nuance rules around prompting.
160
00:12:13,140.001 --> 00:12:22,840.001
So the results you get back are not always consistent or the same, and therefore they don't trust the Output because they don't know how to prompt, lastly everyone has a different level of trust with the humans.
161
00:12:23,140.001 --> 00:12:30,80.001
So there's people on your team who you may or may not trust a lot Then you add AI onto it and you're like, I don't know if this person can really understand how to.
162
00:12:30,665.001 --> 00:12:34,855.001
Push the AI to what you need to get, so it's like all these things need to be mitigated.
163
00:12:34,855.001 --> 00:12:40,825.001
It takes time and hopefully the goal would be that you can trust the governance, security side of the AI tech.
164
00:12:41,245.001 --> 00:12:42,625.001
You can trust the output of it.
165
00:12:42,745.001 --> 00:12:45,805.001
And hopefully it's something where the human does not have to do a whole lot with it.
166
00:12:45,805.001 --> 00:12:51,285.001
It's more of like human in the loop versus human in the steering wheel, Education makes a lot of sense.
167
00:12:51,335.001 --> 00:12:53,45.001
we'd love to get the intent piece.
168
00:12:53,345.001 --> 00:13:01,315.001
Actually going to say one thing about education, which is fascinating, is that, in the report we had that a signal for AI tech purchase is education.
169
00:13:01,735.001 --> 00:13:07,285.001
So you can know if someone's about to buy an AI tech 'cause they're looking into education, which I thought that nugget's kind of fascinating.
170
00:13:08,62.501 --> 00:13:13,492.501
This kind of goes into a lot of different nuances, but AI is getting really good at understanding intent of different things.
171
00:13:13,492.501 --> 00:13:16,342.501
You could talk about it from a marketing point of view.
172
00:13:16,342.501 --> 00:13:23,302.501
So if someone puts in perplexity or Chacha Bet or Gemini or any tech that has access to the internet, someone could put in a mediocre sentence.
173
00:13:23,302.501 --> 00:13:30,22.501
And AI is trying to understand the intent of what it's trying, what the human's trying to communicate, then goes out to the marketplace, into the internet.
174
00:13:30,297.501 --> 00:13:35,367.501
And tries to find something that has the same intent of writing to capture the intent of the person going.
175
00:13:35,367.501 --> 00:13:44,310.001
So now in marketing, just talk SEO keywords, I talk, what intention am I trying to match with this article, which is different than keywords.
176
00:13:44,507.501 --> 00:13:46,277.501
I have to match intent versus match keyword.
177
00:13:47,147.501 --> 00:13:55,767.501
just that alone understanding What is my intent with this ai? The more clear you can be with your intent and be overly communicative, the better results you get back.
178
00:13:55,767.501 --> 00:13:57,863.501
Jonathan, there's a quote you said that we loved.
179
00:13:58,43.501 --> 00:14:03,518.501
the best teams don't use the most ai, they use it the most intentionally.
180
00:14:03,838.501 --> 00:14:06,88.501
And I'm curious how you've seen that in practice.
181
00:14:07,883.501 --> 00:14:17,498.501
I feel like it's really easy to feel pressure 'cause of all the reasons we've already said, I have a lot of people who say, what AI tech should I go get? I think that's the worst question to start with.
182
00:14:17,498.501 --> 00:14:40,58.501
It should be what are we trying to accomplish? What's the prioritized outcome we're going for and what is our gap or the pains we're experiencing while trying to get there? teams that are able to identify what is the prioritized problem that is aligned to business outcome, and based off of that are very intentional on making sure that they leverage tech in a way that hits their goals are the best ones that hit.
183
00:14:40,703.501 --> 00:14:41,513.501
Goals usually.
184
00:14:41,843.501 --> 00:14:45,823.501
The other way around is that people feel this pressure around, everyone's using ai, I should go get ai.
185
00:14:46,603.501 --> 00:14:52,863.501
And even if the tech is absolutely awesome, they don't get the results they want 'cause they don't know what they're trying to impact in the first place.
186
00:14:53,103.501 --> 00:14:55,23.501
Or another example is AI SDRs.
187
00:14:55,23.501 --> 00:15:00,333.501
It's something that's very hot topic Right now and I've seen a lot of teams roll out in ai.
188
00:15:00,333.501 --> 00:15:00,993.501
SDR.
189
00:15:01,638.501 --> 00:15:02,268.501
Failed.
190
00:15:02,418.501 --> 00:15:07,368.501
Not because the tech didn't work, but because they didn't have the information that the tech needed to operate correctly.
191
00:15:07,558.501 --> 00:15:21,848.501
what's my go-to market motion, what's my sales process? What's my ICP? What's their pains where I usually find them? all this stuff that AI needs to be trained, they don't have identified and they think going to break some fundamental system not, it's just going to reveal it.
192
00:15:23,153.501 --> 00:15:25,463.501
They blame the AI and say, this is crappy.
193
00:15:25,463.501 --> 00:15:26,273.501
We gotta get something better.
194
00:15:26,273.501 --> 00:15:28,523.501
It's not the ai it's the same thing as prompting.
195
00:15:28,523.501 --> 00:15:30,923.501
people say the prompt stuff I'm getting back is horrible.
196
00:15:30,923.501 --> 00:15:38,483.501
It's not the ai, it's you, like, AI is the best self-reflective tool out there because if you're not getting what you want, most of the time it's your fault.
197
00:15:38,543.501 --> 00:15:40,773.501
It's my fault as a person, it's not the ai.
198
00:15:41,523.501 --> 00:15:51,238.501
So going back to intentional, it's understanding the limitations of ai, but also being very, prescriptive on what it can or cannot do, and then letting it do its thing and let go a little bit.
199
00:15:53,248.501 --> 00:15:54,658.501
Yeah, I think that's right.
200
00:15:55,268.501 --> 00:15:59,518.501
The other thing kinda going back to the tooling is there could be a lot of AI bloat Yeah.
201
00:15:59,828.501 --> 00:16:06,258.501
it leads to some, you talk about the ai SDR, if you don't have the mechanisms on the backend it can expose some gaps.
202
00:16:06,548.501 --> 00:16:13,458.501
And your go to market strategy, but I'm curious, a lot of folks just add these AI tools to try to solve different problems.
203
00:16:13,458.501 --> 00:16:20,598.501
how do the savviest leaders prevent that and focus on the outcomes? It is a good question because it's.
204
00:16:21,78.501 --> 00:16:29,133.501
When you say leaders, you mean leaders like trying to enable frontline people with Hitachi, bt, or do you mean like an AI tech that goes across the team? I think it's a little of both.
205
00:16:29,143.501 --> 00:16:36,543.501
there's leaders who are finding point solutions, and then there are those, workflow type tools that can accomplish the latter.
206
00:16:36,543.501 --> 00:16:37,773.501
I'm curious how you think about it.
207
00:16:38,83.501 --> 00:16:38,893.501
There's two approaches.
208
00:16:38,893.501 --> 00:16:56,323.501
one is the bottom up, and I think it's very powerful to give people access to a team's version of chat, BT or Gemini I think you should be enabling your team with something, even if they're mediocre, being able to do something faster is better, I would also enable 'em with education around it If you can get someone even 10% better with IDR now, it makes a big impact on their workflows.
209
00:16:56,323.501 --> 00:16:59,273.501
That's if you don't automate anything, if all They do is just prompt.
210
00:16:59,943.501 --> 00:17:02,943.501
having a little bit of nuance to help people would make a big difference.
211
00:17:02,943.501 --> 00:17:03,513.501
That's number one.
212
00:17:03,873.501 --> 00:17:13,453.501
But the real impact comes from the tech that can make a bigger impact on the bottom line with business, profit, costs, revenue, customer happiness that kinda stuff.
213
00:17:13,993.501 --> 00:17:22,673.501
Um, Leaders who understand that and drive that the company who said, learn ai, we have no tooling, no training, no process, no time, no nothing.
214
00:17:22,673.501 --> 00:17:23,513.501
That's bad leadership.
215
00:17:24,353.501 --> 00:17:27,653.501
People who know where this thing's coming, like for me, I'll give you an example.
216
00:17:27,683.501 --> 00:17:34,43.501
I could take a team of 10 SDRs and reduce them down to two and have them do the same amount of work that the 10 used to do.
217
00:17:34,113.501 --> 00:17:36,803.501
I would rather keep all 10 and times 10 them.
218
00:17:36,803.501 --> 00:17:40,253.501
So instead of having two, doing their work of 10, I have 10 doing their work of a hundred.
219
00:17:41,33.501 --> 00:17:50,73.501
So it's part of, it's a mindset shift of understanding what is actually important and thinking, yes, there is expense here, but what Can I do to really amplify the human talent I have.
220
00:17:50,893.501 --> 00:17:53,323.501
Versus trying to just reduce headcount and costs.
221
00:17:53,373.501 --> 00:18:00,243.501
I think We should be looking at dollars, in the last end, but there's just all this pressure and I think people need to have a longer term view of what's actually going to help the business, which is the people.
222
00:18:00,453.501 --> 00:18:13,733.501
If you push your people to be the best they can be you'll have than enough outcomes to mitigate any cost I think you're seeing that with some companies who had an AI first strategy and announced it early on, and now you see that they're hiring back.
223
00:18:14,53.501 --> 00:18:14,983.501
Some people did that.
224
00:18:15,913.501 --> 00:18:16,93.501
Yeah.
225
00:18:16,153.501 --> 00:18:16,843.501
customer support.
226
00:18:16,843.501 --> 00:18:20,83.501
They got rid of a bunch of customer support people, and a month and a half ago, started hiring people back.
227
00:18:20,233.501 --> 00:18:24,433.501
It's a classic example, what is a common, oh, no moment.
228
00:18:24,623.501 --> 00:18:26,513.501
When go to market teams start experimenting with ai.
229
00:18:27,973.501 --> 00:18:28,783.501
Oh my gosh.
230
00:18:28,863.501 --> 00:18:29,703.501
I'll give an example.
231
00:18:29,703.501 --> 00:18:38,343.501
When I go into a team and I'll have a series B series C series D company that has not yet really found out what their sales process should be or exit criteria.
232
00:18:38,793.501 --> 00:18:42,408.501
And then they wanna automate stuff and it creates just a mess It creates a lot of problems.
233
00:18:42,438.501 --> 00:18:44,418.501
for example, if you don't have your fundamentals in place.
234
00:18:45,68.501 --> 00:18:48,758.501
is just going to speed the chaos you already have and it makes it bad.
235
00:18:48,758.501 --> 00:18:54,188.501
no one's going, oh shit, we have all this data that we did that's wrong because of our own stuff.
236
00:18:54,238.501 --> 00:18:54,778.501
That's a big one.
237
00:18:55,228.501 --> 00:18:56,578.501
AI can thrive.
238
00:18:56,963.501 --> 00:18:59,513.501
When it's able to communicate across the board.
239
00:18:59,513.501 --> 00:19:01,403.501
So now you have MCP and agent type stuff.
240
00:19:01,403.501 --> 00:19:02,543.501
There's all these things coming out.
241
00:19:02,543.501 --> 00:19:09,33.501
But our Infrastructure of tech was not ever meant to let every technology talk to each other like it is to an extent, but it's not.
242
00:19:09,903.501 --> 00:19:19,703.501
And so now a lot of the oh no moments is coming from, oh crap, we have all this data, all these different places, but there's no easy way to connect it and let the AI thrive, that's another moment that pretty much everyone I think is going through right now.
243
00:19:19,703.501 --> 00:19:20,813.501
I'm pretty sure everyone has that.
244
00:19:21,953.501 --> 00:19:33,713.501
when people put a little too much trust into humans with AI prompting, you need to be very careful with this because people think they know what they're doing and then they prompt horrible and they get outputs and they're like, what in the world do you just send to the CEO? I get messages myself where I'm like.
245
00:19:34,28.501 --> 00:19:34,178.501
Okay.
246
00:19:34,178.501 --> 00:19:36,398.501
I know this is ai, like I live and breathe ai.
247
00:19:36,458.501 --> 00:19:39,128.501
Like it's fine if you use ai, but be a little bit creative, come on.
248
00:19:39,878.501 --> 00:19:41,788.501
only imagine how many teams are going through that.
249
00:19:41,838.501 --> 00:19:42,198.501
Yeah.
250
00:19:43,333.501 --> 00:19:50,813.501
I think along those lines, if you think about folks that are just starting to dip their toes in the water and some of the early adopters that are out there, maybe hesitant teams.
251
00:19:51,203.501 --> 00:20:00,848.501
What's one story where a team has really shifted their mindset Yeah, I think that mindset comes first, the funny thing is that too many times, the tools come in.
252
00:20:01,613.501 --> 00:20:02,543.501
The mindset doesn't.
253
00:20:02,793.501 --> 00:20:04,773.501
I call 'em the four or five C's.
254
00:20:04,773.501 --> 00:20:06,363.501
a couple of mindset things are super critical.
255
00:20:07,142.251 --> 00:20:08,283.501
One is creative.
256
00:20:08,703.501 --> 00:20:10,923.501
You gotta let your team be creative in how they apply things.
257
00:20:10,923.501 --> 00:20:12,963.501
you can find some really cool, fun stuff.
258
00:20:12,963.501 --> 00:20:16,383.501
a lot of the things I do in AI has come from me just asking the question.
259
00:20:16,413.501 --> 00:20:17,373.501
I wonder if.
260
00:20:17,748.501 --> 00:20:21,378.501
I wonder if I can do this, but it means I have to be creative and have the space to fail.
261
00:20:21,438.501 --> 00:20:22,998.501
'cause sometimes it doesn't work.
262
00:20:23,678.501 --> 00:20:26,358.501
And that's okay though 'cause I'm able to be critical enough.
263
00:20:26,358.501 --> 00:20:28,126.001
So two things go hand in hand creative and critical.
264
00:20:28,668.501 --> 00:20:30,768.501
So I'm not critical of myself or others.
265
00:20:30,768.501 --> 00:20:31,728.501
I'm critical of the ai.
266
00:20:32,28.501 --> 00:20:33,948.501
I wanna push it to be the best it can be.
267
00:20:33,948.501 --> 00:20:36,348.501
And then collaborative, like being able to mix your minds.
268
00:20:36,398.501 --> 00:20:46,358.501
this all takes mindset sharing and giving the space to let people play together with AI and see what results they get And to experiment and all this stuff, so CR creative critical collaborative.
269
00:20:46,388.501 --> 00:20:47,348.501
There's probably one more c.
270
00:20:47,818.501 --> 00:20:48,718.501
I'll come back to me in a second.
271
00:20:49,198.501 --> 00:20:50,98.501
But then, oh, coachable.
272
00:20:50,188.501 --> 00:20:57,408.501
And that's just being willing to learn because I'm an AI I know experts and even the experts are saying it's so hard to keep up and you have to be moldable.
273
00:20:58,318.501 --> 00:21:03,418.501
you're not willing to be coachable and to learn, like learn, this week's going to expired by next week anyway.
274
00:21:03,418.501 --> 00:21:12,268.501
So you gotta be in this constant shift of learning and changing, humans as a general whole aren't used to that, i've been in a mode where I'm changing things all the time, even prompting, I'm like, this used to work.
275
00:21:12,298.501 --> 00:21:23,978.501
But now it doesn't 'cause I'm in oh three or oh four, or I'm in a different technology So if you have those four things from a mindset perspective, you'll be great.
276
00:21:23,978.501 --> 00:21:25,928.501
But if you don't, going to struggle a lot.
277
00:21:26,168.501 --> 00:21:30,488.501
And I'll say that double for leaders, they've gotta be willing to let go of some of that control.
278
00:21:31,768.501 --> 00:21:36,748.501
otherwise if they don't that creative, critical, collaborative, and then coaching come in, including me as a leader.
279
00:21:37,118.501 --> 00:21:43,148.501
I saw back in October, Deloitte released a report around board members of massive companies all the way down to, series B stuff.
280
00:21:43,808.501 --> 00:21:50,168.501
And the biggest thing was, They weren't even talking about AI and not only were they not talking about ai, none of them had even used ai.
281
00:21:50,588.501 --> 00:21:55,268.501
So these people are, who are making the decisions of having AI inside these massive companies aren't even touching it.
282
00:21:55,268.501 --> 00:21:59,498.501
And my opinion is if you've never touched the BT, don't tell me what I should be doing with ai.
283
00:21:59,628.501 --> 00:22:00,918.501
You're not the person to be talking to.
284
00:22:01,748.501 --> 00:22:05,198.501
So I think all those mindset skills go across the board for everybody.
285
00:22:05,228.501 --> 00:22:10,28.501
'cause otherwise the dinosaurs are going to be extinct very quickly.
286
00:22:11,583.501 --> 00:22:12,453.501
Yeah, I think that's right.
287
00:22:12,453.501 --> 00:22:16,48.501
I always think about curiosity as being Yeah.
288
00:22:16,153.501 --> 00:22:18,203.501
thing I look for and hire for even.
289
00:22:18,203.501 --> 00:22:23,393.501
It's like you wanna have that thirst and I think the AI piece, so I like the four C's you have.
290
00:22:23,393.501 --> 00:22:24,993.501
I think creative kind I'll take yours.
291
00:22:24,993.501 --> 00:22:26,493.501
I'll tag it on number five, curiosity.
292
00:22:26,493.501 --> 00:22:26,823.501
I love it.
293
00:22:27,93.501 --> 00:22:28,143.501
Yeah, it's funny.
294
00:22:28,243.501 --> 00:22:34,393.501
I've been talking to some leaders who their board is saying you need to worry about the risks of ai.
295
00:22:34,393.501 --> 00:22:39,613.501
And I talked to chat GPT and it told me this, and it's not, that's not right.
296
00:22:39,833.501 --> 00:22:41,483.501
I was like, congratulations.
297
00:22:41,483.501 --> 00:22:42,803.501
Here's your prize, Mr.
298
00:22:42,803.501 --> 00:22:43,643.501
Board member.
299
00:22:43,803.501 --> 00:22:44,463.501
But you're right.
300
00:22:44,468.501 --> 00:22:48,423.501
People who aren't using it and being curious their perspective is going to.
301
00:22:48,828.501 --> 00:22:51,108.501
Date them and leave them behind at some point.
302
00:22:51,348.501 --> 00:22:52,248.501
I very much agree.
303
00:22:53,118.501 --> 00:22:56,888.501
I wanna talk about another tension the relationship between sales and marketing.
304
00:22:57,218.501 --> 00:23:08,773.501
So as you think about helping teams move forward, implementing ai, from a go-to market perspective how do you connect those two teams from a.
305
00:23:09,333.501 --> 00:23:17,483.501
Working perspective when sales, want speed and they want pipeline and marketing has other priorities and AI's kind of caught in the middle.
306
00:23:17,783.501 --> 00:23:21,683.501
my honest opinion is I'm a marketer and my number one priority is pipeline revenue.
307
00:23:21,803.501 --> 00:23:25,373.501
So I think if some marketer is saying it's not that, then they're not the right marketer.
308
00:23:25,373.501 --> 00:23:26,93.501
It's just my opinion.
309
00:23:26,783.501 --> 00:23:29,333.501
Now with that being said, I think there's two different states to talk about.
310
00:23:29,333.501 --> 00:23:31,343.501
There's the current state than the future state.
311
00:23:32,103.501 --> 00:23:35,793.501
If everyone's not aligned to the main business outcomes, then they shouldn't be in go-to-market.
312
00:23:35,793.501 --> 00:23:38,913.501
Leadership personally, like when I was enablement, I wasn't even marketing.
313
00:23:38,913.501 --> 00:23:41,133.501
My number that I got bonused on was revenue and pipeline.
314
00:23:42,248.501 --> 00:23:42,538.501
Yeah.
315
00:23:42,543.501 --> 00:23:54,923.501
I think everyone across the board, I don't care what your position is, if you're working with the revenue team, you should be tagged to the revenue and pipeline number with that being said, does market have other priorities like, SEO Yes, but it's all for the end goal of, pipeline.
316
00:23:55,383.501 --> 00:23:58,443.501
I also think that I'm a little bit different 'cause I'm not a historical marketer.
317
00:23:58,443.501 --> 00:24:02,613.501
I'm like a rev ops slash enablement slash marketer at one place with momentum.
318
00:24:03,3.501 --> 00:24:04,473.501
So I dive into the numbers all the time.
319
00:24:04,473.501 --> 00:24:12,213.501
I'm looking at how is the lead source converting through each stage? How is the actual customer closing and how are they behaving after the close? So my marketing's more about.
320
00:24:12,693.501 --> 00:24:23,149.751
What is the lifetime value and lifetime journey of the customer versus just MQL? I think that's one of the problems with marketing we gotta get over it's about the whole entire cycle and lifetime of a customer.
321
00:24:23,149.751 --> 00:24:25,549.751
It's not just getting you ops in the door.
322
00:24:25,549.751 --> 00:24:32,919.751
It's like, how are they converting? Are they selling? Because if I bring you a ton of M qls and none of them convert, what the crap am I doing? On the sales side they also need to make sure that.
323
00:24:33,324.751 --> 00:24:43,24.751
It's a delicate balance because I've seen a lot of people in a lot of different companies I've only been on one team where I've had maybe two people be really good discovery, but most of 'em are horrible.
324
00:24:43,204.751 --> 00:24:45,604.751
So I could bring the warmest best lead into it.
325
00:24:45,634.751 --> 00:24:52,554.751
most teams, unless they're ready to buy, do a mediocre job of selling and they can get it through, but most of 'em are just horrible at it.
326
00:24:52,604.751 --> 00:24:55,994.751
My job is to bring people who are warm and aware of the company.
327
00:24:55,994.751 --> 00:24:57,794.751
Their job is to capture and move forward.
328
00:24:57,794.751 --> 00:24:59,474.751
Like not always going to be a perfect set.
329
00:24:59,474.751 --> 00:25:01,214.751
Sometimes it can be super warm, sometimes it's not.
330
00:25:01,214.751 --> 00:25:08,54.751
So salespeople absolutely have to be ready to obviously take the easy ones, but then being willing to work and show value where it can be.
331
00:25:08,594.751 --> 00:25:10,674.751
The nice AI helps both sides very easily.
332
00:25:10,724.751 --> 00:25:10,964.751
Now.
333
00:25:10,964.751 --> 00:25:12,734.751
With that said, I'm going to shift to future sakes.
334
00:25:12,944.751 --> 00:25:14,294.751
this is where it's going to get interesting.
335
00:25:15,254.751 --> 00:25:21,314.751
Sam Altman and a bunch of other people talked about how sometime there's going to be some sort of one person billion dollar valuation company.
336
00:25:22,214.751 --> 00:25:22,874.751
We're not there yet.
337
00:25:22,964.751 --> 00:25:25,364.751
I have a friend of mine, he owns Swan Ai.
338
00:25:25,574.751 --> 00:25:26,354.751
His name's Amos.
339
00:25:26,654.751 --> 00:25:31,244.751
He has three founders and their goal is to get the $30 million with three people and only AI agents.
340
00:25:31,664.751 --> 00:25:35,204.751
In that world, You have a tech person, sales and marketing, and then customer.
341
00:25:35,414.751 --> 00:25:35,864.751
That's it.
342
00:25:36,614.751 --> 00:25:39,404.751
But I, I don't know if every technology, every team's going to get there.
343
00:25:39,404.751 --> 00:25:43,544.751
'cause that's, going from a team of 10,000 to three is not possible.
344
00:25:43,604.751 --> 00:25:48,404.751
But, I think in general you will start to see this shrinkage of.
345
00:25:48,689.751 --> 00:25:53,639.751
Particular roles, not necessarily meaning that person can't stay in the company, but it'll shift what they do.
346
00:25:53,639.751 --> 00:26:03,389.751
it could be that an SDR goes from being an SDR to an AI agent manager, something or the sales and marketing team goes from 10 on the marketing team and 50 on the sales team to.
347
00:26:04,229.751 --> 00:26:07,109.751
Five on marketing and then 20 in the sales team.
348
00:26:07,109.751 --> 00:26:10,889.751
So it doesn't mean they're gone from the team, it just means the role itself is not as much needed.
349
00:26:10,939.751 --> 00:26:18,199.751
I think it's just going to shift where that goes and when that shifts and if things get closer and smaller, I think that leadership will then shrink as well.
350
00:26:18,529.751 --> 00:26:21,199.751
'cause then you don't need as much leadership to manage all the different people.
351
00:26:21,529.751 --> 00:26:27,139.751
So then marketing and sales have the conflict as much 'cause it'll be one person instead of two with the same number, which is revenue and pipeline.
352
00:26:27,589.751 --> 00:26:27,919.751
That's it.
353
00:26:28,269.751 --> 00:26:31,139.751
You hear about Microsoft eliminating manager positions.
354
00:26:31,169.751 --> 00:26:33,179.751
They're looking for something different from managers.
355
00:26:33,179.751 --> 00:26:39,979.751
They're looking for them to be player coaches who understand AI and the limitations, but also what's possible creativity.
356
00:26:40,199.751 --> 00:26:46,489.751
They're not just looking for, people to make sure that employees are doing their job and, giving them a performance review, once a year.
357
00:26:46,534.751 --> 00:26:46,824.751
Yeah.
358
00:26:48,154.751 --> 00:27:01,854.751
Speaking of that, one of the things I picked up when you and I have talked is, some leaders say that AI is going to replace the human element, but when I talk to you, I actually feel the opposite, that it's going to create more opportunity for people to show up and do work differently.
359
00:27:02,164.751 --> 00:27:17,719.751
what does that look like in a high functioning go-to-market organization? this is my opinion in general, which I don't work with companies unless they share something similar, I believe humans are the source of relationship and excellence, not for data entry, not for data sourcing, not for manual, data typing work.
360
00:27:18,49.751 --> 00:27:20,749.751
They're here for relationships, for connection, and for their brain power.
361
00:27:21,629.751 --> 00:27:24,929.751
if any company, whether buying or selling AI tech does not align with that.
362
00:27:24,929.751 --> 00:27:32,159.751
They're not my kind of people I do think we're in a world where we haven't experienced what AI could or could not be doing with us as buyers and as sellers.
363
00:27:32,249.751 --> 00:27:33,622.251
So an example would be a house.
364
00:27:34,619.751 --> 00:27:36,809.751
Imagine a world where AI is like Jarvis.
365
00:27:36,839.751 --> 00:27:44,759.751
Jarvis can take us through a guide of this million dollar house in Southern California, While I'm in New York City, going through a hologram, shows me all the stuff, shows me all the data.
366
00:27:45,89.751 --> 00:27:47,549.751
Do I then need to have a real estate agent? Do me that for me? No.
367
00:27:48,329.751 --> 00:27:50,729.751
I be opposed to having that happen? No.
368
00:27:51,539.751 --> 00:27:54,989.751
I'm not that connected to release estate agents that I'd be like, I have to work with the real estate agents.
369
00:27:55,169.751 --> 00:27:55,799.751
That's me personally.
370
00:27:55,799.751 --> 00:28:00,89.751
I have good friends, real estate agents, but I'm like, I don't really care as long as I get a good house and good deal.
371
00:28:00,809.751 --> 00:28:12,914.751
think that we're going to go through this time where people are going to experiment with seeing what do I like with AI and what do I not like with AI as buyers? And so to see the swing a pendulum of saying people will push things too much.
372
00:28:12,914.751 --> 00:28:16,514.751
Kinda like Klarna and the customer support where they going to put AI everywhere.
373
00:28:16,724.751 --> 00:28:18,194.751
And they realize humans don't like that.
374
00:28:18,659.751 --> 00:28:26,913.501
Swing back, more humans saying real estate agents, maybe someone launches a holographic Jarvis level real estate agent seller thing and it goes great.
375
00:28:26,973.501 --> 00:28:28,433.501
They save money and it goes Awesome.
376
00:28:29,153.501 --> 00:28:38,553.501
I don't think we know I do know that people will start to realize, you'll start to have people who will say, I will only use the human for this, or I'll only use AI for this 'cause it's something like buying strawberries.
377
00:28:38,553.501 --> 00:28:40,833.501
I don't care if I talk to a strawberry expert and some people do.
378
00:28:41,853.501 --> 00:28:43,293.501
This is such a fascinating topic.
379
00:28:43,321.001 --> 00:28:45,93.501
I love this topic so much, so I could get on for this for a while.
380
00:28:46,53.501 --> 00:28:46,923.501
I'll try to give you an example.
381
00:28:46,923.501 --> 00:28:47,883.501
Hopefully make this more real.
382
00:28:48,123.501 --> 00:28:51,323.501
I can use hey, Jen, to mimic my avatar, my voice, my video.
383
00:28:51,563.501 --> 00:28:53,303.501
I can use 11 labs to copy my voice.
384
00:28:53,483.501 --> 00:28:55,703.501
I can use Chacha, bt, or Claude to copy my writing.
385
00:28:56,363.501 --> 00:28:57,713.501
can mimic with ai, everything.
386
00:28:58,583.501 --> 00:29:00,863.501
So that means the value of me.
387
00:29:01,253.501 --> 00:29:11,683.501
Becomes more important and more impactful if I can create an avatar that people can talk to, there's people with AI girlfriend options, getting paid lots of money to talk to their AI girlfriend, which is insane to me, but it's happening.
388
00:29:12,668.501 --> 00:29:12,958.501
Yeah.
389
00:29:13,3.501 --> 00:29:14,443.501
of their actual person? So think about that.
390
00:29:14,443.501 --> 00:29:17,398.501
Like Michael Jordan, if you had an AI avatar, you could talk to any question you wanted.
391
00:29:17,678.501 --> 00:29:18,763.501
I think that'd be cool.
392
00:29:18,883.501 --> 00:29:20,383.501
Honestly, I'd access that.
393
00:29:20,383.501 --> 00:29:24,433.501
But what's the value then of actual Michael Jordan? higher than his AI avatar.
394
00:29:25,468.501 --> 00:29:25,948.501
So Oh.
395
00:29:26,188.501 --> 00:29:31,708.501
of us as people and my own brain, my own thoughts is supremely important because all AI does is amplify it.
396
00:29:33,308.501 --> 00:29:41,378.501
I see a world where I'll have this agent AI infrastructure team that is like my Ironman suit I can go into any workplace.
397
00:29:41,633.501 --> 00:29:47,363.501
And I can plug in to the tech and the only way they can get what I can do is my AI tech that's trained on me.
398
00:29:47,393.501 --> 00:29:55,103.501
Which means that my stuff becomes really important because it's trained on my thoughts, my original thinking, So if you as a salesperson, you as a leader, you as a marketer have unique things.
399
00:29:55,103.501 --> 00:29:56,393.501
This is why you gotta be with ai.
400
00:29:56,423.501 --> 00:30:00,353.501
it's going to come a world where you have this AI agent bought Army thing.
401
00:30:00,563.501 --> 00:30:08,483.501
You can plug into existing technology and have all this cool stuff happen that only you as an expert can have 'cause of your experience, your thoughts, your ip.
402
00:30:09,53.501 --> 00:30:11,626.001
So I think things are going to get exciting.
403
00:30:11,926.001 --> 00:30:14,206.001
Human made anything like human music.
404
00:30:14,206.001 --> 00:30:14,956.001
I'm a musician.
405
00:30:15,646.001 --> 00:30:21,46.001
don't see a world where I could have an AI robot give me the same feeling live that I would have from some of my favorite concert people.
406
00:30:21,46.001 --> 00:30:26,266.001
Maybe it will, but even then it's going to be a different experience listening to Jarvis play his music versus Paul McCartney.
407
00:30:26,266.001 --> 00:30:27,356.001
It's not going to be the same.
408
00:30:27,356.001 --> 00:30:30,151.001
So the value of humans is going to go up.
409
00:30:30,201.001 --> 00:30:40,761.001
The question is, what do you have that you're passionate about, that you're knowledge about, that you are geek out all the time? And you can use AI as the amplifier instead of thinking, good writer, so AI's going to me.
410
00:30:41,661.001 --> 00:30:45,321.001
It could theoretically, but you think about all the greatest writers in the world.
411
00:30:45,321.001 --> 00:30:51,471.001
If I had access to an AI agent that could write like Oscar Wilde, would I pay for that? Yes.
412
00:30:51,831.001 --> 00:30:56,251.001
It's like, where's the value? I it's going to amplify us than replace us Yep.
413
00:30:57,431.001 --> 00:30:57,851.001
I agree.
414
00:30:57,851.001 --> 00:31:00,691.001
I think elevates what we can focus on.
415
00:31:00,821.001 --> 00:31:07,711.001
we're spending all this time thinking about, or logging interactions in Salesforce But even to your point, I love the Ironman suit coming in.
416
00:31:08,41.001 --> 00:31:18,631.001
How close do you think we're to that Iron Man plugin for most organizations? And what do you think could change to accelerate it? MCP is just starting to catch on.
417
00:31:18,631.001 --> 00:31:24,631.001
I'm seeing more and more people use it, it's going to be fascinating to see what happens before the end of the year, then agent to agent just launched.
418
00:31:24,661.001 --> 00:31:26,191.001
A month ago, two months ago.
419
00:31:26,641.001 --> 00:31:27,781.001
So most people haven't gotten that.
420
00:31:27,781.001 --> 00:31:28,621.001
But that's going to spread.
421
00:31:28,951.001 --> 00:31:37,261.001
And I would see, I would guess that most technologies, Microsoft, Amazon, most of are going to adapt don't they're going to eradicated.
422
00:31:38,101.001 --> 00:31:39,631.001
And most companies will try to do the same thing.
423
00:31:39,631.001 --> 00:31:47,791.001
I would hope that in the next year, probably two years max, for some slower moving orgs, you'll have something like that in place and.
424
00:31:48,211.001 --> 00:31:53,851.001
also probably within the next year, maybe year and a half, you have the ability to where I could have a personalized AI tech.
425
00:31:53,851.001 --> 00:32:00,621.001
if I had my GPT team, sometimes there's going to be a world where I can plug in that GPT team to a technology and amplify myself.
426
00:32:00,636.001 --> 00:32:07,386.001
instead of me working for one company, I contract with five and they all have access to my IP that I check in an hour a day with five companies.
427
00:32:08,356.001 --> 00:32:08,866.001
not there yet.
428
00:32:08,866.001 --> 00:32:22,396.001
It's going to require a lot of shifts and changes I say a year, maybe two, because so many companies are so far behind I can't tell you how many times I've done a consulting gig with someone and they said, yeah, we just rolled out Gemini our chat, and this is the first time our team has had something like this.
429
00:32:22,636.001 --> 00:32:23,836.001
This year in 25.
430
00:32:23,886.001 --> 00:32:27,566.001
where have you been? Why is this the first time you're doing this? happens all the time.
431
00:32:27,646.001 --> 00:32:28,306.001
I have to keep remembering.
432
00:32:28,306.001 --> 00:32:29,416.001
I'm a little bit farther ahead.
433
00:32:29,466.001 --> 00:32:31,56.001
I push the envelope and most people don't.
434
00:32:31,926.001 --> 00:32:33,396.001
But a lot of people are just so far behind.
435
00:32:33,396.001 --> 00:32:41,226.001
with the ease of more technology becoming real, it'll be easier to do that, which means bigger organizations can do it faster and it'll just cycle on top of itself.
436
00:32:43,146.001 --> 00:32:43,476.001
Yep.
437
00:32:43,806.001 --> 00:32:53,646.001
I'm super curious to see what happens because there's some recent data worth diving into, it talks about 70% of proof of concepts that businesses do with their own models fail.
438
00:32:53,806.001 --> 00:32:58,96.001
so if they're not looking at Claude or Gemini or chat GT, they're struggling.
439
00:32:58,276.001 --> 00:33:06,346.001
And then you're also seeing organizations who implemented copilot or chat GPT, taking it away from their employees 'cause they're not seeing enough of a business impact.
440
00:33:06,526.001 --> 00:33:13,706.001
So I'm curious to see some organizations are going to keep moving forward and I think some are going to actually stall a little bit and regroup on their strategy.
441
00:33:13,706.001 --> 00:33:16,436.001
Johnson just announced a couple weeks ago that they're pulling back.
442
00:33:18,21.001 --> 00:33:22,122.001
I know why co-pilot's at least their current version is a horrible version of ai, and no one's getting benefit out of it.
443
00:33:22,127.001 --> 00:33:23,411.001
It drives me crazy, but keep going.
444
00:33:24,941.001 --> 00:33:26,441.001
No, yeah, I think it's really valid point.
445
00:33:26,541.001 --> 00:33:29,841.001
So to wrap things up, we always love to do a rapid fire round.
446
00:33:29,841.001 --> 00:33:38,436.001
So are you ready for a few quick questions? What's one AI tool you love that doesn't get enough credit? Oh man.
447
00:33:39,76.001 --> 00:33:39,556.001
Okay.
448
00:33:39,866.001 --> 00:33:41,936.001
You gotta type that in for us 'cause we'll put it in the notes.
449
00:33:42,16.001 --> 00:33:42,556.001
A music creator.
450
00:33:42,556.001 --> 00:33:43,276.001
I'm a music nut.
451
00:33:43,336.001 --> 00:33:43,756.001
I love it.
452
00:33:43,906.001 --> 00:33:44,266.001
okay.
453
00:33:44,491.001 --> 00:33:45,706.001
that last It's awesome.
454
00:33:45,751.001 --> 00:33:46,666.001
It's so Yeah.
455
00:33:46,701.001 --> 00:33:55,56.001
What's something you used to believe about AI that you've completely changed your mind about? To pay for the AI man.
456
00:33:55,356.001 --> 00:34:00,176.001
But probably I told you before that if AI can't do it right it's the AI I switched since it's me.
457
00:34:00,566.001 --> 00:34:01,796.001
I'm the person, I'm the wrong.
458
00:34:01,851.001 --> 00:34:09,496.001
What's one prompt or trick you find yourself using over and over again with ai? I'll give you three really quick ones.
459
00:34:09,496.001 --> 00:34:12,706.001
One is structure and format, making sure it's formatted correctly with titles.
460
00:34:13,66.001 --> 00:34:17,136.001
Two is using hashtags to help the ai, know the hierarchy of importance.
461
00:34:17,406.001 --> 00:34:18,636.001
Three is words.
462
00:34:19,6.001 --> 00:34:19,936.001
words have power.
463
00:34:19,936.001 --> 00:34:23,446.001
So if I say something like rules versus law, that means something to ai.
464
00:34:23,496.001 --> 00:34:26,316.001
understanding those three things alone will get you a lot farther than most with prompting.
465
00:34:27,176.001 --> 00:34:34,376.001
what's one trend in go to market that's overrated? And what's one that's underrated? Oh man.
466
00:34:34,856.001 --> 00:34:39,146.001
I think overrated is writing emails with AI or automating anything with emails I think it's just overdone.
467
00:34:39,776.001 --> 00:34:44,396.001
Then underrated, I think is just how awesome it is to do some basic stuff.
468
00:34:44,426.001 --> 00:34:53,276.001
Like for example, if I had to go train a team on Med Pick and apply it to their company, just throwing in a chat BT and getting this cool mixture of things together, it's just so awesome.
469
00:34:53,396.001 --> 00:34:55,46.001
It still blows my mind how good it is.
470
00:34:55,46.001 --> 00:34:57,386.001
So I think it's underrated and people don't talk about it enough.
471
00:34:58,586.001 --> 00:34:59,66.001
Thank you.
472
00:34:59,76.001 --> 00:34:59,316.001
Yeah.
473
00:35:00,516.001 --> 00:35:02,526.001
That wraps us up for today.
474
00:35:02,586.001 --> 00:35:04,416.001
Jonathan, thank you so much for joining us.
475
00:35:04,446.001 --> 00:35:05,226.001
We learned a lot.
476
00:35:05,521.001 --> 00:35:05,941.001
Thank you.
477
00:35:05,986.001 --> 00:35:08,166.001
Thanks appreciate you guys Awesome.
478
00:35:13,266.001 --> 00:35:16,506.001
What'd you think of our conversation with Coach K? I enjoyed it.
479
00:35:16,506.001 --> 00:35:21,96.001
I think the thing that I learned the most about Coach K was the.
480
00:35:21,441.001 --> 00:35:26,121.001
Idea of creating this Iron Man suit to plug into these go to market motions.
481
00:35:26,421.001 --> 00:35:28,131.001
And that to me really resonated.
482
00:35:28,131.001 --> 00:35:46,671.001
'cause I think that the way that we're building these agents to talk to each other and with MCP and all this new stuff coming out that's going to help with automation, having That human at the helm that can augment these plug and play solutions is going to create a lot more value for folks that have really good ideas.
483
00:35:46,731.001 --> 00:35:48,411.001
So that's what I learned.
484
00:35:48,411.001 --> 00:35:54,926.001
What about you, Ken? Yeah, so I was really impressed by what he was talking about with relevance ai.
485
00:35:55,226.001 --> 00:36:03,536.001
I've been working to build agents and been struggling through the journey a little bit but, making some progress in key areas, awesome.
486
00:36:03,566.001 --> 00:36:06,826.001
I wanted to thank our guest coach for coming on the show.
487
00:36:07,56.001 --> 00:36:12,926.001
really interesting insights and I wanted to thank all of you listening or watching, however you're consuming our content.
488
00:36:13,76.001 --> 00:36:14,336.001
Please send us feedback.
489
00:36:14,336.001 --> 00:36:19,936.001
What do you wanna hear? What do you like, what don't you like? And any recommendations of people you want us to talk to.
490
00:36:20,686.001 --> 00:36:24,46.001
Yeah, and don't forget to subscribe and give us a review.
491
00:36:24,236.001 --> 00:36:25,886.001
That really helps us get our voice out there.
492
00:36:26,546.001 --> 00:36:30,746.001
And we really appreciate all of the feedback we've already been getting from listeners.
493
00:36:30,746.001 --> 00:36:37,696.001
It's really helped us map out the second season so far, and we're really excited to show you some other things based on what we are hearing from you guys.
494
00:36:38,176.001 --> 00:36:42,406.001
thanks for watching and let's keep crafting the future of go to Market together.
495
00:36:44,461.001 --> 00:36:44,681.001
Bye.