Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
I've always said as AI democratizes iq, EQ becomes increasingly important, critical thinking, empathy and all those things, are important, but.
(00:09):
.508151414I believe as marketers balance is actually more important Hey crafters.
3
00:00:14,9.509151414 --> 00:00:20,759.509151414
Just a reminder, this podcast is for informational entertainment purposes only and should not be considered advice.
4
00:00:21,109.509151415 --> 00:00:30,789.510151415
The views and opinions expressed our own and do not represent those of any company or business we currently work with are associated with, or have worked with in the past.
5
00:00:31,389.510151415 --> 00:00:33,309.510151415
for tuning in to the FutureCraft podcast.
6
00:00:33,579.510151415 --> 00:00:34,549.510151415
Let's get it started.
7
00:00:34,549.510151415 --> 00:00:42,172.319675224
uh, uh, uh, um, uh, Hey there.
8
00:00:42,172.319675224 --> 00:00:50,632.319675224
Welcome to the Future Craft Go to Market podcast where we're exploring how AI changing all things go to market, from awareness to loyalty and everything in between.
9
00:00:51,142.319675224 --> 00:00:54,292.319675224
I'm Ken Roddin, one of your guides on this exciting new journey.
10
00:00:54,982.319675224 --> 00:01:00,862.319675224
And I'm Erin Mills, your co-host, and together we're gonna unpack the future of AI and go to market.
11
00:01:01,217.319675224 --> 00:01:06,467.319675224
We're gonna share some best practices, how-tos and interview industry pioneers and leaders.
12
00:01:07,307.319675224 --> 00:01:20,417.319675224
So Ken, how are you paving the way with AI and go to market this week? I don't know if I'm paving the way, but I am leveraging some things that we've learned from some of our guests so far this season.
13
00:01:21,77.319675224 --> 00:01:27,677.319675224
A couple weeks ago, Tani talked about how she approaches building a prompt and getting an answer by having a conversation with ai.
14
00:01:29,602.319675224 --> 00:01:39,542.319675224
Few weeks ago, we talked to Chris Penn and he said, the more information you could put in one single prompt or input you're going to be more successful.
15
00:01:39,592.319675224 --> 00:01:47,282.31967522
So what I did was I had a full conversation with Chacho VT about a topic copied and pasted it, exported it, put it into a word doc.
16
00:01:47,657.31967522 --> 00:01:53,867.31967522
Then used it to make a prompt and uploaded that information and I got a really solid output.
17
00:01:53,957.31967522 --> 00:02:02,747.31967522
I used it to do some market analysis on a emerging industry for a client I'm working with, and what I found was I.
18
00:02:03,77.31967522 --> 00:02:07,37.31967522
It was much more comprehensive and I had way less edits to make.
19
00:02:07,287.31967522 --> 00:02:14,217.31967522
And it even addressed some of the nuance in the conversation where it was like you're originally talking about this, but did you want it like that? And I was like, wow.
20
00:02:14,217.31967522 --> 00:02:15,297.31967522
So it really does pay attention.
21
00:02:15,457.31967522 --> 00:02:16,807.31967522
It's a big time saver.
22
00:02:16,967.31967522 --> 00:02:19,37.31967522
And also a frustration saver.
23
00:02:19,237.31967522 --> 00:02:23,607.31967522
I don't feel like I got as frustrated working on this prompt as I might have doing something this complex.
24
00:02:23,757.31967522 --> 00:02:24,387.31967522
So it was great.
25
00:02:25,17.31967522 --> 00:02:25,887.31967522
I love it.
26
00:02:26,57.31967522 --> 00:02:37,232.31967522
How about you? I did something similar in terms of taking one of the plays outta the playbooks that our guest offered, who was Chase Hannigan a couple weeks back.
27
00:02:37,312.31967522 --> 00:02:44,912.31967522
And being able to connect Tali with some of the other APIs and build these, essentially.
28
00:02:45,517.31967522 --> 00:02:51,457.31967522
Entire systems that continue to talk to each other and have different roles is something I've been really interested in working on.
29
00:02:51,507.31967522 --> 00:02:58,767.31967522
I had already created the assistant that he had been talking about in the episode, but I added a lot of bells and whistles to what it can do.
30
00:02:59,7.31967522 --> 00:03:06,147.31967522
And so now it does a lot more searching for me and it scrapes different websites that I'm interested in.
31
00:03:06,177.31967522 --> 00:03:08,667.31967522
And I've been really happy with the results so far.
32
00:03:09,547.31967522 --> 00:03:15,137.31967522
Do you feel like you're getting better with N eight N now that you've been using it? I do.
33
00:03:15,137.31967522 --> 00:03:22,397.31967522
I think more than getting better with N eight N is understanding how the whole ENT systems work together.
34
00:03:22,667.31967522 --> 00:03:33,947.31967522
And so when I go over to relevance or I'm using another tool or building on lovable and using Super base, I feel like the NANN eight N helps to really.
35
00:03:34,922.31967522 --> 00:03:46,262.31967522
It helps you to visualize how all these things work together so that when you're working in these different systems, it actually ties the story together, which might make sense or might not, but in my head it makes sense.
36
00:03:46,262.31967522 --> 00:04:03,662.31967522
it's a tip that could really help people who are moving from using, chat BT and thinking about moving to more of an agentic workflow you gotta break it down into steps, right? You don't wanna have one agent doing your whole social media strategy, your content calendar, and doing the LinkedIn posts, that's just too much for it to do.
37
00:04:03,662.31967522 --> 00:04:07,752.31967522
So you've gotta break it down into different sections or different nodes, wow.
38
00:04:09,492.31967522 --> 00:04:10,692.31967522
I think that's exactly it, Ken.
39
00:04:10,697.31967522 --> 00:04:20,866.31967522
And especially someone like me I'm a super visual learner and so just being able to see how things come together helps me to really grasp the concept.
40
00:04:21,76.31967522 --> 00:04:30,496.31967522
Even with relevance, even though it's easier, I think for most probably like an entry point for people to use when things go wrong, I have a harder time navigating how to fix it.
41
00:04:30,546.31967522 --> 00:04:43,676.31967522
With NAN and some of these other tools where you can really visually understand what's connecting with the, or what's connecting with each other, it can give you a lot more comfort or give you a lot more of an understanding of how things are going.
42
00:04:44,201.31967522 --> 00:04:56,51.31967522
Hey, who are we talking to today, by the way? Oh, we're talking to one of our favorites from last season, Liza Adams, who we all know has great frameworks and is a prolific, a LinkedIn poster.
43
00:04:56,411.31967522 --> 00:04:58,241.31967522
I can't wait to talk to her.
44
00:04:58,241.31967522 --> 00:05:00,961.31967522
As she's a good friend of the pod, we'll be chatting with her coming up.
45
00:05:01,531.31967522 --> 00:05:03,661.31967522
I'm so excited.
46
00:05:03,661.31967522 --> 00:05:07,501.31967522
I've been waiting for this for so long, so actually I can't wait it anymore.
47
00:05:07,501.31967522 --> 00:05:17,517.00538951
We've gotta go over and talk to her uh, uh, uh, Today we're reuniting with one of our very favorites, Liza Adams.
48
00:05:17,517.00538951 --> 00:05:23,457.00538951
Welcome back the AI MarketBlazer who showed us how pilot wins can transform go to market.
49
00:05:23,727.00538951 --> 00:05:25,887.00538951
Liza, really excited to hear what you have for us today.
50
00:05:26,817.00538951 --> 00:05:28,407.00538951
I couldn't wait to come back.
51
00:05:29,127.00538951 --> 00:05:35,547.00538951
I was waiting for Erin and Ken to call me and say, Hey, would you come back? So super thrilled to be here.
52
00:05:36,162.00538951 --> 00:05:37,152.00538951
Thanks for joining us.
53
00:05:38,202.00538951 --> 00:06:08,282.00538951
So it's been about a year since we last spoke with you and since the sort of AI Marketing Masterclass episode aired, what surprised you the most on how AI really hit or missed for marketers in 2024 and leading into 2025? You know what surprised me is the fact that we are still having very similar conversations as a year ago when it comes to.
54
00:06:09,782.00538951 --> 00:06:11,102.00538951
Human change management.
55
00:06:12,102.00538951 --> 00:06:27,722.00538951
I think we tried to land that last year that this isn't about tools or AI innovation or technology, that we can't just simply give people Chachi, PT, clot and Gemini and perplexity and expect them to get the most out of it.
56
00:06:27,812.00538951 --> 00:06:32,12.00538951
I am still seeing marketing teams getting handed.
57
00:06:32,37.00538951 --> 00:06:38,167.00538951
Tools and the leaders are expecting the teams to figure it out themselves.
58
00:06:39,387.00538951 --> 00:06:42,247.00538951
People have to see themselves in it.
59
00:06:42,992.00538951 --> 00:06:45,542.00538951
use cases that apply to their jobs.
60
00:06:46,32.00538951 --> 00:06:51,712.00538951
Have joint learning sessions, sharing what works and what doesn't work.
61
00:06:52,112.00538951 --> 00:06:56,102.00538951
I think we have over pivoted on what's the right tool.
62
00:06:56,102.00538951 --> 00:07:05,602.00538951
There's this and that tool and the other, right? When the pivot should be more on how do we help people better understand leverage and truly make an impact with ai.
63
00:07:08,62.00538951 --> 00:07:32,472.00538951
Yeah, I really find that we are still in this same conversation that we were having a year ago in many areas something I'm calling the AI adoption Plateau, where the people who are gonna adopt AI have, and we still now have this bucket of a big group of people saying, we are trying this stuff, but why isn't it working? You hit on the exact point we've got this change management issue, and one of the reasons I am seeing.
64
00:07:33,237.00538951 --> 00:07:40,167.00538951
People struggle with it is their company was like, let's go fully agentic, or let's just redo all of our workflows in ai.
65
00:07:40,167.00538951 --> 00:07:40,977.00538951
And that's too much.
66
00:07:41,257.00538951 --> 00:07:45,207.00538951
It reminded me that you recommend founders start with Tiny AI win.
67
00:07:45,237.00538951 --> 00:08:13,297.00538951
Could you share a success story that really illustrates that approach? What worked? And maybe highlight some pitfalls for people to avoid? Yeah, so I have, been talking about one of the key use cases of a success were we now have a marketing team that is 45 members strong, and 25 of them are humans, and then 20 of 'em are AI teammates that the human beings actually built, trained, and are now managing and maintaining.
68
00:08:13,367.00538951 --> 00:08:14,807.00538951
We didn't get there overnight.
69
00:08:16,577.00538951 --> 00:08:20,87.00538951
One of, one of the key successes was we started with.
70
00:08:20,417.00538951 --> 00:08:28,727.00538951
Super simple custom gpt, and one of those super simple custom gpt was a digital twin.
71
00:08:28,727.00538951 --> 00:08:34,17.00538951
So you know, Liza, A GPT as an example, right? In dice.
72
00:08:34,17.00538951 --> 00:08:37,137.00538951
That's the company that, that, that's transformation.
73
00:08:37,497.00538951 --> 00:08:46,917.00538951
The CMO the head of marketing, the head of demand generation built digital twins, not just of themselves, but of the executive team.
74
00:08:48,117.00538951 --> 00:08:49,137.00538951
So basically.
75
00:08:49,497.00538951 --> 00:09:03,97.00538951
Trained custom GPTs on their frameworks, their thinking their anything that's publicly available that they've written their Myers Briggs, all sorts of things.
76
00:09:03,97.00538951 --> 00:09:14,502.00538951
So this GPT basically knows them super well, right? The reason why I'm calling it simple is we know ourselves best than better than anybody else, right? We could check its work.
77
00:09:14,562.00538951 --> 00:09:17,232.00538951
We know when it doesn't sound like us.
78
00:09:17,472.00538951 --> 00:09:20,982.00538951
We know when It doesn't sound like a strategy we've done.
79
00:09:21,492.00538951 --> 00:09:26,802.00538951
And what I love about it is it's designed not to mimic us.
80
00:09:27,192.00538951 --> 00:09:32,112.00538951
It's designed to learn about us so that it can find our blind spots.
81
00:09:32,812.00538951 --> 00:09:36,412.00538951
So that it can challenge our thinking because we get into our rut.
82
00:09:36,412.00538951 --> 00:09:37,462.00538951
We always think this way.
83
00:09:37,462.00538951 --> 00:09:40,972.00538951
We also have unconscious bias, all sorts of things.
84
00:09:40,972.00538951 --> 00:09:47,782.00538951
So the GPTs were designed to make ourselves better, right? To actually compliment us and overcome our weaknesses.
85
00:09:48,352.00538951 --> 00:09:53,812.00538951
On the other side, where we're replicating somebody else, like the executive de.
86
00:09:54,792.00538951 --> 00:09:59,892.00538951
We use that to prepare for any kind of conversation that we're going to have with them.
87
00:09:59,952.00538951 --> 00:10:08,82.00538951
So Megan, in her case, when she has to present a campaign to the CMO and the CFO, she says, here's my campaign deck.
88
00:10:08,617.00538951 --> 00:10:14,127.00538951
Help me understand what questions they might ask, fill in some gaps.
89
00:10:14,127.00538951 --> 00:10:26,217.00538951
What did I not address? So she either fills in those gaps based on what the GPT says, or mentally she's thinking, all right, I know that they're gonna ask me these questions based on what the GPT said.
90
00:10:26,367.00538951 --> 00:10:29,337.00538951
I'm going to prepare a response to that question.
91
00:10:29,707.00538951 --> 00:10:32,217.00538951
I think it's just such a brilliant starting point.
92
00:10:32,517.00538951 --> 00:10:36,597.00538951
And from there you could see how it got.
93
00:10:37,137.00538951 --> 00:10:41,157.00538951
More complex, more more tied to workflows.
94
00:10:41,157.00538951 --> 00:10:53,157.00538951
They begin using GPTs, not just to do very specific tasks, like they have a pitch deck builder, they have a content topic, ideator, they have a campaign performance analyzer.
95
00:10:53,587.00538951 --> 00:11:08,297.00538951
Now Megan, who's the head of demand Generation, she has actually strung these gpt and chained them together so that it works not just within marketing, but also begins to work with sales and cs.
96
00:11:08,687.00538951 --> 00:11:14,27.00538951
So if you can imagine you got a campaign launch, GPT chain to a.
97
00:11:15,292.00538951 --> 00:11:24,472.00538951
Sales enabler, GPT chained into a question and answer GPT to help CES answer customer questions.
98
00:11:24,832.00538951 --> 00:11:32,932.00538951
Now, she's not no longer just working in marketing, she's working on workflow that cuts across multiple organizations.
99
00:11:33,142.00538951 --> 00:11:36,182.00538951
Stop anyway from a simple, digital twin.
100
00:11:36,602.00538951 --> 00:11:44,722.00538951
To something more complex by function, to know something that chains across multiple organizations in one workflow.
101
00:11:46,352.00538951 --> 00:11:52,982.00538951
I think that's such a great example and something that people can get started with today without having to have a lot of technical expertise.
102
00:11:53,532.00538951 --> 00:12:22,482.00538951
If you think about when you even work with a lot of clients, talking to a lot of folks thinking about aI magnifying some of the challenges, like identifying some of those gaps? 'cause you talked about it, uploading a PowerPoint, but how have you seen AI amplify some of those, whether they're pockets in go to market strategy how can teams avoid the trap of making existing challenges worse? Yeah, so the, there's many examples and I think there's this.
103
00:12:23,217.00538951 --> 00:12:31,837.00538951
Preconceived notion that when we follow a prompt framework the answer is going to be really good.
104
00:12:32,527.00538951 --> 00:12:32,737.00538951
Yeah.
105
00:12:32,767.00538951 --> 00:12:39,47.00538951
But, and prompting, like they say prompt engineering, you need to be good at prompting and all sorts of things.
106
00:12:39,617.00538951 --> 00:12:47,117.00538951
But AI is a magnifier, to your point, Erin, right? It's a magnifier of anything good, bad, or indifferent.
107
00:12:47,477.00538951 --> 00:12:49,127.00538951
And I'll come back to the prompting piece.
108
00:12:50,267.00538951 --> 00:12:54,137.00538951
So let's just say that you follow a prompt framework.
109
00:12:54,137.00538951 --> 00:12:56,747.00538951
I don't care if it's open AI or Gemini.
110
00:12:57,267.00538951 --> 00:12:59,337.00538951
Microsoft is a prompting framework.
111
00:12:59,437.00538951 --> 00:13:25,497.00538951
Claude Christopher Penn has a prompting framework called race, and then I've actually amended his prompting framework and turned it into grace because the G stands for goals, R for role, A for action, C for context, and then E for examples, if you prompt the AI enough with that framework and you give it all that information, then your expectation is that it's gonna give you good results.
112
00:13:26,127.00538951 --> 00:13:27,687.00538951
Here's what's happening.
113
00:13:27,717.00538951 --> 00:13:31,17.00538951
People follow the prompt framework, doesn't matter what it is.
114
00:13:31,17.00538951 --> 00:13:43,227.00538951
And there are lots of good ones out there, and there's still disappointed with the results, right? And they're disappointed because it's magnifying what's there and what's there is something very basic.
115
00:13:43,717.00538951 --> 00:13:45,247.00538951
It's basic thinking.
116
00:13:45,517.00538951 --> 00:13:48,727.00538951
It follows the framework, but the thinking is still basic.
117
00:13:49,57.00538951 --> 00:13:52,637.00538951
So for example, let's just say you're having a churn problem.
118
00:13:52,677.00538951 --> 00:13:55,107.00538951
You have goals and all sorts of things.
119
00:13:55,107.00538951 --> 00:13:59,117.00538951
You give it context that this is for, travel management software.
120
00:13:59,117.00538951 --> 00:14:01,847.00538951
We're experiencing this churn and here's some data.
121
00:14:02,257.00538951 --> 00:14:05,227.00538951
And here's some examples of what we've experienced.
122
00:14:05,617.00538951 --> 00:14:09,367.00538951
Help me figure out how to reduce churn, okay.
123
00:14:09,772.00538951 --> 00:14:10,642.00538951
Awesome, prompt.
124
00:14:10,852.00538951 --> 00:14:11,332.00538951
Great.
125
00:14:12,292.00538951 --> 00:14:18,142.00538951
However, we are simply asking it a very basic question of how to reduce churn.
126
00:14:18,142.00538951 --> 00:14:50,682.00538951
We're not forcing it to think, right? If we said things like, Hey, can you challenge me and think about this differently? 'cause I'm assuming that it's a churn problem, but could it perhaps be a different kind of problem? Can you look at this data? And determine whether it's a churn problem or are there potentially other opportunities that this is telling us? Because an example that I recently posted about this, a churn problem could actually be an opportunity for an upsell.
127
00:14:52,57.00538951 --> 00:14:55,862.00538951
A hundred percent, right? Like we're seeing a lot of churn.
128
00:14:55,862.00538951 --> 00:15:03,492.00538951
Customers are leaving why are they leaving? A traditional response to that question is, Hey, you need a loyalty program.
129
00:15:03,492.00538951 --> 00:15:06,762.00538951
You need, better Cs, or all sorts of things.
130
00:15:07,122.00538951 --> 00:15:12,192.00538951
But if you actually see it as an indicator that it's an upsell opportunity.
131
00:15:12,552.00538951 --> 00:15:18,882.00538951
Then it's a very different motion or a different response from the AI that says, oh, it's an upsell opportunity.
132
00:15:18,882.00538951 --> 00:15:32,832.00538951
Let's think about potential product offerings or potential packaging that would include new features, new functionality, or we probably need a campaign and let them know that it's coming in the next three months or something like that.
133
00:15:33,192.00538951 --> 00:15:37,422.00538951
And now it's no longer a churn problem, it is now an upsell.
134
00:15:37,637.00538951 --> 00:15:38,537.00538951
Opportunity.
135
00:15:38,747.00538951 --> 00:15:41,497.00538951
So it's not necessarily prompt engineering.
136
00:15:41,617.00538951 --> 00:15:42,937.00538951
It's prompt strategy.
137
00:15:43,417.00538951 --> 00:15:43,687.00538951
Yeah.
138
00:15:44,27.00538951 --> 00:16:04,842.00538951
It is what is the strategy? What is the thinking we want to elevate in this conversation? I totally agree with that the critical thinking piece is something we've been talking a lot about you think about kids growing up now and what do they need to learn? It's not necessarily just how to prompt, but that critical thinking, what's the problem? That diagnosis is something.
139
00:16:05,212.00538951 --> 00:16:11,692.00538951
One of the highlights last year we talked about, which I think ties well, is the, aspire, align and implement.
140
00:16:12,152.00538951 --> 00:16:12,932.00538951
I'm curious.
141
00:16:12,942.00538951 --> 00:16:32,27.00538951
Having implemented that with more teams over the last year, what has been the most critical for teams and why? Yeah, and I touched on this earlier we can't underestimate the amount of work required to meet people where they're at and bring them along.
142
00:16:33,47.00538951 --> 00:16:34,367.00538951
Cannot underestimate it.
143
00:16:34,467.00538951 --> 00:16:38,347.00538951
We can have the most awesome MarTech stack out there.
144
00:16:39,307.00538951 --> 00:16:41,497.0053895
We can have four ais.
145
00:16:41,677.0053895 --> 00:16:53,492.0053895
All sorts of support from vendors, but when we don't understand the fears, the motivations, the challenges of people, and we don't give them the space to learn.
146
00:16:54,622.0053895 --> 00:17:00,682.0053895
We're all pretty oversubscribed, right? Like anywhere from 120 to 150%, maybe beyond.
147
00:17:00,772.0053895 --> 00:17:03,172.0053895
There's never enough hours in the day.
148
00:17:03,652.0053895 --> 00:17:10,62.0053895
And then we're expected to learn AI on our own and no one's teaching us, oh my gosh.
149
00:17:10,62.0053895 --> 00:17:15,42.0053895
The pressure of that is immense, right? Especially when there are people saying.
150
00:17:15,582.0053895 --> 00:17:18,462.0053895
Hey, AI is gonna take your job you're way behind.
151
00:17:18,462.0053895 --> 00:17:29,2.0053895
I also don't like that, it's fear-based, right? Not only are we now oversubscribed and under pressure, we now get paralyzed by fear.
152
00:17:29,542.0053895 --> 00:17:38,932.0053895
So this whole notion of if we want to drive change, we can't underestimate the amount of work that it would require to bring people along.
153
00:17:39,622.0053895 --> 00:17:48,792.0053895
And you know how those little cars, the little matchbox cars that you pull back and then when you pull back it gains momentum and energy, and then when you let go, it goes.
154
00:17:49,242.0053895 --> 00:17:56,952.0053895
That's like analogy I'm putting in my head, let's give people the space to pull back, give them the space to learn and understand.
155
00:17:56,952.0053895 --> 00:17:57,492.0053895
Use ai.
156
00:17:58,617.0053895 --> 00:18:03,897.0053895
And then when they're inspired and we show them what's possible, let it go and then it will move forward.
157
00:18:03,947.0053895 --> 00:18:13,698.0053895
And I think right now what's happening is we're pushing the car along very manually when we can just take a little bit of time to pull back and let it go.
158
00:18:16,52.0053895 --> 00:18:30,792.0053895
Yeah, that analogy really stands out to me as something that deals with the relationship between employees and their manager, right? The how much their manager's pushing them based off whatever pressure they're getting and whatever the leadership is experiencing.
159
00:18:30,792.0053895 --> 00:18:44,22.0053895
And I recently did some research and found talking to white collar professionals that something like 93% of white collar professionals were willing to adopt AI at work, but only 50% of them trusted their.
160
00:18:44,407.0053895 --> 00:18:55,527.0053895
The leader's decision making and having an AI strategy, one of the obstacles is the change management around, I need time to learn this thing, but it's also I don't really know how you're going to use it or if you know what you're doing.
161
00:18:55,717.0053895 --> 00:18:59,617.0053895
it circles around a lot of, bit of the dynamics around change management.
162
00:19:00,337.0053895 --> 00:19:35,612.0053895
Another thing that people look at is this idea of how do we measure this as working? When you've been working with marketing teams and getting them on the path to implementing AI and integrating it into their workflows, is there a metric that you can use as a leading indicator? And then are there some metrics longer term that people can look for when they're starting to look for the impact and the outcomes? I get this question a lot because people are always saying, measure what's the ROI and all sorts of things, right? We can, get super complex with that and our heads will explode when we start thinking about all the different possibilities.
163
00:19:36,302.0053895 --> 00:19:50,127.0053895
I always guide marketing leaders to do one of three things, or in combination, right? One is to align jobs to be done to specific strategic initiatives that the company already has.
164
00:19:50,907.0053895 --> 00:20:03,267.0053895
Those jobs to be done could be a combination of use cases using ai, right? When we align it to strategic initiatives, we have a better shot at measurements because those strategic initiatives have goals.
165
00:20:03,267.0053895 --> 00:20:04,497.0053895
They have KPIs.
166
00:20:04,907.0053895 --> 00:20:12,997.0053895
And we have a better shot at adoption because it has budget, it has resources, it has the eyes of the executive team, right? You have no choice.
167
00:20:13,807.0053895 --> 00:20:15,487.0053895
You align to the jobs to be done.
168
00:20:15,487.0053895 --> 00:20:17,347.0053895
That's aligned to the strategic initiative.
169
00:20:17,347.0053895 --> 00:20:23,377.0053895
You're gonna make this thing work and you're gonna be accountable for measuring and reporting on impact.
170
00:20:23,977.0053895 --> 00:20:30,857.0053895
So that's one of the ways to force the issue, right? The second thing that I guide marketing leaders towards is.
171
00:20:31,962.0053895 --> 00:20:41,382.0053895
Pick the area of biggest pain, right? Where you have the biggest pain, you will put the resource towards it.
172
00:20:41,672.0053895 --> 00:20:56,662.0053895
You will use AI to help overcome that pain, and your goal is to alleviate the pain, right? Whether that pain is you're spending too much money with an agency or the pain is, my team is working.
173
00:20:56,952.0053895 --> 00:20:59,82.0053895
80 hours a week each person.
174
00:20:59,82.0053895 --> 00:21:00,282.0053895
Whatever it might be.
175
00:21:00,462.0053895 --> 00:21:06,402.0053895
Or the quality is so bad because we're doing this, not in a good way, There, there's not a process for it.
176
00:21:06,922.0053895 --> 00:21:08,812.0053895
I've used this example a number of times.
177
00:21:08,862.0053895 --> 00:21:11,252.0053895
One of the CMOs I worked with her and her team's.
178
00:21:11,252.0053895 --> 00:21:18,452.0053895
Biggest pain was translation and localization because they were having to do it for eight different languages.
179
00:21:19,907.0053895 --> 00:21:22,157.0053895
Every single customer facing document.
180
00:21:22,857.0053895 --> 00:21:29,857.0053895
Tens of thousands of dollars in agency fees to translate and localize, these documents.
181
00:21:30,247.0053895 --> 00:21:33,627.0053895
So guess what the team did, custom GPTs, eight of 'em.
182
00:21:34,407.0053895 --> 00:21:35,697.0053895
One per language.
183
00:21:35,697.0053895 --> 00:21:38,287.0053895
Field marketers that are native speakers.
184
00:21:38,827.0053895 --> 00:21:41,937.0053895
It doesn't mean that the custom GPTs were, perfect.
185
00:21:41,937.0053895 --> 00:21:44,877.0053895
Like it put a document on outcomes like a German.
186
00:21:45,152.0053895 --> 00:21:47,42.0053895
Translate a document all perfect.
187
00:21:47,282.0053895 --> 00:22:12,272.0053895
No, but it's like anywhere from 80 to 95% there to get that extra 5% Just human oversight versus the weeks of time and the dollars that you would need to invest with an outside agency that's super quick, right? And in that company and in that marketing team, when they said over two weeks when we built this custom gpt, we saved tens of thousands of dollars a month.
188
00:22:12,512.0053895 --> 00:22:25,642.0053895
They're like, heck, if we could do that with localization and translation, let's think about all the other gpt, because that's already like it, it was such a big ROI to them, 20 bucks a month, three people doing it.
189
00:22:26,632.0053895 --> 00:22:29,272.0053895
And then you're saving tens of thousands of dollars a month.
190
00:22:29,572.0053895 --> 00:22:34,672.0053895
You can now begin to think about all of these other use cases where you have areas of biggest pain.
191
00:22:34,822.0053895 --> 00:22:39,802.0053895
And then the last one is really around leaning into your trailblazers.
192
00:22:40,162.0053895 --> 00:22:48,967.0053895
Those who are intensely curious because they're already building the AI tools, the ai, the GPT, the gems or whatnot.
193
00:22:49,432.0053895 --> 00:22:55,782.0053895
To help themselves, with efficiency, with effectiveness, and starting to reimagine the work.
194
00:22:56,382.0053895 --> 00:23:02,142.0053895
See how they're performing already, because I bet you that they're already achieving some of these benefits.
195
00:23:02,322.0053895 --> 00:23:10,207.0053895
So those are just some like really simple ways to get started and get some numbers on the board and say, here's what we're seeing with ai.
196
00:23:11,392.0053895 --> 00:23:17,142.0053895
I think that's such a good point, Liza the folks that are the trailblazers, I think for a while were just a little bit like.
197
00:23:17,997.0053895 --> 00:23:25,857.0053895
Flying under the radar, right? How can I maximize my output without maximizing the effort I have to put in? I know a couple of those.
198
00:23:25,947.0053895 --> 00:23:33,797.0053895
One thing that people keep talking about is some early struggles with ai, which is around hallucinations and quality control.
199
00:23:34,47.0053895 --> 00:23:42,777.0053895
What practical steps do you have to ensure that AI generated content aligns with those brand standards, and how can you limit or.
200
00:23:42,787.0053895 --> 00:23:45,577.0053895
Protect yourself from horrible hallucinations.
201
00:23:46,337.0053895 --> 00:23:49,567.0053895
What was that report? Maybe Erin, you can remember.
202
00:23:49,777.0053895 --> 00:23:56,277.0053895
There's some report from OpenAI I believe, that says that AI models hallucinate anywhere from 30 to 80% of the time.
203
00:23:56,757.0053895 --> 00:23:57,897.0053895
And I was like, holy cow.
204
00:23:57,897.0053895 --> 00:23:58,407.0053895
80%.
205
00:23:59,337.0053895 --> 00:24:01,137.0053895
Yeah, that's a lot.
206
00:24:01,137.0053895 --> 00:24:01,587.0053895
That's right.
207
00:24:01,827.0053895 --> 00:24:02,517.0053895
But here's.
208
00:24:03,827.0053895 --> 00:24:30,377.0053895
You'll observe that kind of hallucination if you are using it like a question and answer machine or like a Jeopardy partner because it's not like a search engine where it's got a database and you just go into it, right? It's not a rag model, right? So if you're asking it, what is the population of Nigeria, when did Elvis die? All sorts of things like trivial questions.
209
00:24:31,97.0053895 --> 00:24:47,207.0053895
The probability of a hallucination is higher, but if you have more nuanced questions, your strategic questions, what if Scenario analysis, that's what it's designed to do, right? It's designed to brainstorm with you and improve your thinking.
210
00:24:47,507.0053895 --> 00:24:52,102.0053895
It's not designed to have exacting answers for very specific questions.
211
00:24:52,722.0053895 --> 00:24:57,997.0053895
So if you're seeing a lot of hallucinations, then just start thinking about what you're using it for.
212
00:24:58,672.0053895 --> 00:25:02,302.0053895
Because more than likely, you're probably using it as a question and answering machine.
213
00:25:02,992.0053895 --> 00:25:14,92.0053895
But to your point, Ken, there are other areas where hallucinations do occur, right? One of the things I try to do to prevent it, and there's several things.
214
00:25:14,92.0053895 --> 00:25:18,832.0053895
So the first one is check your conversations and your data set.
215
00:25:19,252.0053895 --> 00:25:36,932.0053895
If it's getting super big, like super huge data sets or very long conversations, what happens is we run into the context window limitation, which is just fancy way of saying it's the amount of information AI can remember in any give or conversation.
216
00:25:37,172.0053895 --> 00:25:40,142.0053895
So the longer the conversation, the more data it has to ingest.
217
00:25:40,502.0053895 --> 00:25:44,942.0053895
Pretty soon as you reach that threshold of what they can remember.
218
00:25:45,407.0053895 --> 00:25:56,987.0053895
It starts making up stuff confidently, to the extent that we can limit the dataset, limit your conversations, make them shorter, and then, the probability of hallucinations.
219
00:25:57,57.0053895 --> 00:26:01,667.0053895
Go down the second thing is use multiple ais.
220
00:26:03,42.0053895 --> 00:26:06,672.0053895
I think I've said this before, I have a conversation with Chad, GPT.
221
00:26:07,12.0053895 --> 00:26:29,52.0053895
cut and paste that conversation, put it in the Claude, ask Claude what it thinks, cut and paste, put it into Gemini, and they essentially check each other, right? And then I can God forbid that all three of 'em hallucinate at the same time, but when I've got three of them that I can, I have a higher probability of knowing who's hallucinating, right? So that's the second thing.
222
00:26:29,742.0053895 --> 00:26:35,622.0053895
Then one of the things that I have done now very consistently is what? It gives me an answer.
223
00:26:35,682.0053895 --> 00:26:40,892.0053895
Let's say, it's doing competitive analysis, and I say, give me your top three insights.
224
00:26:41,822.0053895 --> 00:26:42,722.0053895
I don't stop there.
225
00:26:42,992.0053895 --> 00:26:49,322.0053895
I say, for each of these insights, tell me your level of confidence, low, medium, high.
226
00:26:50,232.0053895 --> 00:26:55,902.0053895
Tell me what assumptions you made and give me your rationale for why you rated.
227
00:26:56,592.0053895 --> 00:27:08,362.0053895
Your confidence level the way you did, for the insights where you have low to medium confidence, tell me what other information might you need to increase your confidence.
228
00:27:09,562.0053895 --> 00:27:11,752.0053895
That's the prompt, right? Oh, I like that.
229
00:27:11,932.0053895 --> 00:27:11,992.0053895
Yeah.
230
00:27:13,402.0053895 --> 00:27:14,782.0053895
So I just posted that today.
231
00:27:14,907.0053895 --> 00:27:16,617.0053895
And we could put that in the show notes.
232
00:27:17,787.0053895 --> 00:27:21,357.0053895
You would be amazed how many times, let's say there's five insights.
233
00:27:22,152.0053895 --> 00:27:32,952.0053895
There's some low and medium confidence responses and it says, I have low confidence in this because I only evaluated your data.
234
00:27:33,52.0053895 --> 00:27:34,762.0053895
I did not do any web search.
235
00:27:34,762.0053895 --> 00:27:36,682.0053895
We've only looked at three competitors.
236
00:27:36,862.0053895 --> 00:27:39,622.0053895
Like it tells you, I would be more confident if.
237
00:27:40,447.0053895 --> 00:27:54,597.0053895
We had some case studies, all sorts of things, right? And I'm like, dang, this is exactly what I needed because it helps me as a human being validate, right? On the ones that are low to medium confidence versus trying to figure out everything.
238
00:27:55,47.0053895 --> 00:27:56,607.0053895
All right, something new for this year.
239
00:27:56,637.0053895 --> 00:27:59,847.0053895
Liza is the gladiator segment, which I'm heard Sure.
240
00:27:59,847.0053895 --> 00:28:01,707.0053895
You've heard on other episodes.
241
00:28:01,707.0053895 --> 00:28:03,327.0053895
And so now you are up.
242
00:28:03,607.0053895 --> 00:28:09,467.0053895
You actually turned me on to Claude Artifacts, love them and remixing pieces to make something new.
243
00:28:09,617.0053895 --> 00:28:11,57.0053895
So I'm gonna put you on the spot.
244
00:28:11,777.0053895 --> 00:28:16,707.0053895
Can you show our audience how you do it live? Yeah, I will try.
245
00:28:16,807.0053895 --> 00:28:19,177.0053895
And this one's gonna be a fun one.
246
00:28:20,197.0053895 --> 00:28:22,657.0053895
And I will share my screen.
247
00:28:24,127.0053895 --> 00:28:29,167.0053895
You guys can't make fun of my tabs, okay? Because I have way too many tabs.
248
00:28:29,827.0053895 --> 00:28:32,257.0053895
So do you see my screen? Yeah.
249
00:28:32,317.0053895 --> 00:28:37,122.0053895
So it's not gonna be exactly live, but you'll see some fun things in here.
250
00:28:37,752.0053895 --> 00:28:45,662.0053895
There is this brand new report from the Marketing AI Institute called the 2025 State of Marketing AI report.
251
00:28:45,662.0053895 --> 00:28:47,882.0053895
It's got some really good insights.
252
00:28:47,932.0053895 --> 00:28:53,972.0053895
They surveyed a number of marketers across industries, across different company sizes.
253
00:28:54,332.0053895 --> 00:28:55,712.0053895
It's a very long report.
254
00:28:56,762.0053895 --> 00:29:07,167.0053895
So what I did was normally I just put it into notebook, lm, and I have notebook lm, and I have it in my ear and it just basically tells me about key insights and things like that.
255
00:29:07,467.0053895 --> 00:29:12,367.0053895
But I wanted to do something more fun with the state of marketing AI report.
256
00:29:12,367.0053895 --> 00:29:20,747.0053895
And, I was thinking about a use case when we have offsites and we wanna do a really cool icebreaker, but at the same time.
257
00:29:20,847.0053895 --> 00:29:26,927.0053895
Make it educational, or we wanna make sure that we have interaction between the team.
258
00:29:26,927.0053895 --> 00:29:30,97.0053895
So I wanted to see what Claude can help me do.
259
00:29:31,207.0053895 --> 00:29:36,247.0053895
To turn this into something interactive that can be used by the team and make it educational.
260
00:29:36,597.0053895 --> 00:29:42,567.0053895
What you see here is the state of marketing AI report, PDFI simply uploaded that into Claude.
261
00:29:42,567.0053895 --> 00:29:44,617.0053895
This is Claude sonnet four.
262
00:29:45,157.0053895 --> 00:29:46,297.0053895
And here's my prompt.
263
00:29:46,607.0053895 --> 00:29:52,907.0053895
I say, please turn this report into a highly engaging and visually appealing interactive jeopardy game.
264
00:29:53,737.0053895 --> 00:29:55,987.0053895
Four colors blue, orange, gray and white.
265
00:29:55,987.0053895 --> 00:29:58,477.0053895
That just happens to be my brand colors.
266
00:29:58,817.0053895 --> 00:30:01,127.0053895
Then I said, please output the app itself.
267
00:30:01,247.0053895 --> 00:30:13,47.0053895
You can see that it gives, some messaging around what it's doing in terms of the game features and the game structure the different categories in the Jeopardy game and so on and so forth.
268
00:30:13,137.0053895 --> 00:30:14,637.0053895
Scoring system.
269
00:30:15,507.0053895 --> 00:30:20,487.0053895
Some data points covered, and these are from the Marketing AI Institute, but.
270
00:30:21,712.0053895 --> 00:30:23,67.0053895
I will show you what it output.
271
00:30:23,277.0053895 --> 00:30:25,737.0053895
So here's the Jeopardy game.
272
00:30:26,457.0053895 --> 00:30:27,597.0053895
So fun.
273
00:30:28,377.0053895 --> 00:30:32,367.0053895
Does this, so your conversation is still on the left hand side.
274
00:30:32,367.0053895 --> 00:30:34,287.0053895
The Jeopardy game is on the right hand side.
275
00:30:34,327.0053895 --> 00:30:41,887.0053895
It doesn't just output this, what you actually see is while I'm waiting for the Jeopardy game it codes.
276
00:30:41,987.0053895 --> 00:30:44,87.0053895
It codes all of these things.
277
00:30:44,377.0053895 --> 00:30:47,127.0053895
I didn't code this, I'm not there sitting, typing this thing.
278
00:30:47,157.0053895 --> 00:30:56,772.0053895
Oh, I typed this thing right here on the left side the prompt that I just read to you and uploaded the PDF it coded all of these and then it gives you a preview.
279
00:30:57,652.0053895 --> 00:30:58,582.0053895
The game itself.
280
00:30:59,242.0053895 --> 00:31:07,12.0053895
And then once you are happy with the game, you publish it, right? But I was looking at this thing and I forgot to actually cite the source.
281
00:31:07,252.0053895 --> 00:31:15,742.0053895
So what I did was I prompted it a little more and said, please add a source citation at the bottom 2025 State of Marketing AI report.
282
00:31:16,552.0053895 --> 00:31:18,652.0053895
And then it recoded.
283
00:31:18,732.0053895 --> 00:31:19,992.0053895
The source is indicated.
284
00:31:19,992.0053895 --> 00:31:21,492.0053895
So now I'm really happy with this.
285
00:31:21,492.0053895 --> 00:31:26,242.0053895
So I published this and then I copy the link and put it in here.
286
00:31:26,242.0053895 --> 00:31:29,242.0053895
And now here is the Jeopardy game.
287
00:31:29,512.0053895 --> 00:31:33,262.0053895
So you can choose any one of these and let's see if we can.
288
00:31:33,712.0053895 --> 00:31:38,452.0053895
You know what percentage of respondents or in the experimentation phase? I have no idea.
289
00:31:38,452.0053895 --> 00:31:40,372.0053895
So I'm going to say 45.
290
00:31:40,492.0053895 --> 00:31:41,332.0053895
Oops, I'm wrong.
291
00:31:41,332.0053895 --> 00:31:44,282.0053895
It's 40, right? Company policies.
292
00:31:44,822.0053895 --> 00:31:49,812.0053895
What percentage of companies lack an AI roadmap strategy? I think that's gonna be pretty high.
293
00:31:49,872.0053895 --> 00:31:50,472.0053895
78.
294
00:31:50,532.0053895 --> 00:31:51,102.0053895
I'm still wrong.
295
00:31:51,102.0053895 --> 00:31:51,372.0053895
75.
296
00:31:54,462.0053895 --> 00:31:59,682.0053895
Like at what company size does Chad GPT usage peak before declining? Oh, I don't know.
297
00:31:59,732.0053895 --> 00:32:00,512.0053895
Interesting.
298
00:32:00,902.0053895 --> 00:32:01,142.0053895
I know.
299
00:32:01,922.0053895 --> 00:32:02,732.0053895
Oh, I'm wrong too.
300
00:32:04,382.0053895 --> 00:32:06,122.0053895
Doing bad this game.
301
00:32:06,272.0053895 --> 00:32:13,142.0053895
But anyway, you can see that my score is minus seven four, the 25 questions.
302
00:32:13,142.0053895 --> 00:32:15,342.0053895
So it's just something fun to do.
303
00:32:15,342.0053895 --> 00:32:17,322.0053895
And the point of this is.
304
00:32:17,742.0053895 --> 00:32:19,182.0053895
I didn't have to code.
305
00:32:19,542.0053895 --> 00:32:27,132.0053895
This is what some are calling vibe coding, which is essentially code word for describe what you want and AI builds it for you.
306
00:32:27,947.0053895 --> 00:32:30,792.0053895
So Ken Jennings would be proud.
307
00:32:31,152.0053895 --> 00:32:32,82.0053895
That was really cool.
308
00:32:32,82.0053895 --> 00:32:37,182.0053895
I appreciate you walking us through what the output is because that's what really matters.
309
00:32:37,232.0053895 --> 00:32:41,292.0053895
For most people who are building it, it's great, but like what the output that's so cool.
310
00:32:41,292.0053895 --> 00:32:45,642.0053895
And that's something that someone could use it in marketing, they could use it in sales enablement, they could use it in training.
311
00:32:45,642.0053895 --> 00:32:46,722.0053895
There's so many options.
312
00:32:46,812.0053895 --> 00:32:47,112.0053895
Yeah.
313
00:32:47,117.0053895 --> 00:32:56,172.0053895
I use it quite a bit for ROI calculators, right? Instead of using a spreadsheet, build an ROI calculator, I use it for dashboards.
314
00:32:56,877.0053895 --> 00:32:58,947.0053895
Marketing op or CMOs.
315
00:32:58,947.0053895 --> 00:33:02,127.0053895
We have so many spreadsheets, turn it into a dashboard.
316
00:33:02,467.0053895 --> 00:33:11,517.0053895
Event planning, put your event plan into an interactive infographic or webpage, right? So there, there are just the possibilities are endless.
317
00:33:11,517.0053895 --> 00:33:17,877.0053895
And I think what the beauty of this is, we no longer have to sit in a, sit on ideas.
318
00:33:17,907.0053895 --> 00:33:23,907.0053895
We can make it come to life, right? Like I have a lot of ideas in my head, but it's too hard to explain.
319
00:33:23,907.0053895 --> 00:33:28,317.0053895
But if I can show it in an interactive way, in a more tangible way.
320
00:33:28,352.0053895 --> 00:33:34,172.0053895
Create a jeopardy game out of it, then it's more engaging and you get to align people faster.
321
00:33:34,172.0053895 --> 00:33:38,12.0053895
You get your points across, hopefully generate some revenue more quickly too.
322
00:33:38,912.0053895 --> 00:33:45,782.0053895
I wanna switch over to the human aspect of work, beyond the ability to prompt what human skills.
323
00:33:46,737.0053895 --> 00:33:53,397.0053895
Emotional intelligence, curiosity, critical thinking will become even more valuable as we mature in our AI usage.
324
00:33:53,787.0053895 --> 00:33:54,117.0053895
Yeah.
325
00:33:54,117.0053895 --> 00:34:06,682.0053895
So all those, like what you just mentioned, right? Like I've always said, as AI democratizes, iq, EQ becomes increasingly important, right? To your point, like the critical thinking, empathy and all those things, are important.
326
00:34:06,742.0053895 --> 00:34:10,972.0053895
But I believe as marketers, balance is actually more important.
327
00:34:11,617.0053895 --> 00:34:18,337.0053895
Because what we are going to need to do is we need to balance innovation with ethics.
328
00:34:19,252.0053895 --> 00:34:26,822.0053895
Automation with the human touch, personalization with transparency, we have to look at both sides.
329
00:34:26,852.0053895 --> 00:34:32,82.0053895
We can't be all human because we forget about all the benefits that we get out of using ai.
330
00:34:32,82.0053895 --> 00:34:34,992.0053895
That we can't be all AI because we forget about the human.
331
00:34:35,232.0053895 --> 00:34:41,507.0053895
So this whole balance thing I actually don't know how well AI would do with balance.
332
00:34:41,867.0053895 --> 00:34:51,917.0053895
We talk about the human aspects, but I'm like, okay, ai, can you actually balance, which is another dimension that it's a lot harder for an AI to do.
333
00:34:51,917.0053895 --> 00:34:53,417.0053895
In my opinion, at least today.
334
00:34:53,657.0053895 --> 00:34:55,277.0053895
I never know what's gonna happen in the future.
335
00:34:55,277.0053895 --> 00:34:56,687.0053895
It's advancing so quickly.
336
00:34:56,687.0053895 --> 00:35:00,717.0053895
So I was gonna say let's keep squinting into the future a little bit and.
337
00:35:01,92.0053895 --> 00:35:14,582.0053895
Of all the go-to-market functions that exist, what new role is gonna be headlining the LinkedIn job boards and which roles are gonna go extinct? There is such a good article in the New York Times.
338
00:35:14,582.0053895 --> 00:35:19,12.0053895
It came out last week, and I just listened to Notebook LM on that one.
339
00:35:19,222.0053895 --> 00:35:25,542.0053895
So I would recommend reading that article, but here's my 2 cents, jobs that guide ai.
340
00:35:27,562.0053895 --> 00:35:52,112.0053895
Because now we're going to have AI teammates and who is going to orchestrate, who is going to direct them? Who is going to guide them with a moral compass? Who is going to say that the work is good or not good? Who is going to put their stamp of approval so that it holds up in court, right? Like those jobs are going to rise.
341
00:35:52,472.0053895 --> 00:35:59,792.0053895
The ones that are, I've always said 60% of the jobs that we have today did not exist in 1940.
342
00:36:00,42.0053895 --> 00:36:04,992.0053895
There were no software developers, social media managers or web designers back then.
343
00:36:05,242.0053895 --> 00:36:09,682.0053895
At the same time, we no longer have elevator operators and we no longer have St.
344
00:36:09,952.0053895 --> 00:36:11,2.0053895
Stenographers.
345
00:36:11,102.0053895 --> 00:36:25,702.0053895
There, there will be job replacements and the ones that are repeatable, the I, one of my good friends who's a language editor her job is gonna go away, she basically said, I'm out.
346
00:36:26,112.0053895 --> 00:36:31,142.0053895
And guess what she's doing now? She's coaching people with English as second language.
347
00:36:31,742.0053895 --> 00:36:31,832.0053895
Boom.
348
00:36:33,347.0053895 --> 00:36:38,697.0053895
That is gonna be a lot harder for AI to replace because it's about, understanding the culture.
349
00:36:38,697.0053895 --> 00:36:44,937.0053895
It's not just the lang nuances of language and culture rather than just editing.
350
00:36:46,17.0053895 --> 00:36:58,737.0053895
English words, being able to transition from something that is fairly mechanical to something that has emotional and cultural political value is something that people will need to think about.
351
00:36:58,797.0053895 --> 00:37:03,742.0053895
Alright, Liza, you probably remember from last year, we close out with some rapid fire quick questions.
352
00:37:03,867.0053895 --> 00:37:06,207.0053895
So you ready? Fill in the blank.
353
00:37:06,267.0053895 --> 00:37:11,187.0053895
AI is powerful, but it can't feel.
354
00:37:12,87.0053895 --> 00:37:18,477.0053895
What is one AI tool you can't live without? Liza, GPT.
355
00:37:19,197.0053895 --> 00:37:19,677.0053895
Perfect.
356
00:37:19,767.0053895 --> 00:37:21,267.0053895
One prong, a digital twin.
357
00:37:21,627.0053895 --> 00:37:22,77.0053895
I love it.
358
00:37:22,77.0053895 --> 00:37:22,647.0053895
That's perfect.
359
00:37:23,247.0053895 --> 00:37:25,587.0053895
What a role You we just talked about that.
360
00:37:26,397.0053895 --> 00:37:33,57.0053895
If you could instantly upgrade every go-to market professional on one skill, what would it be? Balance.
361
00:37:34,347.0053895 --> 00:37:34,737.0053895
Love it.
362
00:37:35,377.0053895 --> 00:37:36,277.0053895
That was awesome.
363
00:37:36,447.0053895 --> 00:37:38,517.0053895
Thank you so much, Liza, for joining us.
364
00:37:38,517.0053895 --> 00:37:40,167.0053895
It's always great to have you.
365
00:37:40,267.0053895 --> 00:37:42,517.0053895
We really appreciate it and I know our listeners do too.
366
00:37:42,522.0053895 --> 00:37:42,912.0053895
Thank you.
367
00:37:43,362.0053895 --> 00:37:45,392.0053895
Thank you so much for having me.
368
00:37:45,572.0053895 --> 00:37:46,472.0053895
You guys are awesome.
369
00:37:46,962.0053895 --> 00:37:47,432.0053895
Thank you.
370
00:37:47,437.0053895 --> 00:37:48,537.0053895
Thanks, Liza.
371
00:37:49,237.0053895 --> 00:37:49,897.0053895
We'll be right back.
372
00:37:50,737.0053895 --> 00:37:55,232.6911038
uh, uh, uh Bye.
373
00:37:55,232.6911038 --> 00:37:56,932.6911038
Another fun conversation with Liza.
374
00:37:57,242.6911038 --> 00:38:05,132.6911038
Ken, what was your biggest takeaway? The biggest takeaway from that conversation was at the very end when she was talking about what skill.
375
00:38:05,882.6911038 --> 00:38:09,812.6911038
She wished every go-to-market professional had right now, and she said balance.
376
00:38:10,112.6911038 --> 00:38:19,412.6911038
And with AI moving so quickly, you gotta keep focused on the human aspect and a little bit of self-regulation to not overwhelm yourself.
377
00:38:19,562.6911038 --> 00:38:23,547.6911038
So I do think being balance is actually going to be a skill that people.
378
00:38:25,817.6911038 --> 00:38:28,907.6911038
Explicitly plan to develop over the next couple years.
379
00:38:29,197.6911038 --> 00:38:32,647.6911038
And I also think balance from a perspective of how you lead a business.
380
00:38:32,647.6911038 --> 00:38:41,157.6911038
We've talked about this before, but the idea of Jack and Jills, of all trades this is maybe their era to thrive because they have a skillset that's very balanced.
381
00:38:41,277.6911038 --> 00:38:45,937.6911038
So I think balance is going to be the theme of, the next five years.
382
00:38:45,937.6911038 --> 00:38:46,837.6911038
That's my mindset.
383
00:38:47,437.6911038 --> 00:38:48,127.6911038
Oh, I like that.
384
00:38:48,367.6911038 --> 00:38:48,577.6911038
Yeah.
385
00:38:48,787.6911038 --> 00:38:55,777.6911038
What about you? What's your takeaway? I think the Gladiator is one of my favorite segments that we've launched this season.
386
00:38:55,857.6911038 --> 00:39:09,277.6911038
And I think making it fun and the Jeopardy game was a prime example of something that, could be dense to read through with having a really dense research report that you want people to be aware of.
387
00:39:09,517.6911038 --> 00:39:14,957.6911038
But being able to make it fun and being able to do that without an engineer and create a little game is.
388
00:39:15,247.6911038 --> 00:39:15,787.6911038
Pretty neat.
389
00:39:15,787.6911038 --> 00:39:22,857.6911038
I'm a big fan of the creativity as I mentioned, Liza was the one that turned me onto the Claude a Artifacts, and I've had a lot of fun playing with them.
390
00:39:22,857.6911038 --> 00:39:25,367.6911038
But I think that, there's so much more potential there.
391
00:39:26,777.6911038 --> 00:39:27,647.6911038
Yeah, I agree.
392
00:39:27,647.6911038 --> 00:39:33,827.6911038
I think making AI fun is also how people will stay balanced and sane when doing all of this.
393
00:39:34,7.6911038 --> 00:39:35,767.6911038
And I'm gonna try to make like a game.
394
00:39:35,767.6911038 --> 00:39:37,627.6911038
I don't know for what use yet, but.
395
00:39:38,252.6911038 --> 00:39:39,92.6911038
I'll figure something out.
396
00:39:39,92.6911038 --> 00:39:39,422.6911038
I'm sure.
397
00:39:39,602.6911038 --> 00:39:40,52.6911038
Hey, I like it.
398
00:39:40,52.6911038 --> 00:39:42,542.6911038
Let's put balance on our Bingo card for this year.
399
00:39:43,502.6911038 --> 00:39:44,72.6911038
I love that.
400
00:39:44,562.6911038 --> 00:39:46,392.6911038
It's been another great episode.
401
00:39:46,392.6911038 --> 00:39:52,992.6911038
Special thanks to our friend of the pod Liza Adams, for sharing all of her great insight.
402
00:39:53,92.6911038 --> 00:39:55,402.6911038
And thank you for tuning into Futurecraft.
403
00:39:55,402.6911038 --> 00:39:58,912.6911038
Go to Market if you wanna stay ahead of what's going on in AI and go to market.
404
00:39:59,602.6911038 --> 00:40:05,2.6911038
Give us a follow, if you really like us, we'd love if you could give us a review and leave a comment, share with a friend.
405
00:40:05,212.6911038 --> 00:40:07,102.6911038
This is how we get our voice out there.
406
00:40:07,522.6911038 --> 00:40:09,292.6911038
And thanks for listening.
407
00:40:10,12.6911038 --> 00:40:10,402.6911038
Yeah.
408
00:40:10,672.6911038 --> 00:40:15,472.6911038
And until next time, let's keep crafting the future of AI and go to market together.
409
00:40:15,772.6911038 --> 00:40:16,312.6911038
Thanks.