All Episodes

May 21, 2025 47 mins

As AI redefines how products are built and customers are understood, what are the core strategies engineering leaders use to drive innovation and create lasting value?

Join Conor Bronsdon as he welcomes Wade Chambers, Chief Engineering Officer at Amplitude, to explore these critical questions. Wade shares how Amplitude is leveraging AI to deepen customer understanding and enhance product experiences, transforming raw data into actionable insights across their platform. He also discusses their approach to navigating constant change while building an adaptable, high-performing engineering culture that thrives in the current AI landscape.

The conversation explores Amplitude's strategy for building a sustainable AI advantage through proprietary data, deep domain expertise, and robust feedback loops, moving beyond superficial AI applications. Wade offers insights on fostering an AI-ready engineering culture through empowerment and clear alignment, alongside exploring the exciting potential of agentic AI to create proactive, intelligent copilots for product teams. He then details Amplitude’s successful approach to integrating specialized AI talent, drawing key lessons from their acquisition of Command AI.


Chapters

00:00 Introduction and Guest Welcome

01:55 Understanding and Acting on Data with AI

06:42 Amplitude's Unique Position in the Market

08:36 Differentiation and Competitive Advantage

09:58 Incorporating Customer Feedback

12:48 Evaluating AI Outcomes

17:21 Agentic AI and Future Prospects

21:38 Acquiring and Integrating AI Talent

28:44 Building a Culture of Innovation

37:21 Advice for Leaders and Individual Contributors

43:26 The Future of AI in the Workplace

45:38 Closing Thoughts


Follow the hosts

Follow⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Atin⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Follow⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Conor⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Follow⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Vikram⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Follow⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠Yash⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠


Follow Today's Guest(s)

LinkedIn: Wade Chambers

Website: amplitude.com


Check out Galileo

⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Try Galileo⁠⁠⁠⁠⁠

⁠⁠⁠Agent Leaderboard


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
When you get those three things right, clear alignment, cultural
onboarding, empowerment, that doesn't mean that it's going to
be perfect, but I've seen a significantly better outcome
when those things are all true. Well, welcome back to Chain of
Thought, everyone. I am your host, Connor Bronson.

(00:21):
Today, we have Wade Chambers, Chief Engineering Officer at
Amplitude joining us. Wade, welcome to the show.
Thanks for having me. It's our pleasure.
In preparation for this episode,I had the chance to chat with
you, and honestly, it got me so fired up for our conversation
because you said something I really enjoyed among multiple
things you'd covered, which was the pace of innovation, how fast

(00:44):
AI is moving. And you said that anybody that
thinks they've got it figured out is probably wildly off.
And honestly, I think that's just such a healthy perspective,
especially in the context of what Amplitude does.
You're not just providing a tool, you're helping companies
navigate this evolving landscapeto understand their users, to
make better decisions. And Wade, you're in a unique

(01:06):
position at Amplitude helping tolead engineering through this AI
revolution, something we've talked a lot about on the show
with recent guests like Charity Majors.
So let's jump right in. How is Amplitude leveraging AI
to solve its core mission? Well, I mean, first and
foremost, Amplitude for those ofyou who don't know is this

(01:26):
digital analytics platform. We, we help companies understand
their customers so they can build better product
experiences. And, and you have to think about
that and all the various aspectsof it, right?
Like you have to understand what's going on, you have to
make decisions, then you have tobe able to act on the other side
of it. Each one of those is an entire
thing we could talk about. But if if you think about AIAI,

(01:50):
consider the heart of each one of those things, whether it's
understanding, deciding or acting on it.
We can use AI based insights to reduce guesswork.
And I mean, like this brilliant person probably can use 20% of
the data, but 100% of their brain.
Well, maybe not even 100% of their brain.
But imagine if you had somethingthat could use 100% of the data

(02:14):
to give you more accurate insights that accelerate the
decision making process and could even make recommendations
associated with that. And then could allow you to act
on that in a way that I may wantto try something or I need to
change the wording, or even thatI need to put a guide or a

(02:34):
survey or some other experience in place.
We leverage AI in a lot of different ways that helps
accelerate across all of those various areas for thousands of
customers. Is there a particular area where
you're seeing the most promise so far?
I mean, if you just think about history, right?
Like there has been so much emphasis over the last 10 years

(02:55):
or so on gathering up the data and trying to process it.
And then comes the herculean task of I need to make sense of
this. I, I need to understand what,
what it's telling me because it could be conflicting and there
could be dirty data and there, there could be a lot of
different things. And so I think that the ability
to accelerate the understanding of what's going on is really one

(03:19):
of the key value propositions that early on amplitude solved.
And I think we've we've expandedbeyond that.
But I still go back to that core.
I think anything that helps me understand what's going on then
allows me to to move forward at A at a faster pace.
And regardless of whether it's an insight or a chart that helps

(03:39):
you understand what's going on, you may even need a lot of
different things stitched together, but that ability to
accelerate understanding then the the rest of it becomes much
more manageable. I think this perspective speaks
to your experience and the knowledge you've developed over
the years, whether that's VP Venturing and Twitter at Proof

(04:02):
Point, Yahoo, all these other places that you have helped
develop the engineering teams tohelp their approach.
And now this new AI enabled age has dawned.
And you've obviously been involved with Amplitude since
before then with, you know, eight years now as an advisor
before becoming the chief engineering officer.

(04:23):
How have you seen the industry shift around you and how has it
shifted your perspective on the work being done by engineering
teams as AI tools become almost table sticks?
I think we're still mid transition on matter of fact,
not even mid. I'd say we're very, very early
stage on a lot of that. I think that there's still a lot

(04:48):
of organizations that understandthe value of data to accelerate
that understanding, deciding andacting, but they haven't figured
out how to really tap into that.And now they're feeling a lot
more pressure to take and leverage AI in a real way.
And so it's, it's struggling with both of those things at the
same time. Is, is, is pretty hard.

(05:10):
I have seen a lot of my peers, you know, really struggle.
And same with me. I'm not saying that I'm, I'm
somehow not experiencing the same thing at the same time, but
it's like, how do you help your teams both work from a

(05:31):
historical point of view where they built up all these skills,
competencies, capabilities, and know that that's changing in
real time, right when your source code base can talk back
to you and tell you what's goingon, or you might want to
accelerate your understanding ofa problem and potential

(05:51):
solutions. And how quickly can I get to an
answer leveraging AI? That's a lot to sort of take
into account. And I think that there's a lot
of companies going through that same process right now.
And the amount of data that is going to be available to them is
only just going to multiply in the years in front of us.

(06:13):
And yet the the pace is going tobe moving faster and faster.
How do you take all of those very skilled engineers that
you've got in your organization and use them to their highest
potential while all of this is going on?
It it is a very big challenge inin today's market.
It's exciting, it's fun, but it it it's challenging all at the

(06:37):
same time. This is also where I see
Amplitude as really well positioned because of of the,
the data that you provide to your users and the ability that
Amplitude has to help identify friction points and funnels.
So this feels like a, a great opportunity given where AI is

(06:58):
going to say, hey, look, we havethis already built data funnel,
this analytics platform, we're helping you unlock your data
already. Let's just take this next step
that's. Exactly right.
We feel very fortunate. I mean, a, a lot of it is, is
what are people doing on their site?
They, they are voting with theiractions.
But the more that you can understand where there's

(07:20):
frustration in your platform, like who's rage clicking or
who's having a lot of frustration on a single page,
where are there multiple mistakes being made over and
over and over again. You know, Amplitude has put
together a lot of products to complete an entire platform that
allows you to engage these userswhere they're at, from session

(07:43):
replays to experimentation to even activating things outside
of the platform that I want to go and market to users moving
forward. There's a lot of different
places that we can help accelerate the understanding and
then the action on the other side of that by using data.
And we've got years worth of understanding the value of both

(08:05):
qualitative and quantitative data in all of this and and how
it can help you make better product decisions.
Do you see unlocking that customer data and the years of
experience Amplitude has around building that data
infrastructure as a competitive advantage compared to other
companies that are more so building thin wrappers on top of

(08:26):
LMS? How are you going to position in
this new era? I mean, I, I think you really
have to think about differentiation in what sort of
sustainable advantage do you have and what proprietary data
do you have. I mean, we understand a lot of
how users are engaging with the product.

(08:48):
So those usage insights and we've established feedback loops
that we can take advantage of and so that you understand the
behavioral data inside of your system, not just the stats on
one side or the end or the other, but like what active
cohorts do you have and how are they behaving different and
where do they have friction and what can you do about it?

(09:10):
We have the ability to take all of that data and unlock it for
you. And, and the advantage of doing
it for a long period of time is you find the, the low hanging
fruit and then you start to lookbeyond that and you start to
discover deeper patterns that you can use to unlock it.
And so as we can leverage that domain expertise and we have

(09:32):
those data sets and we continue to improve and build upon that.
And we've used that iterative learning on top of that, we've
we've built this deeper intelligence, I think is is
pretty hard to replicate. Tell me more about this.
How does Amplitude incorporate feedback to refine its own AI,
enigentic AI offerings? And then how are you doing that

(09:54):
for customers? We do it in, in a bunch of
different ways. There is no substitute for
customer interaction. And so we spend a lot of time
making sure that we understand our, our customer pain more
friction or, or where they're trying to go.
It's commonplace that you will find Amplitude engineers deeply

(10:14):
embedded in an account making sure that they understand it's
not an abstraction for them. They, they actually can talk
about what Jill is doing on the customer side and the challenges
that they're having. And so therefore they they can
kind of work through that. We dog food a lot internally.
So we're using our own products where we have a bunch of metrics

(10:38):
that we set quarterly goals against, annual goals against,
and we're tracking out on a weekly basis just just to see
how we're moving against those goals and objectives.
We go out and we talk on a weekly basis on around the
theses that we're seeing in the market and like how we should be

(11:00):
responding to them. And then we set up customer
panels to go and talk to them about those same things.
And so we'll have an executive advisory boards, we'll have
customer boards all to make surethat like it's just not a inside
of Amplitude echo chamber, but it's actually accurately
reflecting what we're seeing in the market.
This speaks to that. Qualitative and quantitative

(11:21):
combination that you spoke to earlier, but there's still this
challenge around as you push theboundaries of AI, as you try to
leverage your data more, how do you ensure responsible
implementations and effective implementations in production
and not just in testing? How are you addressing that?
There's a few different things that come to mind just just as
we're talking through that. One is we've tried to establish

(11:44):
guidelines for ourselves and so we've said we need to maintain
transparency just around model decisions and data usage and
allow our customers to say what we can or cannot do with data.
Then we tried to put guardrails in place to prevent harmful or
unintended outcomes And, and we're constantly measuring to

(12:06):
make sure that the thesis matched the results on the other
side of it. And then we, we, we do a lot to
make sure that we, we have to have long lived relationships
with our customers to be able todo this.
And so we need to demonstrate ethical standards for the long
term around what we do with users and make sure that they're

(12:29):
completely on board with that todo that.
So we've tried to make sure thatwe're setting guidelines, we're
behaving responsible, responsibly associated with that
and just act with full transparency as we move forward.
How are you? Evaluating these outcomes and
ensuring that there's not this challenge that comes in with a

(12:52):
non determinism with from AI tooling of hallucinations,
mistakes that are made of course.
We also I should. ADD do have mistakes from humans
as well, but we already have mechanisms in place to address
those. What do you do with AI?
Inspection, a lot of inspection internally.
So we have a thesis on what should come out on the other

(13:12):
side and we're constantly internally using the product to
make sure that we're seeing whatwas intended associated with
that. We've we've got an AI agent
internally that we can use ask Amplitude.
And so you can ask it a lot of different things and we're
constantly monitoring to see what it's being asked and the
sort of responses that it's creating to make sure that we're

(13:34):
not seeing hallucinations do. You have an evaluation platform
in place? Or how are you approaching this?
We do and we have tried to measure against synthetic
results as well. And so we'll, we'll go through
and and generate synthetic results and then compare and
contrast to that as well as previous norms and what we've

(13:56):
seen coming out of that. And so between the two, you've
kind of got a historical view and one that's been generated on
the other side. And so then you can compare 22
models. I think this speaks to something
we discussed a couple minutes ago, which is this incorporation
of qualitative and quantitative,where you both have, in this
case, automated evaluations happening through elements,

(14:19):
judges with synthetic data sets and then continuous learning
from human feedback. Talk to me more about how you're
incorporating these approaches into your evaluation and testing
process. It's a variety of different ways
that we do that, but each team is the way that we've built the
organization is to make sure that organizations feel like

(14:41):
they have ownership around long term goals.
And so then they are able to sort of build inside of that
domain expectations of what goodlooks like or where you might be
off base. And that can go both from an
output perspective, but also from a cost or performance or
latency perspective as well. And you want to make sure that

(15:04):
you've got the right frameworks that can measure all of those
things and so that each and every Rev that you've got going
out. I wouldn't say that we're golden
across the board on every dimension.
I don't know any company who's absolutely perfect in all of it,
but we are using more and more frameworks to make sure that we
know the delta from one version of the product from the

(15:27):
previous. So if I'm understanding
correctly, depending on the organization within Amplitude
and the goals of that part of the product, you're actually
setting different KP is around what you're looking for.
Any from your evals, from your observability, from your AI
tooling, whether that is, you know, speed and low latency as

(15:48):
you brought up earlier, reducingcost or hitting actual outcomes
against a ground truth data set that you've kind of established
from the start. That's exactly right and paying.
You can imagine infrastructure teams having very different
goals and ways of measuring it like uptime, availability, even
down to like disk utilization and are we getting ready to run

(16:11):
out to data teams, throughput performance, What's the latency
from receiving a record to it being live on the site?
You want that as as fast as possible to the cost of
processing all of that to platform capabilities, to even
end user features. Each and everyone is going to

(16:33):
have a slightly different way oflooking at it, with different
frameworks in place to to test for it and monitor against it.
This is a really interesting example because I think there's
still a lot of organizations that are largely building
internal AI use cases and Amplitude's been aggressively
saying no. We're going to be one of the
leaders in externalizing these use cases for our customers so

(16:56):
that we can actually deliver additional value to them and not
just speed up internal processes.
And we're going to take the timeto ensure they're accurate,
ensure that we evaluated them and sure that we understand
ground truth and are are tracking how these can deliver
for our customer base. What's Amplitude's perspective
on continuing to grow this and building more with agentic AI as

(17:17):
in the Ask Amplitude agent that you mentioned earlier?
Yeah, I, I love the question andI love where a gentic is going
and I think we're big fans of its possibilities.
I mean, if if you think of it, it turns it from a passive or
reactive to a very proactive capability in inside of this.

(17:38):
If you can take and break down your product into all of its
various pieces and it's almost like having a PhD or a group of
pH DS working on each isolated part of your problem.
A gentic, I think applied to this space gives our customers
the ability to accelerate their understanding, deciding and

(18:01):
acting and actually have an active participant in in so
doing. And so any time that we can
provide our customers that sort of active partner for their
product teams or marketing teams, you've got a dynamic
system that can not only flag friction, but can also suggest
test, refine solutions in real time.

(18:23):
So in instead of simply waiting for analysts to sort of dig
through these dashboards, these intelligent Co pilots, for lack
of a better term, can proactively surface insights and
and tailor next steps based on both qualitative data
conversations, experiments, things like that you've seen in

(18:44):
the system as well as qualitative feedback inside of a
customer reactions, product usage patterns, etcetera.
And we we feel like if you've got those agents doing those
sort of things on our customers behalf and you've got the humans
still remaining in control because of guardrails.

(19:05):
We preserve the judgement and strategic thinking that only
people can bring to this, while letting AI handle a lot of the
heavy lifting of pattern detection and rapid iteration.
I think that creates an awful lot of real power in the future
that emerges, you know, from this domain expertise.
That amplitude is built up over time.

(19:27):
Our years of analytical experience should fuel agentic
models that understand where to look for friction and why it
matters. And I think that result is a
platform that grows smarter witheach and every interaction and
evolves from a reactive data tool to into a trusted ally for
shipping better products experiences faster.

(19:48):
I love the excitement I'm hearing from you about this idea
of an agentic future where humans are managing teams of
agents to extend their capabilities and let them be
more strategic. What's convinced you that this
is where we're going as I guess a tech industry, but just
broadly within the way we work? I mean, so one listening,

(20:12):
hearing, seeing, but I, I would say that largely I would say I
was on the cynical side of things for a, a, a while.
And yet I can look out there andexperience entire companies that
have remade themselves. And so it's and then as you dig

(20:32):
into those companies, I mean, one, one of the benefits of, of
being around a whole hell is you've got a pretty wide network
that that you can talk to. Hey, what are you doing and how
are you seeing it? And it's not a toy anymore,
right? Like people are actually able to
do all the things that we're talking about.
And so for me, it's not a leap of faith, it's more of a

(20:54):
recognition of a truth. And so when you can see it
applied in action and as you diginto it, you can see lots of
people being successful with it.Then the real question is, is
like, how can I better understand it and harness its
abilities in much the same way? So I think that over the last
year, especially right like the the speed and the clarity and

(21:19):
transparency of decision making,doing research and able to build
a genetic solutions on top of it.
Health has broken through from anovel idea to no it It's ready
to do some things in production.And I know part of amplitude
strategy to succeed in this new era has been to.

(21:41):
Acquire AI talent. Both through hiring and through
some acquisitions, How has Amplitudes experience been with
acquiring AI talent, particularly given the Command
AI acquisition, and how is it influenced Amplitudes AI
initiatives internally? Anytime you can infuse different

(22:01):
thinking into a company, right, I think you get to benefit that.
And if you can get enough of theAI genetics together in a group,
like I think it multiplies the command AI team is, you know,
just amazing, like their leadership down to every single
individual inside of the group, great talent inside of their

(22:25):
highly motivated. Now, when you can take that and
overlay it with amplitudes mission and where we're going,
when there's high alignment and there's high ownership inside of
that, you find that like the, the glue forms and like they're
able to come in and do great things.

(22:45):
Then it's really, you know, can we help set them up for success?
Can we give early wins? Can we get them true ownership
in the company moving forward? And so they feel like part of it
and they feel like they're not subordinate, but they've
actually we're Better Together and we're winning because we
have a lot of benefits to provide to the team and they

(23:07):
have a lot of benefits to provide to the greater
organization. And we've seen that be very true
even in a short period of time. The command AI has already both
embraced and extended the culture internally.
They've had a real influence on the road map.
And those leaders are, you know,future leaders in the

(23:28):
organization and already showingup that way.
I feel like the topic of avoiding the, OR call it organ
rejection when you're making these acquisitions to, to follow
your metaphor here of accelerating the company's AI
genetics. When you are making this
adaption, often you'll see people that thrash out after

(23:48):
time they're saying, you know, this company, I, I joined this
much larger company. It's not what I expect or it's
not the, the work style I want anymore.
That can be a challenge that it's hard to avoid.
You've obviously had to do this at multiple points throughout
your career. What strategies are you applying
today to ensure alignment and make sure that these incredible

(24:09):
talents that you've brought in the company, these amazing
engineers are able to do their best work and are excited about
Amplitude? Yeah, I mean, I feel like it's a
truism that, you know, belief systems, Dr. behaviors and
behaviors help develop competencies, whether it's

(24:29):
cognitive, personal or social capabilities in there.
So I always pay attention to a lot to the belief systems.
And so step one for me is their clear alignment.
Are there shared objectives and success success metrics that
everybody believes in, whether you're on the acquiring side or

(24:50):
on the being acquired side? First and foremost, do you
believe in a line with those shared objectives and goals?
Secondly, then you have to like almost do a cultural onboarding.
You have to clear lines of communication, clear ownership,
clear wins in the early stages, ensure new teams under

(25:13):
understand existing processes, move as fast as possible to a
system where everybody can execute as fast as possible on
shared code bases, doing those sorts of things.
And then really it's around empowerment.
It's like, can I grant them autonomy to execute on the
vision that they were brought into advance and also give them

(25:36):
the resources to be able to do that?
When you get those three things right, clear alignment, cultural
onboarding, empowerment, that that doesn't mean that it's
going to be perfect, but I've seen a significantly better
outcome when those things are all true over my career.
Often it seems like the most challenging part of this

(25:57):
three-step recipe you provided is that empowerment piece of
making sure leaders and individuals who come into an
organization actually have responsibility and ownership and
are truly empowered to go out there and make the results
happen. How are you?
Fostering an environment where not just folks who are coming in

(26:21):
from acquired companies and but any new talent are able to
thrive and contribute to amplitude.
I think you're going to find similar themes.
It's like you're constantly looking to set people up for
success, give them a chance to win, but then give them the
ownership. Like there's a difference

(26:41):
between ownership, accountability and
responsibility. And I think you just need to be
clear. I it's a cheeky way of saying
it, but I often times talk internally about like a a point
system. You get one point for flagging a
problem. It's important, right?
Like if you don't know about it,you can't do anything about it,
and so the absence of that information means it it will go

(27:03):
unnoticed. But if you get one point for
flagging a problem, you get 10 points For getting to the root
cause of a problem. You get to 100 points if you can
accurately from first principles, define a solution
that maps best, and then you get1000 points for actually
delivering the impact associatedwith that solution.

(27:27):
The reason for saying that out loud is that the further you get
into that stack, right? Like you just want to continue
to bet over and over. So first is just giving people
access to get as high up in thatchart and then coach and train,
provide context, help make sure that they're talking to the
right people, and also help makesure that other people know that

(27:49):
you're holding somebody accountable or you've given them
ownership out of a specific area.
And I think that when you can also act in accordance with
that, it no longer feels performative.
It knows, it doesn't feel like, oh, this was a thing.
It feels like, no, we are actually trying to onboard
people, give them opportunities to excel and then remove

(28:11):
roadblocks from them. And when that works out, new
people that have just joined seethat happening and like, well,
if it can happen for them, it can also happen for me.
And once they've done it a time or two, they're going to tell
their friends, hey, like this isa place that actually believes
in empowerment and ownership. Come do something cool.

(28:32):
And, and slowly but surely, that's the culture that you
building internally. And that's kind of culture at
Amplitude. I'm a big.
Believer in the idea that this social system has to be
integrated tightly into the technical systems, this socio
technical system theory in orderto be truly successful.

(28:55):
And I think you've done a great job throughout our conversation
here talking about both the dataand metric side of like, OK,
we're setting goals or fighting places of impact.
We're creating this feedback loop and then now bringing in
here's how we're building a culture where people feel unable
to actually go out and solve problems and encourage to do so.
How do you meld? Those together so that

(29:18):
everything actually works in concert and so that leaders and
individuals feel enabled to actually go out and solve
problems. Yeah, it's not one thing,
probably a system of things. But first and foremost it, it
always starts with the customer.You, you have to be clear on
where the pain or the friction or the the gain is.

(29:42):
And it can't be about ego, right?
It has to be that the customer is greater than the team when is
greater than the individual, andall of your behaviors, reward
systems, everything needs to kind of start along those same
lines. As you get into it, then you can
start decomposing that into where's there the biggest
opportunities internally. And I like the concept of NC TS,

(30:06):
narratives, commitments, and tasks.
But on the narrative part, threekey sentences are very important
for me #1 today, comma visceral painting of problem.
When we deliver this comma visceral painting of solution
that highly contrast from problems so that anyone can look
at it and go, like, I get why we're doing this.

(30:28):
And they get what the IT comes on the other side.
And then the third sentence is to get there, comma, you must
crawl, then walk, then run. And if you can paint that that's
not so baked that it's beyond the ability to debate and
discuss. And so then you actively engage
in that debate to say, where arewe off?

(30:49):
Who has the con on this being the, the most visceral problem
or this being the right solution?
There's a a thing that happens as you go through that debate.
People still start to feel ownership in this.
And when they feel ownership, they'll use discretionary effort
to go figure out new ideas and they'll be thinking about it in
the shower the the next day and best idea should win.

(31:11):
And so as they come back and youstart thinking about it, right,
you'll see more and more people's DNA show up in the
results. And as you get to that, right,
like you, you have this sense of, you know, I have ownership
in this solution. And so the cultural sort of
blends the technical and the human aspects of this in a way

(31:34):
that everybody's able to show upas their best self contribute.
But it's assumed that like we'regoing to debate and inspect
ideas. No one gets a free pass on this.
It's got to be best idea wins. And so we've done a lot to try
and build systems that encouragethat and have people show up and
celebrate when, when people do do that, the data, qualitative,

(31:58):
quantitative can help you figureout where there's an error in
judgement or the debate is not set up well, as well as on the
other side. Let's go, let's go move fast.
Let's ship something quickly to figure out whether it's working
or not. And the data should should also
help show you that you were right as well.

(32:19):
And so it really is building like like that sonar inside of
all of this that that allows youto call your shot, but also
debate it and and test it in increasingly shorter periods of
time. So this is an interesting part
of the conversation because I think there is this idea that

(32:40):
the best idea should always win as you bring up, but very often
a good enough idea with great execution wins and I.
Strongly. About to enter a new era with
these agentic systems that you're mentioning where
execution is going to become less of a differentiator for
this. And actually having the great

(33:02):
idea and then being able to havestrong enough execution because
you are enabled by these agenticsystems and you're familiar
enough with them to basically execute the technical side of a
social technical system will hopefully lead to better ideas
rising to the top. I don't think we're quite there
yet. There's still this big gap

(33:22):
between maybe even a larger gap right now.
While people, some people are leveraging agents and getting
doubling down execution advantages, whereas others
aren't quite yet. But hopefully we're going to get
to a point where this is a broader suite that people are
able to apply, that they're trained to use.
How do we actually kind of move from crawling and some people

(33:43):
running because they figured it out to everyone running and
beginning to leverage this suiteof technical tools?
And how do we retrain or or or teach the the next generation of
team members to take this on? I wish there was an easy answer
to that question. I was hoping you'd have it wait.
Yeah, no, I, I I found that it'sagain, it's a kind of a system

(34:05):
of things, but like I1I stronglyagree with you, right.
It's like how quickly can you test a thesis and and iterate on
it is going to determine the quality of the solution that
comes out out the other side from a belief systems
perspective, right. Like if you can change the way
that you work, you have a much better chance of it, just like

(34:27):
going through it much, much morequickly.
And so I think that you need to build that at the top, right?
Like all leaders of the organization, you need to build
ownership behind the idea, buy into it.
They need to practice it. They need to actively engage in
it. And so as a leader at the top of
the organization, like you have the responsibility of creating a

(34:51):
structure like that that exists and that you can test against
and you can see who is truly a lying versus like kind of going
through the motions. And you want to coach those
people and you want to help themmove through that as quickly as
possible or, you know, change them out if you have to.
But I, I, I'd prefer to grow people if at all possible.

(35:13):
So if you do that, what you'll find is that kind of layers
through the organization more and more as you build a belief
system, you will then want to coach others on the same and get
them to, to buy into the same and, and especially if it's
working and there's benefit and,and, and doing things along
those lines. I also think that you can attack
it from the bottom up, right? Like if your engineer

(35:37):
technically minded, like you want the best tools, you want to
be on the leading edge of things.
And so unlocking them is genuinely just like giving them
access to, giving them a little bit of training and then a
chance to practice on it. I, I love the model of, you
know, you go from unconscious incompetence to conscious

(35:58):
incompetence to then I get to declarative knowledge and then
it has to become practice. Well, that's really just an
opportunity to practice, hopefully with a believable,
credible coach that can whisper in urine, like help you
understand how it actually works. 20 hours of practice, you
get good at something. Maybe it takes 10,000 hours to
be an expert or a master at something.

(36:20):
But like just being conscious about working people through
that cycle in the shortest period of time generally works.
And if you've got it coming, a belief system from top down and
then you've got skills that are being built up, that state of
flow, right is the how big is the challenge versus the
preparation? If you're constantly prep,
preparing people and the leadership is believing more and

(36:43):
more, there will be more and more challenge associated with
that. Then you see the dry kindling
catch fire. And OK, that's probably a bad
metaphor given, you know, what we've experienced in California
in the last few years. But right, it does catch.
And I think that as with most anything, anytime you can remove

(37:04):
friction or increase the speed, generally there is an embracing
of of the idea and and the concept as long as it's ethical
to do so. So try and build systems that do
that. What other advice?
Would you give to leaders who? Are listening right now as they
think about how to grow their orgs capabilities and grow other

(37:29):
leaders within the organization.And then conversely, folks who
are not looking to be people leaders but are looking to have
this upward influence, what would you advise them?
Let's start with the leaders first.
The unpopular answer is I think you have to look in the mirror
first. You have to put your own oxygen
mask on 1st. And so to that degree, right,

(37:53):
like you know, everybody's talking about vibe coding.
Have you done it? Have you tried interesting?
Which LLM did you like the best and, and why?
As you start to work through some of these things and feel
friction associated with it and have to work through, you'll
have empathy for other leaders and like what they might

(38:14):
experience and, and how to help coach them through their growth
cycles as well. And then I would say, you know,
as you start to think about eachleader on, on your team, where
are they truly out on this? Can you have an open
conversation with them that feels safe?
Are they worried about their job?
Are they worried about where this is all going?

(38:34):
Or they're they, they don't knowthat much about it yet?
And so they haven't learned how to harness it, right?
Like all of these things are overcome able, but you have to
be able to, to have those conversations with them.
Everything's learnable. Everything's teachable, right?
It's just being clear on what the gap is between current state

(38:55):
and desired state and like building up that that criteria
and then pushing as many people through it as at the same time
helps because you have a pit crew, you have others that are
experiencing the same thing thatyou're experiencing.
So I think the shorter window that you can go through with
lots of people, right? Like everybody will have shared

(39:19):
experience as they go through that.
And that will be a culture defining moment in an
organization. I would just put your own oxygen
mask 1st and then be very deliberate about the way you're
you're engaging your your leadership team.
Yeah, I think there has to be intentionality around actually

(39:39):
creating space and influence andresources within the org for
this up leveling to occur. Because some folks are able to
find the time to do it naturallyor do it naturally through their
work. But many more are buried in
tasks and are having trouble, you know, getting their head
above water or getting their mask on, as you put it in order
to say, OK, like how do I plan for this future?

(40:02):
How do I build this next layer? And I also think this is true of
a lot of individual contributorswho are building incredible
things and are looking to take the next step of influence, but
maybe may not be wanting to stepup as people leaders.
What's your advice to them as they seek to?

(40:22):
Influence the social technical circuitry of the organizations
to be more enabled by to AI tooling or to have space to take
a course, or whatever else it may be they're looking to do.
I think there's overlaps in the the answer here and and the
previous is like I think you look at the mirror first.

(40:43):
If you are an engineer and you truly there are a lot of people
who get to a certain escape velocity, so to speak, and write
like they learn, they become very competent, they become very
good at something and then want to leverage that competence for
as as long as they possibly can.Which puts you at a disadvantage

(41:09):
in that like you are trying to maintain that thing that you
think is your worth as opposed to no, I can problem solve, I
can learn things, I can apply new technologies based on new
challenges and things along those lines, I.
Think people get stuck and don'tevolve as much as they need to?
I, I, I think there is a, a large portion of that.

(41:32):
And I mean, look at me, I'm, I'mlate 50s, right?
And, and so like this new technology that that's in front
of us, like that could be very scary or it could be the most
exciting thing. And I think you have to choose
which one of those those statements are true.

(41:53):
As an individual contributor, I think that you need to
understand this new tool, right?Like you need to test it, you
need to play with it. You need to see where its limits
are. And what you'll you'll find is
that it's not everything. It's not one solution, is not
everything to you. You're going to figure out how
to use windsurf, that you're going to use different LLMS

(42:16):
cause different ones are going to be good at different things.
The context windows are increasing in size.
What does that mean and how are you going to leverage that?
As you get into it with a beginner's mindset, I think it
helps you a lot more. And you should expect that over
the next 5 to 10 years, you are increasingly going to be less

(42:40):
and less competent because there's going to be more and
more rapid change that you just need to catch up with.
And as you come to, to grips with that, those that can
actually move through that fairly quickly, there's a whole
new world of, of capabilities and results that can be produced
because of that. And so it's, it's, it's

(43:03):
constantly trying to identify that gap between your current
state and the desired state. And like how quickly can I learn
and grow through it? And it is the, the testing and
playing with and trying to use it in, in production as much as
you possibly can that's going toteach you all those things.
What closing thoughts do you have about what an AI enabled

(43:26):
future looks like? I am truly excited about what I
think this means. I think that up to this point, I
am like I, I've, I've talked about, like, I, I've worked with
a bunch of intellectual bullies that on the engineering side,
you know, use technology as a shield for just doing the things

(43:47):
that they wanted to do. You know, they, they, they would
talk past people and make it sound intellectually too hard or
things along those lines. And I think what is happening
more and more is product, engineering and design are all
coming closer and closer together.
And sort of the, the, the thingsthat separated them over time

(44:10):
are losing their hold. And so that ability for you to
do market research, come up withan idea, be able to get to base
level requirements, generated design, build a working
prototype, you're still going tohave to have problem solving

(44:31):
skills inside of that. But I think that's going to be
more and more approachable over time by lots of different
disciplines that come into it. And so I'm really interested to
see whether we just get, you know, a ton of new lightweight
apps that are out there or whether we're truly going to get

(44:51):
new innovation that happens at avery deep level where companies
are building or individuals are building SAS solutions that are
industry strength and can do really great things.
My guess is it's all of the above, but the technical barrier
is going to be largely reduced as a result.
So anybody that's out there, if you've had a great idea, I think

(45:16):
your ability to bring that to market is going to get easier
and easier. I completely agree.
I think that's a really excitingfeature of what is coming and
how the world is being transformed.
We're already seeing this happenfor many folks, and it just
seems like that impact and that effect is accelerating.
Wade, thank you so much for thisfantastic conversation and

(45:37):
coming on the show. It's been a distinct pleasure
having you with us today. Connor, I've I've really enjoyed
it. Thank you so much for inviting
me on. Hopefully we get a chance to do
it again sometime. I think we can make that happen,
absolutely. For the moment though, where can
folks go if they want to learn more about you and follow your
work and or learn more about Amplitude?
I would still point people at LinkedIn for for me

(46:00):
specifically. And then I would say that for
Amplitude, we're pretty active on all the socials.
And so if you want to know aboutculture or new products,
LinkedIn X others like pick yourfavorite social and we're
probably on it and we're pretty active about it.
Excellent. And we will, as always be

(46:20):
linking that all in the show notes to keep up with the latest
in AI and hear more from industry experts like Wade.
Be sure to subscribe to the podcast on your favorite
platform, whether it's on YouTube, if you're watching us,
whether you're on Spotify, ApplePodcast, Stitcher, wherever you
are listening to this, we would love to have you with us for
future episodes and check out our back catalog.

(46:42):
And if you want to go deeper into the world of AI and check
out how to use demos, webinars, and much more, don't forget to
check out that YouTube channel, igalio.
YouTube has so much more on the podcast and I think you'll enjoy
it. That's all for this week.
Thanks again, Wayne. Thank you.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.