Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Om, have you been onLinkedIn recently?
(00:01):
Sadly, I have.
Have you noticed thatit's there's a flood of
AI tool vendors preaching.
Build it first.
Yeah.
And it's not justthe tool vendors.
There's a whole bunchof people that have
jumped on the bandwagon.
Yeah.
Well, that's thesubject of this podcast.
The build first great AItoken money grab of 2025.
Also maybe this episodewill serve as a nice little
(00:22):
time capsule 'cause I'msure that this episode is
gonna upset the apple cart.
It wouldn't be an arguingAgile podcast if it didn't.
Did you like that euphemism?
I had to, I had to goback to 1725 to pull that.
Upsetting.
The apple cart.
Yes, that's right.
Upsetting the apple carts.
I feel like they.
The, the real truth of thisstuff is they've repackaged
(00:43):
something that was likepreviously a big thing is
like, oh, just build fast,faster time to market, faster
proof of concept, you know?
Faster time to market.
Always a good thing, right?
Because you're in a competitiveenvironment, so of course
you want to be ahead ofthe game compared to your
competitors, all of that stuff.
But now what we've come to is.
Let's just build itand they will come.
(01:04):
I think, I think it's,I'm gonna start arguing
early in this podcast.
That's right.
I don't think it's fair to justsay, this is purely build it
and they would come just like,just like, I would say it's not
fair that it's purely like asolution in search of a problem,
but also like skip, skip doinguser interviews, skip doing
(01:25):
research, skip doing all that.
Skip talking to humansand jump straight to
paying me for tokens.
There you go.
Paying me for tokens.
I know we're gonna talk aboutthis during the podcast,
but you know when you say,okay, it's not build it and
they will come, it's build.
So many of them, andhopefully something will
stick and somebody will come.
(01:46):
That's really more like,what's going on now?
Well, I mean that's probablythe first topic to get into
is like, this is the topicnumber one, the great human
bypass, which is that AI vendorsare promoting build first
the culture of build first.
Skip user interviews, skipstakeholder conversations,
skip subject matter expertvalidation, and any other
kind of process that youhave, like process is bad.
(02:06):
Knock out a working prototype,throw it up there, get it
in front of your customers,and then magic like, profit.
Sorry, I had to close the loop.
It was missing.
It's still a big stretchbetween those two last steps.
So these vendors, right, they'resaying forget about everything
else, including all the thingsthat you talked about, right?
Involving the users, figuringout what the problems
are that you're tryingto solve, all of that.
(02:28):
And the reason behind that is.
They don't stand to gainanything if you're going off
on a tangent doing all ofthose other things, validating
the need and all of that.
They only stand to gainthings when you are
feeding their machine.
Yeah, right.
Their token machine.
Yeah.
So that's why they're, they'readvocating built quickly.
Speed is of the essence.
(02:49):
Well, let's jump right intothe best point I have in this
category, which is speed beatsperfection in a competitive
market, especially if yoursolution is like sexier than
the other solutions morewell packaged than the other.
So like the, your sales pitch isbetter than the other solutions.
You know, more people atthe customer sites than the
other vendors or whatever.
Speed trumps everything.
(03:11):
There was a, there wasa saying there, I can't
remember actually.
You know, I mightbe remembering.
Boiler room, or I might beremembering Taylor, I might be
remembering margin call, likebe first, be better or cheat.
Like I think, I thinkthat's, I think I'm actually
remembering a movie.
I'm sorry.
Be first be better orcheat or die trying.
I don't know.
I don't know.
So I, I think there'ssomething to be said
(03:31):
for trying to accelerateyour product to market.
Right?
But that said, there'san assumption here that
you've built a productto satisfy a need.
What I'm seeing nowthough out there is forget
about all that stuff.
. Just make something and thenimmediately launch it, and
(03:52):
then immediately go makesomething else and launch
it and make something else.
So it's basically akin tothrowing mud at a wall.
. And just praying thatsomething sticks.
The odds aren't with you,but the cost of producing
these multiple variants ismuch lower now, supposedly.
And so, again, this isa bit of a dichotomy.
I wanna talk about thatas well in this podcast.
(04:14):
Yeah.
I mean, the immediately thepushbacks right there is
like the LMS and the, youknow what, whatever you're
building this with, right.
The theoretically it's cursor,club code, whatever, using the
demo quickly, like the LMS.
If you feed themthe right things.
This is always what they'regonna say is like, well
you didn't do it right.
Like if you feed them theright context, they can
understand patterns that maybehumans will miss or because
(04:35):
of human bias or whatever.
Like they'll get it right.
And you wanna be puttingsomething that's functional
that people can click in frontof people like way faster
than anybody going through amore quote traditional process
would be able to keep up with.
That's the underlying theoryhere, which does, I'm like,
I'm gonna get behind thata little bit 'cause I do
believe in that to a point.
(04:57):
You know, I do believe inthat to a point, which is
the pushback right awayin this category is the,
like the normal stakeholderfeedback cycles, especially
in larger organizations.
I feel like that's wherethe pushback lands.
Yeah.
The loudest people here wereMadu guru from Google was one of
the first people that kicked offthis discussion of Build First.
(05:17):
And of course he's sellingAI tools for Google right.
There's that.
But then a bunch of other peopleselling AI tools piled on.
But again, this is whywe're looking at this.
Yeah.
Because we're not sellingyou any AI tools, you know?
Is the, the other, the otherpoints for this category,
if I were to, if I wereto represent their side
(05:38):
the ma guru side of this.
First of all, humans mightmiss patterns and because
the LM has can act, can justchew through so much data so
quickly it can get the patterns.
And also I would say youcould still do your interviews
and everything you do andjust have the LM read from
that and be kind of likea, an assist to help you.
But that, that's notwhat they're saying.
(05:58):
That's not the waythey're presenting it.
I like, I got one more pieceof feedback before I turn it
over to you is user interviewslike the interpretation
of what comes outta userinterviews could make those
interviews unreliable as datasources, especially in large
organizations where it's like,it's not just pure responding
to what the user says, it'spolitics as well, you know?
(06:18):
Yeah.
And then there's other nuancesthere too, I have to say.
Like which representationdid you get from
the user set, right?
Did you get typically acrossthe board, did you get just
right a specific type of users.
So there's that.
But all of that said.
Ignore the user's inputthat's actively solicited
right by, by you.
(06:39):
At your peril, basically right.
At your peril part of thatnow it's time to cast
off , the arguing agilelike let's represent both
sides and talk about solvingproblems that don't exist.
Because , I would've thoughtthat we got our fill of this
in the nineties and the earlytwo thousands of knocking out
code multi-month long holdingcode not deploying it and
not showing it to anyone, andthen coming out like, look at
(06:59):
my nine month long release.
Yeah.
Big bang release.
Yeah, yeah, yeah.
You know, we hada whole movement.
About this a whole, sorry.
We had a whole 20 yearmovement just on, just,
just on this subject oflike, you're, you're solving
problems that nobody everasked for like you generate.
Here's, here's a, I know I'vesaid this on other podcasts
before of as a product managerno one ever will tell you.
(07:22):
I mean, unless you're gettingfeedback from a product
manager and then you'regonna get ripped to shreds.
Like if you ask me, Brian,what's your true unfiltered
feedback, I will give youfeedback, but the typical user,
like B2C, B2B user of yoursoftware, especially B2B user,
they like the feedback they'regonna give you that lets you
know they hate your products is.
It's, it's pretty good.
(07:44):
It's okay.
It's okay.
Yeah.
Yeah.
We often hear that it's okay.
Yeah.
It's good.
Yeah.
That's it.
That's, that'spretty taped, right?
That's what you're gonna get.
Yeah.
That's what you'regonna get from them.
Yeah.
If you are solving problemsthat don't exist, you could
be led down a real bad roadwith that kind of feedback.
Indeed.
So two things that cometo mind here, right.
If you're not even validatingwhat you're solving for
(08:05):
and whether that's a realissue that needs to be
remediated in some way, yeah.
You could miss the marketopportunity completely.
Your competitor who isdoing some of this stuff
is gonna overtake you now.
They're not necessarily theones that are churning out
a hundred prototypes a day.
Maybe they're the onesthat are taking their time
and getting it right andthey will overtake you.
(08:27):
To us that is goingto win the race.
The other side of this one hereis that the user interviews they
basically, they cost you zero.
I mean, I guess you could bein some companies where you
give people like Amazon giftcards or something like that.
Like it, yeah.
But it's pittance.
Yeah.
It's, it's, it's minor comparedto the learnings that you
get in , the connections thatyou make and stuff like that.
Like you're not reallygetting a lot of stakeholder
(08:48):
buy-in, in the worldwhere you're punching out.
But like, all thoseare minor to me.
, The bigger one, the biggerone that I just kinda can't
get over is like, , you're,first of all, you're
solving your wrong problems.
You could easily godown the road to solving
your wrong problems.
Even that one I could put asidebecause like, well, in normal
software development you coulddo that too with, just like
with politics and inexperience,making a bad decision.
You don't have subject matterinvolved, that kind of stuff.
(09:10):
But the user interviews, like syou nothing near, near nothing.
And then you as a productmanager, you vibe coded this
tool and you're thinking,oh, this tool basically
works and my developers justgotta pick it up and throw
it into AWS or whatever.
I don't understandwhat the block is here.
Now you're building stuffspending real money.
And you're completelygoing on the wrong path.
(09:31):
You are really trying tohit a bullseye on a dark
board from 30,000 feet.
You'll be lucky to landin the same zip code.
It's basically like nowthat the product manager
can , churn out, barelyfunctioning front end code now.
Like their biases just getamplified to a million because
there's, before at leasta development team would
be there checking you oryour UX person or whatever.
Or the holy, the Marty,Cagan Trinity, the designer
(09:54):
and the Yeah, yeah.
Like now it's just you gettingprototypes to the, and be
like, but the customer told methis is what they love, they
put their name on the waitinglist for this, for Brian.
Brian and o's Cool new website.
Forget Brian and O'Swebsite that we have
all the development.
Like, forget that no oneworks on that anymore.
Everyone wants ourcool new website.
It's just mind numbing tome that people are actually
(10:16):
thinking about this as a viablekind of approach to do business.
If it's got guardrails on it,I'll quit my complaining but
don't pitch it like it's free.
Don't pitch it like,oh, the development team
costs so much money andthis is completely free.
It's not free.
There's tokens attached tothis and that token usage.
Could potentially bankruptyou if you're not careful.
(10:37):
Seriously.
It could be a lot of money.
Yeah.
Yeah.
It could be a lot of money.
And you got a few productowners doing this by themselves
it could really add up quick.
So feeding the token machine,that's where we're, that's
where, that's where we're at.
Om we're feeding the slotmachine AKA, the token machine.
So build first is 10 outta10 generated vendor tokens,
I guess, or the royal flushof give us all your money
(10:59):
and we'll flush it downthe, I dunno, whatever.
Flush it down the royal toilet.
I don't, I guess, I guess yeah.
And then I guess one pointfor building things people
actually want, I don't, I don'tknow how well we represented
both sides on that one.
We tried.
Yeah, we try.
I mean, I'm not saying weare gonna try very well,
but we did try, we did.
So like Point two the, thetoken economy trap, you already
(11:19):
heard me talking about tokens.
And I'm not gonna stopharping on this one,
but Brian, give it up.
Tokens are cheap.
Like the token costs areonly gonna come down.
This is what, when newtechnology comes out, this
is what they always use toget their foot in the door.
Like the cloud's onlygonna get cheaper om Right.
Okay.
There are several CTOsout there that would argue
with you on that front.
And say, what about thetotal cost of ownership?
(11:40):
We just need to massproduce these CPUs and
they'll get less expensive.
Maybe they will andmaybe they won't.
So think about it from theperspective of the manufacturers
of these things, be ithardware or the AI engines
why would they keep makingthings cheaper and cheaper?
Bananas.
It's absolutely bananasto even think that,
like I, I've, I. It.
What it's also been, as I'veheard AI folks use this,
(12:04):
this one is like, well, asbetter models get created,
the tokens will get cheaperfor the older models.
And it's a great arguing point,and it would be a great arguing
point with me if it weretrue with any, any model that
ever existed in any instance.
Like it's never true.
The new Claude Opus model islike a factor more expensive
(12:25):
than any of the other models.
And it's getting moreexpensive than the three
x opus model, than the newones even more expensive.
So like, it, it's not truelike the co this, oh, when you
scale and you could producemore the, like the, the old
factory thinking of like, well,if my factory can punch out a
thousand widgets in a day andwhere previously it's a hundred
widgets, then I can charge mymy cost of manufacturer's 10%.
(12:48):
So I can cut whatI'm selling them for.
By a significant percentageand undercut all my a that
that's true in manufacturingwhen you're punch out widgets
with tokens, it's firstof all, also my argument
completely, I'm not bringingin my heaviest hitting point
in this argument, which is likecompanies they're just greedy.
They just need more money.
So like, they're nevergonna give you a discount.
(13:09):
They're gonna makeyou pay maximum price.
That's the point.
You know, and if they'relooking like they're doing
something if they're lookinglike they're doing something out
of the goodness of their heart,like letting every government
employee, every governmentemployee access their site for
a single dollar of the entiregovernment has access for $1.
But then an unlimited amountof tokens, they charge
(13:29):
you at the token rate forlike the, like underhood.
Oh man.
It's so themarketing is so good.
I think we should just,it should be Brian Om's
marketing company, I think.
Oh, yes.
I like that Brian andOm's AI marketing company.
We haven't had a companysince we sold our yachts.
Ooh, it's a sore subject.
Yes.
So, to your point about the,the newer models costing
more, more tokens, morethis, more that, yeah.
The, the way, the way thevendors justify it is.
(13:52):
These models are better.
They're bigger, they'remore chocolatey, right?
Oh yeah.
It's now a 30 gazillionbillion dollar, or not dollar,
but you know, it, it, thismodel, it's, it's so much,
so much bigger and better.
So that's why we'recharging you more.
That's right.
It's got turboengines on it, right?
It's like, hey, so this ishow the selling point happens.
(14:13):
Bill Gates told me Windows 95was gonna be better and faster
with better access to theinternet, and I'm still waiting.
I I was lied to! Yeah.
So the bigger models are moreexpensive, not necessarily
that they, they mean morefor your purpose necessarily.
Maybe a small bottleis sufficient.
You never know.
Yeah.
But yeah, I, I think the wholeethos of trusting the vendors
to make things availableto you at a cheaper cost
(14:35):
because technology evolves.
I, I think that's a fallacy.
Just think about itfrom their perspective.
What do they have to gain?
Well, what would yousay if I told you that?
Yeah, you are paying more fortoken use, but what the return
you're getting from that isyour team is x a factor more
productive, even though I don'thave any stats about how your
team is productive, but thereshould be a straight line
extrapolation there by sayingthat's the payoff, right?
(14:58):
We're getting the speed,et cetera, so we're
getting more efficient.
Again, there's not a whole lotof evidence to that necessarily.
Well, it isn't true becausewhat's directly proportional
to the result you get is thequality of your input into it.
Actually, I'll tell you whatyou get from more token use,
you get more lines of code.
More lines of code.
Yeah.
And so if that's your measuresoftware lines of code slot.
(15:19):
Some, some people it istheir measure exactly.
Yeah.
And those people are theones that are gonna go
around saying, this is great.
Okay.
While we're at the casinoand we're paying these tokens
let's talk about vendorlock in, because like, I,
we don't like vendor lockinghas something that's like,
it's fallen out of the normalnomenclature that anyone talks
about on podcasts or whatever.
But like, once all yourdevelopers are online with cloud
(15:41):
code or whatever they end upusing good luck changing that
system it sits in the heartof your system and has access
good luck changing that outonce you're married to it.
It's huge to change that.
This is like the equivalentof going to a casino and
having different chips.
At each of the tables, soyou get different chips
at the craps table versusroulette versus blackjack.
(16:02):
So the chips that you get at theroulette table, you can't just
go walk over to the blackjacktable and use them there.
Right.
You're stuck withthe roulette table.
That's really what this is.
Yeah, yeah, yeah.
There's a lot of stuff that Icould bring up in this category!
I think the main against here islike you're gonna need a bunch
of iteration with the, with the.
Burning tokens to getto where you want.
(16:25):
So yeah I would say, and,because the technology
is new, you're probablynot tracking token use to
actually producing value.
Here's somebody, whereastraditionally when people
try to measure developmentproductivity, they have
a bunch of terrible,just terrible measures.
But like, nobody has apure , number of hours that
went into this feature, andthen number of dollars this
(16:46):
feature got us, maybe tokenuse can be connected, like
this is the feature thatbrings us the most value.
Here's how many tokens wentinto producing the code around
it or something like that.
I don't know.
It's a simple equation there.
I think the complexity isvariables that go into it.
Like for example, thetoken usage and the
value you get back.
Are directly a function ofthe quality of your input how
(17:09):
are you prompting the thing?
If that's good, , great.
If it's mediocre, not so good.
If it's poor.
So it's all over the map, right?
Yeah.
It just depends on you as thewhat is your, what is your
maturity on the prompting front.
So would you try to connect,would you try to connect
the accomplishment of yourbusiness outcomes or maybe
(17:30):
some sort of user satisfaction?
The token to value ratio?
That's kind of what I'm talkingabout I don't, I don't even
think there is a way to connectthis, but like the, whether you
achieve the business outcome,like whether tokens were
spent to achieve a businessoutcome, what that business
outcome was, how much peoplelike the feature or how much
revenue the feature bringsin , to, you know what I mean?
Like, I'm trying to figure outa way to say like, Hey, this,
(17:50):
this AI code is a valuabletool to my developer team.
Again, that's not the way thepeople that are like pushing
this stuff, they would hateit being portrayed that way.
Sure.
That oh, it's a replacementfor all your, no, it's
not a replacement.
It's certainly not.
But like, is there ametric there that I can
use to be like, is thisstuff being useful to me?
You know.
That's a great question.
(18:11):
Is it a feeling like, is it, Ithink for the most part it is.
So how is it different thanjust the cost of doing business?
Meaning, yeah.
You know, the, the costof electricity, utilities
that, that you're using totry and get to that point.
Like what is the, the directcorrelation between token
usage and value achieved?
It's very difficultto quantify that.
Yeah.
Yeah.
One thing you could do is tosay, if you had done something
(18:34):
similar in the past beforeusing tokens and AI and all
that, how long did it take youto get to a point where you
had a release out in the, likethe first release, for example,
and maybe compare it to now andsay, we got there three times
quicker or two times quicker,or whatever it might be.
That'll give yousome idea, right.
(18:55):
But I don't think you can derivea formula that you can apply
to future endeavors and say,if we use AI in the future.
The same formula will hold truefor one thing, your experience
curve is getting better, right?
With ai.
Mm-hmm.
So things will be expectedto be different, and if the
assumption holds true, becausethere is an assumption that
token usage will become cheaper.
(19:16):
Right.
So you factor that in as well.
But like I said, that's an,it's not, it's not true,
but I like, it's not true.
I, I like to dream.
Yeah, yeah.
Yeah.
So, so it's difficult to havean equation that says this
is how the return is based onspending it's very difficult.
Let's wrap this category.
I mean I think it's upthere with like developer
productivity and these thingsthat really can be measured.
But if you armed yourdevelopers with these tools,
(19:37):
you probably have some kindalike customer satisfaction
metrics that you could lookat before and after of speed
to delivery or whatever.
Hopefully, like if you havethese metrics before, it does
make it a lot easier to measurethe afterward, which is a
terrible suggestion to be like,Hey, because there's people out
there right now that are like,well, we don't have anything.
I mean, if you have lines ofcode, you're living the life.
(19:59):
'Cause you can generate allthe lines of code in the world.
It's so interesting thatif you have lines of code,
one of the things youcould gravitate to is.
The number of defects pera hundred lines of code
or a thousand whatever.
Pick your number, right?
Pick your baseline based onthe code being generated by ai.
Oh, no.
Versus the code beinggenerated by human brains.
No.
See, we fired all our Q QApeople and now we have no
(20:20):
defects to lines of code.
Yeah, no problem.
I'm gonna say the tokenoptimists, the people
that are like, Hey,just pay for the token.
Like, just believe in it.
Like it's, it'sgonna set you ahead.
Okay.
They get eight outta 10 forbelieving that the math works.
Eight outta 10 tokens.
They just walk into thecasino and they just believe
that they're gonna be luckyto know that, that they're,
they're manifesting it.
They're the, if they dream it,they can make it happen type
(20:42):
of people you know what I mean?
Right.
They're that kind of people.
They're very woo woo.
And they get eight outta10 because they just
wanna believe in it.
Three outta 10.
For the people that understandprobability, don't, don't,
you don't want to benear them at the casino.
Right.
Let's see.
Token skeptics.
They get seven outta 10for recognizing the scam.
Like they're, they're down.
They're ready to play thethree card Monte along with
the AI companies, and thenyou know, four, four outta 10
(21:03):
for you and I for looking foralternatives in the, in the mix.
Cool.
I'll take the four.
Yeah.
It's like, we're not gonna win.
We're not gonna win.
Make money at that.
But I mean, maybe we'll getfree drinks and break even
at the end of the night.
Always, alwaysworth playing for.
the validation shortcut,disaster is the next category.
The build first cultureencourages teams to skip the
cheapest validation methods,which is just talking to
(21:24):
people and in lieu of talkingto people, doing surveys
and gathering data fromlogs and stuff like that.
Usage and logs to backup kind of our bets.
You skip old, youdon't need that.
Just demo.
Don't need any feedback.
If they don't like it, we'lljust give 'em something else.
Because it's so fastto produce something.
Produce five demos, asinine tothink that you could dispense
with feedback from peoplethat actually will be using
(21:48):
or are using the product.
Right.
Or, or will beusing the product.
To me, it's just a non-starter.
I can already hear the BrianCEO feedback, the Brian, CEO,
who was a former salesperson.
Feedback right now.
I was like, om, you're wrong.
And I'm gonna tell youwhy he'll tell you in that
confident that, oh, it isslightly southern accent.
Also, om, you're wrong.
And I'm gonna tell you why.
(22:08):
All right.
Because market timingrequires speed.
Over perfection.
Okay?
We can't be talkingto all these people.
We need hands on keyboards, ohm.
Boy, it's been a longtime in a podcast.
Yeah, it's been a while.
Yeah.
Great.
If you are really hangingyour hat on that one, I'm
gonna keep my resume updatedbecause honestly, I look
at the medium to long term.
(22:29):
I don't know how you surviveby just simply throwing
so many products out therein the marketplace, hoping
that somebody somewhereis gonna like something.
What happens if you arecorrect, and let's say out
of every 10 products thatyou put out there, 5, 6, 7,
get some level of traction,maybe not the same level,
(22:49):
what are you gonna pursue now?
What are you chasing?
You have no idea.
I'll, tell you again, IM gonnastay on the sales side of this
one because it's way too funand you have an expense account
is, listen, if you get out thereand vibe code me something that
brings revenue into the company,like you're more valuable than
any developer or any engineeringmanager or any solution,
a architect, any of thesetechnical people I pay you.
(23:10):
And the fact that you did itwithout being able to know how
to program extra bonus, that'sgoing straight to the top.
You're gonna be the engineeringmanager tomorrow, kid, like
put you in coach not, never,never coded a day in his life.
That's not requiredvp, you don't need
right there these days.
Yeah.
Everybody who's here, likethey're all part of the problem.
They can't thinkoutside the box.
(23:30):
Like you're out therecontacting customers and
throwing out prototypes andletting them log in and.
Click stuff and they love itand they're, they're signing
up on waiting lists andsigning up to give me money,
hand handover, fist money.
Yeah.
Like that's, that'swhat I'm looking for.
Spoken.
Well, like a like a salespersonwho's gonna get their commission
based before the ink isdry on the SOW or contract.
Listen, long term, this isa hollow strategy, it will
(23:52):
implode on itself just becausewhen you put something out
there that is being vibecoded, can you stand behind
that product and support it?
Is it going to stand thetest of time when a user
starts to click a little bitoutside of what they see?
You don't have any qa, sonobody tested any boundary
conditions all of that stuff.
Nobody tested edge cases.
Yeah.
Great.
(24:13):
'cause they don'thappen in real life.
I mean, come on so, yeah.
But you know when, whenthe pain comes later?
Yeah.
Not that much later.
You, Mr. Salesman, willbe out there in Maui
spending your commission.
So yeah, it dependson your motivation!
I mean, listen, you're like,oh, that's not wrong, but we're
(24:33):
getting paid I'm sorry, I, Iwas, I, I want to say that a, a
better version of me had like,well thought out pushback for
that, but no, you're not wrong.
The organizationalgoals have been met,
our salespeople got paidthe real pushback here is like
if you're not, you can do allthe user testing, user research
and stuff that you want.
But like, you needto get me to revenue.
(24:53):
That's like the, I'm thinkingabout businesses that I
worked at in the past wherethe business gets in trouble.
And then the pressure's onwhereas the sales people I
rail on a podcast all thetime about your product
people and your salespeopleshould always be together.
You know, holding hands.
It's a little weird like arm inarm going to customer, you don't
necessarily have to hold hands.
I mean, you can hold hands ifyou want, but the salespeople
(25:14):
and the product people, likethey should be one the same.
You're both doing pitches,you're both trying to
represent the product.
You're both tryingto solve pain points.
There really shouldn't bea division of like, Hey,
whatever gets us to revenue.
Like that's wherethe real signal is.
Like that's the bestpushback that I have in
this category is like, oh,I'm getting us to revenue.
What we had in thepast is dying off.
What we had in the pastis not hitting with
(25:35):
customers or whatever.
And if I can vibe codesomething up that is maybe
it's like still similar to oursoftware, but people can use
it and see the vision and belike, I absolutely need that.
I would definitely sign onthe bottom line, you let
me know when it's live andI will give you the check.
If I can get to thatkind of commitment, then
this is a good thing.
Under those conditions, Iwould say, I think if you're
(25:56):
leading with doing thediscovery, figuring out , the
right problems to solve.
. And validating thatthere is a need in the
market for the solutionyou're trying to create.
Mm-hmm.
And then using vibe, coding,whatever it might be as
tools, I'm okay with that.
Yeah these are just tools.
Before spreadsheets, peoplewere doing things manually.
Yeah.
On calculators andbefore calculators they
(26:17):
were using the abacus.
So these are just tools.
I'm okay with thatapproach of using tools to
accelerate time to market.
But I'm not okay withdispensing with doing all
that discovery upfront andjust saying, just build it.
'cause it's cheap to build it.
If we get it wrong,doesn't matter.
It was cheap.
So we'll build something elsethat doesn't sit well with me.
(26:38):
Yeah.
I'm not gonna have anargument for in this category.
One of them is like ifyour product requires
regulatory or any kind ofcompliance or anything like
that, you know what I mean?
Like, not governance ofthe, like the four letter
word that is governancethat I normally Yeah.
Don't like.
But I mean like actualyou might kill people with
this, you know what I mean?
With this vibe coded solution.
Like if your productneeds actual, regulation.
(27:00):
Obviously this kind of,fails a little bit for that.
If you are segmenting a marketyou can vibe, code something up.
But like if anyone has everlaunched a product where they
were trying to segment analready crowded market to be
like, Hey, our product doeswhat everyone else's does, but
ours does something special.
There's a certain amount oflike, being in the market,
doing interviews and researchand stuff like that is part
(27:22):
of your marketing effort toeducate and try to divide and
segment the market mm-hmm.
To carve out a niche for yournew product or maybe your
slightly redesigned product orrelaunch of existing products.
This is the CokeZero thing, right?
It is, yeah.
So you have, you have DietCoke and you have regular coke.
Yeah.
Coke Zero alsohas zero calories.
(27:43):
Exactly, exactly, yes.
But in a appeal to those peoplethat object to Diet Coke.
Yes.
So, yeah, absolutelyagree with that.
So you're gonna vibe codeup Coke Zero I don't know,
like there, there's, there'slike a 50% marketing, 50%
user, engagement type ofactivity that's going on.
Yeah.
But it's your real productteam and your real develop,
it's your real team that'sworking on the software, right.
It's not smoke and mirrors.
It's a real effort.
(28:04):
And the more people you getout there and talk to and do
like the, the long versionof user research the more
opportunities you have to startsegmenting the market to be
like, oh, this is the hot newterm in the, in the market.
And everyone's gotta have,and your product already
has the zero sugar is anew, you know what I mean?
If we're gonna stayin the beverage Yeah.
Like zero sugar, zero sugar,zero calories is a new hot, hot
(28:26):
marketing thing that everyone'sgotta, and then everyone
starts copying you or whatever.
You could say like, well youcan still do the vibe coding
and stuff, and still do allthe traditional research.
And get to the findings oftraditional research faster
with happier users because,they have stuff they can
put their hands on withthese vibe coded solutions.
(28:47):
I guess you could say that,but the issue with the nuance
I just threw out the issue islike the way the message is
being projected by all thesepeople trying to sell you
an AI tool, like they're notprojecting nuance and they're
not projecting like what Ijust threw out, like would
actually try to help you to belike, vibe, code your thing.
But, but like, havestrong UX research.
Chops.
(29:07):
Yeah.
And then move forward.
Like it's, it's notbeing projected that way.
Maybe I'm readingtoo much into nuance.
No, I don't think you are.
I think it's beingprojected as buy, my book.
Get Yeah, yeah, yeah, yeah.
Come to my Ted top, getrid of all of this stuff
that's distracting you from.
Buying tokens from me.
No, no.
That's what it is, is keepmy coding, because the more
you vibe code, the moretokens you're gonna need.
(29:28):
And yeah, my stuff's goinghigher and higher and higher,
my stock price and all of that.
It's just extremes really, Ithink at the end of the day.
So validation checkpointslike the implement validation
checkpoints before majortoken spending set, set
dollar thresholds, like thesame thing you do in AWS
where you set alarms basedon spending stuff like that.
Like thresholds.
(29:48):
Yeah, exactly.
Yeah.
This is a good one it like beall in on AI tools, that's fine.
But, set some limits,set some thresholds.
At least set some check-insto be like, when we're
spending at this rate,what are we getting at?
I mean, that's basically whatthis category is like, Hey,
these lessons that you'regetting from the market,
especially when you're doingmarket segmentation, if I
had my sales team out there.
(30:09):
Just deployed around the globe.
Or let's say, let's say I wasjust selling in the US right?
And I had my sales teamdeployed all around the us.
Like I hired like a sales teamper region in the us Maybe
I split the, split the USinto like eight regions, six
regions, whatever and I hirea sales team per person so
they can be like within twohours flight of any client
site that I have or whatever.
Some companies, this isa pretty typical thing.
(30:31):
Yeah, that's a verystandard, yeah, pretty,
pretty typical thing.
Like the money that I pay topower my sales team and then
all their expense reportsand all that kind of stuff.
Like there's a budget there.
There will be a norm thatemerges of like if I want
to engage prospects, itcosts this much generally
to engage a prospect.
This is the same thing.
If I want to go do marketresearch, punch out prototypes,
(30:54):
interview people, it alreadycosts a certain amount.
Like if you're not trackinghow much it costs to do the
job without the AI tools.
You should do that,first of all, to say
this is discovery work.
Like go read inspired, readall the work of Theresa Torres,
figure out what discoverywork is, then figure out how
to budget to say, this is mybudget for discovery work.
(31:15):
Not, not, not like I, I onlythis many dollars can be spent
on just, just what do youspend on, you know what I mean?
I'm trying to say like, figureout what you spend on it and
then when you figure out whatyou spend on it, your AI stuff
will add onto that budget okay.
And, and then, thenset up your alarms.
Yeah.
I think if you are in acompany that is not even
tracking this and they'rejust giving their employees a
(31:37):
free hand at, to using thesetools right as they see fit
it could be a slippery slope.
This could be the thegamblers paradox, right?
Yeah.
I mean, you just keepgambling 'cause the ne
you're gonna win the nexttime and it never happens.
So, yeah, so I think it's goodadvice to make sure that you're
metering this in some way.
. But also, , when this productreaches maturity, you've already
(31:59):
got traction in the marketplace.
You could easily say whatcomponent of that was AI
spend, whether it's in doingthe initial analysis of user
requirements and things likethat, the validation of need
or it's the use of tokens bydevelopers to try and accelerate
the product development, right?
You can combine all of that andnow you have a revenue number
(32:20):
to go in the other column andfor future, this informs what
your investment should be in thefuture because obviously there
are other variables, right?
You know, if the marketpotential is bigger, you might
wanna up that, but yeah, it'sbetter to have a starting point.
Yeah.
That's a great point.
The size of your market, likeyou definitely should adjust
the budget because you'refishing in a much bigger pond.
Sure.
With bigger fish and ormammals, whales if you're
(32:43):
not thinking this way,this is very businessy.
We got into a realbusiness-y kinda I don't
think you can avoid it.
Just look at the money you'reputting in versus, how much it
takes to bring a prospect in.
This is the stuff thatcustomer acquisition
costs is made out of.
Yes.
The CAC is a certain slipperyvalue of like, oh, people
onboard and do whatever no, it'slike you can put a hard dollar
(33:04):
to all of the activity thatwent into getting the typical
customer and get your realcosts, because again, you're
still paying all that, but theAI is adding onto the top of
it in a remote first world.
Those costs are cheaper'cause you're not paying
hotels and stuff like that,depending on your industry.
Yeah.
But again, with the token useof like the uncontrolled token
(33:24):
use, it could still get, I mean,it can run rampant actually.
It could run great.
Yeah.
It can run wild.
Yeah wanna cut a machoman pro run a wild, yeah.
Scoring let's see.
Scoring for this category.
This was a good category.
I think the, the marketvalidation through real
use versus learning theselessons that are expensive
and you don't have a goodhandle on them anyway.
(33:46):
We're gonna use the AI cookingshow scale for this to say that
all of the people that are onthe side of build first, they
get a nine outta 10 for thepresentation of their food.
But because nobody ishungry we're gonna reduce
that down to two outta 10.
So that's, theypass the taste test.
The people that validating,they get a solid six outta
10 nobody's like superhappy, but also like they're
(34:06):
turning in like a qualityconsistent meal every time.
So they're gonna take thiscategory, I'm just saying
Circle gets a square.
Cool.
We talked about individualteams burning tokens like
Crypto Bros in 2021, butlet's zoom out for a second
to get the bigger picture.
Let's talk about AI vendorsand how they're systematically
capturing entire organizations,or are they, like that's the
(34:26):
organizational dependencyscam AI tool vendors creating
organizational dependenciesto make it increasingly
expensive to operatewithout their platforms.
So I'm kind of on thefence about this one.
This has like strong twothousands it budgeting
vibes of like needingto buy certain tools.
We talked about it a littlebit earlier on the podcast
of like vendor lock, whichagain, the pros in this
(34:47):
category of AI tools.
They create a genuinecompetitive advantage, which
I'll sign on to this one toargue this one very easily,
because they personally, likeall my personal projects and
stuff like that, that I code,I use it with ais in the size
saddle is generating the code.
I have some guidelines that I'vebeen very successful with and
it's like, it is a developmentproductivity multiplier
(35:08):
That's a very real thing.
Like it, you, you're shippingmuch faster than a team
that was purely just writingeverything the hard way.
Yeah, yeah.
There's no doubt about that.
The machines can really dothings very, very quickly.
So if you're using, ifyou're harnessing the power
of ai but not blindly,you're doing that intently.
(35:29):
Then definitely you havethat competitive advantage.
I think we're rapidlyapproaching a time when pretty
much everybody's harnessing it.
So now where's the advantage?
So now you gotta lookbeyond just using ai.
It's harder.
You use it and do you reallyuse it in a way that is either
saving money or making money?
And so I think that's wherethe advantages will come from.
(35:50):
. Going forward.
Point taken the vendor switchingcosts, like if I'm gonna argue
about, on my own point aboutvendor switching costs, I
would say like, yeah, if you'relike all in on cloud code or
something like that, like there,or if you're all in on cloud
as model and it's like in themiddle of your programs, like
you do something with club,like you pass off the club for
a decision or something likethat, and then you keep going.
(36:11):
And then you, you haven't madethis like a business decision,
like a person decision, likeif you have like a, an AI
agent in the middle of yourworkflow somehow, like maybe
I have some like B2B SaaSapplication, but like like
order processing, customerservice, something like that.
You know what mean?
Sure.
I'm thinkingsomething like that.
Whereas it's like, they'relike the traffic cop of
like, oh, this tickethas all the right fields.
I'll allow it to gothrough to the next person.
(36:32):
Or this ticket doesn't havethe right fields, I direct it.
Or something we do thatwith rules now, like
real complicated, likebusiness rules and stuff.
But like the AI could lookat it and make a human
judgment, you know what I mean?
Or something near to a humanjudgment, things like that.
Once you integrate things likethat with like LLM templates
and temperatures and stufflike that, now you're kinda
(36:53):
locked into a model andit would be difficult just
knowing the way that I've seenpeople adopt AI in business.
Not a lot of peopleare building.
So that all of your prompts arelike completely componentized,
where you could take them andflip to like, oh, we're like,
open AI is our vendor today andtomorrow we're gonna separate
(37:15):
from open ai 'cause we gota better deal with Clause.
So we're gonna flip over Like,if you did that, you'd have
to like revise your prompts,revise your inputs and outputs.
Let alone the actual likepassing off to the right.
Yeah.
To the API the, yeah.
Vendor lock-in.
Maybe like five years fromnow, everyone will be like,
well, why would you ever,why would you ever customize
(37:36):
your stuff to one model?
Why would you not likecomponentize it to where you
go to the box and then itsays, what model are you using?
And depending on themodel it uses the proper
prompt and models andall that kind of stuff.
Most people don't do that.
They build completelyto the stack they're on.
That's true today.
Yes.
And in the future, yeah.
People could perhaps godown to that level of
(37:57):
componentizing the work, right?
But I also think the otherside of it is true, which is
some of these AI vendors willget smart and enable switching
from other vendors to yours.
Right.
You know so they have tobuild specific tools that
you can harness, right?
Yeah.
But they would make iteasy on you as the user.
Is that like the Android?
(38:18):
Feature that is they know ifyou're moving from an iPhone
to an Android, pretty muchthey give you a stack of tools
that migrates all your stuff.
It's pretty much like that.
I was gonna say, welcome aboard.
You know, when you, when yousee that stuff emerge, you know
that's the end that there's nomore, there's no more ideas.
Right.
Yeah that's the end of theproduct roadmap for me.
I think . It'scoming at some point.
I welcome it.
I think it would be areally cool feature.
(38:39):
If you are an AI vendor,you wouldn't think
about all this day one.
This is not exactly in your MVP.
It's also not sexy.
This kind of work.
It is not.
but the payoff is big.
If you can get certainnumber of users over mm-hmm.
Like poach them, whateverthe actual term is.
Yeah, convert them.
Yeah, exactly.
That's right.
They may well do that in thevery near future, I suspect.
The takeaway for this categorysays, maintain ai free zones and
(39:01):
critical workflows ensure theteams can deliver core functions
if vendor relationships end orpricing becomes prohibitive,
which I think is a great ideal.
It's a great goal.
It's a great shiny ivorytower to shoot for.
However, I just, I can't, Idon't know anyone that designs
this way or builds this way.
Everyone basically selectsone AI vendor and goes all
(39:24):
in, pushes all their chips.
I dunno why the, the casinometaphor has just taken
over this podcast and it's,it's, it's not been in any
of the planning that we did.
It's just taken overthe whole casino.
It is organically emerged.
Yeah.
It's so weird.
So switching costing isinteresting to me though because
you know, if you are a smartAI vendor, you'd wanna reduce
(39:44):
the switching cost Right.
From your competitors andI haven't seen anything to
that end yet, but I daresay you probably will.
Don't think theyfigured the tech out.
Yeah.
Yeah, that's true thisis moving so quick.
Yeah, I don't think so.
I don't think they figuredthe tech out, but also like
keep their resume updated.
the other thing to talkabout here is the death of
institutional learning, whichis the idea that your, your
(40:06):
build first culture combinedwith AI tools it's creating
organizations that canproduce code quickly, but they
can never learn from theirmistakes or talking to people
or, you know what I mean?
Well, maybe it's not fair tosay talking to people, they can
never learn from their mistakes.
Mistakes meaning like buildsomething and then Correct.
And go in a different direction.
Look at metrics kind ofbe adjusting constantly.
(40:27):
Traditional software developmenthas it where you do learn a
lot from your mistakes everytime a bill doesn't compile.
But with vibe coding, thatdoesn't happen anywhere
near as often now.
Yeah, right.
And when it does happen,you simply do a little tweak
and it fixes itself and youdon't really know what it
did in order to remediatethe defect that was there.
(40:48):
So yes, I agree with this.
I feel like this is a,an issue that's going to
plague our industry in thenext two to three years.
Sure.
People, it's thedumbing down of.
You know, corporatesoftware development.
All right, let me, let me,let me push back out again.
I want Sure.
It's hot in human today.
The AI tools, democratizedevelopment knowledge.
(41:09):
The pro side of this islike, we're democratizing
software development.
Om, you're wrong.
First of all, we'redemocratizing this, what
a great phrase, right?
Democratizingsoftware development.
It, it is goodmarketing, isn't it?
It is great marketing, butit's also a bunch of bs.
Oh, because it doesn'tdo any such thing.
Are you saying marketingis often not true?
(41:29):
Market marketing is, itspins up versions of reality
they want you to see.
So this idea about losing.
Institutional knowledge, evendomain knowledge is valid.
I think that's a real issue.
And I think that's a bigthreat to, the long term
survival of industry as awhole, because you're gonna
get people that all theyknow how to do is vibe code.
(41:51):
Well, let me, let me, lemmetry to put into words,
i'm gonna, I'm gonna.
You're welcome, Julie.
I'm gonna try to repeat backto you, what I think I heard.
And what I think I heardwas like a couple points
actually that you threwout, which was, yeah.
In the rapid fire round teamslose understanding of why
things work, when you havethe, democratization of vibe
coding solutions where thepeople generate a function
(42:13):
that goes and does this, doesthing a, the business output.
A, but then the teamdoesn't really understand
how the function works.
You know what I mean?
It doesn't really encapsulateall the business logic.
They feed it back to themachine when the customer
complains and then the machinegives them the new version
of the function and they copypaste that in, institutional
memory becomes externalizedto the vendors because nobody
(42:35):
really understands thiscode and because nobody ever
people talk to the customersand it's like, oh, it seems
to be working this way.
I don't really understand.
And they put the like clear textEnglish into the, into the chat.
Yeah whi which by the way,the chat is like a session
that one person has.
So again, assuming the peopleleave the team or stay on
the team or mo leave thecompany or whatever, like
(42:55):
that chat session is gone.
So you never know what thethinking was that went into
vibe coding up that solution.
So there's a, there'sa couple things here.
So there's number one,there's the understanding
of why things work was neverexplored in the first place.
Number two, the institutionalmemory that you get, you
basically shored thatto the AI companies.
And then I, if those twothings are true you didn't
(43:19):
say this, but, I wrote a notequickly while we were talking.
If those two thingsare true, you can't
debug any of that code.
Like you, good luckmaintaining it.
You're, you're just gonnasend it off to a AI and
be like, debug this.
I don't understandyou know what I mean?
And then, all bets are off.
Again, a casino metaphor,by the way, all bets are off
if those people don't evenwork at the company anymore.
'cause now not only do younot know how the code was
(43:40):
created, you can't maintain it.
But now, like you need to justthrow it to the A and be like,
Hey, ai, just figure this out.
I have no idea.
So the scary part aboutOli is, is you're at the
mercy of the ai, right?
So let's say by the time youactually have an issue that
needs maintenance and you don'treally understand it anyway,
(44:01):
you pass it over to the ai.
It's not the same AI thatbuilt it in the first place.
It is now a differentai, not different vendor,
different model perhaps.
The new beautiful,improved model.
Yeah, maybe, but it worksdifferently logically
speaking so it's gonna comeup with a solution that may
not be the best solutionto this problem at hand.
That's a huge risk to methat you are, you're relying
(44:24):
on something external, muchless it's, it's a machine.
You're relying onsomething external for
your core business, right?
I mean, I don't think it getsscarier than that for me.
Right.
I mean, what would you sayif I said that?
, Listen, om like theexperimentation cost, the actual
cost of developing a prototype,we can just throw it away like
, that that cost is like super.
Yeah, maybe it costs 20, 30,40 bucks, but like that's super
(44:48):
cheap compared to our normalprocess that we go through.
So if we score a home run great,I'm willing to spend that money
, and whatever your pushback tothat is gonna be is this real
time feedback with things thatpeople can put their hands on.
It's way faster to generate thatstuff in the AI world than it is
to go through the normal cyclewith the retrospective analysis
(45:10):
and everything that we do.
Like, it's just, theprocess is so much faster
that I'm willing to payfor inefficiencies because
it's just so much faster.
So I'll switchanalogies briefly here.
You know, you said homerun, so we'll go to the
baseball analogy here.
So basically what we'resaying is we can still
bet on baseball, right?
Stand.
Yeah, we can.
Sorry.
Yeah.
It's not illegal.
You stand at the plateand you're swinging away
(45:32):
with your eyes closed asfast as you can, right?
Mm-hmm.
And you're just basically hopingyou're gonna hit something.
So for, for a homerun, yeah, it's great.
It's cheap.
But for that home run,you've missed so many, right?
Sure.
There's been so many.
Or the ball strikes, I don'tknow what they are in baseball.
Well, basically when you don'tconnect or you hit it, sure.
So that's not the bestway to go because you
(45:54):
don't have a Sure bet.
Now do you have a Surebet if you're doing
things the right way?
Yeah, pretty much.
It's a known thing, right?
This has been a provenphenomenon that if you go figure
out the right problems to solveand then solve those, you have
a much higher chance of success.
I was hoping that we couldget outta this category.
I don't know why I'm talkinglike Morgan Freeman right
(46:15):
now, like voice of God.
I was hoping that we could getout of this category without
you bringing up the pointthat your strategic thinking.
Is really atrophied in thisenvironment where you're
just doing this like tacticalvibe, coding of a prototype
up and whatnot, and like,you're like the big picture,
the systems view of howeverything fits together.
(46:39):
Boy, if we start pushingproduct managers into this box
of like, just tactically getin there and code stuff, kid.
Like if you can't codeand like we're gonna have
coding interviews now forproduct managers, it's
like, okay, well who'sdoing the systems thinking?
Where's my systemsthinking interview?
Oh, we don't have anyof those because all the
people interviewing to hirepeople, they're not systems
(47:00):
thinkers because they'rethinking this way of like,
correct, just gimme my nextUI iteration or whatever.
So like the strategicthinking now that , on the
last podcast we put up, youwere like communication.
With the customer.
Like don't offload thatas a product manager.
'cause that's the mainthing you should be doing.
Correct.
I would say if that's themain thing you should be doing
(47:21):
the fast follower to that.
The very second thing that'sbehind that is the strategic
systems thinking aspect.
That should be the very nextthing on your radar when you're
done with all your customeroutreach, communication,
strategic talking points andwhatnot, is how does this
fit into a larger picture,especially the higher you
(47:43):
get up and the, the seniorproduct manager, director
or product, stuff like that.
The Sure more of that cankind of stuff you're gonna do.
But if you're just focused onlike, what is the AI telling me?
You, I mean you're like themyopic view you're getting is so
close that I'm real worried thatthis skill is gonna atrophy.
'cause you're gonna doless and less of it.
Yeah.
And I agree.
(48:03):
And, and I think not doingthese is extremely dangerous.
You so think about whatall of that entails.
We're not gonna breakit down to everything
but strategic thinking.
You are thinking aboutcompetitive analysis.
Yeah.
You're thinking about compliancewith regulatory requirements.
You're thinking aboutpricing strategies you know,
differential differentialpricing based on different
(48:24):
segments of the market.
Yeah.
These, these sorts of things.
If you're not doing thatand you're simply churning
out code, you're swingingaway at the plate as fast
as you can like propeller.
It doesn't really help you.
And actually I would go furtherand say, this could harm you
because yeah, you can have adifferent product every eight
hours or whatever it is, butyou're slinging mud at the
(48:46):
wall here you know, you can'tdispense with the other stuff.
That is absolutely critical.
The, the strategic thinkingspeak of, yeah I agree.
The AI explanation sessions,like when, when you have your
takeaway and AI is assistantto you, like is an additional
tool in your toolbox, likethat just becomes part of
your demo to be like, Hey,AI helped us with this part.
(49:06):
You know what I mean?
It's, it's like an additionaltool that you deploy.
I don't see it as like,well just offload the
whole thing to this.
It can't be a replacement.
And like, yeah, you can vibecode up stuff but like, if
you're gonna say, I don't needUX researchers, or I don't
need, you know what I mean?
Whatever, whatever it isthat we don't need this week.
And again, people might listento this and be like, well,
Brian, nobody's saying you needto get rid of their UX P yeah.
(49:26):
First of all, I understand whatyou're saying and that you've
not necessarily seen someonelike get laid off because
AI is replacing their job.
That's, that's also like aflashpoint online if you were
to go on online forms andstuff like that, or on Reddit
or whatever, and talk aboutlike nobody's directly lost
their job because they'vebeen replaced with a chat
bot or whatever like that.
(49:47):
That's not necessarilywhat's happening.
Companies are laying offbecause it's a good time
to lay off and like they'reusing AI as an excuse.
Sure.
But it's not really ai,it's just like, yeah.
They're doing some like baselineautomation, but they're really
just doing this to like, makethemselves more lean , it's
a profit motive at the endof it and that's really
what's happening under hood.
But if you were to be tosupplement your team members.
(50:09):
AI to help generatesolutions or to help generate
like ideas or whatever.
Like it would be helpfulin your demos to highlight
what AI was assisting with.
So at least you can get theperception out there that
like, this is what we'reactually using it for.
I think that goesboth ways, right?
So if you're doing that inyour demo, it builds trust with
your customer because they knowwhat you're using but
inwardly to your team as well.
(50:32):
It's spreading awarenesswithin your team of how you
use the tool selectivelyand where it's helping I
think that's a great thing.
It builds transparency,it builds trust.
I'm all for it.
Yeah.
I'm gonna stick with the, thecasino metaphor and give this
nine outta 10 nine outta 10doggy coins for reaching their
destination successfully.
(50:53):
The problem is we're gonnadowngrade them to two outta
10 doggy coins becausethey're in navigation skills.
Are useless when theGPS is not working.
So that's, they're outta coins.
The AI GPS Yeah, sadly, sadly.
And then the traditional teams,of course, are still maintaining
that between five and sixoutta 10, because they're still
doing the the, the discovery.
They've been trained to do that.
(51:14):
That works every time.
Apparently talking topeople apparently is a
good, solid practice.
I would've never guessedthat would be the takeaway
of this who knew, right?
Who knew?
Yeah, I know.
I would've never guessed.
The last point that I wannamake is less of a point that
I want to talk about andmore a wrap up of everything
we talked about so far.
The ultimate irony of thisbuild first culture that's
emerging is that it's marketedas innovation, acceleration.
(51:37):
But like, by sending this stuffto the AI tools it's like a
novel problem solving when,what you really are looking
for is like a big breakthrough.
Like it can get you like theiteration of what your idea
is that you fed it, but itcan't get you like the big
breakthrough you know, and,and I'm willing to get pushback
to say that's just like youropinion, man but I feel like
(51:58):
the AI tools, if promptedright and leveraged, right?
That's what the people will saythe people trying to sell you.
By the way, here's a gota AI tool in my jacket.
I'm willing to sell you.
They'll say, token sellers,they'll say, well, you're
just not being creativeenough with the prompt right?
And you know, it's a very lowbarrier to experimentation for
anyone inside your business.
They don't need to know howto code inside your business.
They can just pick up the tool,start asking it to do things.
(52:21):
And you know, anybodycan be a developer.
That's what they'll say.
And that's one side of it.
I mean, the otherside of it is like.
The actual innovation, theleaps in your business ahead,
that requires human insight.
It requires talking to people,and that stuff's not going away.
No matter how many prototypesyou can bust out, maybe
the prototype becomes likethe catalyst that helps the
conversation move along.
(52:42):
And that's where we wanna go.
When we were planning for thispodcast, like Marty Cagan wrote
a blog in between when we wereplanning for this podcast.
That touches on a lotof this kind of nuance.
But again, like I don'tfeel this stuff is being
projected with nuance.
It's very much oneside or the other.
I think you know, people outthere saying, do, do or die,
jump on the bandwagon or perish.
(53:04):
Yeah.
Yeah.
You know, and that's ashame because that's,
that's not reality.
Well, if I had to cut straightto a takeaway right now, like
before you even really talkedabout this whole category,
if I had to cut straight to atakeaway, I would say like, your
takeaway as a product manager,or honestly even a team member,
developer, qa, whoever you areon the team, like you should
be reserving a certain part ofyour time at work to experiment
(53:27):
with different solutions.
And like the AI is oneparticularly different solution.
It's like the old, again, mybackground's in qa, so like I
can speak intelligently mm-hmm.
About testing frameworks becauseI did a lot of my testing
frameworks by coding withSelenium in Java and hating
life every moment of the way.
Now playwright's a thing,I can write in Python,
(53:49):
I can write in whatever.
And life is a lot better butlike, there's other technologies
out there, but like if youtake AI outta the mix, I always
should have a percentage of mytime to probe the market, figure
out what the trends are, trynew technologies, try different
things, you know what I mean?
Learn new skills,that kind of stuff.
Yeah.
AI is the same way it, likethe real innovation is when you
(54:09):
have like, you're not packed,packed, absolutely packed.
Every little bit of space iscompletely taken up and there's
no room for experimentationwe just need you to punch
out the next widget kid thatkill innovation so fast.
You're right.
Yeah I understand it's notnecessarily an AI thing,
but like you're adding an AItool on top of all this to
say like, well, it makes youfaster at doing your job where
(54:31):
you're completely packed tothe tippy top over your head.
Yeah.
And now maybe like you canget a little breathing room;
i'm like, well, yeah, maybe.
But also maybe if yourorganization understood a little
bit about you need a littlewiggle room to invest in new
ideas, whether those ideas areAI or something else, you should
(54:51):
continuously be experimentingand that's your team members
maintaining your products.
I think the companies thatare adopting that kind of a
modus operandi and policies andwhatnot are, you're gonna see
those people overtake companiesthat simply emphasize speed
and just churn out vibe codedproducts on masse I think that's
(55:12):
gonna happen pretty quickly.
Yeah.
I mean like the, the otherone, if I'm gonna be on
your side for a second, I'llsay motion is not progress.
So, and, and a lot of peopledon't understand that.
The larger the organizationthat you are in.
I would say the murkier,it is to decode motion
is not progress.
The size of the organizationis like the size of the
(55:33):
container in the chemistrylab when you are learning
about Brownian motion.
You still have bitsflying around everywhere.
So yeah, we've used anothermetaphor now, so I'm gonna stop.
I like the casino one better.
here's the uncomfortabletruth about the build first
culture that I think wetalked about in this podcast,
it's not about bettersoftware development.
It's about bettervendor revenue streams.
(55:56):
They're gamifyingdevelopment in this.
Pay to play token consumption,like ca, casino floor kind
of metaphor that we'vekind of settled on that
again, which is very weirdthat we settled on this.
This is a very apt one though,because it fits so well.
Like is it, is it build first?
I mean, the, they're tryingto say, they're trying to
compare build first versusplan first to say like, oh,
(56:18):
we're building something, andyou get something from it.
Whereas when you plan, you getnothing but a plan and you, the
plan doesn't survive contact.
That's the marketingwhat they're saying.
But what they're really doingis they're using that message
to turn development budgets intosubscription revenue basically.
Yes, exactly.
That.
I, I think that's a bit of anextremist view to say if you're
(56:39):
planning, you get nothing.
Yeah it, it's just enoughplanning maybe, right.
So, yeah.
I mean, like nuance doesn'tuh, put butts in seats.
Om.
It just, no, that's very true.
It doesn't do that.
So just, just like the,the casino metaphor here,
the house always wins.
So it's the, it's the tokenizesthat, that are winning here.
I'm a big fan of the house onthe old arguing Agile podcast
(57:01):
is what I'm saying there.
Well listen, if you are ifyou're in an environment where
you're just simply ask the vibecode all day long, or if you're
in one of the environmentswhere you're going with intent,
let us know what you thinkabout this podcast down below.
And don't forget tolike and subscribe.
Oh, and keep thatresume updated.