Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome back to
another episode of Lean by
Design Podcast.
I'm your host, oscar Gonzalez.
Alongside with me is my co-hostand friend, lawrence Wong.
There's a couple things thatwe've talked about quite
routinely with the work that wedo and that is really pulling
these frameworks and thesetheories from adjacent
(00:21):
industries and domains that arereally at the essence and the
core of operational excellenceand process improvement.
You know, one of the mainquestions that we want to talk
about today really comes, Ithink, in a few anecdotes or
sort of phrases building theplane as we're flying it or
laying down the track whilewe're driving the train.
(00:43):
And what we're talking about is, regardless of whether you are
developing an internal platformor scaling R&D operations,
there's just too many teams thatare building without designing
workflows that are going to leadto that solution, and what
you're doing is you're leavingout the essence of what is going
to generate sustainable success.
(01:04):
So I'm excited today, leavingout the essence of what is going
to generate sustainable success.
So I'm excited today we haveDavid Hirschfeld, veteran
software designer, founder ofTechies, with over 35 years of
experience in tech innovation,ai-driven workflow design and
startup strategy.
His perspective really on how touse AI smart workflow design
lean methods, reducing youroverall risk and increasing the
(01:25):
clarity from the beginning howto Use AI's Smart Workflow.
Design Lean Methods, reducingyour Overall Risk and Increasing
the Clarity from the Beginning,launching your Efforts and your
Initiative with Confidence.
So I think this is a greatepisode, whether you're a
startup founder or an enterpriseleader planning on building the
next thing.
David, thank you for joining ustoday.
Speaker 2 (01:43):
Thank you, guys for
having me.
I'm really looking forward tothe conversation.
Speaker 1 (01:47):
So I guess, as I
alluded to when I first started
talking, building the plane aswe're flying it, laying down
track while we're driving thetrain, why do we find, why are
we finding, so many teams thatjump into building before
designing, before drawing orsketching out a map of the way
(02:09):
to get there?
What do we actually see whenpeople are trying to launch
products or assets?
Speaker 2 (02:16):
So I think there's a
few reasons why that happens,
why people rush into thedevelopment too quickly.
One of them is just thefamiliarity with automation
internally.
One of them is that they'reexcited to get a product built
and tools have become soconsumable.
You can build stuff so muchmore quickly Even two or three
(02:38):
or four years ago you could, andtoday it's like orders of
magnitude much faster with AIand so they think, okay, I can
just get it out there and havesomething and start building
what I believe is going to besuccessful.
And the third is missing thephilosophy that says go slow to
speed up, that you have all thepieces in place first and you
(03:02):
have a methodology forvalidating that you've got the
right pieces in the right placesand a way of testing that.
Then the difference that willmake in terms of your level of
success and the speed at whichyou can actually grow and build.
People that don't have thatphilosophy don't understand the
you know, don't understand whyit's so important.
(03:23):
There's so much enticing us tomove quickly now because of the
speed that the tools work in.
Speaker 1 (03:28):
That's a great point.
I think even in our own work,lawrence and I, just the things
that we're able to do now wouldnot have been possible three,
four years ago.
You know, in just this shorttime span.
You know, your description ofit really makes me think of a
street race of vehicles whenyou're trying to lay on the gas.
(03:50):
If you hit the gas too hard,right away you're just spinning
your wheels, and I think that'ssomething that we end up seeing
is sort of this false startwhere we have everything, we
think we have a vision, but wedidn't really paint the road for
anyone within a given team.
It was again whether it's thatfamiliarity or that excitement
which is great understandingthat there's still a journey
(04:14):
here, no matter how short,providing the right product for
the right market space.
When we talk about these teamsthat are moving too fast, in the
beginning usually things seemvery good.
Everyone has tasks, everyonehas the things that they need to
be doing.
When do we start seeing thosesigns that they might be moving
(04:34):
too fast or that they don't havethe right structure or clarity
to where they're going?
What are some of those thingsthat?
Speaker 2 (04:43):
we start to recognize
.
Oh, that's an interestingquestion because we can break
that down from a coupledifferent perspectives.
One is, let's say you've got animmature team in terms of their
experience level and skill setfor developing anything.
So teams like that, they justdon't know what they don't know,
it's the evidence that they'removing too quickly.
(05:03):
Developing things is prettyobvious, like right from the
beginning.
Not necessarily obvious to thembecause they don't have
anything to compare it to, butobvious to anybody with a real
decent level of experience.
Now let's say you're dealingwith a team that has a lot of
individual experience, nottribal experience in terms of
the team working together, butindividual experience I've been
developing for 10 or 15 years oras a individual contributor.
(05:26):
It seems like that moved tooquick because there isn't
political will for the peoplethat are driving the project in
terms of making sure that theyunderstand what it is they need
to build and for whom they'rebuilding it before they launch
into building it.
And usually that comes from aphilosophical problem related to
belief in what you're doing.
So when people come to me andthey want to build a product and
(05:48):
they say I believe I've hit onsomething really big, I know
that it's going to be wildlysuccessful.
It's very likely this couldbecome a unicorn.
Those are like code words, forI'm going to fail.
Red flag Right, exactly.
And if your development team isbeing driven by that
non-technical visionary, forexample, then they're just
(06:11):
trying to satisfy therequirements that they're given,
which is to deliver somethingquickly based on what this
person believes.
They may even put internalautomation in place for how they
deliver the functionality, butthen they deliver something
nobody wants to buy.
So that's the other side ofthat equation.
So teams that are really mature, that understand the planning
(06:32):
part, put all the systems andautomations together in terms of
how they deliver and I can talkabout that from a lot of
different perspectives and weprobably don't have enough time,
but you can tell teams thatwork really well in that way
because they'll do many builds aweek.
Very often they get pushed toproduction and there's usually
have decent quality for everysingle build.
(06:53):
They push Production being,while it's in development, the
testing phase.
Once it's in production, theycan add new features and
capabilities daily if it'srequired, without risking the
architecture, and that's amature team with a very
automated pipeline for delivery,right?
So, because testing happensautomatically, as well as code
(07:15):
review, vulnerability tests, andeverything happens as soon as
you do a check-in of the codeand then you can just push it to
production.
When somebody touches it andsays, yeah, this is the feature
I wanted Now, but on the otherside they may just be pushing
into production the wrongproduct because nobody spent the
time to do all of that slowdown, to speed up stuff in the
beginning.
(07:36):
So I always say to new foundersif they don't have this
perspective which most don'tprobably 90 or 95 out of 100,
don't just naturally do thiswhere they step back and they
say what is it I'm assuming inthe delivery of building this
new business or this new product, and if they're really honest
with themselves, they'reprobably assuming everything.
(07:57):
So now how do I take the blackrobe off and put on a white coat
and say I'm a clinician now andI have a hypothesis.
I have a way of testing allthese assumptions and some of
them are going to prove outwrong.
And then I have to have a wayof pivoting and understanding
what that means in the contextof a business that I may be
wanting to build and some ofthem may be right, and then so I
(08:20):
can pursue those and put a planin place to implement that.
If you're able to test it andthere's different levels of
testing your assumptions whenyou're talking about building a
business what you're ultimatelygoing after is product market
fit in the earliest stage ofyour company, and product market
fit for anybody that doesn'tknow what that means simply
means that I've got a productand it's at a price that people
(08:43):
will buy in a high enough volumethat whatever it costs me to
acquire a customer is roughlyassuming this is a classic SaaS
model is roughly one third ofwhat the lifetime value of that
customer is, and that one thirdmay vary depending on how we
were calculating lifetime value.
But that's product market fit,which you don't get until you
(09:05):
actually start to generaterevenue from sales.
Everything prior to that istrying to prove that you've
probably got product market fit,but probably is not definitely
so, which is why launch first.
The whole concept is to dopre-launch sales as a way of
validating your market,validating the early adopter,
because it's not just productmarket fit in general, but it's
(09:28):
who is the early adopter?
What are their top two or threeproblems that they are really
suffering with that you'reattempting to solve and, out of
those problems.
How much do those problemspersonally impact them?
You need to know that becauseyou can't get their attention if
they don't feel personallyimpacted by those problems at a
high level.
And how much does it actuallycost them?
(09:49):
Because it needs to cost themenough so that you can charge
enough to reach thatthree-to-one ratio.
And so how do you find thatearly adopter?
And then how do you validatethem?
So these are the things thatalmost no startup does, which is
why the failure rate for SaaSstartups is so crazy high.
Speaker 1 (10:08):
There are so many
threads that we could pull from
there.
Lawrence, Please go ahead.
Speaker 3 (10:12):
Your point about
product market fit is spot on,
because I think a lot of times,even beyond just companies
developing a software solution,we have, you know, from
internally in the company, whenwe start to do these improvement
projects to help otherdepartments, there's not enough
discovery done to really figureout what is the actual problem,
that you're building thesolution for understanding what
they're struggling with, andthere needs to be more of an
(10:35):
emphasis on digging out what arethose mechanics around the
problem that we're looking tosolve.
And then you mentioned initiallyabout this whole idea of the
team structure and dynamics,which is, I think, a really
underrated point, because youcan have a really good problem
to solve, but if your team isn'treally structured in a way to
debate or work towards a certaingoal, the whole thing's going
(10:57):
to fall apart and you reallyneed to have a mixture of.
It may help to have people thatare not as experienced, but
your benefit from having amixture of experience levels on
the team to just see thingsdifferently.
And sometimes, when you have abunch of people that are too
experienced, you run into that,like you said, political
situation where, oh, I know bestbecause I've done this X number
of times but they have not beenexposed to some of these newer,
(11:19):
maybe, methods of creating aproduct or service Right.
Speaker 2 (11:22):
Yeah, of creating a
product or service.
Right, yeah, and when I said alot of experience, I meant a lot
of broad experience, not justin one direction.
Right, because you're right,that's exactly what happens.
People get entrenched.
Why do you do it that way?
Because we've always done itthat way.
Right, you know, that's likethe kiss of death.
I've always done it that wayand it's always worked.
And I said, okay, fine, youknow, I've got stories that
(11:43):
really bring that out whensomebody says that to me, how to
get them off that position veryquickly.
But what you need to know iswho's responsible for what in
this project?
Right, the RACI matrix orwhatever you use in terms of
mapping out level ofresponsibility.
And you need to have that.
You need to have a generalunderstanding of repositioning
(12:03):
that your focus is about theproblem, your focus isn't about
the solution.
So what happens very often is,as soon as somebody identifies a
problem and they think I knowhow to solve that, they get all
excited about the solution,forgetting all about the problem
, and those kinds of foundersfail at very high numbers.
Right, I'll ask a founder sowhat problem does this solve?
(12:25):
And they'll say, oh, this makesit much easier for a real
estate investor.
To da-da-da-da I say, oh, thatsounds like a benefit to me, but
what's the problem that'ssolving?
Oh, oh well, we're buildingthis really cool search engine
that gives you all of thiscapability around real estate
property attributes and things.
I said, okay, that's a nicefeature, but I don't hear a
(12:45):
problem statement.
And then they start scratchingtheir head because they're like
what is the problem?
Because they frog, got allabout it.
They got all excited aboutsolution, and that's a surefire
way to build the wrong product.
Founders that love the problemand want to spend their time
talking to customers about theirproblems.
Those types of founders often,if not most of the time, find a
(13:08):
path to success where thesolution is nothing more than
just a natural conclusion of themitigating process to solve
that problem.
Right, not the thing to be inlove with.
Be in love with the problembecause that problem will shift
and change and have nuance, andyou only get that because you're
spending all your time talkingabout the problem.
So, and the way you do that isso.
(13:29):
What is that problem?
Are there other problemsrelated to that?
How has that problem affectedyou in the past?
What kind of personal impactdoes it create for you.
What's the actual cost of thatproblem?
Have you found ways of solvingit in the past?
Why didn't you stick with that?
Have you tried other softwaresolutions?
Never talk about the solution,because the solution doesn't
matter.
The only thing that matters isthe problem and if you really
(13:53):
understand it at that level, a360 degree, fourth dimensional
kind of understanding of theproblem and its nuances and the
people it affects and theimpacts it has and things like
that.
The solution is just sort ofthe natural consequence.
It's really simple and thesolution has very tangible value
(14:17):
.
That's easy to articulatebecause you can articulate from
the standpoint of articulatingthe problem.
Speaker 1 (14:19):
I think that's a
really great point, david, where
you know, having to focus onthat core problem that you're
trying to solve really bringsalignment across everyone.
That's a part of the team,because now you're not thinking
about what is that special thingthat we can create.
You're focusing every move thatyou make on resolving this
(14:39):
issue that has this level ofpain, that costs this much
annually, that has thispsychological effect on people,
you know, et cetera, and thelist goes on.
It's really a way to have anempathetic design to your
product, because you're startingto now understand that
potential client or customer atsuch a deep level.
(15:02):
You know this takes me back tomy design thinking day, where
you talk about human-centereddesign, what is at the center of
this problem and who isimpacted, and really using that
as a driver.
When we talk about designingthe end-to-end and recognizing
differences between immature andexperienced teams and that
(15:23):
challenge that a lot of folksmight have where they're, you
know we have all that philosophyand then what does that
actually mean?
What are those major milestonesthat would take us to this
philosophical view of theproblem that we're solving for
the patients and the clients,the customer at the end?
Speaker 2 (15:41):
In the process of
doing this, you very often not
always, but very often find outthat the problem that they
articulated is not the problemthey need to solve.
A good example is I was workingwith a guy who had a really
successful business networkthousands of people and millions
of dollars of revenue and hehad been doing this for 10 years
running.
(16:01):
This huge network and it justwas building.
And so he wanted a mobile appthat was going to help people
consume, be able to connect withpeople in the network better.
And so we were talking about itand came up with a prototype
based on our originalconversation, and he said, okay,
that's fine, but it cannot addany more time to our Monday prep
time.
And I said, well, this won't.
And he wouldn't let go of that.
(16:23):
He kept saying, yes, but if itdoes, there's no way I can
consider moving forward on thisnow.
And then I just pushed themock-ups aside and say tell me
about your Monday prep time,because that was the big problem
.
The whole project ended up beinga workflow automation for that
Monday prep time and changed theMonday prep time from an
eight-hour exercise to 15minutes by doing this workflow
(16:46):
automation.
It took a few months to teaseout every nuance.
Imagine, every single week,getting back seven hours every
single week and you're the guythat runs the organization.
Listening to the problem andunderstanding the problem does
not always end up driving youinto a solution for that problem
.
Very often you end upuncovering way more important
problems that really need to beaddressed earlier because they
(17:08):
have bigger value and biggerimpact.
That's right.
Speaker 1 (17:12):
And I think those are
some of the challenges to be
addressed earlier, because theyhave bigger value and bigger
impact.
That's right, and I think thoseare some of the challenges.
You know, when we start digginginto solving a challenge or a
problem, it just blooms intothis sort of unwieldy beast and
then you have to go back againto that core problem and ask
yourself is this the rightproblem that we're asking?
Is this the right question thatwe're asking?
The more time that we've spentin this consulting organization,
(17:34):
it's been eye-opening torecognize the power of the right
question, and it's not justabout asking questions.
It's about asking the rightquestion, the context, the scope
, who's involved, what it reallymeans to them, how painful
these things are.
Because that's the only thingthat you're going to be able to
focus with and I think youpointed that out where this is
(17:56):
the thing that we're trying tosolve or this is what we want to
do.
In some cases, you lead with asolution.
I need reports and a dashboard.
Wow, that was very direct.
What's going on here?
And it could be something thatis a simple workflow change that
, to your point, could changesomething that takes a full day
(18:17):
into just a couple of moments.
So I think that there's acombination of humility that we
have to have, especially whenwe're doing even for ourselves,
where our job is to solveproblems.
We have to have the humility togo in and continue to ask very
naive questions, because wedon't know their history, we
don't know the intricacies ofthis company.
(18:37):
Sure, it is R&D, it is aproduct, it's a SaaS tool,
whatever the case is, but youhave to really embed yourself
into the problem to truly findpossible solutions that will
solve that.
Speaker 2 (18:53):
Well, and very often,
like you said, that was a good
example.
So why do you need thosereports?
How are you going to use thosereports?
Well, my CEO asked me for thereports because he needs them.
Speaker 3 (19:05):
You know how he's
going to use them.
Speaker 2 (19:07):
Well, I don't really
know.
Is there any chance we can haveyour CEO as part of this
conversation, because it may notbe reports he needs, right.
So it may be somethingcompletely different.
If they ask you for something,you just deliver it.
You can bill for that, they'llpay you for it.
And then they're not happy, butnot because of anything you did
wrong, but because the thingyou built for them didn't
actually create any value,because they didn't understand
(19:29):
the problem they were trying tosolve.
Speaker 1 (19:30):
Absolutely.
Speaker 3 (19:31):
And the other thing
I'll point out is both of you
guys have been talking about itis not every problem needs to be
solved, not every nail needs tobe hammered.
Sometimes I think leaders aretoo in front of the tree, they
can't see the forest, and sothey end up solving for
something that is really minorin comparison to all the other
things that are happening.
I'm interested, david when youcome across founders that are in
(19:52):
that situation, or come acrossenterprise teams that are
developing some sort of solution, how do you step back and kind
of pull them away from thatthinking to really look at the
bigger picture?
Okay, is this the right problemto solve?
You know, how do you prioritizeall the different features that
you might want to consider fora particular product?
How do you guide them throughthat thinking?
Speaker 2 (20:14):
That one is a little
tricky, depending on who you're
dealing with.
But you try to build a matrixof all the various things that
they believe impact theirbusiness in a negative way.
That needs some kind ofautomation or workflow or
whatever, right?
And then try to understand howmuch each one costs them and
who's impacted by each of theproblems.
(20:34):
The cost may be because ofmarket opportunity, maybe
because of physical costs, right, because people have to
manually do something.
Or we end up paying a lot moreshipping because it takes us
longer to get this out.
Well, how much more shipping doyou pay?
Then you can start to get anidea.
And then how much will it costto mitigate this problem?
How much are people impactedand how many?
(20:56):
How much is it actually costingyour organization?
How much will it cost you toimplement this change?
And from a cost perspective,what's the value of this thing?
And from an opportunityperspective, what's the value of
this thing?
And when you start to build ametric around all this from a
matrix and priority perspective,it's much easier to start to
(21:16):
prioritize what's more importantand what's less important.
Speaker 1 (21:20):
I think one of the
things that you're also pointing
out is that this process issemi-qualitative.
There are some discrete numbersthat you can pull out, but
there's also costs associatedwith some of these things.
You know, if you're going totry to implement something
internally, is there a cost tothe slowdown?
Because now your teams have tolearn a new framework, they have
(21:43):
to learn a new system, theyhave to learn a new technology.
If you're not able to use thesame workflows that we've been
using, which have been reallystored in the top of people's
minds, how is that going toimpact our feeling of unity,
because this is all new to us?
You know recognizing that someof these solutions may not be as
expensive quote unquote.
(22:04):
You know that comes on yourbalance sheet, but when you look
at the impact of the people onthe teams that are going to be
executing these projects, thenyou can really understand the
value, and one of the thingsthat Lawrence and I have
constantly been revisiting isthe idea of value and also the
recognition that you can'treally make someone else
(22:25):
understand the value that youput into there.
It's a dialogue.
You need to understand whatbecomes valuable to the patient,
the client or, in some cases,your leadership.
What is actually valuable tothem?
What do they really want?
Is it time back that they want?
Is it things that are morestreamlined, rather than having
them give you their prescription?
(22:46):
Hey, I need you to fill thisfor me, because then they get
filled with it and they go nah,this is crap, that was a bad
engagement.
Well, you know, it justhappened to go that way.
Speaker 2 (22:56):
Right.
You should have told me therewas something more important
that I should focus on, right.
Not that you didn't do it right, it's that you didn't tell them
to do the right thing.
Speaker 1 (23:04):
Exactly.
I didn't get the happy feelingsthat you know I was expecting
to at the end of this.
Speaker 2 (23:10):
Right, yeah.
So that example is kind of likeokay, we're a hospital and we
want to replace Cerner with Epic, what's the risk to your staff?
What's the risk to the patients?
That's part of that matrix,right, because there's always a
risk factor to this whole thing.
But very often these changescan be implemented.
Maybe not that one, but many ofthese changes can be
(23:32):
implemented in a staged fashionso that it doesn't have a huge
impact on the organization.
If they have to learn somethingnew, it's something small.
Some small piece of the biggerpicture that they have to
learn's going to have a bigimpact, and so that's why we
have to have you learn thatfirst, because it'll de-risk the
(23:53):
bigger implementation later on,because you'll already be up to
date on a piece of it.
That's important step inlearning the bigger thing.
So it really depends.
This is very contextual interms of where the company is.
Michael Dell I think he said itbest in a Forbes interview, I
think 20 years ago or so.
The interviewer asked him what'syour secret to your success?
(24:14):
He says well, it's not that biga secret.
I just focus on whatever thebiggest bottleneck is, because
as soon as we fix thatbottleneck, I know that
everything will flow much morefreely, which will illuminate
the next bottleneck and I justkeep following the bottlenecks.
That's a very oversimplifiedway of saying how you prioritize
that list, but that does pointout the fact that that priority
(24:34):
list, the priorities, will shiftand change as you start to
implement automation, becausesome things that seemed like a
bigger impact and higher on thelist once you've solved for a
couple of problems, all of asudden they're not that big a
problem anymore because otherthings start to address or
relieve the impact of aparticular issue.
Speaker 1 (24:52):
And I think that
touches on.
You know the outcomes that thissort of effort allows for.
It really enables not just thedevelopment of the people, but
just the outcomes on the otherend are cleaner, they are right
on track with what's happening.
It could sound like a lot ofanecdotes like, oh yeah, you
know the team is working wellcross-functionally, etc.
But you really do end up with adifferent product at the end
(25:16):
that is more thoughtful, thathas considerations outside of to
your point, before its features.
You know it's very easy to saylike well, it does these things.
That's not what we're talkingabout.
What are you actually solving?
Is it peace of mind?
Is it something physical?
How are you actually solving?
Is it peace of mind?
Is it something physical?
How are you changing the workthat we do?
It's going to touch somebody'slife.
It could touch their life froma work perspective.
(25:37):
That now I come into work andeverybody knows what they're
doing and we just go pow, pow,pow pow.
We have a discussion, boom, weexecute.
It can have impact on thepeople's lives at the end, where
you are designing the rightproduct because it answered the
right question.
I think that challenge reallyis what's at the forefront of
(26:00):
your framework is really puttingyou in a position to put pen to
pad or digitally, however youwant to put it, but really
express what is it that you careabout.
Put it, but really express.
What is it that you care about?
What is the thing that isreally bogging down the
potential for your development?
Speaker 2 (26:17):
And not just what you
care about it.
But why do you care about that?
Because sometimes they careabout something just because
they don't really understandwhat really matters.
Not that you're trying toschool them, but you just draw
out the things that reallymatter.
They come to the naturalconclusion what the most
important things are when theyunderstand them better.
Um, uh, what you were sayingabout solving the right problem.
(26:42):
If you're focused on theproblem and that's all you do is
a is think through the problemand when you're coming, you're
trying to solve the problem,you're working on software that
mitigates the problem, you'restill focused on the problem,
not the feature.
Is the problem solved yet?
Do we need to take this further?
No, it's not solved.
Is it solved enough?
Yeah, probably solved enoughfor now, because we have other
(27:05):
fish to fry.
Okay, just focus on the problem.
And then the solution isn't aset of features that you're
trying to sell people, just likeyou said.
The solution is a way you'vemitigated the problem with just
not a problem anymore, becausenow you've got something in
place that has eliminated theproblem.
It's not about selling thesolution.
(27:26):
People only buy things, whetherit's internally or externally,
because they have a problem.
They want that problem to goaway.
There's no other reason thatmotivates people to spend money
on something or do something.
I know it sounds weird and itsounds very pessimistic, and I
fought this idea for years.
Even if you want to go on avacation, the reason you do that
is because you're afraid you'regoing to miss out and going to
(27:48):
live a crummy life, and you wantto mitigate that life where I'm
not living the best life Icould live, and so I need to
create memories or do something,and that's what vacations go
after.
If you felt great andeverything was perfect, why
would you want to leave andchange?
Speaker 1 (28:02):
it.
There's a lot in here that I'msort of I'm pulling from and I'm
inclined to agree that there isin some cases there's a selfish
tendency that we have torecognize as well.
Like I'm looking around now, mycomputer, the light, the camera
, the hub I have to connect tomonitors All of these things are
(28:23):
to solve specific problems Ihad, as we were trying to build
a podcast, trying to continueour marketing outreach and
things like that.
So I'm living proof here.
But you know, I think what isthe outcome of these, of these,
these workflows and reallyunderstanding that beginning
part and becoming empathetic tothat end goal?
What is that North Star?
(28:43):
What does that end look like?
You hear it sometimes beginningwith the end in mind, human
centered design, puttingsomebody else at the center of
that.
That, regardless of your skillset, of your potential.
I see it often with incest,where there's a sell to a client
and I might be on the phonecall with a partner, with a
client at that time,representing them or helping
(29:05):
them through this conversation,and we can also do all these
things and there's features justgetting splashed in your face.
Those things are greatlong-term, but you're not
listening to the conversation.
You're already creating anupsell without understanding
what's going on here with theclient.
So when we see that these thingsare working and we do take that
(29:25):
time in the beginning, there'sjust that alignment to your
point it starts to make thingsmore boring because they're not
really interesting.
It's not a fire in a trash canready to burst into flames.
It is moving as expected.
And it's funny because you talkto your leadership and they go
okay, any problems?
Well, why do we have to have aconversation just to talk about
(29:47):
problems when things are movingsmoothly?
These are those opportunitieswhere you can start to elevate
your work, you can start toupscale, you can start to take
on additional projects, becausethese once burdensome
initiatives that were the resultof a lack of clarity and
understanding of what the realproblems are is gone and now you
can focus on the actual workand the delivery.
Speaker 2 (30:09):
So this is why it's
so critical to distinguish
between what's the personalimpact that somebody perceives
that problem causes them andwhat's the actual cost, because
those are very distinct numbers.
Right, you can have a reallyhigh personal impact because
this just drives me crazy.
I have to do it every day attwo o'clock.
(30:29):
If I forget, somebody's goingto yell at me.
What does it cost as a resultof doing that?
Yeah, it costs me, maybepersonally, but the company is
really not costing us anything.
It's just a small thing, but itdrives me nuts.
So I would do almost anythingother than pay money for it,
because it doesn't really costanything, and so you're not
going to be able to sell it forvery much to that person, right?
(30:51):
The opposite is true too, whichyou just talked about.
Everything is running smoothly.
If we automated this workflow,we could save a gargantuan
amount of money, but nobodycares, because we're trying to
grow our business, not just makeit more efficient.
That's where management isright now.
Yeah, it would cost a lot, buteverybody struggles with this in
our industry.
Nobody's going to get allexcited about making this thing
(31:14):
happen more efficiently, eventhough it'll save us a lot of
money.
We could charge a lot for it,but we can't get anybody's
attention when we go out tostart to market it internally or
externally.
So you have to understand bothof those factors the personal
impact and who is personallyimpacted and what's the actual
(31:34):
cost associated with thatproblem.
Speaker 3 (31:37):
The other side of it.
You know we've been talking alot about designing the right
solution and solving specificproblems, and I think one of the
things that gets lost sometimesis, once you solve that
particular problem where you'veimproved the process, there's a
bunch of people that come inthat have to manage that process
.
There's a bunch of people thatcome in that have to manage that
process.
And so if you build a beautifulapp or piece of software for a
(31:58):
team, there's a group of peoplethat have to come in and start
managing any future changes toit.
And sometimes that isn't themost beautifully designed user
experience and you have toreally understand all the
intricacies within thatconfiguration to be able to, you
know, build an improvement inand so it right.
You know there's a lot ofbenefit that could come from
selecting the right tools, rightfor the automation and then the
(32:21):
workflow design.
Speaker 2 (32:22):
And then the other
thing is is just considering how
are you documenting all thisand if somebody's going to pick
it up and go to fix it?
Yeah, documentation's huge andthat's a philosophical thing.
Do you document things in adetailed way and is it part of
your DNA to do this right?
The other one just is part ofthe cost of the solution.
(32:44):
When you talk about buildingsomething that has to be
maintained, when you'refactoring, what's the cost to
mitigate this problem?
That's part of the cost thatyou have to factor in right is
the maintenance cost, if there'sgoing to be maintenance.
Speaker 3 (33:02):
Yeah, I know.
For us, we usually have one ofthe extra columns in addition to
this is the cost of doingnothing, because usually that
being able to measure the costof doing nothing, the pain
really is a multiplier to any ofthe prioritization that you
have for the table that you comeup with any of the
prioritization that you have forthe table that you come up with
.
Speaker 2 (33:19):
Yeah, I completely
agree.
The cost of doing nothing ispart of that.
Let's say it doesn't have ahuge personal impact, but it's
very expensive.
So what's the cost if we justdon't do anything and you
extrapolate that out over timeand it ends up being significant
and even though it doesn'treally change your business in
any way, nobody cares.
But it's other than just from acost perspective.
What would that financially do?
How could we change ourbusiness as a result of having
(33:40):
that saved cost?
Is that going to delay us doingother things?
Do we have to make a choice ofworking on this thing that
nobody cares about, even thoughit's really expensive, or
working on this other thing thatpeople really want to see
happen, for whatever reason?
Right, Personal impact is high.
Personal impact is low.
Can't we do both?
If we can do both, then let'skeep the conversation going on,
(34:02):
the one that nobody cares about,because it's going to give us a
lot more money to invest inother areas.
Speaker 1 (34:06):
Absolutely.
There's so many little nuggetsin here, and the good thing is
that now we have systems like AIthat can help us pull out key
features of this conversation.
There's so many pieces to thinkabout and considerations that
are outside of whatever productor asset you're preparing, and I
think that's really what we'retrying to hit home here.
(34:27):
When it comes to doing thingsthe right way, when it comes to
doing things efficiently andthings that allow you to scale,
either in your size of yourportfolio and your projects or
scaling into new initiatives,there's a really key moment that
happens at the very beginning,and it's not building right away
(34:48):
.
It's really taking theseconsiderations.
You know, looking at programslike Launch First to really get
down to the first principles ofwhat you're trying to do and who
you're trying to help here.
So, as we wrap up, what is onepiece of advice that you would
give to somebody who's lookingto launch, to scale a new
initiative, so that Build has amind, has a path and isn't just
(35:13):
this overwhelming, unwieldybeast of people just doing work
and doing?
Speaker 2 (35:19):
things.
The one piece of advice is yourchance, if you're going to
launch a SaaS company, is thatyou're probably going to fail.
And if you approach it likethat with the idea, okay, all us
founders have to have vision,right, we have to have vision
and have a belief system thatthis is worth pursuing.
And then it needs to stop thereand turn it into a clinical
(35:40):
process.
So the goal is to turn everyopportunity to do a science
experiment and try to make itfail as fast and as cheap as
possible.
And if you're unable to make itfail fast and cheap, then you
probably have proven productmarket fit and now you know you
have a business and and you cancontinue to generate revenue
which will help you fund thedevelopment, and you can build
(36:02):
the product actually afteryou've created your revenue
engine.
That's why it's called launchfirst.
You literally launch the salesand marketing engine before you
build the product, forcing youto get out there really quick to
ask the hardest questions ofyour customers, which which is
will you buy this now?
And most of us founders areafraid to do that, especially
when we don't even have anythingyet.
I see founders come out withtheir MVPs after nine or 10
(36:25):
months and they're still notcharging.
And they go through three orfour beta releases and extend it
two years, two and a half years, or they build way too big a
product.
I've even had some foundersthat never went to market, one
that went almost three years andnever put it out to test the
market and went bankrupt.
It's not it's the rare casethat does that right.
(36:49):
But this failure to pull thetrigger syndrome is because
you're afraid that it's notperfect and it's gonna give you
the wrong impression.
You don't want it to get outearly and these are all
backwards.
Get it out as quick as you can.
Being perfect is not important.
What's important is that youfind out who the customer is and
are they willing to pay enoughfor it and can you sell it at a
high enough closing rate thatyou can get to that three to one
(37:11):
ratio and if you do it right,you can do that before you ever
build the product.
I know it sounds backwards, butthere are lots of examples.
I'll give you a very famousexample of this.
It's not software andregardless of what you think of
the guy, elon Musk, when hefirst came out with Tesla, he
had a prototype sports car.
(37:32):
It didn't run, it wasn't real.
But he sold thousands of themto prove that there was a market
for something cool that waselectric and a sports car, and
he generated a lot of money fromthat prototype.
So that's an example of doingpre-launch sale.
You have to have something toshow people.
They're not going to buy justbecause you have a pretty face.
But it doesn't have to be aproduct that works.
(37:52):
It can be a design prototype aslong as it's realistic enough.
And you're not going to sell ifyou haven't really correctly
identified the early adopter.
Because you have to find thatperson who is willing to spend
money early on a product becauseit's a big enough problem.
It impacts them big enoughpersonally.
They know they have to get thiswhen it comes out to mitigate
(38:12):
this problem.
So they'll be given a bigenough value opportunity up
front.
They'll buy now in enoughnumbers.
You can prove you got productmarket fit.
And now and then raising moneyif you decide you're going to
raise money later is a wholedifferent animal because you're
showing traction, you're showingrevenue for a product you
haven't even built yet andinvestors are going to really
(38:33):
care about that.
The idea of launching firstjust makes so much sense from so
many perspectives.
And if you can't find the patheven after many pivots which can
be done very quickly then youcan say, okay, I failed fast and
cheap.
It's only been five or sixmonths, I can move on with my
life.
I haven't spent that much moneyyet.
And I've got another great ideaI want to pursue, instead of
(38:54):
three or four years later, youknow, a failed marriage,
bankrupt, 401k is gone.
Yeah, just saying I like onebetter than the other.
Speaker 1 (39:02):
I have to agree with
you, david, there you know,
that's something core to whatwe've done is really just
experiment as quickly as we can,get as much information as we
can on the problem.
We find ourselves asking otherpeople about the problems that
they're having, asking peopleabout the real situations that
they're dealing with at theircurrent places of work or in
their recent experience, and sothis is right in line with the
(39:24):
philosophies that we hold atSigma Lab Consulting.
David, thank you so much fortalking to us about LaunchVerse.
Tell us how can folks learnmore about techies and the
LaunchVerse?
Tell us how can folks learnmore about Techies and the
LaunchVerse program.
Speaker 2 (39:36):
Anybody that made it
to the end of the show.
I'll give you my email.
It won't be in the show notesbut it's david at techiescom.
Techies is spelled T-E-K-Y-Z,so you can reach me david at
T-E-K-Y-Zcom.
You can find me on LinkedIn orjust go to the contact us form
at techies dot com on my websiteand reach out to me that way.
Speaker 1 (39:59):
Awesome.
David Lawrence, thanks forhaving you guys on the show and
we'll catch you next time.
Speaker 2 (40:04):
Great and thank you
guys.
Really fun conversation.
Speaker 3 (40:07):
Yeah, great
conversation.