Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
In some ways, it's it like, it's the most clearexample of where construction has an arbitrage
(00:07):
opportunity.
Like, if you're in the if you're in the thatbeginning stage of creating an estimate,
whether it's like a design build where you'regoing back and forth with the the the owner or
whether it's a hard bit situation wheresomebody has said, these are the plans, this is
what it's gonna be, and keeps issuing addendato that.
(00:28):
And you trying to keep up with that addenda andkeep up with that price up until the day that
you have to submit the bid, the hour, theminute that you have to submit the bid.
There are opportunities for AI in all of thatto win because you're utilizing a tool that
gives you an information advantage.
(00:55):
Guidance on how to get more growth, wealth, andfreedom from your AEC company?
Well, then you're in luck.
Hi.
I'm Will Forret.
And I'm Justin Nagel, and we're your podcasthosts.
We interview successful AEC business leaders tolearn how they use people, process, and
technology to scale their businesses.
So sit back and get ready to learn from theindustry's best.
(01:17):
This is
Building scale.
Hey, listeners.
It's Will here.
Our mission is to help the AEC industry protectitself by making technology easy.
If you've ever listened to our show, then youknow that the three pillars of scaling a
business are people, process, and technology.
(01:37):
So if you suspect technology is your weak link,then book a call with us to see where we can
help maximize your company's IT andcybersecurity strategy.
Just go to buildingscale.net/health.
Today's guest is doctor Adam Kraub.
Doctor Kraub is a seasoned executive with over20 of experience at the intersection of
(01:59):
technology and business strategy.
He currently serves as the IT director for BoBrothers Construction, a large heavy civil
contractor based in New Orleans, LouisianaNOLA, as they say when you're when you live
there.
I I don't think I get to say that yet though.
You you can if you want.
You know, I won't question you, but that that'sokay.
His focus lies in applying emergingtechnologies like AI, machine learning, spot
(02:23):
solutions, and streamlining constructionoperations.
Adam has led major process improvements intrade partner billing, material management, and
operational cycle time reduction.
He champions data driven decision making,achieving significant ROI through strategic
technology pilot programs.
Known for his hands on proactive leadershipstyle, he is reshaping how construction
(02:43):
companies approach tech and innovation, andAdam is passionate about turning complex
business problems into sustainable tech enabledsolutions that drive real value.
And with all that said, Adam, welcome to theshow.
Thanks, Justin.
Thanks, Will.
Happy to be here.
I have to say that it sounds like AI wrotethat, but it really didn't.
It's just what happens when you're in the AI,world, so long.
(03:07):
You just, you know, you start sounding likecomputers.
Uh-oh, is that how they get you?
Is this we're gonna go pretty deep intechnology here, but, is that how they get you?
Is this how, the beginning of Terminatorstarts?
You start writing like that?
Start sounding like we all start sounding likeArnold Schwarzenegger, that it'll be a really
great look like him back in the day, I mighttrade that for it.
So that's just something to think about.
(03:28):
Okay.
So before we get completely off track, tell usa little bit about you.
How did you get into construction space?
And then tell us about Bobros.
So my father was a general contractor from the,you know, from the time I was little.
And so I worked on construction sites,obviously, like many of us in the summer times
to make a little bit of extra money.
I did all of the things that were exciting todo, but really started in technology in the mid
(03:52):
eighties and had fun with Novell to windowsnetworking to Unix, Linux.
Still sad that Solaris is gone, but really gotinto construction about almost ten years ago
now, nine years and nine years next week.
And that I had been in the ed tech space andwas working in universities and working for
(04:16):
startups and really found construction to be aninteresting place to land because in a lot of
ways it has opportunities that many of us don'tsee in other industries.
There's a famous Deloitte article that saysthat construction is the second worst industry
at adopting technology, only beaten out by nonfarm agriculture and that mean hunting.
(04:37):
So only the hunters use technology less than wedo.
And I'm not sure about that sometimes.
I read this book, where this guy went to Alaskato hunt a buffalo or bison, American buffalo.
And they don't let them use a lot oftechnologies, but the small amounts that they
can use, real high end, you know, sat phonesand night vision goggles and like, like those
(05:02):
things.
I don't necessarily always chalk up those typesof technologies to what a business uses per
perhaps is my opinion.
Before we go deep into the tech side, tell usabout the company though.
Like, what, what are we looking at size wise,things like that?
So, listeners have a good understanding of,what you, what you're doing and what you're
implementing.
(05:22):
So we we're what we call a solidly middlemarket, company.
We do about a half a billion a year prettyconsistently.
We're more than a hundred years old and comingup on the fourth generation of family
ownership.
And so we do a lot of different things.
Have about 700 full time employees, about 400or so of those are what we call technology
(05:46):
users.
We have a lot of people who are crane operatorsand carpenters and, you know, are actually
doing a lot of that work, you know,construction finish, concrete finishers.
So we we have and we do a variety of things.
We do a lot of things work on the MississippiRiver and on the Gulf Of Mexico because we're
here.
So a lot of Marine work, but we also do somereally cool projects in sort of more of a
(06:09):
traditional general contractor space.
And that's fun, like, you know, doing arenovation on the Louisiana Superdome.
That's really cool stuff right before the thelatest Super Bowl.
Wow.
That's super cool.
Yes.
So, you know, that's just on what the companydoes.
Now, why were you brought in?
So I was brought in about nine years agobecause the original organization was led by an
(06:36):
accountant who had no IT background, and theyreally wanted to establish like a formal IT
organization.
And that's pretty true of a lot of constructioncompanies at the time is that they had sort of
grown up more organically and they didn't havea professional IT organization.
So I came in and the first thing that I did isI said, how many problems do we have?
(07:00):
What are those problems?
How can we identify those problems?
So it sort of started with tracking obviously,which is one of the key things that we've got
to do, put a CRM system in place so that we'reactually tracking tickets, but also identified
what are the areas that were holding us back.
And we called those our four four boat anchors.
(07:21):
And those were legacy systems that were keepingus from moving forward as an organization.
So one of my things that I would provide myboss, who's the president of the company every
week was where are we at with the boat anchors?
And they were old imaging systems.
They were an AS400 that was still glowingbrightly.
(07:42):
They were all sorts of things like that thatwere keeping us from putting our resources in
the spaces where we really need to put aresource.
Can I ask why, you know, why keep these legacysystems around?
I mean, that's old.
Like even ten years ago, those systems werealready old.
Yes.
So why weren't they like decommissioned, youknow, to move on to new technology even back
(08:04):
then?
So part of it was we had not really taken onsort of our daily governance journey or started
our daily governance journey.
We had one memo that was several years old froma former general counsel about how long we
might wanna keep some of this information.
And the result was that we didn't really have asense and people in their own areas had
(08:28):
different ideas of how long they needed to keeptheir data.
So someone in one area would violently reactwhenever I said, I'm turning off the old
imaging system from twelve years ago because wereally ought not to have that data.
And so the result was in some ways, one of thethings that helped us move forward was
(08:49):
establishing a really good data governancegroup.
And it's certainly an evolving process andgetting closer to a more consistent approach
has taken us years and years.
But now we have a consistent approach.
We have we know how long we're supposed to keepdata.
That makes it much easier to say, no, you maynot have that anymore because we really it's a
(09:11):
risk to the organization to keep that.
Can I pause you there for a second?
Sure.
Okay.
For construction leaders out there that mayhave never heard the word governance, IT
governance or data governance, can you explainto someone that really doesn't know what this
is, what that actually is and what that means,and why have it?
(09:32):
So, I mean, in my mind, data governance is allabout, you know, who has access to what and for
how long.
And what's the sort of what's what are thedrivers of that of those decisions from, like,
a both a risk perspective as well as, like, aprofitability and analytics perspective?
And that's been an interesting journey too.
(09:54):
You know?
Like, not to off on a tangent, but one of thethings that we've that we've had pitch battles
about is how long do we keep geotech reportsthat are, you know, 50 years old?
Because probably the the you know, we live inLouisiana, the the land slips around a bit and
it probably not valid anymore.
So we probably don't that fifty year oldgeotech report is probably something we would
(10:17):
need to redo anyway.
But the people who had the geotech reports intheir closets still held onto them until they
retired.
And so some of those battles, were just like,we don't wanna fight.
But I I you know, well, I think that that thethe what really helped us understand it was
getting our general counsel involved.
Our general counsel really helped drivepeople's understanding of the risk of having
(10:44):
having available that information, but also theoverhead of having that, not only the the
technology overhead, but also every time we hadwe had a a subpoena, it would take quite some
time to gather the records properly.
There's an old saying that the last phase ofconstruction is litigation.
And so there is not an insubstantial number oftimes where you get a subpoena and then you
(11:10):
have to identify those records.
And part of my role in information governancehas been pushing us to a single platform where
we can have all of that information and thenmanage it much more easily and much more
rationally and saying, okay, it's our 10 plusone time to go.
That's the delete button, please.
And we just move forward.
(11:31):
So I feel like we had really this nice sort ofcoming together of the general counsel who was
very concerned about litigation and me who wasvery concerned about getting rid of these
systems.
And we were able to work together and andreally get past them.
Took a couple of years, but we got past them.
And so the result is now we we have, you know,modern architecture that I'm relatively happy
(11:54):
with.
So real quick, just to summarize.
So data governance is to reduce risks aroundlitigation because keeping data too long can
actually be used against you.
Right?
Would that be correct?
Okay.
As well as keeping that data, it's stale.
You then have to even if you're part oflitigation litigation, but it's not necessarily
(12:16):
you specifically, but you're called on.
Yes.
It's time and resources that you have to spendsifting through that data.
And if you have and essentially, the longer youhave, the more data you have to sift through.
So if you go with the bearer with essentiallythe minimum, so ten plus one, as you call it.
So ten years.
Right?
That's pretty standard.
Then or keep and then you know what you shouldbe deleting.
(12:41):
And you could even create policies, not just,you know, written policies, but then also
digital policies that automate the process.
That's correct.
Okay.
Was there any other is there any other reasons?
Because you talked about risks.
Were there any other risks besides litigation?
Well, I mean, think there were access risks.
There were security risks certainly as well.
I mean, one of the things that we that we talka lot about at data data governance and is, you
(13:04):
know, consistent reviews of of accessprivileges and what is stored where and what
ought to be stored where.
One of the of the pitch battles that we foughtwas over our shared our shared, you know, sort
of active directory drives that they becamethis absolute morass.
(13:28):
And, you know, we had people who were storing,you know, relatively confidential information
on there that were that was discoverable.
You know, one one person had had a had hisbonus sheet where you fill out everybody's
salary and how much bonus they're gonna get.
And we found that and said, no.
You really ought not to be keeping that on theshared drive.
One person had all of his pictures from DisneyWorld from many, many years stored on the
(13:53):
shared drive.
And the the result was that we had to come upwith a real policy of what belongs on the
shared drive and what belongs where.
This is
a Pictures isn't a part of the the game planfor the company?
That's, it's weird.
I thought that would have been.
So listeners, keep in mind this specific storybecause this is super relevant to AI.
(14:16):
We will have an AI conversation in a littlebit, and this is very, very, very relevant, a
very real problem that you need to thenconsider.
So listen to this story.
Yes.
Because yes.
And we will talk about that in a minute.
I know where you're going well.
But it is really interesting to think about howpeople treated it as sort of a dumping ground.
(14:38):
Or somebody said, oh, I'm getting a newcomputer.
I'm gonna copy all of my files onto the shareddrive, and I'll just grab it back when I'm
done.
And rather than copy and cut and paste, theydid copy paste and you know what you get.
You get every single one of their privatedocuments forever.
(14:59):
And it's also interesting, like one of thethings that kind of shocked me and maybe it's
because I came in when I did is how few peoplehad email addresses that were not their
corporate email.
And the number of things that they'resubscribed to that are not corporate kinds of
(15:19):
things that they should be subscribed to.
You know, it's fascinating to see as we moveforward and say, look, you absolutely cannot
have your corporate email address, but youronly email address.
You need to have a personal email address andyou need to have your Walmart shopping on your
(15:40):
personal email address and not on yourcorporate email address.
And I don't care that you leave the company andyou your multifactor authentication is through
your corporate email address for your target,you know, red card or whatever it is.
And that happens actually a surprising amount.
So So you talked about obviously the litigationwhen you talk about legacy software governance.
(16:03):
Hey, like this, that to me seems like the easyexplanation to leadership saying like, Hey, we
gotta evolve.
We gotta move What are other reasons why, Hey,like getting your tech right, getting your
infrastructure right, getting the stack thatyou're using right to your company has massive,
has massive, implications across the entirecompany.
(16:25):
But I feel like it's very difficultspecifically in the construction space for
leaders that are non technical to necessarilysee that.
How did you explain some of these, or how didyou get them to buy into the concept of like,
Hey, like we need to make our tech betterbecause that will make our company more
scalable and make our company better across theboard.
So for me, happened.
(16:46):
It happened with specific alliances, withspecific people that are great drivers of
change in the organization.
One of the best drivers of change inconstruction industry is safety.
It sounds ironic because the safety has thereputation of being kind of behind, but our
safety team has been very technology forward.
(17:09):
They were some of our first adopters of PowerBI.
They were some of our first adopters ofProcore, which is our construction management
software.
By utilizing them, that that is like thebiggest stick.
It's not always a great carrot, but it's areally good stick.
Every construction company that I know of hassafety as number one.
(17:33):
It's the number one priority, which makessense.
Our intention is to get everyone home okayevery single day.
And if that's the intention and that's thebaseline, then everything driven with a safety
requirement gets extra attention.
So safety and quality helped us drive also ourdigitization of plans.
So when I started, we were still running plansout to job sites that were printed.
(17:58):
We in fact had planned bins that have sincebeen removed from all of our offices where you
would have the plans for the day would get shotout.
But the real life challenges really helped showthat we had a situation where we were pouring
concrete for for a precast member that we weregoing to be taking out to a job.
(18:22):
We poured it in our yard.
Someone was using the previous measurementsfrom the previous shop drawings And the result
was we had to report all of that concrete.
And that kind of thing is really those storiesreally help show how important that is.
(18:42):
And, you know, an odd one that that is actuallyan area that I I'm kind of excited about,
excited about.
I don't know, excited about supply chain, Ithink is really huge and super important for us
and demonstrating.
Like prominence of materials and demonstratingquality of materials as part of the supply
chain.
Our safety senior VP tells a story that we wereone of the contractors who worked on one of the
(19:08):
flood walls that collapsed during hurricaneKatrina, which is, you know, like one of the
defining moments in the lives of everyone wholived in New Orleans.
And there were lots of news reports that thesheet piles, the vertical pilings that we'd put
in were not to spec and that we didn't put thesheet piles in properly.
(19:31):
Well, we had the deliveries ticket that wassigned with the measurement.
And so he stood there and watched and got outthe measuring tape and knew what the results
were ahead of time that we were actually beyondspec.
And then we had provided materials that wereactually beyond what was required.
(19:53):
And the result was that all of that went awayand we were all like, okay, now let's get back
to business.
Let's get back to work.
In this era where we're talking more and moreabout compliance and particularly with, there
are the Buy America and Buy Americanrequirements and we have to show provenance of
(20:14):
critical materials for us.
A lot of which is steel, but even things likeaggregate and sand, we have to know what that
provenance is.
And we have to be able to track that.
That is an area that I'm very interested in.
We don't have really great solutions that areholistic yet in construction, but we're moving
in that right direction.
Okay.
So you have a whole bunch of data.
(20:37):
Yes, do.
You have been moving you know, you've moved thecompany towards governance architecture.
And this is partially because of litigation,partially also just for your own sanity.
But there's a third part, which a lot of peopledon't necessarily talk about unless they start
(20:58):
their AI journey and then realize how far theyare behind.
You actually helped fast forward Bobros, sothat you could start your journey way earlier
towards AI.
Right?
And so can you talk about first, how are you,you know, why AI?
And then how are you how did the conversationfirst start start about?
(21:22):
And then what are the steps that you needed totake to eventually get their needle?
And then talk about and then we'll talk somemore after that.
Sure.
So let me start with the beginning of ourjourney, which is around 2019 felt very
strongly that we needed to be engaged inmachine learning and we needed to understand
what it can do for our company.
(21:42):
And so I brought in someone to look at ourfinancial data and run it against some models
that were trained only on our data.
You know, at the time we're relative in today'sworld, we're relatively unsophisticated sort of
random tree models where we were trying tofigure out, okay, what is it that will show us
that we're really going to have a profitablejob here or not?
(22:05):
And what we found is that even with ten yearsof data for about a hundred jobs a year, it
wasn't nearly enough.
It was it was just not enough to to reallyunderstand and to to be predictive more than a
coin flip.
And the result was we said, okay, look, it'snot that it's not there, but it's not now.
(22:27):
And so we, we continued to look at it andcontinue to investigate.
And about a year and a half ago, we said theworld has changed.
Like the emergence of large language models asa consumer thing became something that was
relevant to us.
We started to see what I think somebody callsit shadow IT utilization.
(22:51):
And I I saw something that said something stillsaid that, like, 70% of people don't tell that
their boss that they're using an LLM at work,even though they are, which is a little bit
scary.
And that's come on guys, risk, risk, anygovernance risk that scares the crap out of me.
I'd rather know about it.
(23:12):
And so we first crafted a policy and we said,we've got to have guardrails.
We have to have, this is what you can do.
We have a specific policy that outlines, wedon't outline the specific models, but we
outline what models need to do.
And if you have any questions at all, youshould talk to me.
And the result is that we've had a lot ofreally good conversations about that.
(23:35):
And then from that emerged an IT AI steeringcommittee and not an IT steering committee, but
an AI steering committee where we are activelytalking about what direction do we need to go.
And we've made real substantial progress inthinking about and understanding where AI fits
and where it does not yet.
(23:56):
I think the question is yet.
And particularly we are one of the things thatI know Will was going to talk a little bit
about training data, but I also want to talkabout one of the things that concerns us pretty
substantially is the degree to which we findthe AI, the LLM results trustworthy or not.
(24:19):
In a safety sensitive industry, 97% just ain'tgood enough.
You know, like we, we cannot have models thatare suggesting activities where we only feel
97% safe.
It's not sufficient.
And the result is that we've specificallystarted in areas around project management
(24:44):
where it's much more around the financials,where we do have a little bit more of a sense
that we can interpret and it's not going tocause a crane to crash into a building.
But we've also been working in specific areasto look for the spots where it's appropriate.
The other thing that we did, and we've donethis, there there are sort of two different
(25:08):
divisions in my company that do very differentthings.
One is this sort of civil construction, youknow, utilities and roads and bridges and, and
pile driving.
And the other is this vertical construction.
And they're very different in terms of whatthey do and what they accomplish.
And so I think one of the things that's beenhelpful is focusing on each one of the areas
(25:31):
and saying, that's going to be different andgetting together a group of people who are in
the weeds and are applying AI to what they doevery day.
We call these AI innovators and we meet once amonth and we talk about what it is that we're
doing.
And then I meet with my AI steering committeeevery other week to talk higher level, to talk
(25:54):
about platform, which I'm still not a % sure ofand directions forward.
And that's ever evolving.
I listened to a podcast this morning about anarticle about the single founder company where
he's training AI copies of himself to do all ofthe work, which seems wacky in some ways, but
(26:17):
also has some interesting potential.
I don't know.
So, and the result has been really positive.
We have some, we have things that are like AIthat are built into tools.
There are tools that are well documented andusing very well proven models around schedule,
(26:39):
around this product called Takeoff, wherethere's a process where we basically have to
measure all of the interior space of a buildingto see how much paint we need to use, for
example.
And that tends to be a very laborious task anddone by very expensive people.
And there are opportunities for us to get realvalue out of that, not just for us as a
(27:02):
company, but for our owners as well.
And this is where I think as constructioncompanies, we have to expand our understanding
of value is the value that we're going to driveis going to have potential implications, not
just for us, but for our subcontractors and forour owners.
And in the case of this is where this takeoffexample is really interesting, I think, is that
(27:27):
for owners, if you had the opportunity to do awhat if analysis very quickly of moving a wall,
For example, it moves two feet this way or twofeet this way.
If I can move it two feet this way and save youtwo weeks potentially because of structural
things that I don't have to do, that might behundreds of thousands of dollars in carrying
(27:50):
costs of your construction loan.
And all of a sudden I can make a betterdecision, better informed decision because I
have the speed and velocity that's provided tome only through AI.
And I'm still getting it through the lens ofthat really smart person who's doing the final
review and saying, okay, by the way, I think itcan save you, you know, two weeks on the
(28:11):
schedule.
Okay.
There's a lot here to take in.
There was a lot.
Sorry.
It's okay.
It's okay.
So I I wanna break this down for the listenersfor for a bit.
So for takeoff, that was that's done byestimators.
Right?
And estimators are super expensive.
Why?
They require a lot of experience or theyrequire a lot of oversight
with And there are not a lot of them.
(28:32):
And there's yeah.
Exactly.
Not a lot of them, out there that have theexperience to be able to do that correctly.
Because if you get the estimation wrong,everything goes downhill from there.
Right?
And if you get the estimation wrong sort of theother way, then you don't win the you don't win
the job either.
Right.
Right.
Exactly.
You gotta hit the sweet spot.
Yep.
(28:53):
So so so your margins really start from there.
Okay?
You're able to shorten the amount of time thatit takes to estimate.
Can just for that one example, the paintingside.
Right?
Which is a very simple example, but I think alot of people can understand.
How long would it normally take?
(29:14):
Right?
So give a normal as like an estimation of time,how long that process would take.
Now, have someone and AI isn't doing all ofthis.
It's AI assisted.
Right?
Correct.
Okay.
Just to be clear.
So AI is that assisting.
So can you give the example of how long ittakes in all?
You know, how many people and how long does ittake normally?
(29:34):
And then AI assisted.
So for, you know, a building that might bethree or four stories tall, if we've made it if
we did the whole thing, it would probably takea couple of days for a smart estimator to do
it.
But with with the AI assist, it goes to hours.
Oh.
And, you know, that is the kind of change thatI think it'll really leverage some, you know,
(30:01):
incredible opportunities for us to deliverdifferent kinds of value.
And that's just one piece of the estimation.
Right?
That's just painting.
That's not even You you
draw it all and then then it gives you thevolumetrics and it gives you all the different
types of things.
So it's gonna tell you that about, you know,how much drywall as well.
Right.
You know, if you think about that and youthink, okay, well, I need to make the change.
(30:25):
Oftentimes like you're going back to the modeland you're like, okay, so I've got to move this
and got to move this, got to dividedependencies here and there.
If you get the AI assist to do it, it's so muchfaster.
Like the teams that do it, like, like, this isthis is really an incredible.
(30:48):
You know, super great example.
And then Wait.
Wait.
Before you go, Will.
I also pointing out the labor shortage ofestimator.
Now when you cut days to hours, that oneestimator now has more power to hit multiple
sites, multiple jobs, multiple things thatbefore you'd you'd have to hire.
(31:08):
You'd have to find another person or have towait and delay.
So, you know, everybody talks about how laborshortage is so crucial in the construction
space.
And this is an example how AI can aid in that.
How do we how do we find more help when itcomes to
And in some ways it's like it's the most clearexample of where construction has an arbitrage
(31:31):
opportunity.
Like if you're in the if you're in thatbeginning stage of creating an estimate,
whether it's like a design build where you'regoing back and forth with the the the owner or
whether it's a hard bit situation where someonehas said, these are the plans, this is what
it's gonna be, and keeps issuing addenda tothat.
(31:52):
And you trying to keep up with that addenda andkeep up with that price up until the day that
you have to submit the bid, the hour, theminute that you have to submit the bid.
There are opportunities for AI in all of thatto win because you're utilizing a tool that
gives you an information advantage.
Time is really what we're talking about.
It's a speed advantage.
It's not necessarily that it's doing it on itsown.
(32:14):
No.
It is not.
Correct.
So sort of the you call it the what ifanalysis.
So let's say when a change order comes in,right, happens, you said something about the
construction loan.
Can you connect the dots there between theconstruction loan and the change order and how
AI helps that?
(32:35):
So the the dots that connect there are thatwhen you're doing a change order, you often
have opportunities to do it in one way oranother.
In the initial phase, there's this valueengineering process where you basically have,
this is what I really wanna build.
And then you get back to, this is what I reallycan afford to build.
(32:56):
So the value engineering conversation is a longconversation at the very beginning.
But you can imagine that in a change ordersituation, that value engineering conversation
is sort of forestalled because of the speed atwhich you need to respond to the change.
Sometimes that speed is because it's a criticalpath item and you have to get to it
immediately.
(33:17):
But if you could have the opportunity to do awhat if analysis during a change order
conversation with an owner, then couldconceivably drive a shortened duration to
either removing a delay or actually shorteninga project.
And I think that there are opportunities for usto think about how we utilize that data.
(33:42):
Then the owner is saying, okay, I can closeearlier.
I can either have that construction loan, youknow, if I'm, if I'm financing it through a
sort of a construction loan process, or if I'ma hotel and I can open two weeks earlier and I
can fill rooms two weeks earlier, I get revenuefaster.
So, I mean, it's both of those sides of thecoin.
(34:04):
One of the things that not specifically AI, butis technology related is a lean scheduling
technique called tact.
That's this waterfall in technique where youbring in different people.
It's very, you know, very much, you know,Japanese management style, but we have some
technologies tools that support that.
And it's incredible the degree to which you canhave this flow of the construction process and
(34:28):
flow of information with just really simpletools.
It don't have to be complicated.
They don't have to be super expensive, but theyget that process moving and moving more
effectively.
And and also reducing the time it takes to usethose kinds of tools.
Those that that tax schedule, getting that froma traditional schedule that says, you know,
(34:49):
these are all of the different milestones.
These are all of the different activities.
You know, a lot of those schedules are hundredsof pages long when you do a multi year project
and saying, okay, what's the next thirty daylook ahead for my subcontractor.
You know what?
An LLM can review that and can do the first cutat what that look ahead is supposed to be like
(35:11):
and we can feed that into the tax schedule.
That's pretty cool.
Oh, we talked about all this work.
Why don't we talk for a quick second?
Because work on the AI side is represented by avalue.
Yes.
Can you talk a little bit about that?
How does that translate?
(35:32):
And then why is that important?
See, I I think this is this is where we're allwhere we're all trying to zero in is what does
what's a meaningful impact for AI in theconstruction industry?
I think that, you know, like there's a reallygood sense of, you know, what's the value of AI
labor and thinking about it rather than as atool, thinking about it as a unit of labor.
(35:56):
That's a really great way for us to think aboutit and then translate it into value.
So effectively it's what value is producedtimes, some sort of risk reduction.
So, you know, like, we're not getting all thevalue.
We're getting some of the value over the amountof of money that it costs to basically buy the
(36:18):
token plus, you know, have a human being figureout what the heck AI is talking about.
And just and what is the token?
The token is that particular AI process,whether it is a conversation with a with a
large language model or whether it's anactivity that's created by an AI agent.
(36:41):
You know, it's it's, you know, what are thosewhat's the that sort of unit that we're that
we're basically paying for from one of the theproviders.
And one of my gripes right now is that all ofthese tokens being used by companies that are
not really delivering a lot of value with them,but that's a whole another story and I wanna
(37:02):
get down that rabbit hole.
But one of the things that we that we've lookedat is we need to have it be a meaningful amount
of change.
And like, I think the best example where we'vereally found value that translates and people
understand it is probably one of the leastcomplicated implementations of AI, which is AI
(37:25):
transcriptions of meetings.
And I think that this is probably where peopleget the moment more quickly than spending time
typing into chat GBT what they should get theirwife for their birthday, for her birthday or
not.
What we found is we have different categoriesof value that it delivers.
(37:49):
And one of the biggest values that we found iswe found that we had people who were just in
meetings to take notes.
So we had a note taker who was allocated justto take notes.
So that note taker could do one of two otherthings.
One, if it's an APM, who's just an assistantproject manager, who's just sitting in on this
(38:11):
meeting because they need to learn how thisgoes.
Then that's great.
But if we have, you know, then they can notonly sit in on the meeting, but they can
actually contribute to the meeting.
And that's a value in and of itself.
Or if it's an assistant project manager, ormaybe it's even like a field engineer who
wouldn't necessarily sit in on this meeting, wecan have them do something else.
(38:36):
We can have them do another, you know, wholeset of jobs, whether that's, you know, an
additional safety inspection, whether that's anadditional, you know, meeting with a
subcontractor or perhaps even just doing aremeasure of a concrete pour before we get it
done so that we can get a better placement ofour embeds when we put the walls up.
(38:58):
Those kinds of things are really substantiallyhigher value.
I think that that substantial higher valuemakes a huge difference.
That when we start talking about peopledelivering value for the organization as
humans, they need to deliver that value in avery direct way.
(39:18):
And we need to, we also, let me put it thisdifferently.
They need to deliver that value in a way thatis meaningful in chunks of time, because like
one of the things that will drive you insane istalking to a software vendor who says, I can
save you five minutes a day.
Are you going to use that five minutes a day tolike write the next great novel or, you know,
(39:41):
implement some sort of crazy new idea on a job?
You are not.
You're going to talk about what the score wasin the saints game last night or whether
they're going to start this person atquarterback or that person at quarterback
because we have no idea who's gonna start aquarterback for the saints this year.
No idea whatsoever.
You should ask chat GBT what it thinks.
(40:02):
Maybe they have a better insight.
I think they they have
a better shot at at predicting that than thanthe water cooler does.
But I I do feel like we have to measure this insome sort of meaningful numbers.
I apologize.
My, my watch rang on me.
All good.
No worries.
One of the stupid things about having these,you know, we're multi connected.
The I think that that's one of the bestexplanations of how we can start uncovering
(40:27):
that value.
But we have to think also more broadly thanjust our organization, which is why I think the
takeoff example is really very helpful.
You know, as we start talking about ourorganizations, we need to get beyond seeing the
value that we deliver in a very limited way.
(40:47):
We need to think about the ecosystem and astechnologists, we need to talk about the value
that we're delivering beyond just ourorganization.
So I think it's really easy to start talkingabout it from the perspective of us
individually, but we need to be able to say,okay, how does this help our subcontractors?
(41:07):
How does this help our our owners?
How does this help our architects or designersto do a better job and to deliver a better
product?
Is it, you know, one of the things that'sreally wonderful about working for an
organization like the one I'm in is that webuild things that are really important and that
are really important to be built right.
We're building roads, we are building bridges,we're building things that we have to do in the
(41:32):
right way.
I think it's very important for us to thinkabout holistically how we are benefiting the
community and think about, okay, how can weutilize our technology to drive that community
contribution in a, in a better way.
I assume that's probably a piece of theplatform, right?
Like you're not, you're not solidified on theplatform.
(41:54):
So obviously I'm not going ask, well, what arewe going with?
Cause you don't know.
However, you've obviously done some discoverythere, right?
So what what are some of the the variables thatgo through this?
I actually noticed this the other day as as wetake our AI journey, and I historically have
used chat, but then, ChetiGee, but then alsoCopilot Gemini in there.
Seeing the differences, if you ask the samequestion to each of them is different, which is
(42:16):
is very intriguing to me because I thought, oh,well, it's gonna be a similar concept.
But Copilot seems to be more cautious at givingyou creative ideas is in, they try to be more
and I don't know.
Maybe you can tell talk to this.
They seem to be more accurate or try to be moreaccurate or it won't feed you just an idea
where ChatGPT is just gonna give you well, ifyou just rode a unicorn into work, like the
(42:38):
Saints will probably win the Super Bowl.
Right?
Like, that's, that seems like a thing that youWhich probably true.
That's a thing that you would hear there.
And Copilot would be like, I don't know what todo with that.
What do you mean?
I'm just gonna ask you the question backbecause this doesn't make any sense to me.
So when you're going through these platforms,what are the differences that you've seen that
you say, well, here here is a perk.
Here here's something that we're not interestedin and and kinda going through those steps.
(43:00):
So first off, I I agree with you about Copilot.
I think it's much more conservative.
I think whoever is engineering the sort of thehidden prompt behind all of your chats with
The man behind the curtain?
Yeah.
It's really an important thing.
And we saw that last week where OpenAI releasedtheir new changes and the hidden prompt was
(43:22):
much, much too agreeable.
It was telling everybody that what they theirideas were really great ideas when maybe they
were.
It's it was just trying to stimulate businessin the in the country.
It was doing something.
Everybody has
a great idea.
Ever becomes an entrepreneur.
Which just as a quick aside is scary as heck tome because AI's value in my mind, at least one,
(43:47):
hey, what are the high points of AI for me isperspectives.
Like I really feel like AI drives differentperspectives for me and those perspectives
should not all be ones that agree with me.
If I wanted to something agreeable, I would getinto my social media feed somewhere deep and
like, like relish in the, in all the peoplesaying that I've got the same opinions that
(44:11):
they do, or I'd go to Reddit and just have allof my feelings stomped on for many, many hours.
The, you know, I feel like ChatGPT, the peoplewho are, who are on my AI innovators group,
they love ChatGPT.
They love the interface.
They love the speed.
(44:32):
They love the options that they have.
They really liked it.
We do not love chat GPT teams.
It is very difficult and complicated.
Sharing is really not as easy as it should be.
Everything is not easy.
I'm very curious.
I have not used Claude a lot, but the more Iget into thinking about this as a coding, you
know, exercise, you know, like reallydeveloping code for some of the Python scripts
(44:57):
that I'm trying to to utilize in thebackground, Claude's really good at that.
And it's really surprising to me and I need tospend more time on Quad for sure.
But I also think that we're looking atproviders that are giving us options for
different models and then suggesting whichmodels work right for which kind of situations.
(45:21):
So that's sort of next level.
But there are emerging platforms where you'rebasically paying by token, you're paying by
time that you're addressing the LLM.
And that's where I think we're going to findsome real opportunities for us as we scale to
the entire organization.
Because the all you can eat price that is beingestablished by OpenAI or by Microsoft or by any
(45:48):
number of these different providers, it's toohigh for most people.
I mean, they're definitely making lots of moneyoff of my daughter who who's a college student
and uses AI, but not frequently.
And I pay for her chat GPT pro license becauseI think she needs to be using it.
But I also think we need to be able to havetools where we can have on all you can eat
(46:10):
price for our people who are really using thecrowd out of it.
And again, I don't really care whether, youknow, XYZ company that came to me this last
week and said, I've got an AI in my interfacenow.
You can search everything.
No, I don't need that.
I don't want it.
Stop it.
You're just driving up the price of tokens.
(46:31):
Stop it.
I'm not happy with you right now.
Like that I actually have thought about writingon LinkedIn this week, like how unhappy I am
about software companies that are just AIwashing and, you know, throwing up AI as, you
know, oh, we've got we all have to be using AIwhen in fact we really don't.
Like if your if your product is not core AI,maybe you don't need it.
(46:53):
That's very post, let me know.
We'll put it in the show notes.
I will a % write it this week because I'm sonot happy with this, Justin.
Okay.
So there was a lot to take in there.
Again, well, I'm I tend to be gross.
Oh, yeah.
No.
Thorough.
I love it.
It's very thorough.
I love There's there's a lot of thought behindwhat you're saying.
(47:15):
Some people are not as far in their AI journey.
You've given
I think a lot of
people took a lot to talk about.
Before I pivot, let's do a little bit of theconnection between governance and AI, and the
problems of if you don't have governance andstructure in place, what happens if you connect
(47:39):
AI into into that?
Because AI needs data.
Right?
Yeah.
You have the large language models.
So you can you can with large language models,there's a whole bunch of them.
Right?
ChatGPT is one of them Copilot, which is reallyChatGPT on the back end.
And then there's also Gemini, Claude.
I mean, there's DeepMind, whatever it is.
(47:59):
Right?
There's a whole bunch of them.
However, you're going to get generic promptsbecause the data is, as it's just large
language.
Right?
So it's generic in the sense that it is notspecific to your company.
What AI, for it to be super valuable, you needto be able to ingest the data that it's
(48:19):
specific to your company to then start becomingsort of to give it context.
That's that's the piece that it's missing isthe context of specifically your company to be
able to do the things that would be valuable insort of interacting or interfacing with that AI
AI agent.
Okay.
So can you talk about governance and specificwhen you when you introduce your company data
(48:45):
to it, why why is that so important?
And can you give some specific examples, evento the ones from before?
So the the way that that LLMs are created isthat they have this initial training data,
which is the big expensive part.
They train on basically everything that theycan find in the in the Internet and and many
(49:07):
things they probably shouldn't be trading onwhen lots of conversations about copyright in
that regard.
But as we all know that most of the data that'sheld in the world are is behind corporate
structures, behind firewalls in, you know,repositories like shared drives on active
directory servers in SharePoint and, you know,things like that, you know, data repositories
(49:32):
there.
And these large language models don't see anyof that.
So they're they don't know they're notparticularly good at it.
Well, you're absolutely right for it to getbetter and provide you with better specific
context, you need to you need to feed it.
If you feed it in a way using one of thesecommercial tools that's just out there for
(49:56):
everybody to or even more like end user kindsof tools, not even commercial.
Chad GPT says, oh, well, that's great trainingdata.
I can train myself to get better about that.
Well, here's my bid sheet for this particularbid.
And here was my margin that I got on that.
And by the way, that's now available toeverybody.
(50:16):
Wait.
So I'm gonna pause there.
So if you're using one of the personal basedmodels, whether so I love this quote.
I don't remember who said it first, but I dolike using it, which is, if the product or
service is free, you are the
actually work.
Yeah.
Yeah.
You are the product.
Yes.
That's correct.
Yes.
So essentially, by interacting with one ofthese large language models, if it's free or if
(50:41):
it's a personal version, it is training off ofyou and your data.
So if you, like you just said, bit sheet withmargin in there, and there are many examples in
big companies, like I think Sony or Samsung didthis kind of recently.
And then their competitors were able to look upessentially some of the the competitive
(51:05):
engineering and some of the competitive pricingthat was out there.
Other competitors were able to look that stuffup.
So you don't want to use one of the sort of thepersonal ones, because then privacy is an
issue.
So I'm just going to use ChatGPT because manypeople know it, or Copilot, where they have
sort of company or enterprise versions, whichallow you to access the large language model,
(51:30):
but it doesn't learn off of you.
All that data that sort of learns off of youstays within your tenant or stays within the
purview of your, essentially, it doesn't overread.
So someone else cannot get at that data.
Right?
And can't get at whatever it learns from fromthere.
Right?
(51:50):
So if you want to kind of continue with withwhat you're talking about.
And and so there are different models that aresort of have that governance in mind is one of
the reasons why some people have been morewilling to go with Copilot because I think
Microsoft had that in mind and had, you know,ChatGPT, even if you buy the pro version, the
(52:16):
default is let it train.
You have to go in and flip it on until you getto the ad until you get to the, like the full
enterprise version or the team's version.
Even if you're just paying for your ownlicense, at least that was the case when I set
it up about a year and a half or two years ago.
And Copilot has kind of gone the otherdirection and has decided to make it sort of by
(52:39):
default that it's not shared.
In fact, if you have your Copilot app open onyour phone, I use Android personally, but same
thing on an iPhone.
There are two different toggles.
There's web versus your office information.
And so you have a nice dividing line that tellsyou this is, this is inside versus this is
(52:59):
outside.
And so like when you're searching outside, ithas no idea what's on my calendar.
It doesn't know what's in my email.
When you search inside, it does the kinds ofthings that you really like a copilot to do,
which is, oh gosh, what does my day look liketomorrow?
What should I be preparing?
What should I be thinking about?
Which is one of the prompts that I use almostevery You know, like I look the next day, what
(53:23):
should we be preparing for tomorrow?
You know, I really ought to do that for a week.
If I could just get through the next day, I'mdoing well.
And he usually says, you look really busytomorrow.
Like, yes, I am very busy tomorrow.
But I do feel like that's something that, youknow, and I've been talking to more enterprise
(53:43):
grade companies that are offering that level ofgovernance sort of by default with multiple
models.
That is eventually where we're going toprobably land.
And it's going to have to be a platform thatdoes it.
But I, right now I'm building kind of a tieredmodel in my head of, you know, different
(54:04):
utilization patterns with different people inmy company and starting to think about roles
and how they apply.
I have a sixty, thirty, 10 model now that I'mplaying with, with 60% being on really a very
much more of like a, just an AI assistant kindof model data access that, that, that gets to
(54:25):
specific kinds of data.
Then 20% of the people who are starting toexplore and starting to do some agent
activities.
And then the 10% of people who are doing wildand wacky things.
And they're the ones that I really am excitedabout.
I wish I had time to be wild and wacky likethey do, but unfortunately do not.
Yeah.
And I think in in the end that 60% is gonnahave to be some sort of a tool that that does
(54:54):
that by default.
That's purpose built with with the idea ofgovernance in mind and security and all of the
different pieces that, you know, my securityadmin always tells me, you know, AI would never
be secure.
We, but then, you know, when you talk about theAI back end of the firewall, he's like, yeah,
(55:15):
you know, that's important.
We gotta apply the machine level speed to themachine level speed that the bad guys are
using.
You know, like it's it's gonna be thatrepresents a challenge, but that's that 30%
where I'm really struggling.
I don't I'm I'm thinking that we're probablygonna stick with with Microsoft's Copilot for
(55:39):
now.
But that's up for now, for sure.
And the great thing is I can change that once ayear when my Office three sixty five stuff re
ups.
Yep.
So well, to connect the dots, governance.
Right?
So were those the initial set of governancepolicies, were they written by you?
(56:00):
It was me along with a member of my legal teamand the CFO got together and we talked it
through.
It was it was good.
And then we used to look at what other peoplewho were a little bit ahead of us in other
industries were doing, pulled those togetherand built a policy.
So that's a written policy.
Yeah.
Right?
Okay.
(56:20):
So written policy to start.
But then, the eventual goal I don't know if youhave this or or not, but the eventual goal is
to have digital enforcement
That's correct.
Of the written policy.
Well, fire well, because we know people wholenotion of shadow AI, it exists there.
(56:40):
We know where we know people are using it.
We know people are are are trying tools thatthey haven't been authorized to use.
Shadow IT is alive and well in the twenty firstcentury.
And the AI is a big part of that.
All systems are starting to incorporate toolsfor us to identify and filter those kinds of
(57:05):
conversations.
And we are we are working on that right now.
So for any listener out there that's listeningto this, doesn't understand the term shadow IT,
in your own words, what is shadow IT?
Shadow IT is are the people who, who think thatIT moves way too slow and is way too boring.
(57:26):
And they wanna do what they wanna do, and theygo out and they do it anyway.
They've got a credit card.
They're just gonna charge it.
And then, you know, we don't know until a monthlater when the credit card statement comes
through that they're using a system.
And why is that a problem?
Because there's no review of governance.
There's no review of security.
(57:47):
And we basically are are putting our data andour systems at risk.
Right.
As well as oftentimes my biggest challenge is Ithink this is really great for you, but it
doesn't scale to the size of the company thatwe need to scale to you saying, oh, it's super
easy for me.
Well, that's great for you individually.
(58:08):
But when we try to scale that to the size ofthe company, it just doesn't work.
It breaks down under that under that strain.
It also creates a it also creates a nightmarefor the IT team or IT department.
Their support.
Meaning, if something breaks with it, ifsomething happens with it, who are you supposed
to call to solve that problem?
Typically, that's the IT department.
(58:30):
But if you have shadow IT, you bought it onyour own or didn't involve IT in the first
place in strategically planning for that.
Yeah.
This is the the too slow parts.
Right?
Yes.
And if
you didn't plan for it, then all of a sudden,you're asking the the IT team to solve a
problem that they didn't even know existeduntil you brought it in at that moment, and
(58:50):
that problem started month or months ago, IT isnot all encompassing and all knowing.
Right?
So, I just wanted to kinda go down that bitbecause the governance side, right, is the data
side.
But if your data is in multiple platforms thatyou don't even have control over
(59:13):
Yes.
Then you expose yourself to risk.
Absolutely.
So with shadow IT, don't call the ITdepartment.
Call Ghostbusters.
That's what I got
call Ghostbusters.
That that would be that would be great.
Although, in my job, I was told when I came inthat anything with a plug is in my domain.
(59:34):
So that includes the coffee maker.
When I worked at a firm before, that was our ITdirector, had IT director as one card, he has
separate card that said anything that plugsinto a wall.
Yes, exactly.
That is exactly my that's exactly my job.
Yes.
This has been amazing.
This has been tons of fun.
I feel like we could go for another hour, to becompletely honest.
(59:55):
But, you know, what we find is listeners don'tdon't wanna listen that long.
So maybe we'll have to do a second episode.
But, we like to ask this question to all of ourguests.
So we'll ask it to you.
If you could go back twenty years, that's02/2005, what advice would you give yourself?
Yourself?
I think that I would give my myself the advicethat, like, pay attention to the the wobbles
(01:00:18):
back and forth is that it's gonna come back.
Like every time, you know, if you're making ifyou're making investments in huge systems that
have good power and open APIs and give youflexibility, that's where you should be
(01:00:39):
investing because eventually people are gonnacome back to you and say, wow, I you know, all
these spreadsheets and access databases andeverything that I built for myself, I really
don't wanna do that.
I want to just be focused on my job and like,I've got an enterprise system for you.
And I've got tools that I can, that I canrecreate that bit that you did yourself and
(01:01:03):
made it in a way that's supportable, that'srelatively inexpensive for us to produce, and
that we have control over.
That's very interesting.
Mhmm.
This is an answer from someone that's been inthe construction industry, but that's more on
the tech side.
Very different answer from what we've gottenpreviously.
(01:01:24):
From the construction side?
Like No.
Buckle up.
Yeah.
It's gonna change.
Buckle up.
Because even in the nine years that I've beenin the industry, like the pace of change after
the first round of cash that SoftBank put in,like seven, almost eight years ago, it has gone
(01:01:45):
from zero to 150.
It's it's awesome, but also crazy.
Wow.
We're going to put all your, social and allthat kind of stuff in the show notes, as well
as, the, the post that you post this week,sometime.
I'm going do it.
Throw that in But if somebody wanted to getahold of you, what's the best way for them to
do that?
If you can find me on LinkedIn, I am veryresponsive on LinkedIn and much more so than on
(01:02:08):
the email, honest.
My email is just insane.
But LinkedIn is if you find me there, follow mefor pretty consistent conversations about AI,
but also complaints about people who I who aredoing it the way that I don't want them to do
it.
That's super fair.
Before we say our goodbyes, is there anythingelse you'd like to tell our listeners?
No, I think if you're starting your AI journey,just start slow and start at the edges because
(01:02:33):
the edges are going to be where you're going tofind the best use cases and people will start
adopting better from the edges than they willfrom the center.
Well said.
All right.
This has been a ton of fun.
I hope everybody's gotten a little bit of AIeducation today.
I know I have.
I love Adam's brain and how he sees it,specifically in the construction space.
(01:02:55):
And until next time, adios.
Adios.
Thank you.
Thanks for listening to Building Scale.
To help us reach even more people, please sharethis episode with a friend, colleague, or on
social media.
Remember, the three pillars of scaling abusiness are people, process, and technology.
And our mission is to help the AEC industryprotect itself by making technology easy.
(01:03:20):
So if you think your company's technologypillar could use some improvement, book a call
with us to see how we can help maximize your ITcybersecurity strategy.
Just go to buildingscale.net/help.
And until next time.
Keep building scale.