Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Sam Gerdt (00:08):
Welcome everybody to
Road Work Ahead, a podcast that
explores the unmapped future ofbusiness and technology.
My name is Sam Gerdt and I amyour host.
Last episode we talked a lotabout the importance of
retrospection in the process ofgrowth and innovation.
Thinking back through all thetalks I've had over the past
several months on this podcast,this idea is the one that stood
(00:29):
out to me as being the mosthelpful for anyone who's feeling
uncertain about the future.
In looking back, we can revisitthose core values and
principles that should beguiding our decision making.
We can find and help the peoplearound us who've been falling
behind.
We can organize and optimizeour business processes and our
thought processes.
So today we're doing somethinga little bit different.
(00:53):
I invited my friend andcolleague, Erin Durham, into the
studio to look back over thepast seven episodes and talk
about the ideas that made thebiggest impact for us in our own
work and in our own thoughtprocesses.
We're calling this episode aretrospective.
It was an excellentconversation.
I hope you enjoy it.
All right, Erin, we're sevenepisodes in seven interviews
(01:17):
something like eight or ninehours of content.
What's the big takeaway?
Erin Durham (01:23):
I think the big
takeaway is that nothing matters
like people matter.
People come first.
It's the most important thing.
Technology is always there as atool to support your people,
whether those people are youremployees or your clients or
(01:44):
your customers.
People are the most importantthing, full stop.
Sam Gerdt (01:49):
Yeah, it's
interesting.
Especially, I noticed this withWill, I noticed it with Mitch,
I noticed it with the Strants.
The Strants.
I noticed it with Mark andprobably everybody at some point
.
Yeah, a is a tool.
(02:10):
Yes, then let's talk aboutpeople.
Erin Durham (02:12):
I spend all of our
time talking about people.
Sam Gerdt (02:14):
It's really
interesting I was thinking about
this morning.
Actually, I remember how I wasthinking when we first started
the podcast, because I rememberwriting interview questions and
preparing for some of our firstshows.
I remember the mindset.
(02:35):
I remember how I was thinkingabout AI.
I remember how I was thinkingabout all of this disruptive
technology.
I remember how I was thinkingabout it from that business and
marketing and sales perspective.
Now I look at how I'm thinkingabout it.
It's completely different.
It's evolved and changed in somany ways.
Part of that is because twomonths have passed and this is a
(02:57):
really really fast moving space.
We're really just learning athyperspeed.
That's a big part of it, butthe other part of it is getting
outside of your own head with itand talking to people who are
coming from vastly differentplaces.
None of our interviews havebeen with people who we know
very well, I think, with maybethe exception of Will, who I've
(03:19):
had a pretty good relationshipwith over the last several years
, I was not terribly familiarwith any of these people.
I wasn't terribly familiar withtheir work.
To sit down and talk with themabout things and not really know
what they were going to say,because we do some preparations,
but not a lot.
It was really interesting tohear the responses because often
(03:42):
, especially early on, theresponse was not what I thought,
because it's not what you'rehearing out in the world.
It's not what you're hearingfrom these leaders in industries
like Sam Altman, MarkZuckerberg and even some of the
podcast hosts that I thinkpeople are getting.
(04:03):
Some of this information fromLex Friedman is one that comes
to mind.
You listen to that content.
It's good content, but you getthe impression that AI is one
thing.
Then you talk to the people thatwe've talked to.
That is not the impression thatyou get at all.
Okay, how so?
There's a mantra that I thinkhas come up multiple times AI is
(04:29):
a tool.
Yes, ai is a tool.
Erin Durham (04:32):
Yes.
Sam Gerdt (04:35):
The way that people
would say it it turned into.
It was obvious that it was morethan just a throw, a comment.
Well, you know, AI is a tool.
So XYZ, they would saysomething of impact, and then
they would say AI is a tool.
You could see that there's adeeper meaning behind what
(04:56):
they're saying.
In thinking about that, what Irealized is if you're
encountering a lot of disruption, uncertainty in business and
life, whatever what do you runto?
You run to the things that youknow are always going to be true
.
Okay, we all have this desireto have absolutes in our life,
(05:17):
anchors, truths that we cling toand run to.
Sure, in this particular time,one of the truths that people
are uncovering and, in a sense,making true by doing so because
AI could be something else, wecould make it something else.
Erin Durham (05:37):
I think many people
are trying to make it something
else.
Sam Gerdt (05:38):
Many people are, but
the people that we're talking to
are, in a sense, manifestingthis truth.
Ai is a tool, and that's themindset that they're fostering
in their thinking about theirown businesses and their
thinking about what's coming inthe future.
I think that's incrediblyimportant and I think that it's
incredibly good.
(06:00):
I think that what they're doingis so good.
That reminder, that constantreminder AI is a tool takes you
to the next step, which is tosay okay, well then, obviously,
what's more important?
Erin Durham (06:13):
What is more
important.
Sam Gerdt (06:15):
Because when I'm out
in my wood shop and I'm making
something, I'm not hyper-focusedon the tools.
I'm hyper-focused on the workthat the tools are doing.
That's right.
I'm focused on the goal ofbeing out there and doing any of
this, and that might not evenbe the work.
That might be the giving of agift, or that might be the use
of a piece of furniture foryears and years.
Erin Durham (06:37):
Whatever it is
which still comes back to people
.
Yeah, exactly.
Sam Gerdt (06:40):
Exactly.
I feel like that's.
The big takeaway for me fromall of this is just that sense
of how my thinking has changedfrom being so focused on the
technology because, we've had tolearn so quickly and so much
being so focused on thetechnology to now having those
(07:03):
voices rattling around in mybrain saying okay, yes, you have
the technology, but what's waymore important is what are you
going to do with it?
How are?
You going to apply it?
How are you going to improvepeople's lives with it?
Super interesting.
Erin Durham (07:19):
Yeah, because all
of this AI comes necessarily
with a lot of hype and a lot offear.
I think so much of what we see,at least in headlines and in
various forms of content thatwe're consuming, feeds into that
fear-based reaction.
(07:39):
How then, as business leaders,how then as marketers, do we
avoid making fear-based reactivedecisions about the directions
that our companies are going?
To grow.
What sorts of mindsets do weneed to be adopting?
What are the consequences thatyou can foresee for those
(08:02):
fear-based, reactive decisions?
Sam Gerdt (08:05):
Can you talk?
Erin Durham (08:05):
Ca about that a
little bit.
Sam Gerdt (08:06):
Yeah, absolutely.
That's something that I've beenactually thinking more about in
separate things that I'm doingis this idea of what is the
value of speculation?
Where does it fall?
How useful is it?
I lean towards it not beinguseful Speculation.
What I keep telling myself isspeculation is not productive.
(08:30):
Where that really helps issometimes when you're working
through a problem, you'll end upgetting hyper-focused on
something that isn't true yet.
Maybe it will be true one day,but you can get hyper-focused on
an unknown future.
When you leave the realm ofwhat's applicable and dwell too
(08:52):
long in that realm ofspeculation, you cease to be
productive.
You cease to think productivelyabout the problem that you're
working on.
I've been trying to make a lotof analogies recently to try to
understand where we're at.
One of the analogies that comesup in my mind is back in the
(09:12):
50s, when Sputnik went up.
You had all of a sudden thisstrange thing that caused
strange new technology, caused aton of fear.
It caused a ton of uncertaintyabout the future, very
reminiscent of what we'redealing with with artificial
intelligence.
Sure, but the entire culturechanged with that event.
(09:36):
One of the changes that happenedwas you had all the science
fiction all of a sudden justflood into the market and
capture the imagination ofchildren and adults Mostly
children.
You had comic books, you hadall the superheroes that all
really flourished in thatenvironment.
I'm thinking of all of the Wardand June cleavers of the world
back then who were telling theirkids to get their head out of
(09:58):
the clouds and focus on what'sright in front of them.
There is so much good value inboth of those things, both
letting your imagination runwild with head in the clouds,
thinking that science fictionthinking, and saying okay, yeah,
but now it's time to go intothe real world, and deal with
(10:20):
real world problems, and I thinknow what I'm seeing is what I
want to see more of is yeah,let's have the head in the
clouds thinking, but let's labelit as such.
This is a comic book, this isscience fiction.
Let's clearly demarc what isreal and what's not, because
that's not happening.
Erin Durham (10:40):
So keep that
healthy perspective on
speculation.
I like what you said.
You use the word hyper focusand I think that's important to
pull out, because, as businessleaders, it is our job to keep
an eye on the horizon andanticipate what's coming and how
we're going to navigate through, because we have people relying
(11:03):
on us.
Our employees rely on us, ourcustomers and clients rely on us
to be able to navigate thatwith skill and with accuracy.
But that word hyper focus is soimportant because, like you're
saying, it is easy to get pulledoff track and live in a
fictitious future.
Like you're saying.
Sam Gerdt (11:23):
Yeah, you need that
balance.
You need the word and Junecleaver on your shoulder saying
this is awesome.
Take it to the real worldthough, like pay attention to
what's in front of you, and weget this a lot with sticking
with AI, thinking about thecoming of AGI artificial general
(11:45):
intelligence or, even beyondthat, super intelligences.
A lot of people are spending alot of time there.
And what?
they need to understand is we'renot there, like that's not a
real thing yet, but doesn'texist yet.
We're not even 100% sure thatit can exist.
And even if it does exist,let's not forget that first word
the artificial.
(12:07):
Whatever it is, it's fake, it'snot human.
It is fake.
Erin Durham (12:13):
Yeah, that's an
important one.
Sam Gerdt (12:15):
It might be super
capable, but it is artificial,
it's not real, and there are alot of people who are going to
yell at me over thinking that,but I think that's super
important, and so that head inthe clouds thinking it's helpful
because we are going to getmore and more capable
(12:35):
intelligences that are have theappearance more and more of
human intelligence, even humansuper intelligence, and so when
you're thinking about that,though, taking it and applying
it to the real would actuallyexist and saying, okay, well, we
have an opportunity.
Now, knowing that this might bepossible in the future, knowing
(12:57):
that this is the directionwe're headed, how will we apply
it?
And that's where those mantrasAI is a tool.
So, important, because what thatis, that mantra, that's Warden
June sitting on your shouldersaying, okay, how are you going
to use it?
Erin Durham (13:12):
to do good, right?
Let's keep this healthyperspective yes.
Sam Gerdt (13:16):
And so there's so
much that you could unpack there
.
It's so rich, especially whenyou're talking about reaching
people and making humanconnections, marketing sales,
whatever it is, even just livingas people, and that's where our
conversations, I think, haveshined the most.
Erin Durham (13:33):
Because marketing
and sales should be about making
those real human connections,and we want for them to be about
that.
And so the use of thetechnology comes down to how.
How can we make technologyserve that mission, which Will
talked about, you've all talkedabout, you know how do we use
(13:57):
these tools that are emerging toserve the mission?
How can we focus on thatmission, regardless of what
technology comes out right?
Sam Gerdt (14:05):
Yeah, think about it
in context of your brand.
We talked a lot about brand.
Think about it in context ofyour brand, your brand.
Like Will said, your brand is aproxy for who you are Right.
An analogy?
I love analogies.
An analogy that I use is if youwere going to write a letter
from the CEO, would you hire anintern to write it?
Erin Durham (14:31):
Right?
Probably not.
Sam Gerdt (14:32):
Absolutely not.
Much less should you outsourcethat kind of branded voice to an
AI, to an LLM which has whichhas even less ability to think
in a human way and to make humanconnections and to know you
than that intern does.
Erin Durham (14:50):
Right, because, as
you've said many times, it can't
actually think it just does areally good.
It's artificial, exactly, it'sartificial.
Sam Gerdt (14:58):
It's artificial.
It's really good faking ofthinking and it can.
Erin Durham (15:01):
It can be kind of
astounding sometimes to
experience it and to play aroundwith it and it certainly
appears like it's thinking butit's not and it doesn't have a
background of human experienceand brand development and brand
even brand maintenance, I think,is an art, and art still
belongs to humans, For sure.
(15:21):
Regardless of how much thatartificial intelligence can fake
it.
Sam Gerdt (15:25):
If you think your
brand is meant to do something
other than convey an emotionalresponse, like to elicit an
emotional response from a person, then you don't understand
brand oh, that's so powerful andso.
Erin Durham (15:34):
Can you say that
again?
Yeah, Okay so think of.
Sam Gerdt (15:38):
I mean just just.
Let's dwell on this for asecond.
Yeah, the purpose of brand, thesingular purpose of brand, is
to elicit an emotional responsefrom your consumer, right from
your customer, and thefundamental flaw with any
artificial intelligence isalways going to be not just now,
(16:00):
but always, no matter how farthis goes is always going to be
the absence of emotion.
Right, and so your brand isalways and forever going to be a
distinctly human endeavor.
Yes, it can't be anythingdifferent.
And so, even even speculative,like thinking we were, we're
(16:24):
never to a point to where yourbrand should be developed by an
artificial intelligence.
It's just, it's just a, it's an.
There's just no way forward toI would.
Erin Durham (16:34):
I would even add to
that and say your brand
shouldn't even be maintained.
Your relationships with yourcustomers should never be
maintained by any sort ofartificial intelligence, because
I mean.
Think about if you have, ifyou've put in the leg Work to
develop those relationshipsHuman to human in the first
place.
(16:54):
Can you imagine the breach oftrust that's going to be
experienced by your audience Ifa time came where they found out
that the humans in that brandcouldn't even be bothered to
Maintain those relationshipsthemselves and they, they,
seeded that control?
Yeah over to something that isstill a thing.
It's artificial, it can't even,it doesn't, it doesn't think.
(17:16):
It's not a true representationof a human and trust is Precious
.
It should be treated sacredly.
Mm-hmm and AI cannot be trusted.
Sam Gerdt (17:27):
Yes, to do that, yeah
, yeah there's a distinction to
be made between Augmentation andreplacement, sure so the the
expectation and this is ahealthy expectation, the
expectation is that artificialintelligence is going to greatly
augment the work that we do.
Yes like every single person inour industry, is going to have
(17:51):
AI Underpinning the work that wedo right, and that's going to
take a lot of different forms,and some people are going to do
it poorly and some people aregoing to do it really well.
And then you talk about the lossof jobs.
You know, Paul Roadster talks alot about this.
He's he's the guy who's talkingabout artificial intelligence
(18:11):
in the marketing realm right.
There are others, but his iscertainly one of the loudest
voices.
He talks about this idea thatAI is not going to replace you.
Someone who can use AI betterthan you is going to replace you
.
That's what we're talking aboutwhen we're talking about
augmentation.
Ai is, is is going to be thisincredible, incredible Tool that
(18:35):
when wielded by a capableperson, the the exponential
increase in, in output andquality.
Yes, is going to be.
It's going to mean that Oneperson can do the job of.
First it's going to be oneperson can do the job of two
people, and then it's going tobe one person can do the job of
four people.
Then maybe eight people, andthat's where they'll.
(18:56):
That's where the loss of jobsConversation, I think is more
realistic.
It's fear-based.
But then you have to askyourself okay, well, what's my
job now?
Well, your job is to learn.
Your job is to know.
Your job is to be the guy whocan wield it, and there'll be
degrees of that, you know.
There'll be the person who youknow they can do the work of two
people, and then there'll bethe person over here who's doing
(19:16):
the work of 16, because theywield it with different skill,
so that I think that's a.
That's a, that's a line that weneed to divide between you know
what is what is augmentation,what is replacement?
There are all kinds of thosethings that are coming up in in
when we talk about AI.
I want to, I want to leave AIand talk about some of the other
(19:36):
stuff, but I'll say this butOriginality is not creativity.
People need to separate thosetwo.
Okay, so the outputs of theoutputs of an AI are original,
but they aren't creative right.
So people are kind of conflatingthe two.
It's like oh well, ai can bevery creative because it's
putting out this original stuff.
Originality and creativity twodifferent things.
(19:57):
The other, the other big onethat I want to mention is
Intelligence doesn't equaldominance.
That's where people are, that'swhere a lot of fear comes from
okay.
Well, if AI is more intelligentthan humans, then it will be
more dominant than humans,absolutely not true.
Intelligence and dominancedon't equate to one another, and
this is the difficulty, becauseHistorically that's always been
(20:21):
the case.
Erin Durham (20:21):
Well, I would say
that the history of science
fiction gives us that message.
Sam Gerdt (20:25):
Yeah, history of
science fiction gives us that
message.
Historically, we look at.
You know, biologically, in ourworld, the more intelligent
species are top of the foodchain.
That's, that is how our worldworks.
The difference is now, all of asudden, we are making the
intelligence, and it's a fakeintelligence, is an artificial
(20:46):
intelligence.
Erin Durham (20:47):
Well, it's not a
species.
Yeah, it's not a species?
Sam Gerdt (20:49):
It's not.
It's not Biologically, it's notcompeting.
You know, there's no process of, there's no process of
competition in the ecosystem todetermine who's on top and who's
not.
We're literally giving it thepower that we wanted to have and
then there's some fear thatthat it because it's more
intelligent Than us that it willbe able to take power to itself
(21:10):
.
Right fears largely unfounded.
So I you know, separating outthat idea of intelligence and
dominance super important.
All right.
Erin Durham (21:18):
Well, let me ask
you this, because we're talking
about Fear-based reactions.
We're talking about we'vetouched on the loss of jobs, and
I've heard you say personallythat jobs are not going to
disappear.
They want to evolve.
There will be different jobs.
So, help, help me understand,and let's dig into a little bit
the responsibility that those ofus who are, who are sort of
(21:42):
understanding, learning thisemerging technology right now
what responsibility do we haveto those around us, whether
that's co-workers who have somecompetence but have some
hesitance about all of this newstuff that's coming, or they
they are afraid that they'regonna lose their jobs.
How can we help alleviate thatfear?
(22:03):
And also, how can we help thosearound us?
How can we give them a leg up,how can we help them up the
ladder behind us so that theyaren't left behind and and
losing out on jobs?
Sam Gerdt (22:14):
Yeah, so let's relate
this back to what we talked
about with the strands, withMegan and.
Lorian strand.
We were talking about it in thecontext of the diversity of a
workforce you know the humannessof of a diverse workforce and
and In in that discussion, whatwe highlighted was a lot of
(22:36):
those fears, a lot of thoseinsecurities, a lot of those
concerns are unvoiced right, andthey are a lot of the
limitations that people havethat we have labels for, like
being whether it's ADHD or ASDor some other neurodiversity Are
(22:56):
either undisclosed orundiagnosed, and so the the
bottom line is we are not doingenough to uncover the diversity
of our teams, to uncover thediversity of our workforce, and
we're not doing enough to createan environment that allows for
people to work in a way thatfeeds their strengths yes and
(23:17):
that allows them the freedom totalk about strengths and
weaknesses.
People can tend to be fearfulwhen, when really simple things
come up like okay, so youweren't paying attention in a
meeting, or you got distractedin a meeting and you missed the
next action step.
Mm-hmm.
Are you gonna be the one whospeaks up and says I'm sorry, I
(23:37):
got distracted.
Can you repeat that?
What's expected of me?
Or are you gonna be the one whogoes back to your desk and just
says, okay, now I've got a jobto do, but I don't fully
understand what it is?
Or I'm missing a key piece ofinformation that I'm just gonna.
I'm just gonna hope it allworks out for the best.
You know that shouldn't benecessary.
We should be the kind of peoplewho say you know what people
lose attention or you know thekind of people who say, yeah,
(24:02):
sure it was, it was XYZ, this isyour job.
If we were to do that, we woulduncover Truths about our
organizations and our teams thatI don't think we fully
understand right, and thosetruths Will walk us back to the
point to where we can moveforward together and be much
(24:24):
more cohesive, much more Capablemm-hmm Individuals and as
organizations.
Yeah and there's.
You know, we can unpack this inmultiple different ways too, so
you think about it just from aninterpersonal standpoint.
Like you know, people haveneeds, people have things that
bother them, people have baddays and good days.
You could unpack it there,mm-hmm.
You could also unpack it withregards to, like, straight
(24:46):
technology, like what's thepoint of applying a technology
To a situation that you don'tfully understand it would?
It just adds to chaos, rightthat we're seeing more and more
of that.
People like oh ai's ability toProcess our data and you surface
all of these insights is goingto be revolutionary for us, and
(25:07):
the truth is, it's probably notbecause your data is a mess,
because you've got you've got 15years of data that you've just
been throwing in a closet andit's not organized and you've
got, do you know?
Duplicate data.
You've got confusing datawithout context.
It's a mess.
There's a huge opportunityright now for, I Think, agencies
(25:27):
like us to to start workingwith Clients on, you know,
getting ready for that AI oracleyou know, I've talked about
that with Chris, chris Leonie.
He really wants that for hiscompany.
His is a, his is an agency.
I think that will be ready forit.
He's he's a very data centric,mindful guy.
But there are a lot ofcompanies out there who are like
(25:52):
oh yeah, I would love that, butthere's so much legwork that
has to happen before that's evenfeasible because these
intelligences they can't.
They can't.
They can't necessarily clean upyour data right and then make
it actionable.
They're there to make cleandata actionable right.
Erin Durham (26:07):
They can't make
human decisions about what's
valuable and what's not valuable.
Yeah, and it's the most Lorianwas talking about this.
What's the most Valuableversion of a document that's
been saved?
Sam Gerdt (26:18):
Yeah, and that's
that's super important.
So so that's another way thatwe could unpack this idea of
Really capitalizing on thediversity of a team.
Really, you know, I'm reallyworking to understand what we
have in our corner and thenaligning everybody in the right
direction and getting everybodyworking together.
But you know, I think the bigtakeaway there is we, we have to
(26:43):
, we have to look bad, this,this show.
Mm-hmm.
This episode is a retrospectiveright.
We are looking back on whatwe're learning so as to move
forward more thoughtfully.
Right?
That's a whole point of this.
Yeah, organizations absolutelyhave to do that.
Yes you need to look back andtake stock on where you've been,
what you've done, what'sworking and what's not.
(27:03):
How are your people doing allof that stuff Before you jump
into the next thing?
And there's a whole wave ofnext things coming.
Sure that that you're gonnawant to jump into?
Yeah, and they're valuable,they'll be good, but they won't
be good if you jump into themprematurely.
Erin Durham (27:18):
Right.
I would even like, if I can,just bring it back to the people
element for just a moment,because you talked about this so
much with the Distance, withthe, with the neurodiversity
conversation, and I would evenchallenge business leaders to
ask themselves really honestlyAre you tapping into the
strengths that your people haveas individuals and accommodating
(27:38):
their working styles in orderto squeeze the best of them, and
Are you making provisions forthe weaknesses that people have?
And that's not to suggest thatpeople shouldn't improve
themselves, yeah, but I mean,we've done a lot of work around
this in our own company.
How do we balance each otherout and how do we tap into that?
(27:59):
And I thought it was really Ifound it really personally
meaningful for Lorian mentionedthis and I think you talked
about this too when you receivedyour ASD diagnosis that you
experienced a grief period, andI think Lorian's words were that
he Came in and broke down intears and said I have to be
(28:21):
medicated for the rest of mylife Just to exist in this world
, and I found that so Sad andvulnerable.
But also I think it's reallybeautiful, because when you
receive these diagnoses, you andI'm not to even get to the
point where you are chasing adiagnosis.
You've had to develop so muchself-awareness and you've had to
(28:44):
develop skills around speakingup for yourself, advocating for
yourself and and shining a lighton things that are vulnerable
and may have caused you shameright and Maybe there's a lack
of that in your, in yourteammates and your co-workers,
or maybe there's there's fear.
(29:05):
I mean, we know from all of thework that Brené Brown has done
over the last decade that humansare hardwired for love and
belonging and we have thisinherent fear of being found
unworthy of belonging With ourhuman connections.
And then, when you throwsomeone's livelihood on the line
, yeah, to where they feel likethey can't speak up about things
(29:26):
that they don't understand, orthey feel like they're gonna be
judged for getting distracted orfor not Working the way that
they perceive that theirco-workers are working.
That can create a lot oftoxicity in a team, and I think
it's really beautiful forNeurodivergent people To bring
(29:47):
that self-awareness and bringthose language skills of hey, I
have these needs.
Can we please find ways toaccommodate them?
And even Megan said this somany times she'll speak up and
somebody will go.
I'm so glad that you said thatand so there's there's a
challenge on individuals todevelop that self-awareness and
to Step out in that sort ofcourage to speak up for
(30:09):
themselves, and there's also achallenge to business leaders to
Make space for that and andprovide the kind of working
environment where People aren'tgoing to get punished for saying
I'm sorry, I was distracted.
Can we please, can we, can weclarify this right?
(30:30):
Yeah, it's funny.
Sam Gerdt (30:32):
I it's, I think.
I think we're just Still as, asyou know, evolved as we like to
think that we are and asstreamlined as we like to think
we are, with, you know, managingteams and even with our
knowledge of things likeNeurodiversity, I think we're
(30:53):
still really early days.
I agree with that I think we'restill.
We're still not having theright conversations or mindsets
around these issues.
Because there's there's ananalogy that I Saw recently and
I loved it.
It made so much sense it was,it was such a healthy way of
(31:15):
understanding the conversationaround neurodivergence and it
was this, this, this woman.
I'm gonna forget it if I, if Ifind it again, I'll put it in
the notes for the show.
Just imagine a bin full ofLegos, I know, and, and not any
other crazy space Legos or HarryPotter Legos, but like just
normal, like 1980s Legos that wegrew up with, right Bin full of
(31:37):
Legos and you take it and youdump it out on the floor, right
in one central spot.
The majority of the Legos aregonna land in a pile and it's
gonna be, you know, all like.
There's like five colors.
Right, there's gonna be all thedifferent colors of the Legos
Right there, but then there aregonna be the Legos that scatter,
yeah, and If you, if you lookat all of those Legos that are
(31:57):
scattered around that centralthing, that's what we mean when
we talk about neurodiversity.
We're talking about people whofall much further outside of the
middle In the way that theyrelate to one another, in the
way that they think in theirabilities.
Their abilities, to you know, dothings like time management,
task management, all of theexecutive functions, all of the
(32:18):
you know empathy and and all ofthese other things that we can
talk about.
They're they're all kind of inthis halo Around that central
space.
But then you look at the colorsof the Legos and you see, you
know, there's a yellow one overhere and around.
The colors are kind of like thelabels that we attach to these
neurodivergences.
Maybe it's ADHD, maybe it's ASD, whatever it is.
(32:40):
Um, and so you know you say,well, you've got, you've got a
yellow one over here, butthere's also a yellow one up
here, and those are two peoplewith ADHD and they're miles
apart from one another, but theyboth have this label that works
in helping them work throughtheir struggles Right.
Erin Durham (32:57):
Yeah, I love that
quote that you mentioned in that
conversation and you can tellme who it's attributed to, but
it's if you've met one personwith autism, that just means
you've met one person withautism.
Sam Gerdt (33:08):
Yeah, tony Atwood,
tony Atwood he's the one of the
leading guys on autism and ASD,very helpful early on for me and
for Lorian too.
He's the one who mentioned him.
So, yeah, that's the idea is,even with you can't take, you
can't make a box out of ADHD.
(33:29):
You can't make a box out ofautism, spectrum disorders or
dyslexia or any of these things.
These people are all vastlydifferent from one another,
right, but if you think abouthumanity, if you think about
your workforce, yourorganization, as that dumped out
bin of Legos, you have tounderstand that there's no point
(33:51):
on this, there's no point onthis spectrum, and there's no
Lego that doesn't have strengths.
Right, that can help yourorganization, right, I mean as
long as you're hiring.
Well, obviously, there are.
Just there are bad people outthere.
Or bad fits Bad fits, yeah, butgenerally, when we're talking
about neurodiversity,everybody's got their strengths
(34:16):
and everybody's got theirweaknesses, and if your focus is
well, I'm just going to focuson the ones that are right in
the middle.
They're the most predictable,they're the easiest to meet.
If your focus is there, you aremissing out on some incredibly
talented people.
Erin Durham (34:28):
Great.
Sam Gerdt (34:29):
I mean, you can do
that.
There's nothing wrong withdoing that.
That's one approach, sure, andthere are organizations that
will do that, yeah, but they are.
They're penalizing themselvesin a major way.
I agree with that.
Erin Durham (34:44):
I think humans are
wired.
The human brain wants to putlabels on things and it wants to
sort things into boxes, and wesee that every single place that
you ever look.
But I recently ran across aquote that I it was attributed
to Dan Levy.
I don't know for sure that hesaid I didn't hear him say it,
(35:05):
but I tend to think it probably.
He probably did and that wasthat whenever you think of any
of these traits as being aspectrum that opens the door to
empathy and I think that's thecall that needs to go out
everywhere is we all need toshow each other a whole lot more
empathy and, like you just said, if you can do that and you can
(35:27):
accept people for how they areand give them the space to be
their best selves, you are goingto have some amazingly talented
people doing incredible workfor your organization.
Sam Gerdt (35:38):
Yeah, I'm really
pleased with the amount of work
that's happened in the last 10years.
Particularly, I think ADHD hasgotten a lot of attention very
recently.
It's definitely and that thatattention is sometimes negative.
So there are those who are justjumping on a bandwagon and
(35:58):
trying to take advantage of aperceived trend.
I'm not a fan of that.
I am a huge fan of the awareness, because what it's doing is
it's highlighting some of thesethings that we're talking about.
It's highlighting the fact thatADHD is not a box.
There's no, there's noneurodiversity, there's no
condition, there's no disorder.
(36:18):
There's nothing in any of thebooks that is a box, right.
And the number of people whoare neurodiverse in one way or
another, it's a much greaternumber than anybody realizes and
not many people nobody that Italk to who has a label, who has
a disorder or a diagnosis,whatever it is.
(36:42):
None of us are necessarilyinterested in being treated a
certain way or having any kindof special treatment applied to
us or special rules applied tous, or even, like you know,
nobody's saying well, you know,we need to have diversity,
hiring for neurodiversity, right, like nobody's saying that.
(37:04):
What we are saying is please useme as an example of the simple
truth that everyone is different, everyone works differently and
that's good, and if youcapitalize on that, you're going
to get more.
You're going to get more valueout of us, you're going to get
more loyalty from us and you'regoing to make the world and your
(37:28):
workplace a better place, abetter place to be, to exist.
I think that's super important,I agree.
So that's where I think we'reevolving to in our thinking.
Is this idea that, okay, ratherthan focusing on labels and
boxes and diagnoses and you know, let's have a program for this
person and a program for thislet's just take a step back and
(37:48):
just embrace everybody and learnhow to identify strengths and
weaknesses in people of alltypes Right and then use those
effectively and that requires agreat level of adaptability,
understanding, empathy there'sall kinds of stuff that goes
into that.
Erin Durham (38:05):
Yeah, I would love
to see the language change, and
maybe it will.
From a diagnosis to just anunderstanding, does that make
sense?
Because I feel like there'ssome negative connotations to
the word diagnosis and I don'tmean to cheapen anything that
anyone's experienced.
Everything is a spectrum.
(38:25):
Everyone has different needs,different working styles,
everyone's brain is wired alittle bit differently, and so I
would love to see more of anembracing of all of that, like
you just said, and even to thepoint where we're changing our
language around it in years tocome the place where we're at
right now.
Sam Gerdt (38:41):
I actually like the
idea of diagnosis.
Okay, as somebody who wasdiagnosed with something, I
found it incredibly helpful.
Sure, I would say that mostpeople with a diagnosis will say
that the diagnosis is not afinish line, it's a starting
line yeah.
It just gives you so like adiagnosis, doesn't describe you
(39:02):
to a T Right, it never will butit gives you the right lane to
start in to discover so muchabout yourself, and people love
discovering truth aboutthemselves.
Erin Durham (39:15):
It's so helpful.
Sam Gerdt (39:17):
The grief that comes
with a diagnosis.
A lot of that is rooted inmissed expectations.
But the freeing part of adiagnosis is now all of a sudden
, you have an accurateunderstanding, a more accurate
understanding of yourself, andso much, so much, that you
struggle with.
There are times in my lifebefore a diagnosis where the
(39:45):
pain of not understanding myself, the confusion that comes from
that and probably not even beingable to communicate.
It can feel like your brain's onfire.
Yeah, it can feel like yourbrain's on fire.
It can feel like you're justlike locked inside yourself.
At the times, the diagnosiskind of gives you a key.
Sure, it gives you a key andyou unlock the first door and
(40:09):
there's a journey I use thatterm loosely, but there's this
process that starts to happen.
You start reading all thesebooks and you start meeting
people who have similarexperiences to yourself and
things just start to make a lotmore sense.
Erin Durham (40:23):
And suddenly you're
not alone and you have some
language to communicate withothers.
Sam Gerdt (40:30):
It gives you a
vocabulary.
It does, it gives you avocabulary.
Erin Durham (40:34):
And it gives you a
set of tools that you can
experiment with.
Yeah, Absolutely.
Sam Gerdt (40:38):
And yeah, it gives
you like there are like
everybody's different.
But you know, I think there arecertain areas where you can
relate to one another verygreatly.
You know, like, for example, Ihad an Asperger's diagnosis.
This was back when Asperger'swas still a thing.
They've kind of redefined itsince.
(40:59):
But a big part of an Asperger'sdiagnosis is, you know, they
want to measure how empatheticyou are.
Erin Durham (41:08):
Yes.
Sam Gerdt (41:10):
I am not very
empathetic.
Erin Durham (41:11):
Yes.
Sam Gerdt (41:13):
I'm not very
empathetic, but there are other
people with Asperger's who areincredibly empathetic.
That's not like a make it orbreak it thing when you're
dealing with Asperger's.
Same with, like OCD, obsessive,compulsive behaviors.
That's a big part of anAsperger's diagnosis.
I should say it was a big partof an Asperger's diagnosis.
(41:35):
I have very little OCD in myown life, but I know other
people with Asperger's who OCDis a huge thing.
It's actually like a barrierfor them.
It's something they have todeal with.
So there's all of these hugedifferences within these
individual diagnoses, butthere's still so much that you
(41:57):
can relate to one another on.
There's so much that you'relike, oh yeah, that's what that
is.
That's why people respondedthat way.
I get it now and that diagnosiscan be so freeing in so many
ways, and I do still think thatit's good to have it.
Erin Durham (42:15):
Yes, I wasn't
suggesting that people not find
those answers.
Sam Gerdt (42:22):
You just have to be
careful with what you do with
the label.
Yes, and I like the label.
Be careful what you do with it.
Yes, that's good.
We talked a lot, too, aboutresponsibility.
Yes, and we talked a lot aboutpersonal responsibility, and
(42:42):
this is an interesting one,because I think the average
person in any business ororganization is looking at
cybersecurity.
They're looking at artificialintelligence, data privacy and
protection, data collection allof these things that we're
talking about in the show andthey're saying somebody else's
(43:03):
problem.
I'm not the IT guy, it's not myproblem.
Erin Durham (43:06):
Well as someone who
can be overwhelmed by
technology can be overwhelmed byfeatures and benefits with new
platforms.
I mean, I would even confess tomaybe being a little cavalier
about who was getting my dataand what they were doing with it
, and just kind of assuming as Ithink Christy was the one who
(43:26):
mentioned that yeah, we talked alot about that we can kind of
assume that well, this is areputable company, I'm sure
they're doing what they'resupposed to be doing.
And with the emergence of thisAI technology that can crunch so
much data, suddenly we can'treally lie to ourselves that
we're part of aggregate data andnobody can really identify us
(43:48):
anymore, and we can't really lieto ourselves that we're just
part of the white noise anymore.
So what do we need to be doingabout that now?
How can we protect ourselves?
How can we help to protectthose who are more vulnerable,
who don't understand it even aswell as we do?
Sam Gerdt (44:06):
Yeah, that was such a
good conversation.
There have been severalinterviews that I've done where
the intended line of questionsfollowed a certain path and then
what people actually heard wassomething completely different.
That's perfectly fine.
I love that Christy's was oneof those.
(44:28):
I think Christy's was one ofthose interviews where I wanted
to talk about how do smallbusinesses protect themselves,
and what we ended up talkingabout was people need to be more
thoughtful in all of these,like individually taking
responsibility, being morethoughtful in taking care of
(44:50):
themselves and their own dataand making sure that they are
doing right by them, theirfamilies, their children, their
employers, and then looking outfor others to help them do the
same.
Erin Durham (45:04):
Yeah, I know you
personally feel like we have a
responsibility to protect thevulnerable in the specific space
.
Can you talk about that someSure?
Sam Gerdt (45:12):
I have a.
I have a soft spot for childrenand elderly people in this,
when we're having thisdiscussion, and the reasons for
those two demographics aredifferent.
So for elderly people, I'mgonna use a personal example.
(45:35):
So I have grandparents who arestill alive and they're in their
late 90s.
They were in their 70s whenhome internet became a thing.
Just let that sink in.
They were in their 70s when AOL,when you went to the grocery
store and got the AOL CD and apersonal computer, cost $2,500
(45:57):
and it was.
You had your gateway 2000 andlike.
Erin Durham (46:00):
Can you explain?
For the young people.
What a CD is.
Sam Gerdt (46:03):
Yeah, so they were in
their 70s when that happened
and I think I was probably atthat point.
I was probably eight or nineyears old and I think I was the
one, or my brother was the one,who set up home internet for
them.
It was that kind of thing Likethere was this disconnect.
They were in their 80s whensocial media came along.
Erin Durham (46:27):
Yeah.
Sam Gerdt (46:30):
And there's so much
time back there and now they're
in their 90s and their attitudenow is the exact same as their
attitude back in the early 90s,the early mid 90s.
Right, they're not interestedin this technology.
Erin Durham (46:47):
They're interested
in it to a certain degree.
Sam Gerdt (46:50):
So, for example, with
social media, this irked me,
then irks me now my grandparents.
They're having greatgrandchildren at that point in
time and they love photos andthey want photos of kids and
they want to fill their housewith these printed photos of and
(47:10):
you're saying, no, grandma, yougotta get on Facebook now to
get the photos of the family.
That is such an injustice tothat group of people because
Facebook was not built for them.
Facebook is not a healthy placefor really for anybody to be,
but it's not a healthy place forthat to happen and it's not
(47:33):
better.
It wasn't an improvement.
Erin Durham (47:36):
Yeah, facebook is
another one of those things.
Just to circle back to theexample you gave earlier,
facebook is a tool and nobodyseems to keep that perspective.
Sam Gerdt (47:46):
Well, there's a huge
conversation right now that's
happening, where we're sayinghow do we keep what happened
with social media from happeningwith artificial intelligence?
Because social media was justgiven to us and there were no
rules and there were noregulation and it was bad.
Erin Durham (48:03):
And your language
about it being a place to be.
Facebook is not a place to be,but people treat it like it is.
Facebook is a tool forcommunication, but it's not been
treated that way.
So back to the being aninjustice.
Yeah, let's get back to that.
Sam Gerdt (48:21):
So I look at
protecting the vulnerable when
it comes to things likecybersecurity or even data
privacy means that we need torecognize the strengths and
weaknesses of any demographic.
We need to recognize theimplications of the usage of
these technologies.
I mean, I can't tell you thenumber of elderly people who are
being scammed on Facebook andalso for media Like.
(48:45):
It's not a safe place for themto be.
They're not familiar with thetechnology and obviously I'm not
talking about every elderlyperson in the world.
I wanna make that clear.
But generally we need to bevery aware that there's a whole
host of people who have who arebeing preyed?
upon, they're being preyed uponand they're also being unjustly
(49:10):
treated by people who should notbe treating them unjustly, and
there needs to be more awarenessaround it.
I don't think people arenecessarily doing it on purpose.
We all get wrapped up in ourown lives and we leave people
behind and we do itinadvertently.
Yep, that's an area where we'veleft people behind.
Sure, and I've seen that.
I don't like that.
It's not, I mean, as hard as itis.
(49:30):
You gotta go buy the littlephoto printer and you gotta
connect it to your cell phoneand you gotta print your
pictures.
Yeah, this doesn't have to behard and you can do that online
you gotta put them in anenvelope and you have to go to
the grocery store and buy a bookof stamps.
And you have to write a letterand you have to put it in the
mail and you have to send it topeople.
Erin Durham (49:52):
The post office
still exists folks.
Sam Gerdt (49:54):
It's hard.
I can't believe that people didthis every day Not that long
ago.
It's hard, but you have to doit.
It doesn't have to be.
I know you can make a habit outof it.
It can be enjoyable.
Erin Durham (50:06):
It's about being
intentional about the people who
matter in your life.
Yep.
Sam Gerdt (50:11):
And then there's
children, and the issue with
children is that we're notteaching them, we're not
diligent enough in teaching themthe implications of their
actions and we're not lookingout for their best interests.
Again, this is not a problemthat everyone has Right, but
it's something that we need tobe aware of.
Erin Durham (50:29):
Yeah, it's
something we're seeing a trend
in.
Sam Gerdt (50:31):
You can't give a
child a device, whether it's a
phone or a tablet or whatever.
You can't give a child a deviceand, even with parental locks
and parental protections, justlet them use that device without
an education to go along withit.
There is, there are a number ofdifferent directions that you
(50:52):
could look at this.
There's the mental healthimpact of giving a child a
device in the first place, ohfor sure.
There's the mental health impactof the things that are on the
device like social media, thebullying that can happen there,
the interactions, the socialinteractions that can happen
there, but also like theaddictive algorithms and the.
You wanna talk about issueswith dopamine and attention and
(51:16):
all of that.
Erin Durham (51:16):
Oh yeah, talk about
that for just a moment.
Sam Gerdt (51:18):
So there are a whole
host of people in the world who
have the appearance of aneurodiversity whether it's ADHD
or whatever who don't actuallyhave that.
They're just addicted to theirphones.
They're just addicted to thedopamine hit of TikTok or
YouTube or whatever.
Erin Durham (51:33):
Which are designed
specifically to keep you engaged
on that platform, unable tostop scrolling to the next thing
.
Sam Gerdt (51:42):
This is an area where
AI could hurt us even more if
we let it, because thealgorithms now for social media
are incredibly primitive.
Erin Durham (51:53):
Which is scary to
think about, because they're
pretty good.
Sam Gerdt (51:56):
Yeah, so you plug in
an algorithm that is.
That isn't like.
It is intelligent and then yougive it the ability not only to
feed content, but also toproduce the content that it
feeds based on its perfectunderstanding of your individual
psychology.
Because of all of the datacollection, that happens and you
(52:17):
get into this situation wheredoom scrolling becomes that much
more addictive.
Erin Durham (52:25):
Which is not to get
too much into doom and gloom,
but it is important for peopleto be aware that this could be
coming and it makes that keepingthe tools in perspective as
tools and being mindful aboutyour usage of them.
Yeah, I think I've said this onthe show.
Sam Gerdt (52:42):
If AI is gonna
destroy the world, that's how it
destroys the world.
It's not nuclear winter, it'snot any of that other stuff,
it's by just-.
Erin Durham (52:50):
Sucking us into our
phones and never letting us go.
Sam Gerdt (52:52):
Entertaining us to
death, absolutely Entertaining
us to death.
Yeah, so children areespecially vulnerable because
they don't have knowledge orforesight of how these apps work
.
What they're doing what?
these devices are doing they'renot thinking about.
They're not thinkingnecessarily about themselves as
adults, and parents aren't evenreally doing that either.
(53:14):
I mean, parents are puttingtheir kids on social media left
and right and center.
They need to be more aware ofthat too.
But the ability to exploit evenjust the image of a child and
absolutely ruin that child'slife, your family's life you
(53:36):
don't wanna think about it, butthat reality already exists.
This is not science fiction.
That reality already exists.
I would be very cautious and Ihave children.
My children are not on socialmedia.
They don't have access todevices outside of a very
limited window with extremeparental supervision.
They're getting an education athome about what these devices
(54:00):
are meant to be used for andwhat they are capable of.
They are.
We keep them.
We have some social mediapresence.
Our children are not there.
You won't see them there, thatkind of stuff.
I think that needs to be muchmore common, because these kids
are gonna grow up and they'regonna want jobs.
(54:20):
They're gonna grow up andthey're going to apply to
schools.
They're gonna grow up andthey're going to want to have
relationships and get marriedand have partners and all of
that, and when you can go backand look at all of the things
that have ever happened to aperson and those things can be
(54:41):
exploited by bad actors.
Those things, it's just.
It's not at all healthy whenyou project it out into any kind
of future.
It's not healthy to have thatand children need to be more
aware of that.
They need to be more protectedand they need to be more
educated so that they canprotect themselves.
Erin Durham (54:58):
So, before we get
too far into parent guilt, do
you have any practicalsuggestions for someone who is a
single parent and doesn't havethe capacity or even two working
parents who just don't have thecapacity that you have in your
family to keep that kind ofcontrol?
Sam Gerdt (55:20):
Yeah, it's my family,
is my family and that's my
experience, so that's alwaysgonna be hard for me.
It seems to me like one of thebiggest problems is that we
assume the necessity of certaintechnologies.
Sure, we can relate this backto business.
(55:42):
We assume the necessity of thenext greatest tool and we lose
sight of our purpose.
And so if you're gonna applythis in a family scenario,
regardless of the family,families have to have a brand.
You know what you're about.
You have a brand and that brandhas a purpose.
(56:03):
You have your core values.
You have all of that.
You have to relate anytechnology back to that and I
think there's not enoughthoughtfulness there.
But generally, I think itstarts when kids are young,
educating them on who you are asa family, what your values are,
and equipping them for thinkingabout outside influences coming
(56:25):
in.
And what are those influencesgoing to do for you, to you, how
are you gonna use them?
Technology is one of thoseinfluences.
So it's not just cell phonesand tablets or whatever.
It's the internet, it'stelevision, it's all of it.
You have to.
It's a form of gatekeeping.
(56:46):
You're not building anfortified shelter around your
kids as much as you are going ina particular order.
First comes an education, firstcomes an established set of
core principles.
Then we can begin to introduceand plug in new technology, new
processes into these littlelives and turn them into
(57:11):
citizens who are going to beproductive adults, productive,
who are going to be positive andkind, valuable additions to the
world.
Erin Durham (57:21):
So yeah, within the
framework of that set of values
.
Right, yes, yeah, ideally Imean that's your job, is
introducing those things.
But that framework has to bethere to begin with.
Sam Gerdt (57:33):
Every parent has a
set of core values that they're
instilling into their children,and every parent wants their
children to retain that.
They want to see that carriedon.
Erin Durham (57:44):
It doesn't always
happen.
It doesn't always happen.
Sam Gerdt (57:46):
That's a sore spot
for a lot of parents and you say
, oh, my kids turned out sodifferent than me.
Erin Durham (57:51):
It's like, well,
First of all, they're growing up
in a very different environment.
But, second of all, you can'tcontrol the outcome.
All you can control is how youparticipate in the outcome,
right, and I think the mostimportant values are gonna be.
Sam Gerdt (58:06):
they're going to want
to carry them forward, like if
you, as a parent, can'tcommunicate the value of a
principle, then why would theywant to?
Then they're not necessarilygoing to keep it, but if you, as
a parent, can articulate awell-reasoned argument for why
this principle matters, and candemonstrate it without hypocrisy
(58:28):
in your own life and candemonstrate its application in a
variety of differentcircumstances in the real world
which is a hard job to do, butthat's the parent's job Then the
child is going to not onlycarry it forward, they're going
to carry it forward with zeal.
They're gonna carry it forwardwith enthusiasm, because you'll
have done a good job ofreinforcing it.
(58:50):
Boy, we're way off track.
That's okay?
I don't think so.
Erin Durham (58:54):
I think that all of
this is applicable to
businesses too.
I mean, you do the same thingas a business leader, or at
least I hope that you do that.
You're creating a culture, aculture code.
You're living your brand.
You're infusing your brandBased on the same set of
principles.
You have developed a set ofcore values.
You can articulate why they'reimportant.
(59:16):
You can inspire your people toalso work within that framework
and be ambassadors of your brand.
It's exactly the same set ofprinciples that can be applied,
concepts that can be applied.
Sam Gerdt (59:32):
Yeah, and we started
this conversation, this topic.
We started it by saying how dowe protect the vulnerable?
Well, that, but we also said atthe same time how does an
employee protect what's theindividual responsibility?
Erin Durham (59:45):
Yes, what's the
personal?
Sam Gerdt (59:45):
responsibility.
How do we protect thevulnerable?
But what's the individualresponsibility?
And so organizations can insome sense view themselves as
family.
I don't think that that'sparticularly helpful all the
time.
No.
But if you look at all of thepeople who work in your
organization as beingindividuals with individual
(01:00:08):
responsibilities, you haveincentive to instill your
company's values into themAbsolutely and see them apply
those not just to the jobdescription, but apply them to
the culture of the organization.
Apply them to interpersonalrelationships within the
(01:00:28):
organization, which contributesto culture, apply them to
interactions with customers, anda huge part of your brand
development becomes your peopleand how they're working, what
kind of work they're producing,how they're interacting with
each other, with customers, andso there's this big correlation
I think that we could draw.
(01:00:48):
And then the safety aspect ofthat is you can't have people
working for you withoutentrusting them with a part, an
important part, of yourorganization.
Erin Durham (01:00:58):
And what are they
gonna do with?
It.
I love that.
Sam Gerdt (01:01:01):
So like they are an
ambassador, just like our
children have our last name andso if they go do something
stupid and end up in thenewspaper, that's our name.
That's kind of an old mindset,something my dad would say to me
you have my name so don't go dosomething stupid.
But it's true.
It's absolutely true.
You don't want a rogue employeeto go off and say something
(01:01:22):
stupid on social media becauseyou can guarantee your company's
gonna get dragged into it yesor representatives?
So, then, how do you foster thismindset of the individual
responsibility to protect theassets of the company?
We talk about cybersecurity,having individual security
(01:01:43):
standards that are up to par,but you can also talk about
brand development and what arethose core values?
Does your employee know yourvoice?
Are they speaking with yourvoice and then you can also talk
about like are they looking outfor the customer's data, one
another, are they looking outfor threats?
Because a lot of threats comenot from you know.
(01:02:07):
A big brute force attack topdown.
A lot of the threats are comingfrom beneath you know.
We can use a personal example.
We've had in the last month,two months, we've had two or
three separate attempts toinfiltrate our business through
technology.
(01:02:27):
Those attempts didn't come topdown.
They weren't trying to breakinto our servers.
They weren't trying to breakinto our email.
They were trying to findvulnerabilities with employees.
Can you?
Erin Durham (01:02:44):
explain that for
the audience.
Sam Gerdt (01:02:46):
Yeah, yeah, yeah, I
was going to.
So you get an employee gets atext message on their phone yes,
not a company phone on theirphone, yes, and it says, hey,
this is, you know, the boss'sname.
Right.
So the boss, can you?
I need you to do something forme.
Are you available?
Erin Durham (01:03:05):
Yeah, and there's
no link.
So it was four of us receivedthis on the same day and around
the same time.
Whoever did this had access toour personal phone numbers which
most of us don't give out andknew where we worked and knew
our company president's name.
Sam Gerdt (01:03:21):
Yeah, and so the goal
is to get a response.
Erin Durham (01:03:24):
Yes, and there was
no link in the text.
There was nothing to click.
Sam Gerdt (01:03:27):
The goal is to get a
response and then from there I
don't know the exact game plan,but from there you're building
out this, you're building outthis little, little tiny tunnel.
It starts just a little tinytunnel into the company, but
eventually, once they get in,they're in and there's all kinds
of damage that that these badactors can do.
And but, to emphasize the point, they're going after, they're
(01:03:53):
going after the new employee,like they're going after like
the intern.
They're going after peoplewho've only been here for a year
.
And so this emphasis ofawareness and training from day
one super important, individualresponsibility.
You have to have organizationalwide systems that are secure,
(01:04:15):
and we do that.
That's a huge part of what wedo.
That's a huge part of my job.
Is it even just advisingclients on stuff like that?
But then if you don't haveeverybody on the same page with
security practices and knowingwhat, knowing what a scam looks
like, knowing what's okay andwhat's not okay, what can I do
(01:04:36):
with this piece of information?
What can't I do with this pieceof information?
From day one, you have to havethat nailed down.
Super important, and this ishow individual responsibility
plays into making a safeorganization.
It's making sure that you'retraining your people and that
you're being constantly vigilant, both in keeping them up to
(01:04:59):
date, but also in justwatchfulness every little thing.
Erin Durham (01:05:04):
If something
doesn't seem right, it's
probably not yeah.
Sam Gerdt (01:05:08):
So it's individual
responsibility is going to only
become that much more important.
It won't.
I don't think AI is going toalleviate any of that
responsibility.
I think it's going to add to it, because that's the trend with
every new technology that we'vehad forever.
Technology does not alleviatepersonal responsibility, it only
(01:05:33):
adds to it, because you're notgiving somebody an out, you're
giving them another tool to use.
Erin Durham (01:05:39):
Right.
Sam Gerdt (01:05:40):
You're not.
With more powerful tools comesgreater responsibility.
That's what we need tounderstand.
Ai is a incredibly complex,capable and powerful tool that
can be applied in a milliondifferent ways, but it's a tool
that we're in charge of wielding, and so that comes with
individual responsibility.
Erin Durham (01:06:02):
So, would you say,
that makes it even more
imperative to be selective aboutthe tools that you choose to
use, and only choose to partnerwith tools that are actually
going to support your mission,rather than we see a lot of
people who go, oh, this newtechnology is out, I'm going to
adopt it, whether they trulyneed it or not, or is that a
(01:06:23):
different issue?
Sam Gerdt (01:06:24):
It's a different
thing, but I'll answer it.
There's actually two thoughtsthat I have.
The first has more to do withindividual responsibility.
The second has to do withchoosing tools.
First, individualresponsibility.
You've given somebody thissuper powerful LLM now, and that
LLM can assist them in a lot oftheir tasks.
(01:06:47):
And maybe it's not an LLM,maybe it's a data analytics tool
, whatever.
Along with that, you have togive them training on what can
and can't go into that system,because there's all kinds of
things that can happen insidethese AI tools to extract data
that's been put into them.
And these tools aren'tnecessarily isolated or
(01:07:08):
insulated in a way where a badactor can't get at them.
And so, for example, we advisecompanies who are using chatGBT
not to use chatGBT for sensitiveinformation.
Don't let chatGBT process datathat's sensitive.
And that means, let's say, thatyou want to have it organize an
(01:07:30):
Excel spreadsheet in adifferent format or give you
insights from that, and it cando all of that.
But if that spreadsheetcontains sensitive customer data
and that's what you're askingit to process, that's a huge
mistake.
That's a big mistake.
Similarly, I talked about thiswith Mitch If you're having it
assist you in productdevelopment and you're giving it
source code.
(01:07:50):
That's a big mistake.
Your source code is that's yourproduct, that's yours and
you're putting it, you'restoring it in a place where
other people can see it, and youneed to understand that.
So that's the individualresponsibility piece.
Then they need to train peopleon how these systems work and
what is good and what's not goodto do with them.
(01:08:12):
Separately choosing technologies, you need to.
I mean, we've said it already10 times you need to support the
mission and vision of yourcompany with any technology that
you choose, and so a technologythat's going to complicate your
workflows or add complexity tothem is probably not the best
(01:08:32):
idea right off the bat.
A technology that's going todistract you from your ability
to provide that human to humanconnection that becomes, in a
sense, an excuse to do less,that's probably not a good idea.
There's a huge amount ofauditing.
I think that needs to happenbefore any tool is added, and
this has been true for a decade.
(01:08:52):
This isn't just AI.
Before you add a tool, you needto go back and revisit all of
your tools.
We do this with clients who are, let's say, they need a CRM.
Well, we're not going to justthrow a CRM on top of whatever
you're running.
We're going to go back and lookat every single thing that
you're running and we're goingto thoughtfully integrate a CRM
(01:09:14):
into that stack in a way thatdoesn't add complexity, it adds
capability.
To your existing processesExactly, and then we're going to
go back and we're going totrain everybody on it and make
sure that everybody's in thesame place as far as how is data
organized, where is it storedsafely, is, do the right people
have the right permissions, allof that.
So there's this huge auditingprocess that happens.
(01:09:35):
That needs to happen on theindividual responsibility side,
to where, when you add a tool,you're saying how does this
affect our risk?
How does?
this affect the vulnerabilityLike?
Are we taking an unnecessaryrisk by introducing a technology
that maybe is new to the marketor maybe it's.
We're not sure how the data isstored or who has access to it.
(01:09:57):
And with AI tools I would beespecially wary, because people
are hungry for data to trainmodels right now.
And so, buried in those termsof service, you'll see them
reserving the right to use datafrom the platform to train
future versions of the model.
Erin Durham (01:10:15):
Interesting.
Sam Gerdt (01:10:16):
So that would be,
zoom got into trouble for this.
We talked about it with Christie.
Zoom got into trouble for thisvery thing because it's like,
okay, wait, we're using Zoom tohave like private closed door
conversations and you're tellingme you want to use that closed
door conversation to train amodel that's gonna be released
to the public and as much as youwanna say, well, that data's
(01:10:40):
not necessarily gonna besurfacable.
You would be amazed at howpeople have been able to surface
specific training data inplatforms like GITGPT, where you
can bring to the surface likeactual information that was used
years ago at this point in thetraining of that model, that it
(01:11:02):
exists somewhere.
I would liken it to when we hadhard disk drives People still
use them.
Erin Durham (01:11:09):
Can you explain
that?
Sam Gerdt (01:11:10):
Yeah, people still
use them but most of the time
we're fixed this media, flashmedia, all that new stuff, right
.
But back when we had harddrives, it was, you had to
emphasize just because youdeleted it off your computer,
just because you removed it fromthe recycle bin doesn't mean
that it's gone off of that harddrive.
Erin Durham (01:11:32):
And that's even
more true today, when we keep
everything in the cloud justbecause you deleted off your
Facebook account.
Kids doesn't mean it went away.
Sam Gerdt (01:11:38):
So the forensic
ability of someone who knows
what they're doing to go in andrecover and extrapolate data
from surface data from when youthink it's locked down.
It's incredible, and that sameis true.
You can make that samecomparison.
Erin Durham (01:11:54):
It's only gonna get
worse because people can
already do that and have donethat, certainly in legal
industry.
Sam Gerdt (01:12:02):
It's super important
to have good policy around this
stuff now and start trainingpeople now to think properly
about it.
Well, that was a roller coaster.
Erin Durham (01:12:13):
Yeah, so just to
sort of book in the conversation
.
Sure.
What is your big takeaway, ifyou haven't mentioned it already
, from the first seven episodesof your show?
Sam Gerdt (01:12:24):
Big takeaway is I
kind of mentioned it at the
beginning it's this idea thatyour thinking is going to change
so rapidly when you'represented with so much all at
once.
Distruction yeah, businessowners are gonna deal with this
a lot.
You're gonna think you have itnailed down and then the next
day something's gonna change andthen you're gonna have to think
(01:12:46):
differently about it.
Those fundamental truths aboutwho you are as a company, about
how you want to connect withyour customers, those can't
change.
Those have to be non-negotiable.
Mm-hmm.
In some senses you're making abet.
Some things it's a safe bet,you know, like human interaction
.
Prioritizing human interaction,that's a safe bet.
(01:13:07):
Prioritizing the idea thatpeople want to deal with people,
that people want a humanconnection with their brands
yeah, safe bet.
You know what we don't have to?
There's not too much skin inthe game when you bet on that,
but there are some truths thatare still being discovered and
you have to put some skin in thegame and then stick with it.
(01:13:28):
Every company does this withtheir brand purpose.
That's, yeah, you should be.
It's unique to you.
You're putting all your eggs inthat basket, that brand pur,
and there's no rule that says itcan't change and adapt over
time.
It's an evolution process, butyou have to have it.
(01:13:49):
You have to articulate it, youhave to train your people on it
and then you have to stick withit.
That's your guideline.
So if you're going into thedarkness, that's your like super
narrow flashlight beam that'sjust guiding you every step of
the way.
Erin Durham (01:14:04):
I love that.
It's a really great analogy.
Sam Gerdt (01:14:06):
If you can do that.
You're not always gonna end upat this super successful place.
Sometimes the initial truththat you've latched onto will
turn out to not be profitable.
You know, there are a number ofcompanies who have really well
articulated purposes but theyjust don't resonate with enough
(01:14:28):
people for the company tosurvive.
They didn't target a purposethat was going to pan out.
But you have a much betterchance of finding that deep
purpose, that human connection.
I made the analogy with I thinkit was with Mark with the film
(01:14:54):
industry, kodak and some ofthese small, like film developer
, film development mom and popshops.
There were people whose purposewas too wrapped up in their
product and they weren't willingto adapt with the technology,
whereas if they had drilled downjust another layer and said,
(01:15:17):
okay, our brand purpose is topreserve your memories, then the
product doesn't matter.
The technology doesn't matter.
As long as you're staying trueto that deeper purpose, you can
add technologies, but you'rejust building this brand that
says we're here to help you keepthose memories safe.
And so we see that all the time.
(01:15:40):
There are companies that wereestablished 100 years ago and
what they do now is completelydifferent than what they did
back then.
But you trace it through andyou're like, wow, their product
is so completely different, buttheir purpose somehow is exactly
the same.
It's crazy.
I was actually just readingabout the founder of Lego.
(01:16:01):
I won't get into that.
Go look up the founder of Legoand read his story.
It is incredible the amount ofthe amount of challenges, the
trials that he had to face inorder to keep his company going
and how he bucked every singleconventional piece of advice
(01:16:23):
that he ever got and ended upjust and it's almost to the
point to where he didn't evensee Lego become what we know it
as.
But he is absolutely the onethat made it what it is, and he
did it not by focusing onproduct as much as focusing on
that deeper purpose.
So I think that's the bigtakeaway for me is all this
(01:16:48):
technology stuff.
We don't start with that.
Start with those fundamentaltruths that define you as a
person or as an organization,and then apply carefully and
thoughtfully to that as you goalong, let it evolve and just
keep an eye on that purpose.
I love that.
What about you?
Erin Durham (01:17:07):
The people thing
that we talked about in the
beginning, yeah, it just and Iwould even add to that to the
people discussion and put outjust that kind of general call
for empathy, because even ifmaybe you don't score highly on
an empathy test, it is a skillthat you can build.
Sam Gerdt (01:17:28):
It is.
Erin Durham (01:17:29):
And people matter,
and especially in a post-COVID
world, in a world of knowledge,work, where you do have the
opportunity to leverage thestrengths of really talented
individuals, if you can justhave enough empathy.
And that's not to say youshouldn't expect personal
responsibility, you shouldn'thold people accountable.
(01:17:52):
You absolutely should do that,but in my opinion you should do
that with a healthy dose ofempathy too.
Yeah, yeah.
Sam Gerdt (01:18:01):
Yeah, it's a
superpower.
It's one that I wish I had moreof.
You're right that you candevelop it.
Erin Durham (01:18:08):
I agree that it's a
superpower.
It's not often recognized asone.
Sam Gerdt (01:18:11):
Yeah, it's the
ability to, the ability to put
yourself in a situation thatmaybe you don't even agree with
and see it from a perspectivethat maybe you don't even like,
but then to fight from thatperspective, that's.
It's something that I don'tunderstand.
Erin Durham (01:18:32):
It's not a gift
that I have, or even continue to
see the humanity in a personand refuse to treat them as
something other than you or lessthan you, which we see a lot of
, especially on a politicalstage we want to dehumanize
(01:18:53):
people that we disagree with,and Bernadette Brown talks about
this in her book Braving theWilderness.
She talks about if we can hangon to each other's humanity in
the face of disagreement, wewill have much more productive
conversations.
Sam Gerdt (01:19:12):
Sure, I mean bringing
it to that team level, building
a strong, creative, effective,productive team.
Empathy is the gift thatbelongs to the person who can.
When somebody is not aligned,when somebody is contributing to
a decrease in all of those goodthings, the empathetic person
(01:19:34):
is the one who can see it,understand it and correct it as
quickly as possible, and that'swhy it's a superpower, because
what you're building, then, isthis self-correcting,
self-aligning, highlyfunctioning team, and without
that empathy, what happens issecond law of thermodynamics
(01:19:58):
entropy.
What happens is you might haveit at one point, but then,
immediately, you start to loseit.
As soon as you have it, you'restarting to lose it, and so
empathy is that glue that cankeep it together for longer
periods of time to greaterdegrees of effectiveness.
Very good, thank you.
Erin.
Erin Durham (01:20:18):
My pleasure.
Thank you for having me.
Sam Gerdt (01:20:20):
And looking back at
the last seven episodes, seven
interviews that we've done.
This show is gonna go in allkinds of directions but I really
hope that in our cominginterviews which I'm really
excited about we have somereally cool people already lined
up Awesome In those interviews.
(01:20:43):
I'm really hoping to hear morefrom them about this, but I'm
also hoping to go even deeperinto some of the things that
we've already talked about.
Yeah, I think you will.
Yep, and I'm gonna go ahead andput a plug in here.
If there's, as we look forwardto the next set of interviews,
if there's something that weneed to talk about more, email
(01:21:05):
us and let us know, Because thisshow is not just us saying what
we wanna say.
This is us seeking theperspectives that you find
valuable.
Erin Durham (01:21:14):
Yes.
Sam Gerdt (01:21:17):
And so the email
address is Roadwork at
WayPostStudiocom, and if yousend an email there, I will
receive it.
I will make sure that ourproducers see it and we will
make every effort to incorporateyour feedback into the show
moving forward so.
Erin Durham (01:21:36):
And you're also
super active on LinkedIn now.
Sam Gerdt (01:21:39):
Oh, yeah, yeah.
Erin Durham (01:21:40):
So definitely
connect with Sam yeah.
Sam Gerdt (01:21:42):
Thank you for doing
this, Erin.
We're gonna do it again afteranother batch of episodes.
Erin Durham (01:21:46):
I think, it was a
valuable exercise.
Sam Gerdt (01:21:49):
So, yeah, I'm looking
forward to it.
Erin Durham (01:21:51):
Me too, thank you.