Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Andrea Goulet (00:00):
When we have
a high empathy environment
where it is optimized, whatwe observe is less friction
in our conversations.
I send it in a high enoughfidelity way that you understand
what I'm talking about.
Galen Low (00:11):
Empathy can
be the missing piece of
the puzzle when buildinginnovative, high performing
teams in the age of AI.
Andrea Goulet (00:17):
They had a
chance to assess the helpful
model before the otheraspects had been added to it.
It was 30% more likelyto cause harm and to
lie or be disingenuous.
Galen Low (00:31):
Tell me how empathy
fits into the world of AI.
Andrea Goulet (00:34):
Where in
your organization can you
leverage this as a tool tohelp you run experiments?
The purpose of AI is tohelp humans human better.
Galen Low (00:47):
Welcome to The
Digital Project Manager
Podcast — the show thathelps delivery leaders work
smarter, deliver faster, andlead better in the age of AI.
I'm Galen, and every week wedive into real world strategies,
new tools, proven frameworks,and the occasional war story
from the project front lines.
Whether you're steeringmassive transformation
projects, wrangling AIworkflows, or just trying to
keep the chaos under control,you're in the right place.
(01:10):
Let's get into it.
Today we're talking aboutthe idea of empathy as a core
part of an organization'scommunication infrastructure,
and how cultivating andmeasuring empathy can
be the missing piece ofthe puzzle when building
innovative, high performingteams in the age of AI.
My guest today is AndreaGoulet, a communication
systems architect who is ona mission to translate the
(01:32):
complexities of human dynamicsinto practical tools and
frameworks, so smart people cancollaborate more effectively.
Andrea has over 25 yearsof professional experience
that includes building a $4million software consultancy
from the ground up, teachingover a hundred thousand
students, and growing amassive global audience as
an in-demand keynote speaker.
Andrea, thanks for beinghere with me today.
Andrea Goulet (01:54):
You
are so welcome.
I'm very excited to be here.
Thank you so muchfor inviting me.
Galen Low (01:58):
I am really
excited for this conversation
because you and I, we met atthe Agile 2025 conference.
You filled my brain with so manythoughts around empathy, and I'm
really excited to get into it.
I know from experience now thatyou and I are the kind of people
who can probably zig and zaginto some pretty interesting
tangents, but just in case,here's the roadmap that I've
sketched out for us today.
To start us off, I wantedto just follow my tradition
(02:21):
and just get one big burningquestion outta the way.
It's that uncomfortable,but maybe pressing question
that everyone wants to knowthe answer to in my opinion.
And then I'd like to zoomout and maybe just talk about
three things like, first Iwanted to talk about what
empathy actually is a measureof and what it indicates about
a team or an organization.
Then I thought maybe wecan get into a few examples
of how empathy can be usedto create and reinforce a
(02:42):
culture of adaptability tofacilitate knowledge transfer
and ultimately change theway a business measures.
Performance.
And then lastly, I'd like tomaybe just talk about the role
of empathy and the age of AIand whether human intelligence
and empathy will fall by thewayside, or maybe will it
become more important than ever.
How does that sound to you?
Andrea Goulet (03:01):
Yeah,
that sounds great.
Galen Low (03:02):
So as I was
mentioning up top, you gave a
keynote speech at Agile 2025in Denver, a few months back.
The title of your talk wasEmpathy as Infrastructure, and
I had the distinct pleasureof missing it completely,
but you gave me the Kohl'snote version over breakfast.
Could you just tell us whatyou mean when you say empathy
as infrastructure and why isit maybe more than just a sort
(03:25):
of woke snake oil buzzword?
Andrea Goulet (03:27):
Yes, I love that.
The highest compliment Iever got, one of my most
popular keynotes is calledEmpathy is a Technical Skill.
I give it a lot of softwareengineering conferences.
The highest compliment Iever received was somebody
came to the talk, came upto me afterwards and said,
the only reason I camehere was to tell you how
wrong you were afterwards,and I can't do that now.
(03:51):
You've changed the way I think.
And I was like, yes, becausethat's crazy high praise
from that type of audience.
To answer your question,I'm gonna give a little
bit of context into kind ofhow I got into this topic.
So I started my career in salesand strategic communications
where essentially you likepart of the training is
(04:13):
how to understand people.
You know, there's consumerpsychology, like you get a
pretty technical understandingof essentially what empathy
is like, how to predictanother person's perspective
and how to respond.
So my business partner,who I built the software
company with, he and I wentto high school together.
We grow the company together.
(04:33):
A few years later we getmarried, which is fun.
And so what that ended updoing was create a really
deep motivation to understandeach other in a way that kind
of, you don't typically get.
There were many times where wehad the very traditional sales
slash engineering dynamic, andI felt like I wanted to rage
quip truly, but I didn't becausethere was an extra incentive.
(04:57):
And so just like he hadtold me, like, no, you're
actually good with people.
Like, why are you callingyourself non-technical?
You can understand this stuff.
Like you pick up on, youmight not know it now and
you might not have a computerscience degree, but you
are absolutely learning it.
But I, it was like an identitything, and I noticed that
he had one on the opposite.
Which was, I'm goodwith machines, but I'm
(05:18):
not good with people.
And that was somethingthat he had been told.
And honestly, a lot of thereasons that I wanted to rage
quit were miscommunications.
And I was expecting him torespond in a certain way.
He didn't respond in away that I had seen in
the general population.
And it was frustrating to me.
(05:38):
And at the time Ihad taken on this.
Identity of like, oh,I'm non-technical.
I'm good with people,I'm an empath.
But what I was really doing washubris because I had the false
belief that I could walk intoa room and I knew what people
thought better than they did.
And it's only been after aboutlike decade or so of diving
(06:00):
deep into the research thatI've had that realization,
and now I'm like, I justwant to go back to my past
self and apologize profusely.
I think this is a dynamicthat is very common, right?
We have people who identifyas people, and it's like,
oh, I'm an empath, right?
I understand whatpeople are doing.
But then what I wasn'tdoing when I would
misunderstand Scott, wasI didn't take the time to
(06:23):
listen to his perspective.
I was very judgy and Iwas like, why can't you
blah, blah, blah, right?
That's a horrible way to treatsomebody, but I was frustrated.
So an example to make it alittle bit more concrete.
An example, I would say, here'smy big idea, here's my big
strategy, blah, blah, blah.
And you know, halfway throughhe would interrupt and he
(06:43):
would pick one specific thing.
He's like, that'snot gonna work.
I'm like, and he was like,I know I've been called
like a professional balloonpopper my whole life.
So there's real frustrationthere, and it's like, why can't
you see, like, I'm excitednot to interrupt me and stuff.
But when we dug into it.
Because we had that level ofcompassion, like we truly cared
about each other in a way wherewe wanted to stick it out, and
(07:06):
we didn't want to make eachother's lives more miserable.
So that was the extra incentiveto really figure it out.
And so he told me, he'slike, well, I have a lot of
compassion for what you'rebuilding, and I don't want
you to go down a directionthat I know is going to hurt.
And so it was like, well,okay, when you do that, it
takes me outta this flowand then I can't remember.
(07:27):
And so this is the greatthing is that when you can
have empathic conversationslike this, instead of just
these friction points wherepeople pull away, especially
when you have differentperspectives, this is where
innovation comes from.
So the innovative thing that wedeveloped a lot of communication
frameworks, and some of them,honestly, we poured over
from couples counseling andjust directly implemented
(07:50):
them into the business.
Galen Low (07:51):
Amazing.
Andrea Goulet (07:52):
But the framework
that came out of this specific
interaction was essentially likeScott, when you have a detail
that you want to tell me about,do a find and replace and say,
may I pause for a clarification,right?
Galen Low (08:08):
Before
you pop before in.
Andrea Goulet (08:09):
'Cause that
gives me a little sense.
And then if I'm on a roll,I'm like, no, hold on.
Right.
Like, let finish my thought.
So then I'm like,okay, what's up?
And then I said, thenext step is what kind of
feedback would you like?
Macro feedback, microfeedback, or no feedback?
Sometimes it's like I justwant you to listen to the
big picture and like hearthe whole thing and think
is this the right strategicdirection to start putting some
(08:32):
resources towards the details,towards doing more research or
towards more implementation.
There are absolutely timeswhere I'm like, okay, this
is close to ready to ship.
I want you to use your geniusand I want you to tear it apart.
And those teeny tiny things thatwe co-created together changed
(08:53):
the way we work together becausethen it made it frictionless
even though we had completelydifferent perspectives.
And Scott Page has somereally great work on this.
He has a book called TheDiversity Bonus, and for
those of you who are justlike, wow, that's woke.
This is based in math.
When people who have reallydifferent perspectives can find
(09:14):
ways to influence each otherand work together to create
something, you literally get,in an organizational sense,
one plus one equals three.
That was our experience, andso to me, having the background
of empathy, having the.
Background of working insoftware and learning about
how complex software systemswork, and it's like, whoa.
(09:34):
This is the sameas social systems.
Like there's math behind this.
There's infrastructure, there'sarchitecture here, there are
principles that we can lean on.
Then I got really excitedand so my purpose was to
basically help Scott understandwhat empathy was in a way
that made sense to him.
Because when I would try todescribe why this was important
or how to do things, he waslike, yeah, you're just telling
(09:57):
me to like have empathy.
So in the effort to try toexplain things to him where
it made sense, that endedup being kind of, you know,
blogging about it in publicand just saying, here's what
I'm finding and thinking,and it really resonated.
So that's where it's likeall these different things
come from, and I mean,it's transformed the way
(10:18):
we work together, theway we live together, the
way we parent together.
We did sell our business lastyear, so there's a little bit
of a different dynamic now.
So the reason it's not justwoke BS is two reasons.
The biggest overlying regionis that there is a difference
between culturally understoodempathy and computationally
(10:39):
understood empathy.
And so, you know, me describingmyself as an empath, right?
And saying, oh, Iknow what it means.
Or like thinking that it'sonly being kind all the time
and never setting boundariesor thinking that it is letting
people walk all over you orthinking that it's like, oh, I'm
this psychic superhero, right?
Like there are a lot ofdifferent kind of cultural
signals and identity things thatplay into this, which I think is
(11:01):
what people are responding to.
So my goal is to reallyunderstand the computational
side and the analytical sideand the neurological side, and
just what does the science sayaround what this is and how can
we just really understand it asa mechanism, as infrastructure
so that we can architect anddesign organizational systems.
So the last point in termsof like what is empathy going
(11:22):
all the way, like if we'reabstracting all the way up
to what it is, empathy is theevolutionary mechanism and
there are a lot of differentsystems in our brains and
our bodies that contributein a dynamic way to what
gets expressed as empathy.
And I think a little bitlater I'll go into like
the four components thatNeurobiologists has figured out.
(11:45):
But the reason that wehave this is that we are
a hyper social species.
And so empathy enables usto communicate, collaborate,
and solve complex problemsmore effectively for
species survival, literally.
And when we look at itfrom that perspective,
it's like, why would wework against our nature?
(12:06):
And when we understand thatwhat's at play, then we notice
that there's a lot of differentkind of levers we can pull.
We can change our environment,we can change our own
perspectives, we can changethe way we communicate.
Like there's a lot ofdifferent things that we
can do to optimize empathy.
So I think of empathy muchmore in a systemic way than in,
you know, there's definitelyinterpersonal dynamics.
(12:27):
Not in a, I am an empathbecause that's like saying
I am a circulatory system.
Galen Low (12:35):
Okay.
Yes.
I'm really glad youtook it there because
that resonates with me.
Right.
We are a social species andevolutionarily we have the
survival skill called empathy,which is about predicting
somebody else's perspective.
It's not about reading mind,it's not about knowing people's
thoughts better than theyknow their own thoughts.
It's not aboutnot communicating.
Where we kind ofthink, oh yeah, you're
(12:55):
empathetic, you understand.
You don't even have tosay the thing because
you know the thing.
So we don't communicate.
It's actually the opposite.
It's not making assumptionsabout how people are
feeling or what they'rethinking and their POV.
It's developing theinfrastructural language to
communicate around that becauseit's like core to our being as
humans and how we've evolved.
(13:16):
And I think, like you said,I would love to trace it
back to figure out wherethis word went wrong.
I'm paraphrasing, butthat's how I interpreted.
I'm like, that is fascinatingbecause language fascinates me.
That specifically is, you know,one of those words that I think
is imbued with so much meaningthat was never really intended.
And we've used it in certainways, and you and I, we've
talked about the superheroempath and like how that
(13:38):
character is, you know, whatcharacter traits and what
superhero skill that would be.
It's mind reading.
It's like, well, feelingreading, but it's not actually
what empathy is or was.
It's just the social fabricthat helps us collaborate.
And when you put it likethat, I can't argue with that.
I'm that person that yousaid, right, like came to
your keynote speech to belike a naysayer to shoot you
(14:00):
down, to tell you're wrong.
But actually when you breakit down that way, that makes
tons of sense and that'swhat teams need to do.
Like that's something we'vebeen saying all along, but
for whatever reason, theword empathy gets painted
with a certain brush.
Whereas what we mean islet's have some kind of
computational empathy.
Let's have the infrastructure,let's have the sort of framework
(14:21):
to communicate effectivelyknowing that we don't all
think the same things andsee things the same way.
Andrea Goulet (14:26):
And
that's a good thing.
Galen Low (14:27):
Well, I mean
at the top, he's Scott.
He just, even we, Scott,he Scott likes to fix
bugs and refactor code.
He is not among thegeneral population.
Andrea Goulet (14:35):
No, he's not.
Galen Low (14:35):
That is a different,
it's a different human.
Andrea Goulet (14:37):
And what
ended up like the way that we
built our culture, my numberone thing was like, okay,
I've seen empathy work inall these other industries.
We're building our entireculture around empathy and
people, literally, thisis like 2009, 2010, we had
consultants and they literallysaid, you can't do that.
If you say the words empathyand software in the same
sentence, you'll be laughedout of the industry quote.
(14:58):
And I'm the type of personI'm like, oh, hold my teeth.
That's actually whereI'm gonna double down.
Galen Low (15:03):
Yes,
challenge accepted.
Andrea Goulet (15:05):
But what we ended
up doing was by learning about
Scott's frustration and just,you know, he kept describing me.
He kind of felt like.
I'm probably dating myself,but Milton from Office Space
where he's like sent into thebasement and it's like he's
got his little stapler andlike no one sees his value.
I think there are a lot ofpeople in organizations who
contribute a lot of value,but it's not always visible,
(15:27):
and so they get overlooked.
So Scott was sharing kind ofhow he felt, and we were in
a Barnes and Noble and theyhad a table for books and it
was like, maker Fest make.
And he was like,where's my table?
And like it's that kind ofthing that then that generated
all the different chemicalsand biological things where
I all of a sudden understood.
(15:48):
So now we'll get into the fourdifferent pieces of empathy
and I'll share how thisexperience demonstrates it.
So before it was oh, okay.
I have an analogy.
I kind of on an abstractlevel understand what
you're talking about.
But when I who identify as amaker was like, oh my gosh,
this is so amazing to seesomething that's just for me.
(16:11):
Like I want to dive in andbuy all of those books.
I wanna go to allof the hackathons.
And then hearing in Scott'svoice, just the frustration and
the feeling like he couldn'tfind a place to belong.
And like I had a placeto belong and he was
like excited for that.
But just like he had been reallystruggling to find other people
(16:31):
who had the same passion as him.
And not only did he strugglenumbers wise, he also
struggled because people likeme at the time were saying,
well, you're, you know,why would anybody do that?
But in that moment, so there'sfour different aspects of
empathy that kind of gointo an empathy system.
There's lots of differentplaces in the brain
(16:52):
and chemicals that get.
Passed around in ourbodies that we don't need
to necessarily go into.
But John Deity is a cognitiveneuroscience outta the
University of Chicago.
I really like his model,which is the functional
architecture of human empathy.
When I was reading his papers,this was where I was like,
oh, there's something reallyclicked because like I do
(17:12):
this in some of my keynotes.
If you read his papers andtake out the word empathy, you
can replace it with software.
Galen Low (17:18):
Interesting.
Andrea Goulet (17:19):
Yeah, so it's
things like blank needs to be
broken down from a monolithinto more discrete concepts.
And it's like that's afundamental principle for
like creating a healthysystem architecture.
So it was like thisis interesting.
But back to thepieces about empathy.
You've got compassion, whichis sensing, suffering, and
(17:41):
wanting to do something aboutit and wanting to alleviate it.
Self-compassion isdoing that to yourself.
Super important.
Kristin Neff has a bunch ofgreat work on self-compassion,
but also somebody else.
So I care about Scott, right?
We're doing this.
When I sensed his suffering,compassion kicked in, and
when compassion kickedin, that's when a lot of
(18:02):
other things happened too.
The other one that was likereally surprising for me, but
once I heard it, it made aton of sense, is regulation.
Galen Low (18:11):
Right.
Yes.
Andrea Goulet (18:12):
So being
able to identify our own
bodily state emotions andintentionally shift them.
Which is why like if Iwanna rage quit, I can't
listen to somebody else'sperspective because I'm
self-centered at that moment.
And not because I'man egotistical maniac,
but because I'm human.
It's just the way humanswork when we're, we've
(18:33):
got really heightenedemotions, we can't listen.
So in that moment, I wasregulated, like we were hanging
out at Barnes and Noble.
I wasn't like super angry.
I was in a joyful state,but able to shift that.
The next is affect,which is emotion sharing.
This is what we tend tothink of as empathy alone,
necessary, not sufficient.
So being able to attuneto somebody like, okay, I
(18:55):
sense you're frustrated.
I have been frustrated.
I know what frustrationfeels like, and it doesn't
have to be forever.
It's like a pang of like,okay, you're feeling this.
You don't even have to name it.
Scientists essentially nowplot emotions on a grid.
Where you've got it impacts youpositively or negatively, like
the sensation and its intensity.
(19:17):
So it's either calm orlike really intense, right?
So something like joyis like a really intense
emotion and we would labelit as positive, you know?
So I just need to plotthat somewhere and feel it.
And a lot of times it'slike this is a reflex
not for everybody.
Some people have to likeput a live intention
(19:38):
to it, and that's okay.
The last one is cognitiveempathy, and this one gets
talked a lot about in productdesign and user experience
and things like that.
What this is being able to applyrational thinking and reasoning
so that we can understandanother person's perspective and
predict how we should respond.
And this one is criticalbecause if we only rely on
the other ones, then we don'ttake in the information,
(20:00):
we're not able to influence.
Also we're not able to considerthe costs and benefits.
So a big one is whencompassion and affect are
really heightened, we can getso drawn into one person's
perspective that we can'tsee the impact of a group.
An example here of why thisis important is like doctors.
(20:24):
They're in an emergency roomand there's been a horrible
accident they have to triage.
If they spend all of their timein one room for an hour, then
more people are going to suffer.
And there are plenty ofexamples like this, but so
that's just one, one example.
This is called Compassion Fade.
So if you wanna gee out aboutthe actual literature, and
(20:45):
this is the thing, it's likethere are very well researched
terms to describe these things.
And so when we look atthose kind of four different
aspects, that's how we canbreak down, oh, Scott was
sad, but then the innovationhappens when we have different
perspectives, how do we fix it?
How do we get you that sense?
(21:06):
How do we build that table?
So that's what we ended updoing was we went back to
like, there were lots andlots of conversations and a
lot of 'em revolved aroundidentity and, you know, we
were modeling things outabout how we did our business.
I mean, it was a lot of deepdives, which was fun because
we're learning about each other.
It helped us do our businessbetter because I'm learning
(21:27):
more about legacy code less froma, just like a book and like
why truly these are important.
What we ended up coming upwith is kind of a counter
where you've got makers.
Yay.
If you're a maker, celebrate.
Scott's a Mender.
Galen Low (21:43):
Okay.
Andrea Goulet (21:43):
So it's
like they're complimentary
because sometimes like ifyou're operating in white
space and you need to dosomething really quick,
you want rapid prototypers.
If you are working on a missionor safety critical system and
you need to change out likereplatform it because the
hardware is now out of date.
(22:04):
You want somebody who is likesuper diligent and isn't gonna
just like rush through the job.
What Scott was saying is likethe second type of project
is the one where he's like,give me, it's like candy.
And so when you can operate withempathy, when you can interact
with different perspectives,when you can appreciate the
strengths that people bring.
And that's what enabled usto build the company too.
Galen Low (22:27):
I love that.
I wonder, could we transposethat onto another business?
Because I love thesefour things, right?
I'd love to beable to take that.
The last one I think issuper interesting to me.
The sort of cognitive empathyobviously affect you know,
we're talking about the sortof popular understanding
of what empathy is.
Plotting something on themagic quadrant of emotional
intensity and then using thatto guide your conversation.
(22:50):
We have the compassion bit.
Including self-compassion,the regulation bit.
I wanna tie all these thingstogether because I'm glad
you brought in community.
I'm glad you brought in thesort of like evolutionary
social behavior and thenthis framework of these
four elements of empathy.
How does a business then takethat and benefit from it?
And then how do theyknow if it's working?
(23:11):
'cause it seems like there'sa lot of conversation,
you know, like deep dives.
This is groundwork.
A lot of organizations in growthmode are just like go fast fast.
Produce.
How can they use this?
Andrea Goulet (23:22):
Yeah.
Which is great.
If you have more empathy,then you can grow, go.
Go and produce better.
Galen Low (23:26):
Tell me
more about this.
Yes.
Andrea Goulet (23:27):
So, okay.
Empathy is the engine,communication is the car.
Okay.
If you have just a shellof a car with no engine,
you're not going anywhere.
I mean, I guess you can push it.
You can get a tow truck orsomething, but it's like
you're not gonna go very fast.
If you have a high performanceengine, you can go really fast.
So that's the wayyou think about it.
An engine that's just sittingthere in a yard, like that's
(23:48):
not gonna help you getwhere you wanna go either.
So you need bothof these things.
So communication is theexternal expression of empathy.
And when we have a highempathy environment where
it is optimized, what weobserve is less friction
in our conversations.
Because it's a predictivemeasure, I can predict with
a relatively high accuracy,and I understand the message
(24:09):
that I'm trying to say, andthe first time that I send
it in a high enough fidelityway that you understand
what I'm talking about.
So think about this as thethree-way handshake that
operates the internet.
So the three-way handshake,on the internet this
is the HTTP protocol.
The S we bring in security.
So I'm the server.
I wanna send information, right?
I'm like, Hey, hereare my packets.
(24:30):
This is what I want and Iwanna send 'em in this order.
Galen, you know you'rethe computer, right?
And you're like, oh yeah, okay.
I got these packets andI got them in this order.
He sending that back to me.
I'm like, yep, those arethe packets that I sent,
and they're in that order.
We're good.
That takes a lot of time and ittakes a lot of resources, but.
It's accurate.
(24:51):
So there are, this islike active listening.
A lot of times you need thatback and forth, especially
like if you don't have alot of overlap in terms
of your perspectives or ifemotions are heightened right.
When there's added complexityor uncertainty in the system,
you need to slow down anddouble check this accuracy.
But there is not, like,this doesn't apply
(25:11):
to every situation.
So there's another type ofinternet protocol called
UDP, which is for streaming.
So like UDP is what you'reusing to, like, we're not
checking with every person who'slistening to this and verifying
that the packets of our audioand video are coming in.
We're just going,here's our content.
Andrea's just spewingconversations.
(25:31):
Right?
And we're trusting that itwill get where it needs to
go, but we don't need tospend the resources on that.
So the, in a human world,this is playing charades.
This is in an agilesituation, throwing sticky
notes up against the wall.
Just get them on the wall.
We'll observe later.
Because if I pause every timesomebody says something in a
game of parades, I'm like, whatdo you really mean by that?
(25:53):
What we're trying to do inthe UDP protocol is get as
many pieces of informationout as we can, but at the
same time, why are thereso many miscommunications.
Galen Low (26:02):
Right.
Yes.
Andrea Goulet (26:03):
And I think that
it's like we're understanding
the car, but we are notunderstanding the engine.
And so if we start gettinggood, you need both.
Because if you just haveempathy, but never say
anything and never interactwith anybody that doesn't
generate innovation.
Galen Low (26:15):
Here's what I thought
you were going to say, and
I'm glad you didn't go there.
I thought you're like, oh,UDP, like that's broadcast.
Don't do that.
That's just like sending thememo, the corporate memo that
says, come back to the office.
Do AI, do two people's jobs.
Thanks, bye.
Don't come to me.
And, but this is not a dialogue.
But also where you took itwas sometimes you need it to
get the ideas out and comingback to the conversations.
(26:36):
That you shared with me, thatyou had with Scott, right?
About like, let'scommunicate about like
what protocol are we using?
And I know that for a lot ofpeople, that sounds very like
wooden in a social context,but it's like, listen, is
our goal to get a wholebunch of stickies on the
board and then talk about it?
Or is our goal to put asticky on the board and talk
about it, make sure we'reon the same page, and have
that fidelity, like whatfidelity are we going for?
(26:56):
And that is something that thenyou can like, it translates into
more efficient communication.
Like if you were to look atit from that business lens.
In a slightly more, Iguess, data oriented way.
I was gonna say sterile,but that's just not
quite what I mean.
But I mean is like, okay,why am I doing this?
Why am I gonna investin people having empathy
and having an empatheticcommunication framework?
(27:18):
Oh, because there will befewer miscommunications.
That's great.
Because that happensall the time.
We've just accepted thatcommunication is imperfect,
so we just bake it in andnow, okay, we can have higher
fidelity in our communicationsand we can be more efficient.
Then there should beless communications.
And where my head goes islike, okay, that makes sense.
How do I measurethat as a business?
How do I go, you know what?
(27:39):
All that time we spent figuringout what our protocols are,
what our infrastructure is,and it's an ongoing thing.
It's not just a, you know,start and finish and we're good.
This is ongoing timewhere we're investing.
How do I know ifit's paying off?
Andrea Goulet (27:52):
Yeah.
Looking at two differentthings, lagging indicators
and leading indicators.
Leading indicatorsare what we want.
These are what thelevers that we can pull.
So these are things likepsychological safety.
You might think it'swoke, there's math.
It backs it up.
Like when people and I thinktoo, it's like, let's take the
virtue signaling out of it.
What is it like Amy Edmondsoncoined the term in 1999 and
(28:16):
it's that people in a groupfeel comfortable in a social
setting speaking up withoutthe fear of being seen as
ignorant, irresponsible, shamed.
So we need this.
Especially to catch bugs or tosay, oh no, that's not that, but
what if we did this to be ableto build on each other's ideas?
(28:36):
So it's things like that.
So there are leading indicatorsthat we can measure, like
do we have engagement?
So one is howeffective are meetings?
We did this where it was like,so in the Agile manifesto,
the one principle thatI'm like, eh, not quite.
Is, you know, it talksabout communication.
It says that face-to-facecommunication is
(28:57):
always the best.
It's not.
That's okay.
They're notcommunication experts.
Get it.
Standups.
How many people have been inlike a standup or a sprint
planning meeting that's supposedto last for 15 minutes and
it consistently is an hour?
That's because it's amismatch, a status update.
So what we did were,you know, was like,
oh, this is a mismatch.
So we're gonna have a Slackchannel that says standups.
(29:19):
And everybody's just gonnapost it asynchronously and say,
here's what I'm working on.
Tag somebody if you're blocked,read what people are doing.
And then it takes everybodyfive minutes instead of 15.
But then it's like, okay, savethat for the prioritization
meeting, save that for theretrospective, because that's
where we want to, again,have that more three way
(29:41):
handshake type of dialogue,especially in a retrospective.
So are your meetings runningeffectively as an example?
Like is therespite on your team?
Are people like getting along?
I could go into spite.
Galen Low (29:55):
That's fascinating.
I love that.
Andrea Goulet (29:57):
It's
incredibly contagious and
one of the biggest factorsthat will take down a team
dynamic incredibly quickly.
Exponentially quickly.
Galen Low (30:04):
I believe that.
Andrea Goulet (30:05):
Yeah.
So there are metrics that we cando and levers that we can pull.
You know, it's like how manymiscommunications, like you
can measure it quantitatively.
This is like a surveyqualitatively kind of thing?
If you do that, peopletend to be a little
burnt out on surveys.
So, you know, there areother ways to do that.
But then there's also thelagging indicators, which
(30:27):
is what's our throughput?
How many new ideasare we coming up with?
What's our retention rate?
Are people quitting?
Galen Low (30:34):
Interesting.
Yes.
Fair.
Andrea Goulet (30:36):
So I think the
challenge is that pretty much
any metric when it comes toorganizational health, I think
could come back to empathyin some way, shape, or form.
So because it's so foundational,if we spend time optimizing
that, I just feel like it'sa huge leverage point because
we get exponential benefitsfrom a lower investment.
(30:58):
But if we want to get thosebenefits, we have to be very
clear about what it is, andinstead of telling people,
oh, have more empathy, ordoing the virtue signaling
where it's like, I'm anempath and you're not.
No, that actuallymakes things worse.
Right.
So getting really discreet andreally technical around empathy.
Thinking about it asinfrastructure, thinking about
(31:18):
it as a communication system.
This is just where the mathand where the science points.
So to me, I am like, is just afun, interesting rabbit warren
that I wanna keep exploringfor the rest of my career.
And yeah, I mean, when wedo this, then we can do
what humans do best we can.
Solve problems together, wecan communicate effectively.
(31:41):
We can create a sense ofbelonging and community.
Which are things thatwe need to survive.
Galen Low (31:46):
I wonder if I
can kind of bring you back
because there is something yousaid, well, there's several
things you said that I knowmy listeners can relate to.
Standups and meetings that justaren't efficient or effective.
Miscommunications, spite on theteam morale, you know, a lack
of safety in the sense of beingable to sort of voice concerns.
If I was going to like takethis practically, and someone
(32:08):
listening has a team wherethey're like, yes, there's
a lot of spite on my team.
Andrea, what do I do?
What do they do?
What are somethings they can do?
Andrea Goulet (32:16):
Run
small experiments.
So the reason that we endedup doing the status meetings,
right, and if you listen to allof the stories of breakthrough
innovations and how we usedempathy, the reason that we did
that was because we paused andsaid, this thing isn't working.
So we observed something,what's going on here?
And then it was like, whatif we blah, blah, blah?
(32:39):
So then it's like, well,okay, well let's test
that for a couple weeks.
And there were plenty ofthings where it was like,
whoa, that doesn't work at all.
Right?
So when you run a test,it's like, keep it
modified or discard it.
So the standups one.
It was like, okay, y'all,this is like annoying.
Does anybody like havinga 15 minute meeting?
And then it constantly runsover an hour and we all feel
like it's a waste of time.
(33:00):
Raise your hand ifthat brings you joy.
Right?
Crickets.
So it's like, okay,we're all unhappy.
How do we make ourselves happyso it's not going out and
looking for this perfect thingthat somebody else has created?
Because nobody can create theperfect thing for your context.
In your present momentwith your complexities
(33:21):
and your team dynamics.
And this is whatempathy allows us to do.
It allows us to collect dataon different perspectives
and then innovate and say,well, what if we did this?
So the standups on it waslike, well, what if we just
posted it in a slack meeting?
I was like, that sounds good.
I could do that.
We ran it and then we keptit for over a decade 'cause
it was like, this is great.
Then we experimented and waslike, well, what if we do
(33:42):
this with the retrospectives?
And it was like, ooh, no.
We're actually losing a lotof important information.
Let's actually instead,like, let's take the time
that we were spending inthe standup and let's add
that to the retrospective.
So both meetingswere like 30 minutes.
So we deleted the,essentially the standup
meeting and then we extendedthe retrospective meeting.
(34:05):
Now computationally likewe can understand why.
Galen Low (34:07):
Right.
It's like we needed adifferent protocol for this.
Standup is a differentprotocol than retrospectives.
We ran some experiments andyes, this is more effective.
Andrea Goulet (34:15):
But that
was not because I went out
and read like, this is theperfect thing you should do.
In fact, all of the thingsthat said you should do
this, said, I shouldn't.
And I was like, Ithink you're wrong.
Galen Low (34:24):
Hold my tea.
Andrea Goulet (34:25):
Yeah,
and has some, I think
this is where people.
Yeah.
Another thing to look for ishow much slack and redundancy
do you have in your system?
Is your calendar so full thatyou have no time to just have
these kind of more spontaneousand asynchronous sessions
to dive into a problem?
(34:47):
And there's some more hereif people wanna nerd out.
Eleanor Ostrom,she's a economist.
She did work on thetragedy of the commons.
And so how do people bestmanage a fixed resource?
Galen Low (34:59):
Interesting.
Okay.
Andrea Goulet (35:00):
So they
don't all deplete it.
So it's things like, okay,you've got a bunch of
fishermen, there's a limitedamount of fish stock.
Galen Low (35:04):
Right.
Andrea Goulet (35:05):
Self-interest
will kick in and then
everybody loses because the,all the fish are gone, right?
So how do we coordinateour activities when
everything's really high andwe all have to sacrifice?
So she, that's her thing.
So she came up with abunch of different stuff.
One of the mostimportant things was the
opportunity for chitchat.
Galen Low (35:22):
Okay.
Interesting.
Andrea Goulet (35:23):
Yep.
What this is hyperlocalknowledge transfer
and coordinating.
Galen Low (35:29):
Ah.
Andrea Goulet (35:29):
So if I say
that, then you're like, oh
yeah, that's really important.
But if I say you need tohave time to let people
chit chat, it's like.
No.
We need to get things done.
We need to be productive here.
Galen Low (35:39):
Yes, but
chitchat is productive.
It is coordinating.
Andrea Goulet (35:41):
Right.
It's not forced fun.
It's not like you need to cometo this team bowling thing.
It's act throw.
Yeah.
And you don't eat thechicken fingers and have
fun then you are not.
It's not that, and you cando things as easy as like
one of the policies that weended up experimenting with,
it worked really well, was wegave a five minute grace period
to every internal meeting.
Galen Low (36:01):
Yes.
Okay, I see.
For chitchat or justWe left it open?
Andrea Goulet (36:04):
Left it open.
If you need a bio break becauseyour last meeting ran over,
get yourself a acoustic likeuse the loo, do what you need
to do because it's important.
You should be takingcare of your body.
We need healthy people tocreate healthy systems.
Most people showed upand it was like, and the
purpose was chitchat.
Okay.
And so a lot ofpeople showed up.
It's like it's unstructured,just, and that was where
(36:26):
a lot of exchange happens.
Like, Hey, whatare you working on?
It's like, oh, I'mdoing this thing.
Oh, I actually did some researchon that for my other client.
I can share what I learned.
That's a hyperlocal thing.
'cause it's a highlyspecific problem.
Those two people probablywouldn't have had the
opportunity if we had only stuckto the agenda the entire time.
And I'm not saying don't.
Build these opportunities foreffective slack in your system
(36:49):
and let the team experiment.
You need a little bitof sense of autonomy.
That's really healthyfor a team too.
That doesn't mean lettinggo of accountability.
But create the culture ofhaving these experiments.
So that's what I dive into alot in the remote teams course.
Galen Low (37:06):
I love that.
You know what I like aboutit is it sometimes just
all comes back to somethinga lot more innate in our
humanity than business stuff.
We're like, oh, chitchat.
That sounds unproductive.
Guess what?
We built our entire specieson chitchat look like hyper
localized coordination,cross training, information
sharing, not because itsaid so on the agenda.
(37:29):
That's how we innovated.
We're like, oh, you'reworking on that thing?
That reminds me of thisother thing that has
nothing to do with that.
Let's chitchat about that,and you're gonna get an idea.
Maybe that is going to takeyou further than we would if we
followed the agenda and used thefull 30 minutes for the meeting.
Andrea Goulet (37:43):
So here's
another way to think about it.
Humans aren't the onlyhyper social species.
Ants.
It starts raining.
Ants all send pheromones out.
They send chemical signals fromone ant to another, and then
the entire colony can relocate.
But not necessarily fromlike the queen saying, okay,
I'm going to dictate, I'mgonna create a hierarchy.
(38:03):
Here's the business plan.
Here are all the KPIs,which can be important,
especially if you're thinkingabout long-term things.
But it's like everybodyinteracts, like all the
ants and chemical signalsto the people like to the
other ants around them.
This is what's called a complexdynamic adaptive system.
So again, term that you cango research, but there's
interesting math behind it.
Another example is bees.
(38:24):
Bees go out, they're scouts,they look for flowers.
When they have like, whoa,y'all, there is this like
amazing field over here.
We can get some pollen.
They come back andthen they do a dance.
They wiggle their buttsand say, and there's
different specific dancesto say like what direction
and how much and stuff.
So ants send chemical signals.
(38:45):
Bees do a butt dance.
Humans use language and empathy.
It is that like mechanism,and this is where the
system infrastructure comesin because we can then
use those infrastructurepatterns and knowledge to
do that better because.
This is how we interact.
This is how wetransfer knowledge.
(39:06):
And so if we do things like cutoff chitchat completely, and
I'm not saying the organizationshould be only chitchat.
Nobody gets anythingdone that way like you.
It's about having theright balance of all of
these different things.
Galen Low (39:18):
It's these
protocols, right?
That's what makesthe infrastructure.
It's interesting 'causeit ties it all the way
back to something you saidat the beginning, right?
This idea of self, thisidea of community, and this
idea of otherness, right?
If you don't think you'rean, then you're not gonna
send the pheromones.
If you don't think you're abee, you're not gonna wiggle
your butt when you come backfrom a great flower and it's
not gonna benefit the whole.
Andrea Goulet (39:38):
Yeah.
This is not a term, at leastthat I have seen or researched.
I think of it in terms aswe have miscommunication and
we have discommunication.
So we have like, okay,I attempted to interact
with you, but it didn'tquite go as we wanted to.
Like the information wasn'texchanged effectively.
That's a miscommunication.
(39:58):
Discommunication is, Iam isolated and I'm not
interacting with anyone.
That's how I think of it.
And so we've got a lotof discommunication.
These are echo chambers, right?
These are social mediaalgorithm islands, right.
That we're getting split into.
Galen Low (40:12):
Even the conference
we're at, you know, I noticed
there was a movement of, no,I'm not gonna be going to this
conference 'cause actually I'mbetter working in isolation.
Like, I'd rather not sitand talk about values.
I'd rather justget my work done.
Thank you very much.
Bye.
I've been thinking aboutthat the whole time
we've been talking.
'cause I'm like.
We are social creatures.
Not to say that people whoprefer to be isolated are doing
(40:32):
anything wrong, but it just,the more we talk about it, the
less sort of natural it seems.
We've evolved associal creatures.
It's a hard one for me.
Andrea Goulet (40:40):
Well, and it's
difficult if you don't have the
right environmental factors andthe right resources internally.
Like if I can't regulate,if I'm so emotional about a
particular policy or injusticethat I feel, or if I feel like
I've been wronged in some way.
I'm naturally going todisconnect myself from people
(41:02):
that I feel angry with.
So it takes a lot ofwherewithal, you know, I think
this is, sometimes empathyis described as a muscle.
This is where it is like goingout and being like, I'm not
comfortable talking to somebodywho doesn't say yes all the time
getting to your thing on AI.
I think that mightbe a good segue.
Galen Low (41:20):
Yeah, let's
round out there because.
AI is a big elephantin the room.
I think some people would arguethat AI means we don't need to
focus as much on empathy andbeing social and humans, and
some might argue that actuallyit brought us all to the same
table, the vendors, the makers,and everyone in between.
Now, feeling like we cancollaborate, should collaborate,
(41:40):
but not necessarily beingon the same page yet.
Tell me how empathy fits intothe world of AI in your mind.
Andrea Goulet (41:48):
Yeah, so I
think some of this is like I
can't give answers that are asconfident as I can with others
because this is just a very newfield, and so a huge caveat.
These are myinstincts, not facts.
So take everything I sayhere with a grain of salt.
This is my futuristbest knowledge self.
(42:10):
I think there are fieldslike artificial empathy,
like we're trying to trainmodels to be more empathic.
The challenge there, I thinkis again, understanding what
is empathy and what lookslike an empathic response.
Because in AI, what sometimeslooks like an empathic response,
the technical term for itin the field is sycophancy.
(42:34):
So the AI is acting like asycophant and just saying
everything you wanted tosay, and then you're like,
oh my God, I'm this brilliantperson right in the world.
Thank you.
And it's even hardto temper it down.
Like I've got process filesand stuff where I'm like, do
not tell me that I'm amazing.
And even that it's hard.
(42:54):
And a lot of that I thinkis because we're trained.
My hypo, there was a greatarticle in the New York
Times today about prompts andlike how different AI tools
and the way that we use AI.
You know, there was onestudy that was cited.
It was from a team that studiesharm that comes from AI, and
(43:14):
so again, lagging indicator.
So what they found was thatthe people who are building
the models, they want itto be truthful, honest,
and doesn't cause harm.
So the challenge, like thisis a programming human thing.
The programmers who arecreating these large models,
(43:35):
they want to be helpful.
They have really goodintentions, so they're
focusing on the helpfulparts of the model first.
And so the earliest seedsof it are to train it to
be helpful and then tobe harmless and you know.
Later in the process.
So it's kinda like qa, I mean,but this is like, this is how
(43:56):
we have developed software andwe wanna make sure that the
feature meets its requirementsand then we'll check for
bugs later down the line.
Right?
Very natural.
And I don't think anybodywould fault somebody.
I think we would actuallysay, look at that really
compassionate developer.
Like they want to be helpful.
They are putting in the work.
Right.
But here's the problem.
This research team found that.
(44:18):
They had a chance to assess thehelpful model before the other
aspects had been added to it.
It was 30% morelikely to cause harm.
Galen Low (44:30):
Wow.
Andrea Goulet (44:30):
And to lie or
be disingenuous like, and that
is incredibly problematic.
So I think when we're talkingabout empathy in the age of
AI, there's just, there'sso much that goes into it.
Because they're the humanswho are creating the models.
There is the training datathat the models are used,
(44:52):
you know, they're being used.
There is the policy and thegovernance and the regulations.
There is the existentialdread or utopia.
So the same article, itcited like the two like.
Most cited researchers,like in AI, one of them is
a total, like we are allgoing to kill ourselves.
(45:14):
The other is like, no, thisis gonna bring in the best.
So essentially, when you'retrying to predict, you're just
playing a big game of Plinkowhen you've got such a big
discrepancy, like, I have noidea where this is gonna land.
But I do know that tome, understanding the
mechanisms that enableeffective communication,
collaboration on a humanlevel, can't make things worse.
(45:38):
I don't think so.
To me, I think that's the bestdefense that I have against
trying to skew the systemtowards a more positive outcome.
And if that's not clear, it'sthe age of prosperity instead
of we all kill ourselves.
Galen Low (45:51):
Right, right.
Andrea Goulet (45:53):
Yeah.
So I think then if youunderstand empathy on a more
technical level, here's theother thing, a lot of it
is counterintuitive, right?
We follow the science,then we follow the science.
If we say, oh no we needto have these people do
this helpful thing first.
What we're doing thenis actually like doing
something that is a vanitymetric, and we're doing
it because it feels good.
(46:13):
It feels like we're puttingpeople first, but we are
actually causing systemic harm.
So that's where it's like, okay,having that cognitive empathy
is important because we need tobe able to evaluate like, okay,
how do these measures work?
And yeah, I think that's thebiggest thing is that there has
been so much around this stuffthat has literally just like
broken my brain in the best way.
(46:35):
And I think that is kind of whatwe need a little bit right now.
So, yeah, if, and the otherthing I think that is really
important is to remember thatAI is a tool, not a teammate.
It can talk to you and it veryclearly passes the touring test.
It sounds like human, like, youknow, when I use it sometimes,
like when I've trained a modelto sound like me it can write
better than I can sometimes.
And I'm a professionalcommunicator and I've written
(46:57):
professionally for 25 years.
Like it's scary, right?
How do we use that as a tool?
You know, now it's like I'musing it to capture more
nuance and I'm actually likehaving more dialogue to refine
my thinking rather than justtrying to focus at speed.
So I'm actually usingtools to go deeper.
(47:18):
Like, help me think aboutthis in a more skeptical way.
I do tend to be a little bitmore of a, I have an optimism
bias, so by asking it, like askme critical questions to help
me refine my thinking, changingthe way I'm prompting it.
So then that way I'm focusingon depth more than just like
generating a bunch of gibberish.
Galen Low (47:37):
Speaking of AI as a
tool, something you said to me
as we were prepping for this,you said the purpose of AI is
to help humans human better.
Andrea Goulet (47:44):
I did say that.
Yeah.
I think that's the mission.
I think like.
And I think like for thedevelopers who are building
these models, like,okay, I wanna be helpful.
Why do I wanna be helpful?
It's 'cause I wanthumans to human better.
And so on a team, this canlook like, you know, Genworth
is I'm in Richmond, Virginia.
They're a company thatspecialize in insurance
(48:04):
and stuff and finance.
The CIO is on a panel.
I really liked the waythat he shared things.
He's like, you know, AIis an empathy enabler.
Because the worst thing wewanna do is have a chat bot
interact with somebody at likea major crisis in their life.
That is the opposite ofeverything we stand for.
And it's like organizationally,and I like, that's
(48:25):
gonna lose us money.
How can we use AI to eliminatethe things that are getting in
the way of having our agentsbe there and extend even
more time and more compassionwith each of our customers?
Galen Low (48:38):
I love that.
And that translatesinto teamwork.
Andrea Goulet (48:41):
Yep.
That is a great example of usingAI to help humans human better.
And so I think that where youcan use this is, you know, AI
very much is the wild west.
A lot of it is like, oh,it's used for vibe coding,
it's used for this and that.
Where in your organization canyou leverage this as a tool
to help you run experiments orto help you brainstorm or to
help you like, and so I thinkAI literacy in terms of what
(49:04):
is it and thinking about as atool and not replacing an agent
as just gonna have an armyof chat bots run our company.
There are some use cases,I guess where that can be
the most efficient, but.
I would argue that mostbusiness models or organizations
don't fall under that bucket.
(49:25):
So that's the way I think aboutit is the purpose of AI is
to help humans human better.
And the way that we dothat isn't necessarily by
following our instincts.
Like that can be a reallygood place to start, but
it's by following the sciencearound how do we effectively
communicate with each other.
(49:46):
And again, going thatlayer deeper and that ends
up becoming a questionof how do we effectively
empathize with each other?
Galen Low (49:53):
I love that.
It actually makesspace for empathy.
Andrea Goulet (49:55):
Yeah.
Well, I mean, empathy exists.
It's like, that's likesaying make space for
your heart beating.
Galen Low (49:59):
I mean, and that's
the kind of the thing that like.
That is the sort ofbusiness argument, I
guess, in a way, right?
It's actually, it's likewe didn't have enough
time for empathy to havethese conversations about
values and the way we'recommunicating and whether we're
communicating effectively.
And yet that is the equivalencyof not leaving enough
space for your heart to be.
Andrea Goulet (50:15):
I would actually
argue, so that made me think
about, you've got differentfunctions like breathing
your respiratory system.
It's both voluntary,involuntary, so for the
majority of the day I'm goingaround, I'm not thinking
about like air going inand out my nostrils, right?
But when I sit down and Iregulate, like meditation,
creating a condition where myheart rate is going to slow, or
(50:40):
going on a run, like knowing howto have these different physical
systems and how to optimizethose, I think there is then the
case of like, no, you do needto make space to listen to your
heartbeat and to regulate that.
You do need to make spaceto listen to your breath.
Calm yourself down or get moreair in, or whatever it is.
(51:02):
And so, you know, empathy ina similar way, like we have
it, it's a predictive tool.
We're using it, we'reusing it in different ways.
It gets expressed in differentways, but it's there.
But are we taking the timeto really observe it and to
think about how we can do itbetter for our organizations
or for whatever mission that wewant to get a group of people
together and achieve something.
Galen Low (51:24):
I like it.
Andrea, thank you somuch for spending the
time with you today.
It's been so much fun.
I love the ideas youbring to the table.
I will try and gather someof the names and some of the
research and put them in theshow notes because I find
talking to you is almostlike you're this hub and the
spokes just shoot out from it.
I'm like, okay, now Ineed to go and look at
this and this and this.
This has been really great.
Thank you again.
Andrea Goulet (51:44):
There's
also, you had mentioned the
history of the word empathy.
I have the very firstarticle that I put up on
my LinkedIn newsletter.
It was defining empathy aslike nailing jello to the wall.
Galen Low (51:55):
Yeah.
Andrea Goulet (51:56):
Because
that I was like, okay, I'm
interested in learning abouthow to describe empathy.
I should probably just startwith like, what is empathy?
And it's like the literatureis there's not a lot of
consensus and we're gettingall this new data just in the
past decade or so because theneuroscience has gotten better.
So before we were only operatingon behavioral outputs of
people saying how We didn'teven get into like all the
stuff we talked about empathicaccuracy and stuff like that.
(52:20):
But yeah, there's so muchhere that I think it's
endlessly fascinating.
We have something that'sendlessly fascinating and can
also have a positive impact.
I just, that soundsvery exciting to me.
Galen Low (52:32):
I love that.
I love that.
Andrea, where can peoplelearn more about you?
Andrea Goulet (52:35):
So you can go to
my website, andreagoulet.com.
I got a contact form on there.
So I'm a independent, I callmyself a communications systems
architect, so I work withorganizations to help humans
human better so that they canachieve their business goals
or organizational mission.
And then I'm on LinkedIn,so you can follow me there.
I've got a newsletterthere as well.
(52:56):
I am not on any othersocial platform.
I have a Bluesky account,and I think you'll see
like six hosts on there.
So LinkedIn and my websiteare definitely the best
places to follow me andget in touch with me.
Galen Low (53:09):
I'll include those
links in the show notes.
Andrea, thank you again.
I really appreciate it.
Andrea Goulet (53:12):
Yeah, thank you.
Galen Low (53:15):
That's it for
today's episode of The Digital
Project Manager Podcast.
If you enjoyed thisconversation, make sure
to subscribe whereveryou're listening.
And if you want more tacticalinsights, case studies and
playbooks, head on over tothedigitalprojectmanager.com.
Until next time,thanks for listening.