Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_02 (00:00):
So we are still in a
situation where there is a
productivity crisis.
The difference being is thatthese AI tools will mask a lot
of the productivity issues.
SPEAKER_00 (00:16):
Hi, and welcome to
another episode of Stell Me This
Podcast.
This week we're joined by TimChen, who is one of the
co-founders of Untapped Energy.
We covered a lot of ground inthis episode talking about
everything from AI literacy, AIfluency, and the importance of
data as a foundation for gettingyour AI strategy right.
I can't wait for you to have alesson.
(00:36):
Welcome, Tim.
We're super excited to have youon another episode of Sell Me
This Podcast.
We are going to dive right intothe conversation today.
And why don't you kick thingsoff by just introducing yourself
and tell us a little bit aboutwho you are?
SPEAKER_02 (00:48):
Absolutely.
First off, Keith, thanks forhaving me here.
I am so delighted.
My name is Tim Chan and uh bornin Toronto, uh, but moved to
Calgary in 2005.
So for the most part, I doconsider myself a Calgarian.
Uh in the daytime, I work as anassistant professor at the
Bissett School of Business atMount Royal University.
(01:09):
Uh, I'm also a co-founder anddirector of Untapped Energy.
SPEAKER_00 (01:13):
Oh my goodness.
Okay, so there's a lot to unpackthere already.
And I know that we're hot offthe heels of a really successful
Untapped Energy EducationSummit, which I think we can we
can dive into.
But both of those things um seemcomplementary but but separate.
How did you find yourself umdoing the work of both of those
things?
SPEAKER_02 (01:31):
Actually, it was all
by accident.
Uh so maybe I'll share a littlebit of the origin story of
Untapped Energy.
Uh back in 2018, uh, whichreally felt like such a long
time ago, uh, there was a lot ofchatter and excitement about
this new topic, big data, thatsomehow big data would be sort
of the mechanism by whichorganizations would be able to
(01:53):
create more value, particularlyin the oil and gas sector where
I was working at the time.
And at the time, there was alsoa lot of growing concern, as is
typically seen whenever there isany disruption when it comes to
business processes ortechnology.
And so there were a number ofoil and gas professionals, uh,
primarily geologists,geophysicists, petroleum
(02:16):
engineers, who are starting towrestle with this idea of, well,
if all of a sudden theorganizations were going to lean
heavy into this big dataconcept, did they have the
skills and the competencies tocontinue to thrive in this
particular sector?
And as a result of that, many ofthem became concerned.
And so that's really howuntapped energy started.
(02:37):
So there were a number ofgeologists and geophysicists,
petroleum engineers in oil andgas that got together and
thought that wouldn't it begreat if we could create a
community of professionals inoil and gas and to organize
events that could showcase whatit would be like to collaborate,
to come together, and to solveproblems using data.
(02:58):
And so the idea was born tocreate the very first oil and
gas data fund.
Now, they were, I guess, goingthrough their networks to try to
see who they can getrepresentation from some of the
large oil and gas companies.
And then they came to mycompany.
Now, I was third on the list.
Uh, they had actually two otherdata scientists that they had
wanted to get involved.
(03:19):
Uh, as fate would have it, thosetwo were not able to attend the
meeting.
So they're like, okay, fine, weknow this accountant from this
organization, let's give them aring.
Uh and turns out I attended, andthe very first question I asked
was, Well, what is a data thon?
Well, I would learn that it issimilar to a hackathon, except
you gather people, you providethem with the data sets and with
(03:40):
some data tools, uh, give themuh three days uh over the course
of the weekend to then try tosolve some problems using these
particular tools and techniques.
SPEAKER_00 (03:50):
Aaron Powell Very
interesting.
And so, yeah, I think there's alot of people that are very
familiar with the hackathon.
And so the data thon, you almoststart the other direction, which
you say, okay, we have thislarge set of um either
structured, I'm guessing, orunstructured data, and and kind
of what what can you find out ofit?
Like how how what does itactually feel like to be part of
that kind of experience?
SPEAKER_02 (04:10):
Aaron Powell Well,
it's amazing.
Uh and first off, when you weretalking about whether the data
was structured or unstructured,uh I would call that it was all
messy data.
SPEAKER_00 (04:18):
Yeah, it's it wasn't
tied to a there's no bow across
it.
SPEAKER_02 (04:22):
We wanted to take a
look at some of the biggest
challenges facing the oil andgas sector at that time.
Um there were a lot of headwindsfacing the industry.
You know, you had uh takeawayconstraints, so there just
weren't enough pipelines to movesome of this crude product.
Um there was also uh governmentregulations that weren't very
favorable to the particularindustry, and it was an industry
(04:45):
that was very sensitive tomarket fluctuations for
commodity pricing.
So all these things werecreating challenges for a
thriving sector.
And so the data fund essentiallywent out to look for publicly
available data sets related tooil and gas.
Now, little did we know thatthat was such a wide net to
(05:05):
cast, and what we eventually gotback was just messy, messy data
that required a lot of cleansingof the data, a lot of
structuring, uh, and ultimatelyit led to uh where there were
five specific questions that weuh were trying to solve.
Uh everything from corporatesocial responsibility to the
repurposing of existing oil andgas assets uh to clean water
(05:30):
usage.
SPEAKER_00 (05:31):
So, as the
individual that uh, you know,
either by through the luckystraw or the short straw, as you
attended that first meeting, didyou have a lot of knowledge in
big data at that time, or wasthis something that was more of
a passion project that you werejust excited to be a part of?
SPEAKER_02 (05:45):
I just naively
stepped into something that I
thought was cool to be part of.
It turns out that I wassurrounded by a number of data
scientists, uh, mathematicians,computer scientists, those that
you would typically think wouldbe in this space.
And so I had the greatest formof imposter syndrome that entire
weekend.
But I pushed on, just thinkingthis would be great to build a
(06:08):
community around just wanting tosolve complex problems.
Well, halfway through the event,as I was having some serious
doubts about my involvement withthis event, someone came up to
me, my friend Sheldon, and hesaid, Tim, you need to put aside
this doubt that you have.
You're an accountant, whichmeans that every month you're
going into the organization'senterprise resource planning
(06:31):
system, you're extracting thedata.
Uh it turns out that the datathat you pull out, you really
can't use it, so now you'rehaving to transform that
particular data.
And then after that, you'redoing some analysis, drawing
some insights, and then you'reloading that data back into some
sort of management system.
He said, Tim, that's ETL.
(06:51):
As an accountant, for the lastdecade, you've been doing that
every single month.
And it was that aha moment thatreally set aside all that worry,
and I'm like, actually, I dobelong here.
SPEAKER_00 (07:02):
I find it really
interesting.
We we talked to a lot ofdifferent professionals that
aren't in technology and theybelieve that technology is
somewhere else over here on theoff to the horizon.
It's something that supportswhat they're doing.
But there's so much technologypractice that actually comes
into play.
Um in that exact I love thatexample that you shared where
you know you were you were doingthat work already, you were you
(07:23):
were building those processes,you were really embodying the
exactly what that group needed,but you you didn't even have the
you just didn't have the rightlabel for it.
SPEAKER_02 (07:33):
Absolutely, Keith.
And sometimes maybe that's allthat's needed in order to make a
difference.
You just need to have a littlebit of curiosity that allows you
to just extend yourself beyondyour zone of comfort uh and
always be willing to learn fromothers.
So it there is a little bit ofthat humility uh that really
does open up uh different doors.
SPEAKER_00 (07:52):
So speaking of doors
opening up, how did you then
transition from this reallyinspiring weekend data-thon to
the founding of Untapped Energy?
SPEAKER_02 (08:02):
Well, after the
weekend was concluded, uh we sat
back and we're like, wow, thisis so amazing.
Like over 125 people just showedup out of the blue, like
voluntarily.
Nobody was paid to be there.
This was when the Bull ValleyCollege opened up their downtown
campus.
So they graciously offered thatspace to us.
And we thought, we need to dothis again, but we can't wait
(08:25):
until the next year's data fundbefore bringing back together
such a compassionate,impassioned community of people
that just want to do good andsolve problems using data.
So that's when we said, well,why don't we start meeting on a
more frequent basis?
And that was the start of ourmonthly meetups.
And so since 2018, UntappedEnergy has been meeting on a
(08:46):
monthly basis where we wouldinvite speakers to come in to
share a little bit about theirchallenges in using big data,
but also to maybe highlight howit is that they've overcome some
of these problems.
And we have found that that typeof gathering has been so
inspirational because peoplecould be wrestling with the
exact same problems at thatmoment, hearing how someone has
(09:07):
solved it for them.
SPEAKER_00 (09:09):
And who are the
types of people that you're
finding at these meetups?
Is it is it big data expertswhere they're they're diving
into all of the data, or whowould who would be attending
these types of meetups?
SPEAKER_02 (09:21):
Well, initially we
thought it was that type of
persona, right?
The person who's spent manyyears doing computer science or
into machine learning.
And certainly uh we had many ofthose participants show up to
these meetups, but then westarted to see others in
business showing up, theaccountants, the lawyers, those
that are in HR.
Because it turns out that inthis day and age, everybody is
(09:44):
dealing with data in one form oranother.
If you send an email, you'redealing with data.
If you are having to update aspreadsheet, you're dealing with
data.
And it also turns out that a lotof these users of data wrestle
with a lot of the samechallenges and are motivated and
curious into how it is that theycan either make their own
workflows a little bit moreefficient or to solve some of
(10:07):
the more complex problems.
SPEAKER_00 (10:09):
Very interesting.
And so when you were firststarting these meetups back in
2018, was AI part of theconversation or was it very much
around the data itself?
SPEAKER_02 (10:18):
The wonderful thing
about the city of Calgary is
that there is a very vibrantgrassroots group and movement
where uh if there's any type ofinterest, uh, there's probably a
meetup group for that.
So very quickly we started tosee that for a data community,
they were also part of codingcommunities, or they were also
(10:38):
part of machine learningcommunities, and they were also
part of an artificialintelligence community.
And the generous nature of thisparticular city is such that
everybody is always looking forways to collaborate, to partner,
and to help each other up.
It's this uh idea of you know,if you can lift the entire tide,
then everyone gets the benefit.
And so, yes, even in 2018, therewas, I'd say, kind of a smaller
(11:02):
movement around artificialintelligence.
It wasn't yet at the stage whereit had captured a lot of
culture's imagination.
But still there were those thatknew all about neural networks
and these transformers, all ofthese things that you know maybe
some five years later we wouldeventually hear more about.
SPEAKER_00 (11:19):
Well, on the AI
topic as well, like I I believe
that's something that a lot ofpeople don't fully understand is
that it isn't just the releaseof Chat GPT that has kicked
everything off.
This is a conversation that'sbeen happening not for for
years, but for decades.
Um and it might have haddifferent labels, it might have
had different um pseudonyms, butuh but this is a conversation
(11:40):
that has been happening for avery, very long time, and data
is an incredible part of that,and I think people are starting
to see that more and more now,too.
So, how has the the conversationaround data and untapped energy
changed with the momentum aroundAI right now?
SPEAKER_02 (11:59):
Yeah, certainly I'd
say that machine learning, uh
deep learning, and thenartificial intelligence has
always been a feature of a datacommunity.
And I would say that certainlywith the release of ChatGPT in
November of 2022, where now thatseemed to be kind of a
groundswell that has justexploded and now everybody is
(12:19):
aware of that.
But what it all has also done isreminded to our community the
importance of data and dataliteracy and data fluency.
Because all of these algorithmsthat are running a lot of these
generative AI tools that we'venow become familiar with, it all
starts with data.
We keep hearing about how all ofthese large language models they
(12:40):
need to be trained on data.
Where do you get the data?
And so for someone who is wellfluent in data, uh, they have
now the skills and capabilitiesof knowing where the data is,
uh, knowing how to create thenecessary pipelines to extract
that data in an efficientmanner, and then doing something
with that data so that you canput it in a way that's a little
(13:02):
bit more structured, that youcan actually start running some
work on that data.
SPEAKER_00 (13:07):
And so I don't want
to get too technical today, but
I think you have a reallyinteresting perspective around
the importance of thatfoundation.
And so if if you're talking to alistener, if you're talking to a
business owner that knows thatthey have to walk down this
path, they they know that theyneed AI, they know they need all
of these things that everyone'stalking about, but haven't the
(13:28):
faintest clue where to start,but they also hear this
narrative that your data needsto be in order, your house needs
to be in order.
What are some of the things thatthey should be considering or
contemplating to start walkingdown that path?
Or is there any low-hangingfruit that exists for them as
someone that might not be asintimate as you are?
SPEAKER_02 (13:46):
So there have been a
number of AI pilot projects that
have been completed at thispoint now.
Uh unfortunately, many of themhave not moved on to where
they're now out of the pilotphase because of some of the
results that were observed.
And one of the main reasons whythese particular pilot projects
(14:07):
are not achieving the expectedor desired outcomes is down to
data.
Um the data is just not in agood enough state, which enables
a lot of these tools to be ableto really leverage what's in the
data that can create someamazing insights, which would
then enable the organization tomake some quick decisions or to
(14:28):
optimize some of its processes.
And so for a leader of anorganization, knowing that this
has been the trend, that datahas been one of the downfalls of
successful AI implementation,what they can do now is really
be proactive of ensuring thateven before any type of AI
initiative is being proposed,that the data is in order.
(14:53):
Now, one of the challenges for alot of organizations is that
they've inherited data fromvarious legacy systems or legacy
approaches.
All done while, let's be honest,people didn't know what data to
keep.
People didn't know how it isthat they should label their
data or store it in such amanner that followed some sort
(15:13):
of structure.
And so you end up having uhthese piles of random Lego
bricks all over the organizationthat you know that at some point
you might be able to draw somesort of insight out of those
Lego bricks, but it's in such amess.
So by the time an AI projectcomes along, uh, you know, they
have to do that type of uh put alot of effort into that, and
(15:34):
that can sometimes eat up a lotof the actual time and effort of
an AI project.
SPEAKER_00 (15:40):
And then so if I was
to try and unpack that a little
bit as as someone that might nothave that same technical depth,
are there things that I can doto start to put those Lego
pieces in a way where they makesense?
SPEAKER_02 (15:52):
Well, I guess the
first is just to know where all
the Lego bricks are.
So even if there's a way to comeup with some sort of mapping or
documentation of where the dataresides.
They're typically in databases,uh, so that shouldn't be too
hard of an exercise to complete.
But then once you've got a uh agood sense of where the data is,
(16:12):
then it is coming up with aparticular framework of how it
is that the data needs to becategorized.
Let's put all of the red brickstogether, let's put all of the
green bricks together.
How about the ones that aremaybe two by two and the ones
that are you know two by eight?
So having some sort ofstructured approach uh as a
framework is really thebeginning of what's known as
(16:32):
data governance, where it isthat you have guidances on how
it is that your data uh shouldbe stored, not unlike how
organizations treat some oftheir merchandising inventory or
some of their um hydrocarboninventory in the case of an oil
and gas organization.
SPEAKER_00 (16:51):
So that makes sense
to me.
So kind of what you'resuggesting is saying we need to
almost define the rules to playby for data, understanding how
it traverses our organization,where it lives, what street
address it's on, and and reallykind of creating that
infrastructure so that we don'thave to make an individual
decision each time.
SPEAKER_02 (17:08):
Absolutely right.
Because once you know where yourdata is and in what state it is,
that's when you can startactually using it.
SPEAKER_00 (17:17):
Can I ask a little
bit of a candid question?
Sure.
Like I believe that everyoneknows that this is super
important, or at least at leastmaybe the people that we talk to
in the circles that we talk to.
But this seems to be a partwhere everyone falls down, and
maybe it's because it's not theshiny object and the exciting
work potentially.
But but why don't moreorganizations invest in this
(17:38):
foundation?
Because I I feel like this isexactly why a lot of them
struggle with the actual ROI ontheir projects.
SPEAKER_02 (17:46):
Yeah, it's such a
great observation.
Um, and I think it comes down tocourage.
It's the same type of couragethat causes someone to go into
the basement of an old buildingnot knowing what's there, but
knowing that there might be aswamp of mess, messiness.
And so uh a lot of leaders whohave the motivation may not have
(18:10):
the full courage or support towander into the depths and the
dungeons of where all this datais.
Um so part of data fluency isarming yourself with enough
tools and a mindset so that youcan then put on the right
equipment, get the rightlighting gear, to then wander
into this dark dungeon uh to seeexactly what the state of your
(18:34):
data is in.
And oftentimes, once thathappens, that's really then kind
of the tipping point where thenorganizations feel more
confident about how it is thatthey can pull the data, uh, be
able to uh make it more currentand more relevant and more
useful.
SPEAKER_00 (18:52):
So you said
something very interesting
there, which was data fluency.
So, how would you describe datafluency in today's business
environment?
SPEAKER_02 (19:00):
Yeah, oftentimes we
hear about data literacy uh as
an important upskillingcapability that employees should
have, workers should have.
Uh I call it data fluencybecause just like if you and I
were to go to a country where wedid not speak the language,
yeah, you can jump on Duolingualand it'll give you a number of
(19:22):
uh activities, exercises to helpwith your literacy.
So you may learn the charactersor the alphabets, you may learn
some of the rules of grammar inorder to put those things
together.
But that doesn't make you fluentbecause the moment that you land
in that country, you'll realizevery quickly that you cannot
communicate with someone who hasthat fluency.
And so there is a big differencethere.
(19:42):
There's certainly lots ofcontent information out there to
help people on their dataliteracy journey.
But fluency comes from masteringa lot of what you've learned in
literacy and being able to putit all together so that you can
communicate in a very compellingand inspirational way.
So, this is where data literacyis sort of the aspirational
(20:06):
point for someone on their datajourney.
It will start with dataliteracy, uh, learning about
databases, maybe even learningsome of the coding languages
that helps you to clean andorganize your data.
But really, where the valuecomes from on that journey is
being data fluent, where you cannow interpret the data, pull out
(20:28):
meaningful insights, and be ableto communicate that in such a
way to key decision makerswithin the organization.
SPEAKER_00 (20:36):
So does this work go
away with the increase in AI
tools?
Um, you know, a question thatI've received before is can I
just dump all my d data into theAI and I'm doing some air quotes
for anyone that's justlistening, and then it'll just
clean it up for me.
Is that how it works?
SPEAKER_02 (20:55):
Well, certainly, as
we're seeing now, there are a
lot of AI tools that can helpwith some of the monotonous and
repetitive tasks in a particularworkflow.
Um we've seen this before inmaybe prior iterations, before
it has this AI label in the formof automation or RPA, robotic
process automation.
(21:17):
Uh AI can certainly uh providesome of that lifting of the
tedious and monotonous tasks,which you can find in a lot of
the data cleaning up uh aspectsand processes.
But still, in order to get tothe point where the insights
that are coming from that dataanalytics work, uh, you still
(21:38):
need to have that humanintuition, that critical
thinking that is needed.
So there's recently been areport that uh came out that
said that 40% of employees thatwere surveyed are now indicating
that they're seeing somethingcalled AI work slot showing up.
This is now work that is clearlygenerated by AI that looks and
(22:03):
sounds like it's good, but whatit's what it's causing is now
human workers have to now stopwhat they're doing and review
and inspect this other work todetermine whether or not it's
actually accurate.
And so I'm not sure if a lot ofpeople would have predicted the
prevalence of work slop nowshowing up and how that's
(22:25):
actually working counter to someof these productivity
aspirations for theorganizations.
So, similar to being able todetermine what is good work and
bad work, uh, those that havethe data and fluency, data
fluency would have thecompetencies to be able to
determine what is good dataanalytics work and what is not.
SPEAKER_00 (22:45):
That makes a lot of
sense.
And you know, uh to build onthat, I was out for a coffee
with someone the other day, andwe were talking about one of the
vibe coding tools, one of thenew coding tools, and it's
incredible what you can do.
But under having a foundation ofactually how platforms work, you
can take those actual vibecoding tools and take them way
(23:06):
further.
And so, an example, I you know,I had a small use case for
Deliver Digital, I builtsomething out, and I was able to
make something a fairlyfunctional MVP.
In the same amount of time, thegentleman that I was talking to,
he runs a software developmentcompany.
He built something that youcould probably put on uh, you
know, start going for fundingrounds tomorrow.
(23:26):
Like it was a very functionalSaaS application that blew mine
out of the water.
And just because he knew exactlythe right questions to ask, the
framing, the context, datastructure, and having that
foundational knowledge is is amust.
SPEAKER_02 (23:41):
Absolutely.
So this is not at alldiscounting the fact that uh for
someone to have valuable AIskills, they should learn prompt
engineering, they should learncontext engineering, they should
learn vibe coding.
But I believe where humans willcontinue to still have the
advantage is how do you act as aleader in AI?
(24:05):
So what does it mean toimplement AI responsibly and
ethically?
Uh how does these AI toolsimpact change management, which
has always been something thatuh leaders have always been
wrestling with.
SPEAKER_00 (24:18):
Definitely, and I
think to your earlier point as
well, how are you managing thesethese AI either agents or
workflows?
Because input and output are aretwo very different things.
And if you put in a crappyprompt, or if you put in a
crappy request, you're gonna getcrappy information back.
Yeah.
And if you continue passing thatdown the line, eventually you're
(24:40):
just gonna have a pile ofgarbage.
SPEAKER_02 (24:42):
Absolutely.
So I I believe that there is acrisis that we're facing right
now with AI, and it is called aproductivity crisis.
And it's not a crisis that wehaven't seen before.
So actually, Keith, if you thinkabout it, most of why
organizations go through adigital transformation is
because someone has identified aproductivity crisis.
(25:02):
We need to use our data betterin order to generate more value
without incurring more costs.
So we are still in a situationwhere there is a productivity
crisis.
The difference being is thatthese AI tools will mask a lot
of the productivity issues.
So a lot of incentivizationstructures, or even the way that
(25:24):
organizations are structuredright now, is more on input.
How much can I demonstrate thatI'm putting into the process?
And so this leads to then thecreation of miles and miles of
PowerPoint slides or reportsthat you can create entire
cottage industries around justdoing stuff or appearing to do
stuff.
(25:45):
And don't get me wrong, these AItools are great at making you
look great when you're justdoing stuff.
But the crisis is this is thatorganizations who aren't focused
on the output are going to becaught in a bit of a loop where
now workers would become betterat using these tools that
generate slob or create thingsthat demonstrate that they're
(26:06):
productive on an inputperspective, but it does nothing
to show that they're actuallysuccessful on the output
perspective.
And so I think the moment thatwe can maybe address that crisis
through AI fluency, throughappropriate leadership mindsets
and sensibilities, this is whenthe needle will start moving on
(26:26):
actually seeing the true valueof AI show up in organizations.
SPEAKER_00 (26:31):
So how much do you
feel like leaders need to shift
their perspective then in termsof even what their goalposts
are?
Because you know, I rememberwhen I was early on in my
career, um, you know, startingto manage people, I was told
over and over again, you know,you can't throw more people at a
bad process.
And occasionally you can, youknow, you can move the needle
forward, um, but are you movingthe needle forward in the wrong
(26:53):
direction in an effective wayand masking to your point the
problem by just having morepeople?
And AI just seems like a cheatcode to say, okay, well, we can
times that by infinity now,because the people aren't the
constraint, and so we can justkind of continually double down,
show progress in a way, but itmight, to your point, not be
(27:14):
actually leading us anywhere.
SPEAKER_02 (27:16):
You're absolutely
right.
So I'm suggesting it's not evenshifting the goalposts.
We need a whole new gamealtogether.
We need a whole new goal, andthis is where AI fluency will
help to um highlight that sothat you're not just using these
tools to try to exasperate anexisting crisis.
SPEAKER_00 (27:37):
And so, how does
someone even start to wrap their
head around that?
Like if you know, if you're ifyou're talking to me and I own a
you know$30 million a yearconstruction company, and you
say, okay, great news, uh, weneed to change the goalposts
completely.
Um we need how do I how do Ieven start to contemplate that?
Like what are the things that Ineed to be doing to get myself
into a headspace to be able tosee how I can change myself, my
(28:00):
business, my goals, etc.
SPEAKER_02 (28:02):
And this is not at
all to suggest that you get out
of that business altogether.
You're still in construction,and so you know that at the end
of the day, the key thing isthat you're gonna be building
things that are of value towhomever your customers or your
clients are, and you need to doit in a safe manner, right, so
that the people that aresupporting you get to go home
every night.
(28:25):
AI fluency just uh causes you tothink about the problem in a
different way.
Whereas traditionally we are soused to you have to do things in
a very linear fashion.
You've got to do this before thenext task can be completed or
started, and that needs to bedone before the next task can be
(28:46):
completed.
What AI fluency can do is okay,well, what if we were to iterate
on all the possible ways that wecan complete this process?
And we're no longer constrainedto maybe just your five best
scenarios.
Maybe you can have thistechnology iterate an infinite
number of different outcomes foryou.
(29:07):
Call it the Doctor Strangemultiverse of possible outcomes.
So if you now know that you havea tool that can give you uh a
glance into this multiverse ofinfinite possibilities, that is
then how you can start gettingsome insights on, well, maybe
there's a different way that wecan sequence the work, maybe
(29:28):
there's a different way that wecan utilize our human capital
into doing things that allow usto get on that most optimum path
to the outcome that we'relooking for.
So it is a bit of a vague wayfor me to describe, but this is
maybe describing more of amindset, not necessarily well,
here are three courses that youcan take, two podcasts that you
(29:49):
can listen to, and oneuniversity course.
Those will still be therebecause there still is the uh
pragmatic, tactical approach tolearning the tools and
capabilities.
But what's going to be harder tolearn, particularly for leaders,
is that sensibility that if youhad the opportunity to look and
see an infinite number ofoutcomes for your particular
(30:10):
business, how it is that you canget onto that particular path.
Trevor Burrus, Jr.
SPEAKER_00 (30:13):
This sounds like how
the Avengers movie happened.
But it seems almost like a cheatcode.
If you can start to unlock allof the potential scenarios,
probabilities, and really startto model out some of these
things without incredible amountof cost.
You know, the the some of thebig consulting companies have
(30:34):
built their entire empires, youknow, they're around parachuting
in teams of people to analyzethese things.
Does that mean that you thinkpeople are thinking too small
when it comes to AI and when itcomes to some of the iterations
that are possible and how thatwould change their businesses?
SPEAKER_02 (30:50):
I actually think
they're thinking too big.
Okay.
I think a lot of times we getpulled into a lot of the hype of
what these two tools can do.
We get enamored by these largevaluations, these watering,
eye-watering valuations of whatcompanies are able to be worth
and what they're able to raiseall in this space.
(31:11):
All this talk about data centersand but really what
organizations need to startthinking about is what we had
talked about previously.
Where's your data?
So it comes back to somefundamentals.
And so if you spend too muchtime thinking about, well, we
just need to use these tools andthat will solve all of our
(31:31):
problems, that's a little bitshort-sighted because then by
the time you practically go andtry an AI prototype or pilot
project, then very quicklyyou'll realize that your data is
the issue.
SPEAKER_00 (31:44):
Interesting.
So it still goes back to thosefoundations that is what is your
data integrity, what is yourdata fluency, and what is your
data governance process to beable to then contemplate what
are these really interestingthings down the line.
SPEAKER_02 (31:58):
Absolutely.
And so the relevance of acommunity of practice of
professionals seeking to upskillthemselves in data analytics and
data science is more importantthan ever.
And so I know some peopleworried that, oh, they haven't
jumped onto the AI bandwagonsoon enough.
(32:20):
Well, it is just a bandwagon.
Like a lot of other bandwagons,a lot of it is propelled and
fueled by hype.
But if we peel back a lot ofthat and we just see what is the
foundational pieces, well, it isalready taking what you know
about data in your existingroles and just being to amplify
that a bit more, learning someof the techniques so that you
(32:43):
can have a better line of sightto all of your data, uh,
learning some practices thatallow you to keep your data
clean going forward so itdoesn't all of a sudden mutate
into something that you can'tuse anymore.
And then finally just havingsome leadership by being able to
pull insights from your data andtranslate that into insightful,
compelling uh call to actionsfor the decision makers in your
(33:07):
organization.
SPEAKER_00 (33:08):
I love it.
I'm gonna have to write allthose down here.
You've obviously been workingalongside a lot of leaders that
have gone through this process,whether it be through your
community or through yourcareer.
Are there often surprises thatthey come across with or they
come across when they're at thetail end of these projects and
they they come out the otherside either achieving something
(33:30):
they didn't realize or somethingthat was bigger than what they
thought they were going to do?
Like are there surprises thatthey come across?
SPEAKER_02 (33:36):
I think there's
always surprises when you go on
a particular journey oradventure because you can try to
plan as much as you want, butthere's always these unknown
variables that end up showingup.
Uh certainly I think thatthere's uh no lack of shortages
of um bad surprises, and oftenthat is really due to the fact
(33:59):
that, well, one, your data justwasn't in the right state that
it needed to be in.
But also, we're stillconstrained to a very
conventional way of runningprojects and how these projects
are sponsored.
So you may have a champion for aparticular initiative that is
really passionate about this andwas able to secure the funding
(34:22):
for it.
Okay, well great.
The project progresses, but asit is in a lot of organizations,
leaders move around, people movearound.
And so perhaps the new leaderwho is now in that position
doesn't have the same vision orthe same passion.
And so by the time the projectends, well, guess what?
There's no more funding for it,and then it just dies on the
(34:44):
vine.
The other consideration is thatsometimes there are great
surprises, and so a thoughtfulleader would anticipate that if
one of these great surprisescomes up, what can I do to
ensure that there is enoughsupport and inertia to progress
the project?
(35:05):
And so this then becomes sort ofa CapEx OpEx conversation.
Um, so a lot of funding issecured just for the project
itself, but there is noconsideration of, well, how do
we then scale this so that it'smore enterprise-wide and becomes
more of an operational expenserather than a capital expense?
SPEAKER_00 (35:22):
Interesting.
And so you you brought up aswell the organizational design
side of things.
Um you shared some reallyinteresting thoughts the other
day around how data, how AI, howsome of these emerging
technologies are really gonnastart to reshape organizations
themselves.
Are you comfortable sharing kindof some of your thoughts around
(35:44):
what changes organizations aregonna have to make in order to
really ingest some of thesetechnology changes?
SPEAKER_02 (35:52):
You and I were the
benefactors of many decades of
business studies of what anoptimum organization structure
looks like.
Uh and all of it would havestarted in sort of this command
and control type of structurewhere you have a lot of
employees or a lot of workerstrying to achieve certain
(36:14):
outcomes, but you want to makesure that you have the right
layers of control so that one,leadership knows what's
happening across a broad swathof uh locations of where they're
operating.
Um, but that there's also astandardized way of
communications and of executingon the particular work that's
done.
And you and I we can agree thatprobably for the last five, six,
(36:38):
eight, ten decades that thisapproach has worked.
So, what happens if all of asudden the structure itself
consists of workers that arehuman, so flesh-bearing humans,
but also digital employees, soGPU-bearing workers as well.
(36:59):
Because we are now getting tothe point where AI is falling
under this term agentic, meaningthat you have these particular
AI tools that are really good atdoing specific things.
And so if you can somehow groupthem together so that each AI
tool is responsible for oneparticular part of the process
as an agent, then you canactually have a whole team of
(37:22):
agents working on a particularoutcome.
And so we're probably not thatfar from a future where a
manager or a supervisor willhave to manage not only these
flesh-bearing workers, but alsothese GPU-bearing workers as
well.
And that would be a whole newleadership skill set.
(37:43):
If you were to enroll into acourse in HR, they don't teach
you.
Well, how do you manage a teamof agents?
And there could be two agents,there could be 2,000 agents.
And so some thought needs to beput into is there also a
disruption not only in the toolsthat are affecting the
workspace, but the way that theworkspace is being managed and
(38:06):
performanced?
SPEAKER_00 (38:07):
It's really
interesting.
And you know, by the time thisreleases, it'll actually
probably be out in the world.
But I have a an article that'spenned right now that talks
about management as the newleadership.
Because the idea of motivation,the idea of incentives are
different and very differentwhen you think about a human
flesh-brained person versus aGPU that um you know just wants
(38:29):
essentially compute and energyas its inputs.
And and it's really going tochange those um you know
meta-skills and functionalskills of of what does it mean
to be a manager.
SPEAKER_02 (38:43):
Absolutely.
And uh it's exciting, um, maybea little bit uh concerning as
well, too, uh, because I thinkalso we've seen humans are good
at really messing things up,right?
We kind of again become thevictim of our own success, or we
let our own broken nature get inthe way of things.
SPEAKER_00 (39:03):
Oh, definitely.
And so speaking of our brokennature getting in the way of
things, are there some pitfallsthat we should be watched out
for?
And I recognize we couldprobably have an entire episode
based on the you know the scarythings and the T9000 uh future
in front of us potentially.
But if we think short term andreally tactical, what are some
of the things that we should bereally deliberately talking
(39:25):
about or thinking about when itcomes to how we're responsibly
integrating AI into ourbusinesses?
SPEAKER_02 (39:31):
We're in the time of
human history where we have
access to so much informationand knowledge.
Uh another very conventionalstructure that we're used to is
our academic institution.
Um, and particularly how do Isignal to you, Keith, that I've
got certain skills?
Well, I have perhaps a piece ofpaper that says that I've
(39:54):
graduated from a certain school.
But nowadays, people are able toupskill and learn new things
really at the instant of a mouseclick.
So we have all these massiveopen online courses, we have
micro learning that is outthere.
But yet there isn't a frameworkor a schema that both an
(40:15):
employee and an employer can useto say, well, this is actually
what I'm capable of doing,because I was able to upskill
myself in these certain ways.
All I've got to show for you isjust that piece of paper that
some institution somewhere hasdecided to write my name on it
to say that, okay, I've got abachelor's degree, or I've got a
master's degree, or I've got adoctorate.
(40:37):
So I think the pitfall is if westill allow traditional
frameworks to try to signalwhere there is capability, we
may end up missing on humancapital and talent that's right
under our noses.
SPEAKER_00 (40:53):
100%.
And I think you bring up a veryinteresting point too in terms
of the education institutes orthe educational institutions,
because a lot of our educationalsystem is built around knowledge
synthesis, remembering and beingable to re-articulate um facts,
figures, dates, um, theories,calculations.
(41:16):
How does your role, and youprobably have a very interesting
perspective in this, as aneducator start to change when
the ability to recall andsynthesize information is no
longer the prerequisite, butthere's a huge responsibility
and opportunity still for whatwe do with that information?
SPEAKER_02 (41:34):
Trevor Burrus And
this goes back to I think the
distinction between literacy andfluency.
It is more efficient to evaluatesomeone on their literacy
skills.
Were you able to read up onthese concepts?
Are you able to memorize thesethings, and are you able to
successfully answer a multiplechoice question?
(41:56):
That demonstrates literacy.
Fluency is the ability to takeall that knowledge, synthesize
it, and communicate in such away that it is compelling.
Compelling in such a way thatmaybe it then supports the
changing of policy, that it caninstigate new regulations, uh,
(42:16):
or even just cause someone topause and maybe change their
perspectives a little bit.
So to me, the mastery ofliteracy is what leads to
fluency.
SPEAKER_00 (42:26):
I l I love that
topic, and I might steal it, or
the the idea of that literacyversus fluency, and it's really
starting to become clear to mearound kind of where the big
difference lies there.
So as a as someone that is umyou know actively an assistant
professor, does that change theway that you're teaching?
SPEAKER_02 (42:43):
What it does is
reminds me that the students are
currently in this really oddposition.
Yeah.
They are still having to adhereto some of the traditional code
of conduct that the institutionhas around plagiarism, around
how it is that they do theirwork.
At the same time, these studentsare about to graduate into a
(43:05):
work world where employers areexpecting them to know how to
use these tools.
So I actually feel really badfor the students for having been
pinned into such a toughsituation.
But I also see that my rolethere then, particularly as more
of a practitioner-focusedinstructor and educator, is to
(43:26):
help them to start building someof these AI leadership skills.
I know very well that they cango and learn how to prompt
appropriately in some of thesetools.
But what they're not beingexposed to is, well, how do they
deal with change management, youknow, changing org structures
and being able to implement AIin a responsible and ethical
(43:49):
manner.
So I actually think that theyhave the opportunity to gain a
competitive advantage versusthose that are already in the
workforce.
SPEAKER_00 (43:58):
Interesting.
And so this might be a littlebit outside of your expertise,
but I'd be curious about yourperspective on it.
You know, I was at a future ofwork summit um last week, and
there was a lot of talk aroundyouth unemployment, around kind
of the changing demographics ofthe workforce.
What's the narrative that youhear your students talking
(44:18):
about, as well as some of theemployers that you work with, in
what's going to be kind ofhappening to that Gen Z
demographic as we're building upa um future leaders, but also as
they're getting their supervaluable experience on the front
end of their careers?
SPEAKER_02 (44:35):
It is tough.
Like there are uh economicreports that will suggest that
those between the ages of 15 and24 are in a very, very
challenged uh employmentsituation right now.
And we are already seeing that alot of the initial use cases for
a lot of these generative AItools are tasks that would
typically be taken on by newgraduates and interns.
(44:59):
So there's a couple of headwindsfacing these students.
The jobs that would normally bethere for them to graduate into
are being taken away by some ofthese tools.
In addition to that, they'recoming up against a sensibility
that doesn't really allowyounger workers to be able to
(45:19):
explore some of these tools.
A lot of the tools are stillpretty much all constrained to
the point that it's not reallythat useful.
So then the question is (45:26):
well,
what is the place for these
graduates?
They have a lot of newknowledge, very recent
knowledge, but yet they're notprovided with the opportunity to
be able to showcase that.
So I'm really delighted to be ina city where the focus on
technology and innovation is sohigh, with organizations like
(45:50):
Platform Calgary providingessentially a lot of support
structures for those that arewanting to consider
entrepreneurship or starting uptheir own companies.
This is really where I believethat students and new graduates
will thrive.
Not trying to find their wayinto a traditional organization
(46:12):
that's been around for manydecades and are already codified
in their specific ways, but tobe part of an initiative or
organization where everything isvery agile and very, very fresh.
SPEAKER_00 (46:27):
I think that's a
really interesting place for us
to start to wrap up here.
Before we leave, I do want togive you a tiny bit of airtime
to just share Untapped Energy,what you're all about, and maybe
some of the things that you'reworking on, um, so that our
listeners can connect with youif they're interested.
So if you wanted to give the30-second elevator pitch on
untapped energy and some of thepriorities you have ahead, um, I
(46:48):
would love for you to be able tobe able to share that.
SPEAKER_02 (46:51):
Thank you for that
opportunity, Keith.
Untapped Energy is a federallyregistered nonprofit, upskilling
professionals in data analytics,and data science.
We do this through our monthlymeetups, we put together tech
social events, and then we haveour annual data funds that we
run.
Our focus right now is actuallyon microlearning.
(47:12):
Realizing that professionalstypically don't have a lot of
time to enroll into even amulti-week course.
They don't have a lot ofopportunity to access some of
the latest and greatest ofknowledge.
And so a microlearning course isessentially a 60 to 90 minute,
very targeted offering, learningexperience for professionals on
(47:36):
various topics in the data andartificial intelligence space.
So for 2025 and 2026, our focuswill be on a number of these
microlearning courses.
Everyone is welcome to joinbecause we're a nonprofit,
they're very accessible.
Most courses are between$15 and$20.
And it only will take about anhour or two of your time.
SPEAKER_00 (48:00):
Fantastic.
Well, thank you so much for allthe work you put into Untapped
Energy, Tim.
I know that it's a great servicefor our city, for a lot of the
people that take part in it.
And um, you know, it is I reallyappreciate all the work you do
there.
It's amazing to see and very,very needed.
Um thank you as well for comingonto the podcast today.
It's been a phenomenaldiscussion.
I feel like we could probablycontinue to talk um for for
(48:23):
many, many more hours.
But if someone wanted to connectwith you and learn more, if they
wanted to pick up um theconversation or maybe they
wanted to pick your brain ontosomething we haven't fully
unpacked today, what's the bestway for them to connect with
you?
SPEAKER_02 (48:36):
Yeah, absolutely.
I think LinkedIn is the best wayto connect with me.
I can provide you with myLinkedIn.
I'll put it in the show notes.
Yeah.
Untappedenergy.ca is the websitefor uh the organization.
If uh folks want a sense of thethings that we've done in the
past and some of the things thatwe're working on.
And absolutely, you're right.
(48:57):
Uh I think we've just scratchedthe surface on something that
continually moves at breakneckspeeds.
Um but this is where I believethe power of conversation is so
important.
It allows us to be able to pickeach other's brains, share
perspectives, and kind of havethose mini aha moments, like, oh
okay, never really thought of itthat way.
(49:17):
Uh and this is how I think wewill continue to embrace
something that is very uncertainto us in a way that does result
in something that is good for usoverall.
SPEAKER_00 (49:27):
Phenomenal.
Thank you so much for coming ontoday, Tim, and can't wait to
keep talking.
SPEAKER_02 (49:32):
Okay, sounds great.
SPEAKER_00 (49:34):
If you've made it
this far, like and subscribe on
YouTube or follow and leave areview on your favorite
podcasting platform so you don'tmiss any future episodes.