All Episodes

November 18, 2025 31 mins

Imagine a classroom where students don’t just use AI—they question it, improve it, and build with it. That’s the world we explore with Indra Kubicek, CEO of Digital Moment, a national charity bringing digital literacy, coding education, and social innovation labs to young people across Canada. We talk frankly about what it takes to raise confident learners in a world where AI is everywhere: not more hype or bans, but durable skills like persistence, resilience, and critical thinking that make kids active creators instead of passive consumers.

Indra shares how a nontraditional path—from accounting to entrepreneurship to scaling Code Club with Raspberry Pi—shaped her view of modern literacy. We unpack what “using AI well” actually looks like: crafting clearer prompts, verifying claims against credible sources, and noticing when a confident answer is thin or wrong. Rather than treating AI as a cheat machine, we outline classroom strategies that turn it into a lab for judgment and curiosity. On the home front, we explore simple steps families can take to talk about privacy, bias, and algorithmic feeds without needing any specific app in hand.

We also zoom out to the policy level. With education governed provincially, access to AI education risks becoming uneven. Indra makes a compelling case for a national AI framework that supports teacher training, safe tools, and baseline equity from rural communities to big cities. The goal isn’t disruption for its own sake; it’s positive, people-first change—like the volunteer-driven coding clubs that kicked off a movement. Give teens real problems, room to experiment, and mentors who listen, and they’ll surprise us with ideas that are both bold and responsible.

If you care about raising thoughtful, capable kids who can navigate technology with confidence, this conversation offers a practical roadmap. Subscribe, share with an educator or parent, and leave a review with the one skill you think every student should master next.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Chris Colley (00:13):
Welcome everyone.
Here we are in another episodeof Shift ED Podcast coming to
you.
I'm Out of Laval reaching downsouth into the great city of
Montreal to bring in CEO ofDigital Moment, Indra Kubicek,
who is Digital Moment nationalcharity.
And Indra's been the CEO therefor quite some time.

(00:36):
And we're going to kind of divein all about coding for our
kids.
We're going to, I'll tell you alittle bit, well, Indra Indra
will tell you Indra will tellyou a little bit about the
services that they offer and theamazing programs that they
offer at Digital Moment.
And we're obviously going totalk a lot about AI in this as
well as we go through.

(00:56):
Indra, thanks so much fortaking some time to join this
podcast.

Indra Kubicek (01:03):
Awesome.
Thank you for having me.
Really excited to be here.
Yeah, I can share a little bitabout Digital Moments.
So we're a nationwide digitalliteracy charity, bilingual,
focused on ensuring that everyyoung Canadian has equal access
and opportunity to importantdigital skills.
We have reached over 900,000young people and trained over

(01:25):
32,000 educators.
We work both in the schoolsystem, which is a large part of
our services, is reallysupporting educators.
Without educators having theconfidence, the tools, and the
capabilities to really beworking day in and day out with
their students.
We're not going to create thesustainable change we need.
So we work within the schoolsystems, but we also work in

(01:48):
informal other learningenvironments where young people
love to play and learn andthrive.
And so from libraries tocommunity centers, we work
through volunteers.
We have a large-scale volunteernetwork of volunteers that are
teaching coding clubs withyounger kids.
We are the Canadian partner tothe Raspberry Pi Foundation for
that code club initiative.

(02:10):
And our more recent addition toour services is a teen program
called a Social Innovation Lab.
So we are working with teens ina and more of a deep dive
accelerator type environmentprogram where they learn about
entrepreneurship, social impact,and how to actually build and
create technology that hopefullyserves our communities, our

(02:33):
wider cities, our country, andpeople that we care about.
So we're trying to make themreally think about why they
would want to build technologyand what they want to put in the
world.
And so yeah, very excited to behere and have a chat about,
yeah, as you said, the the hottopic of the moment.
AI.

Chris Colley (02:52):
Totally.
And maybe before we hop intothat, I like I always love kind
of a backstory a little bit ofhow you came to where you are
now here today.
And I mean, we know technologyaccelerates everything, it
seems.
I mean, even with AI, they'resaying the turnover now is about
seven months, six months.

(03:12):
Like it's it's so fast.
We are not fast, you know,humans grow slowly, you know, we
develop over time, and so someof those moments brought you to
digital moment that you couldshare with us.

Indra Kubicek (03:26):
Yeah, so how did I end up here?
Because I am definitely not thedigital computer science
software engineer background atall.
So I actually, by accident, asmany people might say, fell into
entrepreneurship in my early30s by deciding I wanted to
build a solution to a problem Isaw in the world, which is

(03:47):
usually how entrepreneurs getstarted.
So my career is totallydifferent.
I am a chartered accountant.
I went to business school, Imoved from Vancouver to London,
UK, not London, Ontario, for theCanadians, and worked in the
finance world and was quitedisillusioned with climbing the
corporate ladder and all ofthat.
And so I, on a whim, quit myjob.

(04:09):
I have volunteered in someeducation organizations in
Africa and was really looking atmore grassroots social change
movements, did a public policymaster's, and then decided to
build something completelydifferent, which was a
marketplace for healthy yogaholidays.
So I'm passionate about yoga.
I trained as a yoga teacher.
This was in 2012.

(04:30):
And if you wanted to actuallybuild a website of a marketplace
at that time, you actuallyneeded to know how to program
and code.
So there were no off-the-shelfwicks, all of this kind of new
technology that exists.
You really need to know whatyou were doing, or you needed to
have someone who did.
So off I went on this journeyin my early 30s and very quickly
felt completely out of water,realized here I am with all my

(04:53):
whatever degrees, education,multinational, international
career, and I can't build thisthing I want to build, and I
don't feel confident at all toreally do this.
And how is this the case?
And why didn't I learn this atschool?
And even more so, looking atthe education system, realizing,
gosh, like it still hasn'tchanged 15 years later, and here

(05:14):
we are.
And this is now almostessential building block
literacy for any business,regardless of what you're doing.
And so I found some young womenin London at the time who had
just started an organizationcalled Code Club, which was the
volunteer coding clubs, joinedthem, and they were scaling like
crazy.
It was, it was pretty new,amazing mission.

(05:35):
Joined them and then helpedbring us into the fold of
merging with the Raspberry PiFoundation, met Kids Cogenasse
that was here in Canada as partof building an international
network, and eventually cameback here to Montreal.
So there's a little long storyto get there, but it was really
fascinating.

Chris Colley (05:52):
Like I love it too.
Yeah, totally.
Well, you know, I think it's agreat example, too.
That I mean, all the educationin the world, and I'm not
bashing education, I think thatit's important, but oftentimes
education overlooks that skilldevelopment, right?
And kind of like when you arefaced, I would need a website, I
don't know how to do that, youknow, and it it requires a skill

(06:16):
base kind of to you know, evenjust find somebody, you know, or
like who do I know?
How how could I collaborate?
Well, how can I, you know, likecritical, you know, all the
stuff that comes in that we sodesperately need our kids to uh
embody as they head out intothis craziness.
I guess what is what are theskills that that digital moment

(06:38):
focuses on that you guys feelthat are uh that are
non-negotiable almost, that kidsneed these certain skills once
uh they get into thatmarketplace.
I mean, maybe kind ofreflecting a little bit about on
your own drought uh journey, itit it sounds like uh you found
a place where you want to helpstudents gain those skills that

(07:03):
maybe weren't available to youat at a time in your in your
career.

Indra Kubicek (07:07):
Yeah, yeah, absolutely.
And I think, you know, thereare obviously hard skills that
we talk about and kind of thetechnical side of it, but it's
really much broader than that.
And when we're talking aboutdigital literacy today, I think
it's a lot of these moreintangible things.
So when I think about evenbefore what skills, it's sort of
like what do we want people tofeel?
And we want people to feelempowered.

(07:28):
We want people to feelempowered, to be able to build,
to be able to create, to be ableto challenge, to be able to
feel that they're quote unquotein the driver's seat of their
life.
And now our lives are sointegral and intertwined in that
technology from the way wework, live, and play.
And now I would say with AI,it's everywhere, it's everywhere

(07:50):
all at once.
And how are we making sure thatpeople do feel empowered and
not afraid and don't feel thethings that I felt, which was,
you know, vulnerable?
Like you said, you had to findsomeone who could do this.
If you don't really know themechanics of it, how can you
evaluate who's building for you?
How can you feel that, youknow, you're not helpless in all
of this?
So we focus a lot on buildingsort of durable skills, I guess

(08:15):
we would call them, that arequite important around
persistence.
So one of the things that welook at when we evaluate is the
willingness to persist.
So, how can we help buildconfidence in young people so
that they're willing tocontinue?
Because you're going to failover and over and over again.
And at least in very structurededucation environments, and
certainly in the setting I camefrom, where you know you're a

(08:35):
chartered accountant, shouldn'tget it wrong, numbers are a
certain way, like you're notsupposed to make errors over and
over again.
But every entrepreneur willtell you they just fail forward
and fail forward and failforward, and you're constantly
trying to figure out how topivot and change.
And and that takes immenseamounts of confidence.
And so we're trying to build upthat, particularly for

(08:56):
marginalized, underrepresentedgroups, for young women and
girls who don't see someone likethemselves, who don't have
networks and people who can whocome from those backgrounds?
How do we kind of build thoseskills?
Critical thinking is at thecore.
I think everyone's talkingabout this risk of losing our
critical thinking skills as westart to depend more on answers

(09:18):
from AI systems.
So, how are young peopleactually critically thinking
about the information they'rereading, they're receiving?
Why are they being showncertain things?
Are they controlling theirenvironment or is that in
control of them?
Resilience, adaptability.
We all know, as you said, quitehonestly, I don't think many
people can predict where this isall going to end up at this

(09:38):
point.
We just know we're on this jet.
And like you said at thebeginning, there's no other
option.
It's not gonna stop, it's notgonna disappear.
We can't just, you know, say,wait for us to catch up.
It is, it's it's moving on itsown and it's it's an organic
entity now.
And so how do we really preparepeople for that uncertainty and

(09:59):
and still keep a level ofoptimism and fun and play and
make them and make them want tosee the opportunities that we
hope this this majortransformation will enable.

Chris Colley (10:11):
Absolutely, absolutely.
And I was talking witheducators and uh, you know, some
thought leaders as well aboutlike where do we start this
process, you know, from aschool, you know, an in-class
school, you know, foundation.
Like and it's funny thatthey're always this idea that
we're that we're not thinking,you know, like like if we

(10:36):
introduce AI, thinking stops,you know, and we just the
machine will tell me what to do,right?
And I kind of question that alittle bit that if you're using
AI, like I have to think when Iuse it, right?
Because your prompts and whatyou want at the end, you gotta
know, right?
It's not like I can just say,okay, write me an essay about

(10:56):
Macbeth.
Go ahead, I need 500 words.
I mean, it will do that, butthat's not really the
orientation, I don't think.
And that critical thinking hasto be embedded within AI to get
what you want in the end, orit's just you know, somebody
else's stuff.
What do you can you maybe drilldown a little bit more on

(11:19):
critical thinking?
Because I know that that isprobably one of the key, and
critical thinking also embeds awhole bunch of other skills
within it.
Can you kind of talk to us alittle bit about that?

Indra Kubicek (11:32):
Yeah, and I think I know, and you know, I know
for the education systems, and Idon't have to be the teacher in
the classroom, and I reallyempathize with you know where
they are and and what they'refacing, and and not even with
AI, I mean, starting with asmartphone, you know, a student
can challenge you and Googlesomething and all of this very
easily.
And so I think the role thateducators play is shifting and

(11:55):
changing, whether that's ateacher in a classroom or an
educator in a different kind ofenvironment and moving toward
facilitating and mentoring.
And so when it comes tocritical thinking, exactly what
you're saying, and this is whereif we just kind of say to young
people, don't use Chat GPT,don't cheat, we don't want to
talk about it.
All we want to do is find everyway to say we're gonna verify

(12:17):
that you didn't cheat on yourhomework, we're really um
avoiding something that wecan't, you know, we can't avoid
it.
We're trying to avoid somethingthat is real and we're actually
not preparing people.
So if we're not educatingpeople about that world and the
the risks of not criticallythinking about what's being
shared back in those tools,we're not actually doing them
any service.
They're going to be going outpast, you know, high school,

(12:40):
even never mind secondary orpost-secondary, they're gonna be
going out into the world andthey're gonna be going into jobs
where they're gonna be asked tobe using these tools.
They're gonna be asked how theywould use these tools, how
would they evaluate them, howthey would determine if they're
gonna trust them or not.
And they're gonna need tooptimize them and be efficient
with them.
So if we're not trialing thatout at a younger age, we're not
doing our job.

(13:00):
And so, coming to your point,those of us who are very old now
probably remember going andwriting essays by getting books
out of the library and actuallyforming our thoughts while we
were reading all of these booksand having to write an essay
feels like very ancient timesnow.
But one of the things I saythat, you know, we while you're

(13:21):
doing that, while you're goingthrough what felt like very
mundane at times, you'reactually forming your thoughts,
right?
You're you're you're actuallythinking through what you feel
about a book, whether you agreeor disagree with that author,
and now you've gone and readanother source and heard another
side of the story, perhaps, andyou're weighing out all those
options and you're looking atwho the credibility of these

(13:44):
people, and you're actuallyforming where you sit.
And I think the worry for me isthat if we're not having young
people maybe look at what comesout of the system, compare it to
verifiable information.
And I have sat with educatorsin a conference in UNESCO where
one of the assignments was readthe book and yeah, write, ask
Shape Chat BT ChatGPT to tellyou the answer.

(14:04):
And we want you to compare thedifference.
And I think there are smart andunique ways that you can get
them to use the tools to be ableto do that.
And then to your point, I don'tthink any of us can say we
aren't using these tools tostart brainstorming about
something.
It's about prompting, it'sabout verifying the information,
it's about getting better atyour questions.

(14:25):
And this is a skill and amuscle that will develop.
And so I think we should reallyencourage young people to go
and look at a subject they knowreally well.
Lots of kids play sports, youknow, go and and start asking
Chat GPT to tell you aboutsomething you know really well.
And through that experience,you're gonna start to see the
holes and the gaps and realizethat this isn't just, you know,

(14:48):
to be taken at face valuebecause sometimes the way, you
know, the system responds veryconfidently.
I think that usually you hearan answer that cannot have too
much substance, but man, does itsound like it really knows what
it's talking about?
And so young people will startto see that for themselves as
they have access to try andlearn and discuss and you know,

(15:09):
in their classroom in theseenvironments with you know,
teachers who who are trying tosupport them.

Chris Colley (15:16):
Yeah, totally.
I heard this really cool stattoo from ChatGPT put out their
report recently, and they saidthat close to 75% now of
searches are fornon-work-related searches,
right?
So it's not really about usingit as a academic crutcher.

(15:36):
It's starting to shiftdramatically towards
non-work-related searches, whichI found pretty fascinating that
we're we're harping on thisidea that it's a cheat machine,
but it's you know, it's notbeing used that way, I guess, is
what I'm trying to say.

Indra Kubicek (15:53):
Yeah, yeah.
So, I mean, I I I don't doubtthat stat.
I mean, what I have at leastheard in being over 40, I've
heard that, you know, us over 40are using it more like Google
for information to kind ofsynthesize and summarize.
And younger people, and I dothink we need to think about
this, and this is why we can'tjust be assuming it's a chat,
it's a cheating tool.
That's not what it's for.

(16:13):
They're using it to haveconversations, to do things with
their personal life, they'redoing it in all sorts of ways.
That is a society, you know,the this it's like we're all on
the bus and we're flying downand down the road, and young
people are on it with us.
And I think opposed as opposedto like previous, you know, when

(16:33):
you compare it to otherindustrial revolutions or big
technological advances, evenyoung people weren't necessarily
in the mix so closely as theyare right now.
And so I think that's where weactually should be putting our
attention and actually havingconversations at the kitchen
table with our kids to learnmore about what they're doing

(16:54):
with these tools and whatthey're relying on it for and
how they're using them, becauseit is, I think, much more
outside of kind of a work typegoal than people realize.
And there's positives andconsequences to that as well.

Chris Colley (17:10):
Totally.
How do we how do we get home inschool at the table listening
to those conversations?
Because I mean, right now herein Quebec, we're kind of keeping
AI away, you know, directlyfrom kids in classrooms, you
know, age restrictions, privacy,like all of those issues that

(17:33):
are there.
How how do how do we starthaving those conversations with
kids from the home perspectiveand from the school perspective
and start listening to what whattheir ideas are about?

Indra Kubicek (17:50):
Yeah, I think so this is this is exactly what
we're trying to do.
I mean, as an organization, weare really trying to support the
school system so they'rewilling and open to have those
conversations.
And yes, you know, there's arisk of not just bringing
anything into the classroom andthere's privacy and kind of, you
know, all the ethical concerns,but at the same time, you can

(18:13):
be having conversations anddiscussions about technology
without even touching technologyas a starting point.
And parents are also feelingoverwhelmed in all of this.
And so, how do we start tocreate maybe more inclusive
environments where those kind ofconversations and discussions
can happen and we can actuallyhear directly from young people

(18:35):
what they think?
And, you know, we're quite usedto systems being kind of taking
the creative side out of us,you know, as you go through
school and you're told to stayin the box and like no focus on
this and don't go outside of therules and all of this, and
actually over time, all of ourcreative and innovative ideas
are kind of pushed aside becausewe're supposed to just do what

(18:56):
we're told to do, whether that'sat work or at school to some
extent.
So I always like to think likeof young people as actually they
are entrepreneurs andinnovators by nature, and we're
kind of pushing them into a boxas they get older.
And so, through at least, forexample, our social innovation
lab program, we want it to feelas not like school as possible.

(19:17):
So we want them to feel likethey can build the tool they
want to build.
We're there to mentor them andsort of share with them how they
could integrate technology intothe problem they're trying to
solve, but center it aroundsomething that they care about
and give them the autonomy tosay, like, this is where we want
to focus and listen to whatthey think as well.

(19:37):
And I think there's so manycreative ways that the school
systems, the government couldcome in and support that
infrastructure to be able tohave parents also be involved in
that process.
So we're also talking aboutthat, well, that right now,
about, you know, a family guideor some sort of thing to just

(19:58):
start conversations going.
For example, for us, you know,one of the big things is, you
know, polarization that's beingcreated through social media,
something we all really now aremuch more aware of.
We're all living in thesebubbles and these echo chambers,
and we don't get to ever meetpeople on the other side or the
other viewpoint, and youdehumanize them.
And so, how do we easily createways to have those

(20:21):
conversations?
And you don't even need to bebringing a tool into the room.
Everybody knows what what thatthey can imagine what that
scenario is like and have anhonest conversation about it.

Chris Colley (20:32):
Absolutely.
I totally am with you too on onthat social media part.
I I s sometimes just feel wekind of miss the boat on that,
and we're seeing therepercussions now on the you
know, and the kids that sit infront of us in in class that
they they're more anxious,they're more, you know, just
feeling the weight of the world.

(20:52):
And I'm not saying that theworld doesn't have a lot of
weight to it right now, it does,but we kind of miss that boat
on social media and the powerthat it was gonna have on these
on our our youngest learners,and we're kind of seeing that
repercussion, and I'm justhoping that we don't make that
same mistake again with AI,because like you said, it's

(21:13):
another tipping point, right?
I mean, social media was atipping point for our youth, and
now we have another tippingpoint really close together.
And are we having thoseconversations?
And I know that you're very uhinvolved with those
conversations that are happeningat many different levels, but
what where are we at here inQuebec with those conversations

(21:36):
about you know ethical use,letting kids explore AI, but
with you know, a little bit ofsafeguard around it so that that
we're not doing what we didlast time where we were just
like giving them you know a carand said, go ahead, drive.
No, you don't need a seatbelt,just go ahead, drive.
Like, are we thinking about howwe're gonna kind of make sure

(21:59):
that we put those safeguards inplace to protect our future
generations?

Indra Kubicek (22:04):
Yeah, I mean, I think the analogy and the
comparison is completely onpoint.
And this is you know what is onall of our minds who are
talking to different levels ofgovernment institutions and sort
of public civil society aroundwhat not to do this time.
And and I think in in ourlearnings of you know, social
media and and I think and thereal mistake was calling a

(22:28):
smartphone a phone because it'snot a phone, it's a complete
access to anything on the web,the dark web, you name it,
whatever apps, you know.
And I don't really truly thinkanyone had any idea what they
were, you know, what they wereintroducing young people to.
So I I feel for people who hadkids during that period of time,

(22:49):
and absolutely for young peoplewho who grew up with, you know,
and are now very deeplyingrained in these kind of
social media systems.
And as you said, therepercussions and outcomes of
that.
So if there's not a bettermoment to be very hyper-aware
that we cannot let that happenagain, it is now.
And so what we're talkingabout, you know, with different

(23:11):
levels of government, andobviously we're national and in
Canada, it's a complexenvironment.
So when I was in the UK in2012, the national education
ministry of education introducedcomputer science from K to 12
across the country.
And at least from a policyperspective, they could do that
overnight for the every child inthe country.

(23:31):
And here we're, you know, we'rewe're trying to get to a place,
and for us right now, we'rereally pushing for a national AI
framework agenda that everyprovince would sign up to say,
look, it shouldn't matter.
And as a parent and a citizenof Canada, it shouldn't matter
if I'm in rural Quebec, downtownVancouver, Prince Edward

(23:52):
Island, or Edmonton.
Like we should assume that allof our children are getting the
same access to opportunity andbeing safely protected in the
same way because it's notdifferent from reading and
writing.
And now obviously those kind ofskills have been around a long
time.
So they're already being taughtacross the country at the same
level.
And so this is where it alllies in this divide of what's

(24:14):
going to happen.
And you can imagine what'sgoing to happen.
Private schools are going toget it, companies are going to
go in, and private schools aregoing to pay, and schools on the
public system are going to bestrained and suffer unless
government steps in.
And so it's really up togovernment, in my opinion, and
not to even want or leave it upto the market.
This is not a, you know, thisis a this is a public, in my

(24:36):
right, human rights issue, to behonest, that every young person
needs to have the sameopportunity here and the same
education to protect themselvesand their families need to know
this.
The government needs to step upand to really fund what it's
going to take to give oureducators time off and support

(24:56):
to go to PD trainings like weprovide.
And right now we have a federalgrant from the Ministry of
Innovation through a programcalled CANCODE that I know
you're quite familiar with.
But even with those programs,we need partnership and
collaboration with theministries of education in every
province and territory to wantto say, let's do this together.

(25:18):
I think it's far too much toask teachers to kind of keep up
on their own.
Even for us who are runningthese organizations, we can
barely, you know, it takes alot.
And we collaborate with all ofour other Cancone partners and
many of us across the countryand internationally to bring
together ideas and thoughts andcontent and and all of that.
So, you know, I think it's itis it is government's job to do

(25:42):
something.
Today is the last day of apublic call from the Ministry,
Solomon's office of Minister ofAI, asking the public about what
should be done.
So I think efforts are beingmade, but it's up to all of us
who can talk, speak on behalf ofcitizens to not take our foot
off the gas on pressuring themto own this.

Chris Colley (26:03):
Absolutely.
I'm I'm on that, I'm in thatcar with you, Indra.
I mean, I think that if thereisn't some kind of framework
that covers all of and andCanada's a weird place, right,
where we've allocated provincialeducation, they control
education provincially, butwe're all in the same boat here,
as you said, you know, or so ithas to be some kind of

(26:24):
framework where we all agreewith.
And I was at a conferenceyesterday and we were talking
about that exact thing thatwe're all kind of doing our own
thing instead of lookingglobally at it.
Kids are kids, students arestudents, schools are schools.
I mean, this is a a universal,as you said, you know, thing
that's there.
And if we don't address it thatway, I think that it's going to

(26:47):
be this patchwork again, whichnever tends to work very well.

Indra Kubicek (26:52):
Yeah, and at the day, at the end of the day, you
know, we're all humans who careabout the young people in our
lives and our communities andour country.
So we all want what's best forthem.
So I think that's actuallycoming together and saying, as
you said, like there are ways todo this where we're not all in
silos out there creating our ownand recreating.

(27:12):
And, you know, there's also Ithink about efficiency and so
and use of funds and all of thisas well.
And so, you know, let's let'sput our brains together across
the country and people who arein the decision-making seats
say, Yes, I want to meet all ofus and say, here is a starting
point.
You know, it's it's just wehave to start somewhere, and we

(27:33):
don't have time on our side to,you know, as you said, education
moves very slowly.
Human beings are, and I don'tthink, as you mentioned rightly
at the beginning, I think we'remeant to move this fast, but you
know, we have to do somethingto not let it overtake us.
And so I think education needsto get a bit uncomfortable in
saying we're gonna do thingsdifferently than we have in the

(27:55):
past.

Chris Colley (27:56):
Totally.
I love that.
Well, this has beenfascinating.
I think that the thoughtfulnessof your ideas and how you
present them is something thatwe need to start talking about.
I know it's uncomfortable, butI think right now is a time for
disruption in a way, because wesee we don't really know, but we

(28:20):
see what the history has taughtus, and that we're kind of
heading in that same directionagain.
And it's it's a bitnerve-wracking.
And I think you're right, aslow change just won't do it.
And for change to happen, youhave to be disruptive.
I mean, it has to be, you gottakind of break it all down and
rebuild a little bit.
I mean, I think that's kind ofwhere we're at right now.

Indra Kubicek (28:43):
I yeah, I absolutely agree.
And I think I think justbecause of sort of, you know,
when people think about tech andbig tech and and some of the
failures we feel maybe that havehappened, and the word
disruption is now, you know, ohno, we know we don't that sounds
like really negative per se,but it doesn't have to be.
And so when we look at the CokeClub model that started in the
UK, and I will speak to thatbeautiful story and and the

(29:06):
women who are behind it, thefounder of Coke Club, you know,
they saw a problem.
They said, We we are young techprofessionals, we have people
who have these skills, they'renot being taught in schools.
We're gonna create an easy wayto match schools and volunteers,
and we're just gonna sendpeople into those schools and
they're just gonna startteaching it for free.
It was all around scratch,first coming out of MIT.

(29:27):
And that's when the movementreally began, you know, about
10, 15 years, 15 years ago, Iguess now, almost.
And that was disruptive, right?
That was not a thing thatexisted.
And it was social innovationand social disruption for the
positive.
So I think we do need to thinkabout what are the things we can
do to just disrupt how thingshave been done, but for good.

(29:49):
And I think that's what'sthat's what we're teaching young
people when we take them intoour labs.
We're like, you know, thisproblem's existed for 50 years,
food insecurity, whatever it isthey care about.
Clearly, the things we've beenDoing aren't getting aren't
meeting the mark.
We're not solving it with thesame sort of ideas.
So they're beautiful youngpeople who can think outside the
box and can, if you give them achance, they're going to come

(30:11):
up with wild and crazy ideas.
And usually wild and crazyideas lead to disruptive change
that can be positive.
So I think that's reallyimportant.

Chris Colley (30:19):
I love that.
We're going to leave it there.
I want to thank you.
Man, so much to think about.
So many more conversations tobe had.
I really hope that we can hopon here again one time and
continue this conversation.
Because I think if we're nottalking about it, you know, and
the conversations are needing tobe had.

(30:40):
And I hope the listeners outthere can appreciate some of
what we're talking about.
And we're not thinking aboutkids, we're thinking about their
future and empowering them forthe future.
And I think that digital momentis a big key to helping here in
Quebec and across Canada.
So Indra, thanks so much foryour your your beautiful

(31:03):
insights.
I think that they were so wellexplained and illustrated.
I really appreciate you joiningme today.

Indra Kubicek (31:12):
Thank you so much and happy to come again and
hopefully we'll have made goodprogress by then.

Chris Colley (31:17):
We got a date.

Indra Kubicek (31:20):
Thank you.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.