All Episodes

December 3, 2025 49 mins
TrulySignificant.com presents Kate O’Neill, renowned Tech Humanist, author of What Matters Next? and honored as Thinkers50.

Kate spent her career exploring how technology can serve humanity—not the other way around. From pioneering the first intranet for Toshiba America to becoming Netflix’s first Content Manager, Kate has seen firsthand how emerging technologies can either empower people or overwhelm them.

Today, Kate advises organizations around the world on creating sustainable growth, designing equitably, and building long-term value through human-centered innovation. She is the author of What Matters Next? A Leader’s Guide to Making Human-Friendly Tech Decisions in a World That’s Moving Too Fast, where Kate examines how thoughtful technology choices shape the future.

On this episode, we dive into regenerative growth, practical applications of machine learning, and how data can reveal insights that genuinely improve the customer experience. You’ll hear about digital transformation in the film industry, strategies for investing in sustainability, and maximizing energy and processing power through thoughtful system design.

Ultimately, this conversation is about setting a new precedent—using technology as a steward, sensitive to humanity, and always mindful of what truly matters next.  Visit www.koinsights.com 

Become a supporter of this podcast: https://www.spreaker.com/podcast/success-made-to-last-legends--4302039/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
And welcome back to truly significant dot com presents success
made the last legends. I'm Rick Tokeeny along with our
LA correspondent Jennifer Tokeeny, who leads many Moons Entertainment. We
have a great guest on today, Kate O'Neill, and she

(00:30):
is the renowned tech humanist. What attracted us to Kate
were many things, but she's got this futuristic mind that
sets her apart. And she asked questions about design in
ecosystems and how people are affected by the ongoing innovation

(00:52):
and of course the surge of AI. She was an
early Internet pioneer, building the first intranet for Toshiba America
and became Netflix's first content manager. Today she advises some
of the great global companies on using emerging technology for

(01:12):
sustained growth. And I absolutely love the way you approach alignment, Kate.
I am a student of alignment for growth because of
one of my mentors is doctor Fullet Boisseu, who worked
for Steve Jobs, and all they talk about is alignment.

(01:32):
So I mean, you were just a wonderful guest. Welcome today.

Speaker 2 (01:35):
Thank you very much for having me on here. Boy,
it's such a great setup, and I love that opening too,
of the alignment focus, because that's a huge part of
what I bring to my thinking around all of this. Well,
it will be by way of giving you a bit
of background, because I'm a linguist by education, and so

(01:56):
I was a German major, a Russian and Linguistics double minor,
and a content in international studies in my undergrad so
tolefic background for technology. But it was through just a
miracle of timing. I was in college when the web
came about, and I was supervising the language laboratory, as

(02:17):
one does, and the web came about, and I found
out I could build a website for the department for
the language lab So I built it, and it turned
out to be one of the first websites at the university,
the first departmental website at the university, and it was
it at a time when people could make manually curated
lists of what websites were new that day, which is

(02:39):
so mind boggling to even think of. And that got
me noticed by a guy at Toshiba. We started a correspondence.
That's how I got recruited to Toshiba. So this is
all backstory, but all of this to say that linguistic
background informs one thing. I really think about language as
the architecture of so much of communication. It is the

(03:01):
original human technology, and it also gives us a good
hint about what meaning is. So meaning is about this
relationship between what it is you're trying to communicate or
what it is you have in your mind. Like when
we think about linguistic meaning communication, you're trying to communicate
something the other person on the other side, here's something,

(03:24):
and then there's the message that's actually in between. And
anywhere there's shared overlap between these three components, you've actually
gotten meaning, but only where they're shared overlap. So that
alignment piece is huge when you think about language as
a form of communication, when you think about language as
a metaphor or meaning as a metaphor for anything that's
significant or anything that matters in the world at all.

(03:46):
So that has to do with marketing, that has to
do with strategy, It has to do with how we
think about organizing the world in general. So there you go.
There's my sort of opening ramble about some of these
concepts and how they fit together for me.

Speaker 1 (04:00):
Okay, so take it away.

Speaker 2 (04:02):
Where should we go from thetistics.

Speaker 1 (04:04):
Expert Gur tell us, where are you from originally and
a little bit more about your education.

Speaker 2 (04:10):
Yes, sure, so I'm from the Chicago area originally, and
I was always interested in languages and technology. In fact,
there's a story it sounds apocryphal, but it's actually true
that in first grade I won two statewide competitions. One
was a young Author competition and one was a young
programming competition. So it was sort of a pretty good

(04:34):
indicator early on that I was going to always be
interested in technology and the written word, but more broadly,
like how we communicate with one another, how to create
connections with one another, And so that story sort of
resonates throughout my career in a lot of different ways.
I then went from like I mentioned, I was in

(04:56):
college when the web came about, went on to Toshiba,
lived in the Silicon Valley for a decade during the nineties,
which of course is just an incredibly formative time in
Silicon Valley as the place we know it now to be,
and that's when I went through a series of startups.
After Toshiba landed at Netflix is one of the first

(05:17):
hundred employees. As you mentioned, I think and from there,
there's another series of great jumps into different enterprise organizations
Hospital Corporation of America once I moved to Nashville, which
is the largest for profit healthcare company. There's just been
a whole lot of really interesting dots in this pixelated

(05:37):
picture of my career, and the fun of it is
connecting them and figuring out where these patterns are that
would otherwise seem a little bit random. But there's a
lot of real interesting synergies and synthesis to day.

Speaker 1 (05:52):
I love it, okay, And as aforementioned, Jennifer, representing the
entertainment industry, LA based lives and dies by California in
her industry, I'm going to let her jump in and
ask you a few questions from her sector, and then

(06:12):
I'll guide this thing back over to the general world
so far away.

Speaker 3 (06:16):
Jennifer, as the tech humanist, how do you believe the
leading streamers are shaping the modern human experience? Do you
think it's positive or negative?

Speaker 2 (06:27):
I think it's a mix, right. I think a lot
of people these days are much more aware of how
algorithmic experience plays into our lives. Right at the time
when I was there, I helped develop one of the
first algorithmic website designs. That was one of the projects
I was involved in was redesigning the homepage so that

(06:50):
it used more algorithmic kind of clues and cues to
give you more personalized information. We've got this movie and
you'll love it. And that was great at the time,
and I think what we've seen in the twenty five
years since then is we've gone all the way that way,
and now I think people I hear are more and more,

(07:11):
especially younger people, but across generations. People are kind of
pushing back to us a little bit and going like,
I don't know if I always want algorithmic recommendations. I
don't know if I always want to be in the
filter bubble. You know, I don't know if I don't
want to just randomly encounter something. But that's an interesting
byproduct of it. And I think also the phenomenon of

(07:32):
binging and streaming has been you know, maybe a mix
on society. I don't know that we all benefit from
the watch next button automatically deploying on our behalf and
keeping us engaged in hour's worth of programming, but it's
been I think there's been a lot of also some

(07:53):
benefit to the disruption of how many decision makers are
involved in getting getting some diverse voices out there, getting
some new content, uh, shaking up that the sort of
old players in the market, and making it possible for
you know, some new formats to be discovered and some

(08:13):
new approaches to be seen. So mix would you what
would you say, Jen?

Speaker 3 (08:20):
It seems like there's just infinite choices and you end
up going in and scrolling for the amount of time
that it would have taken to watch a TV show essentially.
But I think for me personally, I like that it
identifies specific things for me to watch in different genres
and things like that. But I know it can be overwhelming,

(08:41):
but I do think that it can be helpful. So
I think technology is helpful and it's catering to different people,
but it can definitely be overwhelming.

Speaker 2 (08:50):
So one of my favorite stories from landing at Netflix
as one of the first hundred employees as the first
content manager was I took over the movie database. So
this is the database of all of the movie listings
and all of their metadata and everything. What I saw
right away is that every movie could only have one genre.

(09:10):
And I think if you know movies at all, you
know immediately that that's a big problem, that's going to
be a limiting factor. So I got approval to you know,
re engineer this, and I worked with pretty much everybody
in the company Althowis. I've said it was one hundred people,
so it's not really that many. But it was engineering,
it was product, it was marketing, it was everybody. And
you know, we recoded the database relationship so that movies

(09:32):
could have more than one genre, and so that was
a really big undertaking. It was huge, ugly work. But
what it meant is that in the twenty five years
since we did that, Netflix has used machine learning really
effectively to discover these really interesting patterns between movies and
between your viewing habits, my viewing habits, you are viewing habits,

(09:55):
and to be able to kind of connect these dots
in ways that would not have been possible prior to
machine learning, but also would not have been possible without
cleaning up that genre relationship, because now you can get
these surface like quasi categories that now show up in
your home feed like mine, say things like Oscar winning

(10:16):
twentieth century period pieces, or you know, award winning movies
about friendship, or you know things like that that are
not genres in any classical sense, but that use that
same architecture. So I think it's a really interesting combination
story of how it is that you know, you can
do to digital transformation to make a company more ready

(10:39):
for the future, and how machine learning slash AI can
create more rich human experiences so that we can all
enjoy and benefit from them.

Speaker 1 (10:50):
Right, it's fascinating. I what do you think, Rick, You've
got my mind thinking you you frame innovation around regenerative value.
Procter people are all about creating value. And if all was,

(11:12):
if everything were said and done to describe our careers there,
it was to improve lives and change lives. And so
that's what that's what value means to us. So I
would leve for you to share an example of the system,
either past or present, that you've successfully grown, restored, depleted,

(11:36):
or done something to the human resource.

Speaker 2 (11:40):
Yeah, that's a it's a really beautiful question. It's one
that has my mind going in a lot of directions.
I think these days when we think about regeneration. I
did a whole talk actually about regeneration for Thinkers fifty
in the lead up to so they have their gala
and member in which I became a ranked thinker, So

(12:02):
that's really exciting, thank you. And in the couple months
prior to that, they did a virtual event that was
the lead in to the gala, and I did the
closing remarks there and I think they're still available online.
Ten minutes talk on regeneration as a strategic theme, and
the thing is that there's so many ways to parse it.

(12:24):
You can think about regeneration in this sort of climate
ecology sense, of course, and I think it's really important
that we do in terms of sustainability, but you can
also think about it in terms of technology and in
terms of human experience. And one of the things that
I tried to do in those remarks was tie some

(12:44):
of those thoughts together in the sense that regeneration, ultimately, well,
sustainability really is about doing more with less on some level, right,
And regeneration is the promise of once you have to
do more with less, like reinvesting the gains from what

(13:06):
you've saved into a sustainable system, that then you theoretically
are building something that should be self perpetuatingly sustainable, right
Like you're growing something that's increasingly sustainable. And I think
what when we look at technology, we can see there's
been you know, these kinds of rules and laws like
More's Law and the series of laws that have come

(13:29):
along after that that have shown us how to how
we can think about chipsets and energy in sustainable ways,
Like we're going to get more out of each chip
because it's going to be able to do more as
we go, We're going to be able to put more
circuits on each chip, you know, two and a half
times every two years or whatever the More's lawn ratio is.

(13:50):
And then all of these other laws that have shown
us we can do more with energy, we can do
more with processing power. And yet what we're also is
there's this a corollary that suggests that we can't keep
up that the more sustainable we get or the more
energy efficient a system becomes, the greater the demand for

(14:12):
energy becomes. And so we have a human problem. Really,
we have a demand problem. We have a greed problem.
We have a system design problem. And I think that's
where you know, when you think about the P and
G folks, and you know, folks who are listening, who
are thinking strategically, who are thinking organizationally, who are thinking holistically,

(14:37):
there's this incredible opportunity, I think, for leadership in the
organizational sense to set a new model, to set a
new precedent for how to integrate these ways of thinking
about regeneration, about climate responsibility, about human centricity, you know,
creating meaningful human experiences, and about using technology in a

(15:02):
responsible stewardship sort of way, so that we are moving
into a really exciting future. The technology that's here and
that we see on the horizon is exciting, crazy, exciting,
and also plenty of people are wicked scared of what
they're seeing coming out of this technology and out of

(15:24):
the decisions that surround the technology. Let's be clear, because
it's not the technology that's scary. It's the decisions that
people are making with the technology. It's the decision to
cut jobs and use AI instead. It's the decision to
make deep fakes and try to manipulate people into voiding democracy.
It's the decision to do you know, manipulative or destructive

(15:47):
things with technology. That's the problem. The technology itself is
not the problem, just as the technology is not itself
good or bad. We made the technology. Technology is neutral,
it is I'm sorry what I mean to say, is
technology does not itself carry the value It's not, but

(16:08):
it is not inherently a neutral thing either, because it
is imbued with the actions and the values that we
imbue it.

Speaker 1 (16:16):
With Jennifer and perhaps her industry feels the greater threat
of AI versus me coming from the CpG consumer packaged goods.
I'd love for you, Jennifer, if you want to add
any color comment to that, because you are in the
trenches with your entertainment company, and you know what's happening

(16:38):
out there.

Speaker 3 (16:41):
Because a lot of Hollywood is now tech companies. How
do you keep grounded with these new tech companies coming
in taking over? Is there a way to kind of
stay grounded with all the change?

Speaker 2 (17:01):
Yeah? I think if I if I interpret where you're
going with that or what you're what you're thinking behind
it is there's a level of the entertainment industry is
predicated on human creativity on some level, right, and there's
an element of that that is legitimately subsumed into AI.

(17:26):
Right Like we understand that AI generative AI and other
AI models have been trained on human creative output. Writers
like me, my author my authorship has been subsumed into
data sets that train AI, and so now generative AI
tools can write like me. This told not probably not

(17:47):
be something that I advertise that you can just ask
AI to write like Kate O'Neill the tech humanist, and
you will get a pretty basic, impassable sort of representation.
I think the more important thing, and so I've actually
participated in United Nations AI Advisory Body discussions about IP

(18:07):
protection and how we're going to validate certain kinds of
you know, creative assets as being authentic or valid or
you know, signed official or whatever the case may be.
That solution does not exist yet, there are many possible

(18:28):
solutions in the meantime. I think that the the music industry,
of which I was an adjacent to. I was in
Nashville for ten years. I moved there as an aspiring
songwriter alongside my tech work, so I certainly had some
connectivity to the creative industries there and my background with

(18:52):
the movies and as a writer. I have a lot
of sympathy for the fact that what we're doing to
the entertainment industry is incredibly challenging to have it pivot
and figure out like we you know, we were just
disrupted by streaming in all of these industries. Now we
got to figure out and we've just tried to figure

(19:12):
out a reallocation of value there. Now we got to
figure out a reallocation of value when it's not humans
that are actually making the creative product. Like, what does
this even look like? I think my hopeful take is
that we are figuring it out a little bit. Like
with streaming, I think there have been some progressive models

(19:35):
that have rewarded humans, even though that may be fractional
compared to what the ownership model used to be. Like
when you bought an album, the songwriters were better rewarded
than when you stream a song. Certainly, meanwhile, someone's pocketing
most of the money that's transacting and it isn't the

(19:55):
people who are making the most creative product. I would
like to think that we are right for a disruption
that is that favors creators. That's some kind of model
that can look to put some protections in place. Not
that mean that we don't use AI tools to create,

(20:16):
because I think there's incredible product that can come from
using AI tools to make really inventive types of creative output.
But I think there's got to be a place for
human creative output, and we have to figure out a
way to value it or else we're going to make
it impossible for people to pursue that field and with

(20:39):
it the entire entertainment scaffolding that goes around it. So
how do you stay grounded? I think is a hard question,
although I think there's a strategic optimism is my model.
I believe in the idea of naming the threat, talking
about risk and harm, and really managing that risk and harm,
and at the same time identifying what it is you

(21:01):
hope for and what it is you plan to work toward.
So it's a both and situation. In this case with entertainment,
I think like there's legitimate threats, and like, what could
we do? What could we how could we create new models?
Could could that be an LA initiative that creators and

(21:22):
tech savvy folks who are in the er entrepreneurial space
entertainment wise, can get together and figure out new strategies
for capturing some of that value. Rick back to your
point of the pg P and G folks and all
of those those sort of synthesized ideas that are together
in this thank you for discussion.

Speaker 1 (21:42):
We're going to take a quick commercial break, I'm going
to throw it to you first, Okay, tell our listeners
where they can purchase all of your books.

Speaker 2 (21:52):
You can find it at what matters next book dot com,
but that will just redirect you to my website, which
is ko insights dot com in the books section there specifically.
So yes, I would love to have your listeners pick
up the copy of the book, of course, but also
reach out and let me know you know what it
is that resonates with them.

Speaker 1 (22:12):
And I'd love to add for our alumni networks that
listen to this, how can they contact you to be
a keynote speaker at their next event?

Speaker 2 (22:27):
Well, funnily enough, also at ko insights dot com and
they can find these speaking header on that page and
reach out there.

Speaker 4 (22:37):
This is Marcus Aurelius reappearing to proclaim that truly significant
conversations with big hearted people is a rare piece of literature.
This book reminds me of one of my more stirring quotes,
waste no more time arguing what a good man should.

Speaker 5 (22:56):
Be be one. If you're step into your next life
chapter of your career and questioning what lies beyond success,
this book is for you.

Speaker 6 (23:09):
Dive into forty soul stirring stories from luminaries like doctor
Jane Goodall, Ed Asner, and Emily Chang. Stories that urge
you to pursue purpose, serve others, and build a.

Speaker 5 (23:24):
Legacy that outlasts you.

Speaker 7 (23:27):
Authored by Rick Tolkini, Truly Significant will challenge your view
of success and ignite a life of impact. Order now
at TinyURL dot com, backslash truly significant and begin living intentionally.
Maybe your epitaph will.

Speaker 5 (23:48):
Read she gave outrageously extended grace unceasingly and lived to
help others so that death found her empty. Visit truly
significant dot com and celebrate the most truly significant people
in your life with the truly Significant community. How bold

(24:12):
of you to make your next chapter matter and be
truly significant.

Speaker 1 (24:18):
We are back with the one and only Kate O'Neill,
the tech humanist, and she is a futurist. And for
your listeners out there, you know that we love talking
to futurests because they think way outside the box. They
see things far before we see them. And we're also

(24:38):
joined with Jennifer Tolkenny representing the entertainment industry, and Jennifer's
got one more question before she runs I.

Speaker 3 (24:46):
Think as a final question, how I guess, how can
companies ensure that technology serves humanity and not the other
way around?

Speaker 2 (24:57):
Yes, well asked, I think that there's a number of
practices here that are really helpful. I think most One
of the things that's very interesting to me is that
people assume that leaders of companies have absolutely no interest
in doing the right thing or creating meaningful human experiences

(25:19):
or whatever. A lot of times I will hear from
people like you think leaders care about you know, ethics
or responsible tech, And the truth is yes, I meet
leaders all the time, all the time after my keynotes,
who come up to meet in one on one ask
me questions and we'll say, we needed this vocabulary, we
needed these frameworks, and it's really hard to have this

(25:41):
conversation with my board. I need there to be more ammo,
so to speak, although I'm a non violent person, so
ammo is a terrible metaphor more mor ling for the fire.
To make sure that we can make this transformation. I
think that part of it is part of how we
make you know, more human friendly tech decisions or you know,

(26:04):
create business decisions that favor humanity, is we have to
be honest about what it is we want to accomplish.
We want to accomplish that, Like nobody doesn't want to
live in a world that is set up better for humans,
Like we're all humans. We would we would love to
have that. Yeah, I think it's just a question of

(26:24):
reconciling these kind of classic tensions of we want more profit,
we want more growth, like, but we also understand like
we need to do the right thing, but do we
really like do we need to do the right thing?

Speaker 3 (26:39):
Is that?

Speaker 2 (26:40):
Is that going to get us our our quarterly margins?
And so I think we have just a lot of
opportunity to have much more emotionally intelligent and intellectually honest
conversations about the richness of what we could create in
the world, and that I think requires integrating some of

(27:03):
these viewpoints to be able to say alignment right, Like,
we need a lot between what the business is trying
to do, what people outside of the business want to
have happened, and what the technology, technology and other resources
can do, what the capabilities are, so we can make
sure that all those are aligned.

Speaker 1 (27:22):
Okay, as Jennifer departs, if Jennifer were if Jennifer's company
get to do a documentary on you.

Speaker 2 (27:29):
Yeah. Nice.

Speaker 1 (27:30):
What would be the title of that documentary?

Speaker 2 (27:40):
Oh gosh, that is a fun, provocative question. I don't
even know, but I'm going to play with that one
for a while. Hmm, what should it be? Jen, What
do you think it should be? You're the expert.

Speaker 3 (27:57):
Oh, I don't know the expert on me, but you're
the expert on titles. Maybe like a line that you
say frequently, or it doesn't necessarily have to evoke a
strong feeling. It could just be something that you because
the thing with titles is it has to be intriguing

(28:18):
enough for someone to want to watch it without giving
too much information.

Speaker 2 (28:25):
I could go totally silly. And when I used to
be asked about like what it is that I would
tweet about all day? I robots, cats and beer.

Speaker 3 (28:33):
That's so that's pretty Robots, cats and beer coming to
a theater in twenty twenty seven.

Speaker 1 (28:43):
We're talking, We're let's enter the space of truly significance. Now.
When we wrote this first book, we had the opportunity
of interviewing Jane Goodall, and she is the final chapter
and so glad that she allowed us to talk with her.
She was the founder of the Jane Goodall Institute and

(29:06):
the United Nations Messenger of Peace and beyond that though,
it was just a great person who really cared about
what she did and the message that she the message
of hope to the rest of the world. And then
we talked to a lot of other people, and what
we learned along the way was it wasn't about the

(29:29):
fame as much as it was about the condition of
their heart changed somewhere along their journey. And when I
got information from your publicist about you, I just read
it and reread it, and I said, there's a story

(29:49):
inside Kate's story that we need to uncover and let
the world rest of the world know about it. And
so if I had a journalistic mission today, it would
be to get to the heart of you. And I
think we're getting close. When we talk about aspiring songwriter,
there's some music or sound going on in your head,

(30:12):
some real or film that's rolling and it's manifesting itself
and what you're doing today as a futurist and as
an author. But there's something else inside there that made
you switch on from success to significance. Tell me about it, Kate.

Speaker 2 (30:36):
Oh, that's such a great provocation. I think that success
is significance. I think significance is success. I am all
about significance because significance is meaning, right, What something signifies
is what it means, and meaning, as I've already alluded

(31:00):
in this conversation, is to my mind, the most central
human condition. It is what we uniquely care about. Like
no other animal that we know of cares about meaning
the way we do, and machines don't yet care about
meaning the way we do. It is uniquely ours, and

(31:23):
it scaffolds from this baseline of semantic understanding, you know,
communicating something in words and what those words mean when
we say them to one another, all the way up
through layers of purpose and patterns and truth and significance,
all the way out to what the most big picture,

(31:45):
macro like existentialist cosmic questions of meaning, like what's it
all about? And why are we here? And those are
all meaning, but when you try to distill them down
to a central idea, it is about significance. But the
question is what matters, and that's what matters. Next is

(32:09):
even though it is a book that's written for leaders
and helping leaders make better decisions, and that sounds very
corporate and there is an element of being of service
in the corporate space. It is a very personal construct
to me because what matters is that heart of the
question that I've been asking my whole life around meaning

(32:31):
and how can I help people get better in touch
with truly what matters and how they can align themselves
align their work to that, How they can you know,
be more in concordance with what they value and what
they're doing in the world. I think that that's that

(32:55):
misalignment of meaning is one of the most magic missing
pieces in so many lives and in so many businesses,
and the businesses that thrive are the ones where you
can really see the concordance through. You know. One of
the examples I use about why insights are so powerful

(33:16):
in ethical acceleration and tech decision making is because you
can think of an example like Apple and you can
think about how clear it is that Apple is a
design centric company. Right. That's an insight about Apple. Apple
has known since the Steve Jobs formation, right that they're
very much about design. And so you can imagine a

(33:39):
scenario where someone brings a leader at Apple a product
that they would like to ship or a feature they
would like to ship, and yet they're saying, well, it's
not truly designed. Well, yet you know we don't have
it ready. It's an easy decision. It's a no, we're
not ready to ship that not for Apple. Maybe another
company would ship it, but Apple knows that it's not

(33:59):
ready to ship, because that is an truly intrinsic thing
about Apple. And so what I always do when I
illustrate that story is to say, what is it that's
just as true for you, like for you for your organization.
What's an insight about who you are or who you
are as an organization and what you value that you

(34:22):
can accelerate the decision making that you're doing in a
truly aligned and ethical value centric way because you know
this that is meaning.

Speaker 1 (34:36):
And so that's getting more meat when you go to
present to an organization like the world of Walmart. The
thing that I wasn't there obviously, but I can tell
you from past experience of hearing world class speakers and

(34:56):
thinkers like yourself, is that this sad thing is you
people will come in with a book, we will go
through classes, and then at the end of the day
or the fiscal year or five years after, where where
is the where's the imprint? Why aren't we thinking like

(35:20):
Kate O'Neill. Why aren't we acting like Stephen Covey taught us?
And it's it's like, please, don't think of Kate as
a one off. We're bringing her in to speak and
sell a book. This is about. This is a velocity
that needs to have stickiness to it.

Speaker 2 (35:45):
Yeah. You know, there's a new model that I'm launching
with my company and I'm really excited about it. We
have been trying to think about what it means to
grow in a meaningful way for KO Insights, and the
model that we're launching is called ten thousand boardrooms for
one billion lives. So the idea, as I hope it's clear,

(36:08):
is that we would very much like to and we're
measuring this. Of course, get into metaphorically or physically ten
thousand actual boardrooms of companies, organizations, or even cities. You
can use it as a metaphor for the leadership of
a city. I do a lot of work with cities
and helps steer the decision making that's happening there such

(36:31):
that it benefits one billion downstream lives. And it's easy
to approximate you know we're gonna have to use all
sorts of approximation approximations, but you think about a company
like P and G or like Walmart, it's very easy
to imagine the many, many people who are downstream of
the decisions that are made in those boardrooms or near

(36:52):
the boardrooms, right like second and third sort of order
level down from boardrooms, they're still very potent decisions being
made about technology, about strategy, about customer experience, human experience.
And I would very much like to do exactly what
you're saying, to form to create more of a presence

(37:15):
that's remembered in those organizations, so not so that I
have to be there, but so that there is a
sense that we're part of this movement that you know,
Walmart or Ikea or whoever is part of the ten
thousand boardrooms for one billion lives movement, and that helps

(37:38):
us remember to make those decisions in a way that's
aligned with what we're saying about with what we're saying
our values.

Speaker 1 (37:44):
Because that folks, and you know, the people that are
about to hear this have heard it one hundred times
from me. Core values have to be strategic, which lead
to authentic purpose and mission. Stop posting mission statements and
core values on the walls if you're not going to

(38:06):
walk the talk.

Speaker 2 (38:13):
One hundred percent. Rick, Yeah, And I actually think mission
vision and all these other statements are often times a
couple of sentences of quasi language that no one can remember,
no one can recite. And what I am a much
bigger advocate of is strategic organizational purpose that is articulatable

(38:34):
in sort of three to five words and that everybody knows.
And so the great classic example for me of this
is like Disney theme parks. You could say, create magical experiences,
and everybody across the organization understands that to be their purpose. Right,

(38:55):
So if you are working in a call center and
taking booking call from guests to be or if you're
working as a janitor on main Street in a theme park,
you all understand that creating magical experiences is your purpose
and so you feel empowered, assuming that the company has
done a really good job, which Disney Theme parks have

(39:17):
done a very good job of letting people know that
they are empowered within you know, within reason to make
that happen to you know, cause create refunds or set
things up special for people or go out of their
way to make sure people feel seen and valued and
that they're having the time of their lives. That is

(39:39):
so clear at a place like that. And I'm not
even a Disney fan, but I am a fan of
purpose well articulated, that is well executed upon, and they
certainly do it. So they're a great example of that,
and I think other organizations can easily see how they
could articulate their three to five words and bring a

(40:00):
lot of their brand and culture and experience, design and
data and technology into better concordance with that well, because
the piece I even forgot to mention about Disney theme
parks is then you can think about how you can
deploy a multi billion dollar project like the Myamagic band
wearables and have them be integrated in this Internet of

(40:22):
things systems, beacons and sensors kind of a program where
that information is something people are just carrying around with
them and magically it seems to unlock doors and magically
it seems to carry payment information and reservation information for
restaurants and get you onto rides. The fact that you

(40:44):
can bring technology into concordance with those values and that
purpose is an even greater clarification of why this matters
and why it's so important that companies do.

Speaker 1 (40:55):
Part of what you're saying when you're out speaking to
these companies is to find your two or three word
mantra from an Apple.

Speaker 2 (41:08):
Three to five words strategic organizational purpose.

Speaker 1 (41:12):
Okay, So again, I let your career and your books
and everything you've written watch over me, and I'm before
the show, I was thinking, what can I ask Kate
to make sure that she understands what we would love
to accomplish? And I think it's this. I think that

(41:35):
you are You could be the medicine that certain sectors
need to take right now, and that I think you could.
You could provide a practical, calm voice about what are
you going to do with your future and stop being
so freaking scared and intimidated by a So let's start

(42:02):
with a couple of weird little questions. Is A supposed
to be an extinction filter?

Speaker 2 (42:13):
An extinction filter? I'm not sure I know what you meant.

Speaker 1 (42:16):
To use AI will you become extinct? M?

Speaker 2 (42:23):
I see what you mean. On a personal professional level,
I don't know if it's meant to be an extinction filter.
I certainly see how the way that we currently operate
and the way that AI is likely to roll into
more and more work roles, it certainly will function that way.

(42:48):
I think one of the things that I the distinction
I make when we talk about the future of work,
I think we conflate a lot of the time the
future of work, the future of jobs, the future of
the workplace, the future of the workflow, the future of productivity,
the future of tasks. Like all of these are different futures,
they happen to fall into relationship with one another, But

(43:11):
it is really important from a future of jobs perspective
that individuals understand the technology that's emergent all the time.
And right now I use this term minimum viable skilling.
So we talk a lot about upskilling and reskilling, and
the term I've started using is minimum viable skilling. And

(43:31):
the minimum viable skilling I think for people is prompt skilling.
Because we understand that generative AI, large language models, and
even the agentic AI tools that are starting to pop
up more and more are largely driven by prompts. And
the thing about prompts that's so interesting is once you

(43:52):
have if you write a good prompt, if you articulate,
well what it is you're looking for. It is really
not that different from delegate to a person. So in
either case, you have to be clear in your own
head about what it is you want and what the
success criteria are, and you know sort of the boundary

(44:12):
conditions of like, don't do that, do do this, Like
I definitely want you to include that, format it this way,
and go, and once you do that, you can get
great work product out of generative AI. You can also
get great work product out of other people generally speaking.
So I think that's one of the things that people

(44:34):
should really lean into, is the minimum viable skilling, which
is prompt skilling and getting good at that.

Speaker 1 (44:41):
Allow me to elevate you to Jane Goodall she said
the more important they became, and that you are. Maybe
she was a linguistics specialist as well, because she just
said the right things at the right time, and you're
mind me so much of her. It's possible that you

(45:06):
you could be the calming voice.

Speaker 2 (45:13):
Well, that's really kind. I think it's partly my resting
bliss face. I don't know why I have and I'm
just always smiling.

Speaker 1 (45:20):
One thing, let me read that's a wonderful. It's about
the courage to follow your curiosity, the humility rick to
recognize the interconnectedness of all life, and the determination to
leave this world a better place than you found. It's like,
be still, my heart. She told me that. Personally, absolutely,

(45:47):
I want I'm looking for you. Yeah, I mean I
see something in you. I hear something in you. I
read enough about you to be dangerous that you could.
What do you want to say in this If there's
a quintessential chapter about significance about pet where you can
calm the sea and tap people look forward, that's what

(46:13):
Maybe we should end the show today on that, and
then I'll get back to you with other questions. How
about that?

Speaker 2 (46:21):
Okay, I will say this. It's so funny that that
is one of the greatest compliments that I've ever been
paid in my life. And the other was when someone
compared me to Peter Drucker. And I would like to
think that maybe at my best I'm a synthesis of

(46:41):
the two, or I aspire to be right Peter Drucker
meets Jane Goodall. Please, that is that is what on
my best day I am hoping to be. But I
think that it does come down to sort of the
synthesis of Peter Drucker's absolute clarity about strategy and Dame
Jane Goodall's absolute clarity about hope and purpose and humanity. Right,

(47:08):
there's there's this middle path, this third way that the
to those two ideals set for us. And I think
that middle path is really defined by a clear and
insistent focus on human meaning, meaningful experiences for humans, on

(47:32):
alignment between what is good for people, what is good
for the planet, and what's good for business, because ultimately
we know that business is the engine that sees these
things forward. So you know, we can't we can't be
anti capitalist. I'm not anti capitalist. I was asked that
on a podcast at one point, like are you a socialist?
And I no, I'm not a socialist. I can see

(47:53):
how you'd arrive at that, but no, I'm not. I
am a capitalist, and I am, but I am very
much a compact capitalists. I am very much about what
could we do to remake capitalism. It's not doctrine, it
doesn't have to be sacred. What could we do to
remake capitalism so that it is actually achieving more of
the goals we would like to set for ourselves. How

(48:16):
could it make the world better? How could it make
humanity better? How could it create the conditions for flourishing
for humanity now and in the future that would be
worth aspiring to. What could we do to make the
world better for more people more of the time. I mean,

(48:36):
it's such a non ambition, it's like such a giveaway
sort of line. But I think it's a really really
important question to be asking ourselves as often.

Speaker 1 (48:47):
As you can ask it. And with that we will
close today's chapter and thank you Kate for being on
and her book is what matters next to making human
friendly to decisions in a world that's moving too fast,
And hopefully she will be appearing in our next truly

(49:08):
significant book before she spends out her next brilliant book.
But I think we caught it. I think we caught
lightning at a bottle today. Thank you so much, And
as we always say, folks, thank you so much. Hope
that you can get your arms around the concept of
success while being significant in this world. Brittany
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.