Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Good Company is a production of iHeartRadio. Everybody rallies around
a mission. Whether you're building a company, you've got to startup,
you've got a publicly traded entity, or you're going into
a war zone. You rally around the mission. And the
only thing that happens to get the mission accomplished is
people have to do the work, and people have to
(00:21):
feel safe, they have to feel empowered, they have to
have the tools, they have to have the resources. There's
this notion in the military of mission first, people always,
and frankly, it's exactly the same inside of a company
as well.
Speaker 2 (00:38):
I'm Michael Casson and this is Good Company. Together we'll
explore the dynamic intersection of media, marketing, entertainment, sports and technology.
I'll be joined by visionaries, pioneers, and yes, even a
couple of disruptors for candid conversations as we break down
how these masters of ingenuity are shaping the future of business,
culture and everything in between. My bet is you'll pick
(01:03):
up a lesson or two along the way. As I
like to say, it's all good. Welcome back to Good Company.
Today's guest is a true defender of the digital world,
someone standing at the front lines of technology and trust.
Stu Solomon didn't take a straight line to the CEO's office.
(01:26):
He served as an Air Force officer, practice law, another
reformed lawyer like myself, led threat intelligence firms, and now
is the CEO of Human. He's taking on the defining
challenge of our time, ensuring that the digital world remains safe, authentic,
and fundamental human amid the rise of AI agents and
(01:47):
automated decision making. Human has become a guardian of digital integrity,
an organization built to protect not just data, but the
very fabric of how business, media and society connect online.
Every week, Human analyzes more than twenty trillion interactions, quite
an extraordinary number across the Internet, separating legitimate engagement from
(02:10):
manipulation and transforming data into real time intelligence that helps
their partners stay a step ahead of evolving threats. But
Stu's story isn't only about cybersecurity. It's about leadership grounded
in service, discipline, values, and the belief that technology should
serve humanity, not the other way around. This conversation is
(02:32):
about how we build trust in a world where machines
now act on our behalf, and how leaders can defend
what matters most in the next era of digital transformation,
because in the next chapter of the Internet, the real
question isn't who's online, it's who's authentic. And I will
tell you I believe there's no one better equipped to
answer that question than Stu Solomon.
Speaker 1 (02:53):
Welcome, Stu Michael, thank you very much for the introduction,
and I absolutely love the way you just lay that out.
I would add trust. I think in the next wave
of the future, the Internet is going to ultimately boil
down to authenticity and trust.
Speaker 2 (03:07):
Well, you know, I'm going to start with something I
don't start with and I used a lot for a
long time, and I haven't mentioned on Good Company in
a while, but you know, I've talked over the last
several years about words and words matter, and I came
up with something a few years ago. It boiled down
to something that I've called traditionally the t's and c's,
(03:28):
and I do that because the words either started with
the letter T or the letter C. What I said was,
these are as good as words as I've heard in years,
as thoughts, starters and a conversation. So I have to
do it with you because of the nature of what
we're going to talk about here today. So to those
who've heard this before, I apologize, Stu, it's fresh for you.
The T words are trust, transparency, talent, technology, and transformation.
(03:54):
And I always begin with trust and transparency and that's
the hallmark of what human is about as much as anything.
And if I throw in technology, transformation, and talent, there
you go. The sea words are content, commerce, culture, creativity, community,
and curation. So the sea words apply as well. But
(04:17):
in your case, the T words tell the story trust, transparency, technology.
So your background, as I said, Stu, coming right back
to it, doesn't follow the traditional tech CEO playbook. That's
what kind of makes your story more interesting than most.
Can you give us a little bit of that journey?
Speaker 1 (04:36):
Absolutely, and thanks for asking about it, because it is
exactly what defines me. But ultimately it all boils down
to inherent's sense of mission. Whether you're serving your country,
your community, your company, your team, or in this case, industry,
it's always circling around a sense of mission that rallies
people towards a common cause and a common goal. My
(04:59):
journey is simply one of honing in on a particular
mission and really sticking to it and growing over time.
I was fortunate enough to have been selected to attend
the United States Air Force Academy coming out of high school.
I went on my Air Force journey. I stayed as
an active duty officer in the United States Air Force
for a number of years and then transitioned into the
(05:19):
Air Force Reserve and National Guard. In fact, I was
just recently confirmed for the United States Senate to become
a general in the United States Air Force Reserve. My
assignment will be pending here, actually getting announced in the
next couple of days, but supprises us to say, I'll
be focused on the same kinds of things that we
that were going to be talking about today, which is
(05:39):
how do you safeguard the future of our technical environment
and our digital future now. Along the way, I learned
not just what it means to rally around a mission
and to serve others, but also how to hone the
techniques necessary to effectively lead individuals and teams. After I
left ACTI duty in the Middle Military, I moved into
(06:01):
the civilian workforce. As you said, I'm a reformed attorney,
much like yourself. As an attorney, I learned very quickly
it wasn't necessarily about binary decisions of good and bad,
right and wrong, But it was more about efficacy of
your advocacy. And I found that you could take the
way that you think instructure thought as an attorney that
pushed towards advocacy and it became a wonderful foundation for business.
(06:25):
And so I started to apply that on different ways
to establish a foundation for building and leading business teams
and businesses. And then over time I ended up at
Bank of America, where I got very heavy into cybersecurity,
risk compliance and learning what it meant to safeguard trust
(06:46):
and brand In one of the largest financial institutions in
the world, the implications of good and bad decisions could
quite readily be seen on the stock market on a
daily basis, and so it was just a remarkably important
and format of time in the early days right before
the mortgage crisis in two thousand and eight and in
that time right after where you had to earn back
(07:06):
the trust of the public by standing for something.
Speaker 2 (07:09):
So STU two things, I'll start with, Yes, we both
are lawyers by trade. I started my career as a
tax lawyer, of all things. I guess what I learned
in the ten years that I practiced law and what
I learned in law school, which I think readied me
and prepared me for my career in business, was the
art of understanding the call of the question and learning
(07:31):
the ability to analyze the situation. Finding the answer comes
through experience, not the education from my perspective, but the
idea of issue spotting, as you know, as we learn
in law school, it's spotting the issue, understanding the call
of the question. I think that's one of the great
lessons that I learned as a lawyer and how that
drove my career in business and how I approach it.
(07:52):
But the military side of it in your case the
Air Force, and I'm having a senior moment. I can't
remember the gentleman's name, but it will come to me
as we're chatting. I worked with a gentleman who started
a company in publishing, and he took his Navy seal
training and put that at the core of what he
did in business, because he said, when he was given
(08:13):
an assignment as a Navy seal, go take that bridge,
take that hill, whatever the charge of the call of
the question was, or the charge from the commanding officer was,
and he had to understand the purpose to be able
to do the job well. So that analytical thinking paid
off there as well, because if the idea of taking
(08:34):
the bridge in a military context is to disable it
so no one can use it, then it's one strategy.
If it's to just block it so the other guys
can't use it, then you don't want to blow it up,
you don't want to scorch the earth. Then you know,
it's that logic that I think has to be so
in your case, having both the legal training and the
(08:55):
Air Force training, the combination is just amazing. Right.
Speaker 1 (08:58):
Context matters, and when you talk about a mission orientation,
it's not just about getting the mission done, it's about
getting the mission done correctly. And I think that's what
you just described perfectly with that analogy. I think the
other piece, when you look at a number of different
kind of scenarios that translate well from military life to
business life, it's also about extreme ownership. Absolutely, It's about
(09:20):
understanding what your responsibilities are and what you should be
held accountable to be able to complete and do effectively.
Speaker 2 (09:25):
Exactly. So tell me what got your juice is flowing
in terms of human what attracted you based on everything
we've just talked about.
Speaker 1 (09:36):
Yeah, absolutely, so. Human represents a remarkable opportunity and it
also represents a remarkable history, and they come together so perfectly,
and that's what attracted me. Human add its very core
has been making decisions on these very basic questions of
bot or not human or not ultimately good or bad
(10:00):
with this general presumption that's a very binary decision that
if you were a human historically you were good, If
you were a bot historically you were bad. Now that
obviously just completely breaks down in a modern digital society
at the very core of those questions. Over time, it
turns out that you could apply that logic to so
many different areas of our digital interactions. You could look
(10:23):
at that from an advertising perspective, and the way that
bot activities used to in a scale fashion to interface
with digital ad content and the placement of digital ads,
the consumption of digital ads, and the measurement of digital ads. Well,
in a very similar way, that same logic of bot
were not good or bad starts to break down at
a massive scale when you start to look at what
(10:46):
comes next after you get attracted to digital ad content,
and that's you go to a website, you go shopping,
you build brand affinity, You have some logic behind your
relationship with the brand, and suddenly it starts to get
manipulated again during a transactional behavior, during an authentication to
a website or signing up for a loyalty program, and
(11:09):
the very same questions and the very same manipulations and
the very same threats start to occur when you interface
as a human or an authorized machine with the digital
ad contact that takes you to that next step in
the journey. Now the next piece. Suddenly, what do you
want to Ultimately do you want to transact? Well, if
you go to transact, there's all kinds of bot based
(11:31):
or human based manipulation of that transactional relationship. Every single
one of those steps happen trillions and trillions of times
a day across our world, and we take them for granted.
But ultimately there isn't necessarily water set of safeguards to
pull them all together to look at that end to
end value stream or client relationship.
Speaker 2 (11:52):
You know, I'm staring at a plug in a wall
right now, on a socket, and I remember somebody in
a very large tech company in Seattle, not Amazon, starts
with an m who said to me, you know when
you see that cord plugged into the wall, as a consumer,
as a person, as a human, I just say, oh,
it's plugged into the wall. What I have no clue
(12:13):
about is what's happening behind that socket. Okay, there's a
lot of wires that are making that work. To me,
it's just you plug it in. You know, I don't know.
Speaker 1 (12:22):
And that's the exciting part about human It's kind of
the ability to go and put this giant wrapper around
all the things that happen in that entire process that
you either take for granted or simply don't point in time,
have a singular view as to all the different places
that potentially malicious activity or fraudulent activity occurs. That creates signal,
(12:43):
and that signal needs to be harnessed and analyzed. Once
it's analyzed, you have to make decisions very quickly, intensive
a millisecond, potentially trillions and trillions of times a day,
so that you allow for good positive brand relationships, good
positive and affinity, and good transactional behaviors to occur, and
you add friction to the bad ones so that you
(13:06):
get in hardly more efficient.
Speaker 2 (13:08):
We're going to hit pause for a moment, but stay
with us after the break, we've got more insights to share.
Stull switching for a moment. I've submit to you that
(13:31):
the arc of my career of the past fifty years,
I'm going to start lying about my age shortly. I'm
just telling you, but right now I'm being honest for today,
and no, my age is not fifty. The career spat
is fifty. So you can try to do the math,
but don't try too hard. But I've said that the
three most important advances that I've experienced now would be
(13:53):
the introduction of mobile telephony, the Internet, and AI. Lots
of in between, lots of other things, lots of shiny toys,
dot com, dot com, the importance of blockchain. I don't
diminish that. But NFT and Web three and all these
things that were going on in the background, none of
them have the impact in my opinion, of mobile telephony,
(14:16):
the Internet, and AI. And when we're dealing with authenticity
and trust at the same time as colliding with the
rise of agentic AI, I wonder how that reconciliation is
going on in terms of you know who's on first
here in the famous Abbott and Costello lines, is it
(14:37):
agentic you.
Speaker 1 (14:39):
No, you're're exactly right. It's interesting you say that. So look,
I think at the end of the day, this is
something the entire industry is struggling with. We know for
certain that agentic behaviors are the next logical extension of
the AI movement. It started out with let's call it
machine learning and basic data science. It moved into things
(15:00):
around bought an automation and autonomous activity, and now it's
moving into this idea of agentics. And from an agentic perspective,
I would start by defining it a little bit, because
I think everybody has a slightly different definition. But I
would say that when I think about agentic activity, I
think that it's autonomy and authority to act on behalf
(15:20):
of a human with a constant self learning component to it,
so that it starts to anticipate activity.
Speaker 2 (15:28):
I put a prompt in the other day that said,
what have you been telling me that I haven't been
listening to. I'd save two hundred and fifty dollars going
to a shrink.
Speaker 1 (15:36):
Or maybe it just funneled you to a different shrink.
Speaker 2 (15:38):
Yeah, and by the way, two hundred and fifty dollars
is probably just for parking these days indeed.
Speaker 1 (15:44):
But so as you start to think about those notions
of autonomy and authority to act and a self learning
element associated to it, and you start to apply that
to behavioral patterns of things like buying and purchasing on
my behalf, that's where things get really interesting and that's
where you have to build this element of trust. There
has to be that trust that the agent is acting
(16:06):
on your behalf appropriately. There has to be trust that
agent is the right agent, and there has to be
trust that that agent works over time on your behalf.
So not just an initial trust, but a persistent trust,
and then ultimately you have to trust the outcome.
Speaker 2 (16:19):
Yeah, but you see, in my case, it's like that
Broadway show tune. I'm just a girl who can't say no,
that's me. So I need a gentic help he or
I need somebody to say no, no, you don't really
need that, please don't buy it.
Speaker 1 (16:31):
So it's interesting to say that. So when you go
through that whole paradigm and you think about that trust
factor that has to be established. Google is working with
the consortium across industry right now. You know, people like
MasterCard and coinbase and paypaler starting to work with Google
to build out their agentic payment protocol. This is just
one protocol across many, but they've got a very interesting
(16:52):
set of taglines across the thing. The relevant is conversation.
It's this notion of is it transactional behavior, authorized, is
it authentic? And who's accountable for it? And this goes
back to your point. Just building that trust layer is
step one. Do you trust the agent that's working, do
you trust the outcomes? And then the second piece is ultimately,
(17:13):
how do you have an authorized, authentic and accountable transaction
that occurs on the other side of it? And I
think that ultimately the accountable one is the most important point.
It's that idea that somebody's got to pay. There can't
be a fraudulent activity, it can't be a disputed activity.
There has to be some kind of thing that says
I wanted you to pay. I'll let my bank and
(17:34):
my credit card company settle the transaction because they know
that I'm on the other side of it saying yes.
And so it's that entire behavior that I think completely
changes the way that we as an industry think about things,
because now it's about how do I attract you to
my brand, how do I build a brand affinity? And
how do I create a transaction buy a machine that
anticipated my needs that the human being can stand behind
(17:57):
once it's over.
Speaker 2 (17:59):
One of the real differentiators from a human perspective is
your data story. I mean the homework that I've done.
You know that's been over the time that we've been
fortunate to work together and know one another. Your data
footprint is basically unlike anyone else. I mean the numbers
just I can't count that high, and I set the
bar really high, but the numbers I'm here in the
(18:21):
trillions of digital interactions on a weekly basis. The key
is our friend. I think you know Rashad Tobakawala. I
always give him credit for this. Many years ago we
were working on something together and he famously said, data
is like oil. When it's in the ground, it's interesting,
but it's not worth anything until you take it out
(18:42):
and refine it. Well, you've got an oil field that
is unmatched in terms of the data that you have.
And you talk about second and third order analytics, I mean,
could you shed some light on that for us?
Speaker 1 (18:55):
Yeah? Absolutely, And this coincides nicely with the question you
asked at the beginning why men human has the most
remarkable and unique data set when it comes to understanding
where fraud occurs across that digital spectrum of e commerce
in the world. To be able to solve for this
and see over four trillion transactions a day or interactions
(19:19):
a day. Now, the question is, how do you harness
that knowledge to be able to more effectively identify fraud
earlier in the transactional life cycle, and how do you
know when to put mitigating controls and friction in place
so that good traffic gets to the end and transacts
and bad traffic or malicious traffic gets stopped before it
(19:42):
creates not just fugilent transactional behaviors, but also frankly, operational
inefficiencies and or a lack of knowledge or understanding of
what your audience actually is and how they're performing. This
is all about creating, as you articulated, a second or
third order analytical effect to take seemingly disparate observations and
(20:02):
turn them into core knowledge. So you know, potentially, if
you were bad when you checked out at home depot today,
you're probably bad when you click on that ad for
McDonald's tomorrow and how do you start to look for
and identify those trends to be able to help protect
all brands by the knowledge that machines and human beings
(20:23):
have certain activities of behavior that help you to look
for and identify fraudulent patterns much earlier in their life cycles.
Speaker 2 (20:31):
Well, let me now translate this into the world that
we inhabit at three C. I always describe the intersection
that we live at as the intersection of marketing, media, advertising, entertainment, sports,
and technology. So I would say to you, let's zoom in,
if you will, on what this means for our audience
(20:51):
leaders in marketing, media, advertising, entertainment. How do you articulate
the value proposition you're talking about brand signals, you're talking
about the things we've talked about where cmos and chief
security offers or share ownership of AI governance. I mean,
how are you playing on that team?
Speaker 1 (21:11):
So, first of all, I think it's really important to
think that it's actually all one team coming together. And
I think when you get into larger enterprise that's just
not a reality, just based on the way that you
have to build functional expertise and siloed budgets and solid perspectives.
But at the end of the day, a cybersecurity professional
and I was just talking about this actually at a
(21:31):
conference in Monica last week to a room full of
chief information security officers from France, and the discussion was
every cybersecurity professional wakes up and starts out that day
one of their profession. They're told that your job is
ultimately to protect for the availability, integrity, and the confidentiality
of your networks, your applications and your data. Well, the
(21:53):
reality is that those very things are the same things
that marketing professionals actually have to worry about. They're the
same things that business executives writ large from an operational
risk and security perspective me to think about. They need
to work together to be able to do something actually
that's more important than all of that, which is protect
your brand equities and your ability for your business to
(22:14):
operate the way you expect it to act. So, if
you expect your business to attract people to your website
and go shopping effectively, the marketer and the security professionals
responsibility is the same.
Speaker 3 (22:30):
Good company will be right back after the break.
Speaker 2 (22:50):
We looked at this on the back of work we
originally did in the day when Mark Benioff and Salesforce
were launching the marketing cloud early on when they made
those early purchases of Radiant six and Exact Target and
Buddy Media and on and on and on, and you know,
Mark was transitioning a cloud computing company to a broader
(23:12):
based company with a marketing cloud, and one of the
things he asked our help with was re educating the
salesforce of Salesforce to understand how to pitch. No longer
were they pitching just to the CIO or the CTO.
They were now pitching to the CMO. And at that moment,
we looked at it and we said, there's actually a
(23:34):
coming together, and that coming together is the chief marketing
officer is now tasked with making more technology decisions than
ever before, and correspondingly, the chief technology or chief information
officer suite is tasked with making more marketing decisions than
they have ever made before. So we used to kid
around and say, the person with the pocket square is
(23:57):
now one and the same with the person with the
pocket protector. I tend to be more in the pocket
square crowd. But yeah, I know, I know. I'm saying
that with a smile. For our listeners. But the truth
is that party is no longer just the CMO and
the CTO. There's a third c suite member at that
(24:19):
party now, and that's the CFO because procurement slash CFO
is paying more attention now than ever to marketing expenditures
number one, because of the technological aspects of all of
this and the AI transformational aspects of all of this.
You've got the CMO, the CTO, and the CFO. So
(24:41):
I guess my question to used to when human is pitching,
I guess you're, in a funny way, pitching to all
three of them.
Speaker 1 (24:49):
We are pitching to all three of them. And again
I think that the converging factor is brand integrity, brand trust. Right,
That's ultimately where it resides. And and then you know,
the CFO can weigh in and say, well, what are
the things that are going to risk my ability to
convert that brand integrity and brand trust into money? And
so the three come together. Where it gets really really complicated, Michael,
(25:12):
is when you start to look at agentics. So everybody
gets excited about them. They make a lot of sense.
It adds all kinds of efficiency and productivity to our scenarios.
But when you ask a room of those three cohorts
that you just mentioned who owns the budget and who's
in charge of implementation? Nobody can see me, but they're
going to go like this, They're going to point in
every direction.
Speaker 2 (25:33):
By the way, I was doing exactly what you did
for our audience because I kind of read where you
were going. They're going to be pointing in the other direction.
Speaker 1 (25:39):
So nobody's taking ownership of it. They just want the
outcome and the efficiencies and the benefits associated with it.
So if I'm in one of those three seats, I
would challenge my other two counterparts to figure out who's
actually going to be have the authority and the accountability
to implement.
Speaker 2 (25:56):
Well and Stuo. You know, it's this question. I guess
going to push you on this, But what if the
real threat isn't a malicious bot, but rather a well
meaning AI agent making a bad decision. I mean, of course,
you know, to dog ate my homework. You know, who
is it? Shaggy? It wasn't me, It wasn't me. Okay,
(26:18):
I'll let everybody fill in the lyrics on that. But
how are we going to deal with that one?
Speaker 1 (26:22):
No, you're you're exactly right. And I was telling this
anecdote the other day as well. It's very similar to
kind of the insider threat scenario that a lot of
security and fraud professionals look at. The definition of insider
fraud is using appropriate or approved authorized access in either
a malicious or unintentional fashion to create the wrong outcome.
(26:45):
This is the same thing here. You know, if you
don't have you know, the accountability and the authorization associated
with an agentic behavior, it's going to create an outcome
that you don't want. And so I think we have
a scenario here where there's going to be just as
much misuse in it use as there is unintentional consequences
that are created by not putting the right trust factor
in place, by not creating that persistent and ongoing trust
(27:08):
factor and not having a trust factor at the end.
And that's a technical solution, but it's also a process solution.
Speaker 2 (27:14):
Stu. Speaking of process, you know, I always tell people
you've got to read the room. You got to know
who you're pitching to, or i'd say pitching, but who
you're trying to convince, who you're persuading in an argument.
You've got to understand the room. Is it possible? And
I think it is, But i'd love your input on
this that brands are really going to start being more
(27:36):
focused on marketing an agentic future than to a human
I guess your marketing has to be different if I'm selling.
Maybe I don't know the answer. I'm asking the question.
Speaker 1 (27:48):
No, I think that's the right question to ask. But
I would say that unbeknownst to most people, that inflection
point is already occurring in our environment, and we're just
not optimized to be able to deal with it. So
in twenty five we've pretty much hit this inflection point
where more than fifty percent of Internet traffic is actually
bot based than it is human based right now. Even
(28:09):
more importantly, there's many projections by large companies, large studies
from Gartner, a market analyst, Cisco, a ubiquitous technology company,
that are saying, by twenty thirty, we should expect to
see upwards of nearly eighty percent of all traffic actually
being bot based, on a large subset of that being
agentic based traffic versus human traffic on the Internet. So
(28:29):
if you focus on the fact that that's happening big picture,
and then you start to see what human has seen
just over the last few months is a rather precipitous
increase in agentic traffic starting to creep into the just
our install base and what we're seeing around the globe,
to the point where it's not just traffic that's occurring,
but that agentic traffic starting to do things like logging in,
(28:52):
like transacting, like acting with an autonomous activity on your behalf.
You're starting to see a massive amount of traffic assigned
not just to the chat shepts and the perplexities of
the world that are doing more generative AI things, but
you're starting to see a significant amount of agentic traffic
starting to occur. This is happening today. If this is
(29:13):
happening today, the reality is we have to optimize not
for search engine optimization, but for discovery in an authorized
and appropriate fashion by agents that are crawling and scraping
and indexing to discover our brands, discover our pricing, discover
our processing, so that ultimately we're included in those generative
(29:35):
AI outcomes and that we're now discoverable and we're useful.
When there's no human at all in the middle.
Speaker 2 (29:41):
Yeah, there's nobody on the sidelines saying coach, put me in, Coach,
put me in.
Speaker 3 (29:45):
No.
Speaker 1 (29:46):
So increasingly there's going to be an agent acting on
behalf of the consumer, and an agent acting on behalf
of a merchant, and the agents we'll talk to each
other and there'll be no human in the middle of
that loop.
Speaker 2 (29:57):
You know, in Hollywood, for years we've said my people
call your people. I guess we're bringing that to life, Stu.
Let me take us back to the beginning. Yes, leadership.
I don't get to say this too many guests on
Good Company, but I do in this case. You've led
teams in war zones and boardrooms. Is there a common
denominator leadership across both? Is there some common denominator?
Speaker 1 (30:20):
Two common denominators. One. Everybody rallies around a mission. Whether
you're building a company, you've got to start up, you've
got a publicly traded entity, or you're going into a
war zone, you rally around the mission. And the only
thing that happens to get the mission accomplished is people
have to do the work, and people have to feel safe,
(30:41):
they have to feel empowered, they have to have the tools,
they have to have the resources. So you know, there's
this notion in the military of mission first people always,
and frankly, it's exactly the same inside of a company as.
Speaker 2 (30:55):
Well, Stu, I get to do my most fun part
of good company now, which which is the lightning Round.
And I'm going to throw some things that you that
we haven't talked about besides this podcast. What would I
find teed up in your audio player right now? What
would you be looking forward to listening to? Is it
a book? Is it another podcast? What's your fancy?
Speaker 1 (31:16):
So normally I listened to really really bad spy fiction,
but I took a divergence and now I'm listening to
really really bad mafia fiction right now.
Speaker 2 (31:26):
Oh I love that. Well, I'm going to give you
my book tip of the Month, Born to be Wired
at John Malone's biography. It's a brilliant walk through the
history of cable.
Speaker 1 (31:37):
Love it, Stu.
Speaker 2 (31:39):
What's your greatest professional fear?
Speaker 1 (31:41):
My greatest professional fear is letting down the people who
trust me.
Speaker 2 (31:45):
That's a strong statement, goes back to trust and trust
underlies everything, you know. Years ago, somebody asked me why
I didn't invite them to something or whatever. It was
kind of like a high school question for an adult,
but I said well, this was something that I limit
it to my friends. And this person looked at me
and said, well, I thought I was your friend. I said, well, actually,
and you were my friend, but you went the wrong
(32:07):
way with me. He said, what do you mean. I said, well,
you went from being a friend to becoming an acquaintance.
And he startled. He said, I don't understand. What does
that mean? I said, well, you see, friends are people
I trust. Acquaintances are people I know. I know you,
I'll say hello to you. We can be cordial to
one another, but you can't really be my friend because
(32:27):
I don't trust you, and therefore, to me, there's a limit.
So it goes right to it speaking about people, stew
was there a mentor in your career early or you know,
could be more than one, but is there one particular
mentor that gave you a piece of advice that you've
never lost?
Speaker 1 (32:46):
Oh? Boy, that is a great question. I reference mentors
all the time. I think at the end of the day.
One of the most important mentors I had was this
old Chief Master Saget, the highest rank that you could
be an enlisted individual in the military, and he took
me under his wing when I was a young second lieutenant.
(33:07):
Just by virtue of being a lieutenant, I was his boss,
but I was anything but his boss. He was mine.
And he taught me this very basic notion of kind
of shut up and listen. The first rule of leadership
is to stop, to listen, to look around, and to
learn from those that are doing the job around you,
and then make decisions later. And this gentleman taught me
(33:27):
that literally my first day on the job.
Speaker 2 (33:30):
Well, I modify what Aaron Bird told Alexander Hamilton in Hamilton.
Talk less, smile more. My grandmother used to say, you
have two ears and one mouth for a reason. Listen
more than you speak. Just do the math. Is there
one habit in your daily routine that brings you unexpected joy?
Speaker 1 (33:47):
Yeah. My favorite thing to do on a daily basis
is take my dogs out into the woods behind my house.
Just the act of as I walk past the staircase
that leads down to the outdoor exit. Dogs every time
they think that that's where we're going. It is on
the walk. Whether I make it down there that once
a day or not, they're excited every time I even
walk past that staircase. So when I actually get out
(34:09):
the door, with them. That's the best moment of the day.
That's a great moment.
Speaker 2 (34:13):
Final question, Stu, is there any one particular industry buzzword
that you would wish could disappear forever?
Speaker 1 (34:22):
Hey? I I figured as much.
Speaker 2 (34:27):
Stu Solomon, this has been a great pleasure to have
you on Good Company. I've been looking forward to this
conversation and I want to thank you for taking the time.
And I'm certain our audience has learned a couple of
important lessons today, but about trust and transparency as well
as some good life lessons. So Stu Solomon, thank you.
Speaker 1 (34:46):
Michael, thank you so much. I truly appreciate it.
Speaker 2 (34:53):
I'm Michael Cassen, thanks for listening to Good Company.
Speaker 4 (34:57):
Good Company is brought to you by Three C Ventures
and Heart Podcasts. Special thanks to Alexis Borger purdeou our
executive producer and head of Content and Talent, and to
Carl Catle, executive producer at iHeart Podcasts. Episodes are produced
and edited by Mary Doo. Thanks for joining us. We'll
see you next time.