Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Hello, Hello. This is Smart Talks with IBM, a podcast
from Pushkin Industries, iHeart Media and IBM about what it
means to look at today's most challenging problems in a
new way. I'm Malcolm Gladwell. Today I'll be discussing the
innovations around hybrid cloud with Lumen's David Shakochus and IBM's
(00:27):
Howard Boville. David is Vice President for Enterprise Technology and
Field CTO at LUMEN, where he's helped clients across industries
create new business opportunities through unique digital interactions. David has
been immersed in cloud computing long before his time with Luhmen,
working with companies such as Unit, Digital Island and fuse Point.
(00:51):
You're putting computing capacity in places that didn't used to
be thought of as data centers before. There's an element
of novel challenge and so they're inherently there's there's more complexity.
Howard is the head of IBM Cloud Platform. In this role,
Howard has focused on driving digital transformation for enterprises, especially
(01:11):
in highly regulated industries. Before joining IBM, Howard was Chief
Technology Officer for Bank of America, where he led the
transformation of the Bank's infrastructure and developed one of the
largest internal private clouds. The times you have to kind
of be a technology of angelists in terms of what
the art of the possible is against the problems. In
(01:31):
this episode, we'll explore working and living in a world
of cloud technology. Will show you how new innovations in
cloud computing have reimagined a world where computing can happen
anywhere and businesses can use data to accelerate innovation to
improve service and performance. Let's dive in. Welcome everyone, Howard
(02:07):
and David. Thank you for joining us today. Let's jump in. David,
I'm gonna ask you to define some terms and that
will be easy for you but useful for the rest
of us. First of all, the Fourth Industrial Revolution? What
is it? Yeah? Good one, so we can really look back.
(02:27):
I think on history in these periods of technology advancement, right,
you know, the period of industry that was defined by
steam power, the period of history, and the industrial period
of history that was defined by electrical distribution. We commonly
think of the third one is really this information age,
the information revolution of digitization of process creation and online
(02:52):
connectivity of data. Is this third industrial revolution of the
digital age? Information technology systems community caating with each other,
and the advent of all that you can do in
industry with those technologies, and the Fourth Industrial Revolution is
really this reflection of the explosion of data that gets
(03:12):
created by all that connectivity. Taking data and being able
to acquire it, analyze it, and take action upon it
is opening up a wide range of new industries and
new business opportunities and new regulatory challenges. UM and that's
what we mean when we say the Fourth Industrial Revolution.
How did Luminum IBM come together, and what's the logic
(03:35):
behind your collaboration in this field? You can't take the
heritage of both companies. So limit are a world class
global networking company. They connect things together at the highest
level of quality, lowest latency and so on. And it's
and it's hard through all the actual transformations that IBM
has been through. Is where a compute company on which
the software runs and we write that also the software
(03:56):
in certain contexts as well. So the combination of the
two capabilities solves for the problem. We've been working together
for years. I think the the advent of what we've
been focused on with IBM Cloud Satellite has really been
initiated by Lumen's investment in making our network a place
where you can run software workloads more readily and easily.
(04:17):
And IBM cloud satellite is a great modality that just
snaps right into that network. Yeah, you work for Lumen.
Is the simplest way to describe Lumen that Lumen is
a Fourth Industrial Revolution company. We're we're a Fourth Industrial
Revolution company because we believe at the core of all
of it is connectivity. All that data and all those
sources of data and all the ways that you need
(04:39):
to interact with that data requires a substantial amount of
aggregate networking capacity. We're now kind of hitting this tipping
point in the Fourth Industrial Revolution where the amount of
data coming inbound from cameras and from sensors and from
devices and gaming consoles and and a variety of uh
input sources like that is actually exceeding capacity in the
(04:59):
other direction. So that's really why for the Fourth Industrial
Revolution of work, you need massive amounts of network connectivity.
And that's what what LUMAN does. So this brings up
the second word I want you to define, and that's
edge computing, which I'm assuming is edge computing is it
is a technological response to the phenomenon you've just described
(05:20):
it is. It's one way to think about edge computing.
The way we talk about it a lot is it's
moving workloads software workloads closer to digital interactions, and a
digital interaction could be between things and people and business models. Yes,
I mean, just to add to some of David's point
with some kind of practical use cases that we kind
of we're involved in. So so, first and foremost, the
(05:41):
edge computing piece actually is joining the analog world to
the digital world, Whereas until this point you would look
at the digital world through the screens that we all
spend sumers time looking at. Whereas on the edge, it's
actually looking at physical locations like retail branches, like shipping
concern as, like welds on on a well, but an automobile.
And there's there's two practical examples. So there's thermal imaging
(06:05):
techniques that we now use to look at the quality
of a weld all the way through a production process
in an automotive plant that wasn't possible, that connects in
that local location, gathers that data and determines that the
welders at the actual right quality or on a shipping
content basis, it's it's the combination of r F I
D tags, connecting to networks that contract with that level
(06:29):
of accuracy and giving you that experience. Come in terms
of the how has this come about, it's because as
we've become more familiar with the amount of data that
we can capture through a digital interaction through a screen,
whether that's a mobile phone or whether computer, and all
of the analytics that you can then do on kind
of humans behavior, the same questions that get posed the
(06:53):
physical locations or physical assets, the physical interactions or the
physical assets, and it's the wedding of those two things
create this I T problem. The companies like Lumen and
IBM sold for at the edge so that you can
actually tie together the digital world and the physical world
in the same way as you were capturing the data
purely from a digital world. And it's then human's curiosity
(07:16):
that I say, Okay, well we've got these questions answered
from the kind of the Third Industrial Revolution, I dad,
we went through. How do we apply that through the
Fourth Industrial Revolution into the analog world? Yeah? Yeah, you
know what this makes me think of if I was
and and start me if this is too speculative if
I was a basketball coach, I would love to have
(07:38):
an edge computing system which picked up data from my
players on the court in real time and told me
who was getting tired, told me whose performance was subpar,
told me how quickly someone was responding on defense. And
I mean, that's a that's kind of what That's an
example of what you're talking about. How it isn't it.
(07:59):
It's like the and wor all that had previously been
entirely analog perhaps perhaps bang on a trash can when
they see something. But but but that but that in
parts to the point you're making is in reality because
that there are tracking devices now on athletes in practically
all disciplines at tracking how many kilometers are, how many
(08:20):
miles they're running average pace, And that's been tracked, and
that will be analyzed at the halftime break or the
quarter time break to being upon the actual sport that's
been followed or the third time I guess of its
ice hockey. So that has been tracked. What isn't is
the physiological elements that you talked about. But I guess
kind of that will be at some point because humans
curiosity will drive into that element to say Okay, what
(08:41):
what level of fatigue are we are? Therefore, was the
optimal moment to actually make a substitution of a different
player onto the pitch? Yeah? Yeah. Or if I'm if
I'm a hospital, I want to monitor the performance of
my surgeons. I mean an hour four of a complex operation.
I would love to be able to in real time
crunch a whole series of data that tells me, you know,
(09:02):
who's working well and who's flagging. Another kind of hallmark
of edge computing is when you really need to correlate
things locally. Um. You know a public safety use case
where you know, a gunshot rings out, um, and an
audio sensor picks that up well, correlating that with all
the stop lights in the area, all the lights in
the area, you know, any other public safety device that
(09:25):
is within a particular geographic boundary. UM. That intense correlation
of events to other outcomes may need to happen within
split seconds, uh, you know, for for a public safety
outcome to be achieved. So so so it's it's not just
the fact that you know, we're tracking, we're analyzing data,
and then we're getting lessons learned at halftime of you know,
which one of our players run around. The more fine
(09:47):
grained like milliseconds matter kind of use cases is another
place where edge computing really shines. In step one, you
analyze that kind of data, say the past coutball player
or the surgeon after the fact. So you have the
meeting the next day and you say, you didn't perform
(10:07):
very well yesterday, Malcolm on the court. But if I
can do it in real time, then I can actually
affect the outcome of the game as it's happening. And
that that shift from being able to make those judgments
immediately and make those judgments after the fact is huge.
It's I could win the game, yeah, didn't otherwise lose
(10:30):
And I'm echoing, I'm capturing your excitement. You are kind
of echoing that position where we've kind of gone from
the digital perspective where people are playing online games, sporting
games and making judgment spist up on what they can
see from the analytics they get in that digital realm,
and then translating that into ideas that could be extended
into the animal row and therefore that desire. You can
(10:51):
imagine there are people as we speak now putting together
innitative solutions that can address that very problem. Yeah. The
other thing too, is it we're talking about all the
whiz bang use cases and and there's sort of a
subtext to everything we just said, which is that there's
good software designed at scale able to run and achieve
(11:12):
those outcomes better basketball performance, public safety use cases. There's
software that needs to go and collect all that data
and take action against it. And the other sort of
really the dimension, and certainly with a big dimension of
the IBM and LUMIN relationship is being able to enable
great software development anywhere that the network can reach. All
these use cases don't happen unless there's software that goes
(11:35):
and runs that business logic or runs that analysis, or
processes those inputs into actionable outputs and respond to an
event stream. Yeah. Yeah, talk a little bit about the
cloud piece of this. Why does hybrid cloud How does
(11:56):
it fit into this puzzle that you've been describing. So
the hyhbrary cloud space essentially encapsulates all the points that
David has gone through. So it's a cloud essentially is
a building with computers in that run applications. And the
paradigm until probably about ten fifteen years ago was that
a large corporate would have a big data center have
(12:16):
its own computers in them and would have that capability.
And then what created a huge innovation was the actual
ability for our developer to come up with an idea
not need to unbuild a big data center for computers,
and it could actually rent the space and then turn
that idea into software and turn that software into a
Facebook or a Netflix or whatever it may be. So
(12:37):
it reduced barriers of entry, and that was the first phase.
The first that PHAs that were now in is this
kind of synthesis between the digital and the analog at
the edge, and that's the hybrid cloud computing, where we
can actually create many data centers specific to particular needs
all around the world, not just within the assets that
IBM has or the assets that are the cloud service
(12:58):
providers have. And it's these partnerships. There's also kind of
new economic models in the marketplace where companies can operate
with humility to recognize, Okay, we may be large companies,
but actually we can see the assets in another company
and the brilliant people that exist there, and if we
could partner with them, we could create something valuable for
the marketplace. What the what's the challenge if I'm a
company and I want to do something sophisticated with all
(13:20):
of this data. Where am I gonna What's going to
keep me up at night? A part of this puzzle
when you have this explosion of data and it can
be at the edge, the key thing that we need
to be very mindful of is cybersecurity risk in that
that data gets in the hands of their own people
who then can actually use that to their own to
their own gain, or to whatever purpose they want to use.
(13:42):
So every solution that has to be built has to
be built at a very high gradive cybersecurity, so ensuring
that we protect our customers data and also we protect
them from the laws, rules and REGs that they have
to be obligated to. Broadly speaking, you're you're putting computing
capacity in places that didn't used to be out of
the data centers before. Right there, there's an element of newness,
(14:03):
There's an element of novel challenge that you may be
running into, and so that inherently there's there's more complexity.
The other thing that keeps a lot of it leaders
up at night is whether they are going to be
able to write software and deliver it at a pace
of change that is actually going to be able to
take advantage of or or solve the problem they're trying
(14:25):
to run. So I want to go back. I want
to do a for example here, because it seems to
be a really interesting and important point when I raise
that example of this surgeon and we want to gather
data from the surgical suite, we want to make sense
of it in real time, we want to inform the
surgery itself. But then you also want to share that
(14:46):
data with lots of other hospitals and use that to
build some kind of system that can improve surgery generally.
So what you're saying is, in order to do that
last piece, which is arguably the most important of the pieces,
everyone's going to be reading from the same book, right,
And the key around that is there's a level of
complexity as also reading from the same book means that
(15:08):
the actual the format is the same, the language is
the same, the type face to carry that analogy on.
So getting consistency in terms of the data models as
it's known is super important, as is the provenance so
that you know that the actual quality of the data
is at the highest level of insecurity. And the reason
why that's important is you would take all of that
inside all of those lessons that are turned into data
(15:32):
and put them into an artificial intelligence model to treat
what's not as training that model so that it actually
can come up with hypotheses that are actually continually initiatively
improved based upon the amount of data. But if there's
any issue, any corruption in that data, it will compromise
the actual outcomes. And because the volumes of data can
(15:52):
be so large, it is actually difficult to ensure that
actually the outcomes are trained correctly. So there's a huge
amount of work has to go into ensure the integrity
of the day to the providence of the days or
is correct so the AI doesn't get trend in the
wrong way. Yeah, that idea of software distribution. The in
our data analytics practice. One of the industries they work
(16:13):
with extensively is manufacturing. And one of the things that
we see organizations challenged by, and as a phrase you
are one of our data scientists uses all the time,
is that it's actually kind of easy to go and
collect a lot of data locally on the shop floor,
and it's kind of easy to get all of the
data historically that you've ever had once it's available in
your data center to go have a data scientist analyze
(16:34):
it and come up with you know, widely held best
practices and the source of what should be the most
efficient way to do things and what should be the
most efficient data model that can analyze all the sensors
in the factory. The challenges is getting it from the
top floor to the shop floor. It's fine to get
that lesson in your shortcore data center. It's fine. It's
(16:56):
fine to go collect a lot of data. The challenges
connecting them together, and that's really where this idea of
consistent delivery of new software when you learn the lesson
in the top floor says this is the way it
ought to be. How do you get that code out
into your built environment so that that software is actually
taking effect. It's not just a theory that is a
model in a data center, but it's a model that
can make a difference. Tell me about how this collaboration
(17:20):
between your two companies addresses that problem. Can you give
me an example, Well, yeah, I think what however, was
alluding to. One of the customers we're working with right
now is in the financial services industry. But this is
a digital interaction between the financial services business model of
banking and the people that walk up to it. And
(17:40):
there's a security risk out there in the world whereby
bad actors will target a t M machines and it's
called skimming, where they'll go and walk up to an
a t M, put a device that looks same color,
same fitting over the credit card slot, and surreptitiously scan
the credit card as it's being inserted into the machine.
(18:01):
The user doesn't know that it happened, and the bank
doesn't necessarily know that it happens. In the the the
point at which they can take most efficial effective action
against that bad actor is the point at which they're
walking up to the machine which has a video camera
inside of it, and inserting that device. And so there
are certain patterns you can be looking for. Are they're
(18:23):
walking up to it with a bag, are they reaching
into the bag? They are they taking on a certain
posture against that a t M interface to know maybe
there's further correlation we need to take against this person.
But so financial companies would look at that and say,
you know, that could be a needle in a haystack
kind of analysis problem. And if you get better and
better at getting closer to figuring out who is skimming
(18:44):
off your A t M machines and who isn't. Once
you get good at building that model and then deploying
that software to all your A t M s, you're
in a situation where your overall risk to your customers
and your brand that the payoff becomes immeasurable. So that's
one of the things that we're working on with IBM
and some of the great video analytics software they have
that we can put out closer to some of these
(19:04):
financial institutions, acquire analyzed, but then act upon the data
that's involved. Oh, I see. So to your point, the
insight number one is this particular a t M has
been compromised, But the much more useful bit of information
is it's been compromised by such and such a person,
(19:27):
and we're observing that person compromising it in real time,
right right, right, So whether that a t M learns
what a bad actor looks like walking up to it
in Minneapolis, well that's good. But the key is then learning,
updating the model, getting that new software tested, and then
getting it deployed consistently to all the other places that
can benefit. I want to go back to this partnership
(19:50):
between Lumin and IBM. You said, you guys have been
working together for some time. How long when did you win?
Did it first start? We've had relationships with IBM and
some of it's a Elia companies in one way, shape
or form for a few decades. The other thing to
remember is Lumen is a service provider, right, so we
contract with our customers to go deliver services for them.
(20:10):
In a lot of cases, those services have always involved
IBM software, IBM data capabilities, working with the IBM cloud,
and so IBM as a technology entity has been connected
to the end points of Lumen networks, you know for
all that time. Yeah, what does from a customer standpoint,
what does the partnership between Lumen and IBM look like?
(20:33):
I mean, are you if I'm that financial service companies
is trying to trying to stop my A t M
s from being hacked? Is am I dealing with a
kind of task force made up of lumon and IBM folks,
So that the solution that we're putting together there is
precisely that. So the out of technology companies continue to evolve,
(20:53):
they have these kind of tusk forces that you talk
about that actually work on problems and then reapply let
us technology innovations to those problems. We then create new
go to market offerings. As I mentioned earlier, kind of
the business models that are kind of really worked now
is where you actually get and understand with humility the
assets that you have as a company and combine them
with assets of other companies. And the thing that really
(21:14):
makes it come a live is in getting too very
smart groups of people together to actually face off to
those business problems. So the the problem that DIV was
going through, there was a conversation in the meeting room
which is, we have this problem, how would you think
about this? And then we combine our engineers various components
that we have worked up what we call proof of
concepts to kind of work through is there are there
(21:35):
there in some of the solutions that we can put together,
and then increasingly that becomes something that we would call
a production offering, which actually becomes more generally available in
the marketplace. What's the what's the hardest problem is always
I think called latency is always the hardest thing, and
it's in the both domains were probably primarily in the
(21:57):
limit demand, and that's where you kind of forever put
shing physics to actually get as close to the speed
of light in terms of how quickly you're you're transmitting data.
And it's a tough two problem to solve for but
because of the huge volumes of data and because of
increasingly humans nature for instant gratification and that we want
(22:17):
everything now, we want it immediately. And what's hard about
that is latency is a particularly hard problem from a
technical standpoint because in some cases latency, you know, latency
is that is the amount of time it takes usually
measured in milliseconds, which are less than the blink of
an eye, but the amount of time it takes a
(22:38):
packet to traverse between two particular endpoints and a network,
but those all add up, right. You can sort of
thinking of it in a computer or a brain context
as processing speed. How fast can I react to things? Well,
if it takes a while for the packets to travel
through their neurons, to use a brain analogy of a network,
(22:59):
the longer it takes for the packets to process through,
the longer it takes for an outcome to occur. And
if an outcome takes too long to process, then it
becomes fairly useless. Yeah, Yeah. My first question is do
customers always realize what the potential of all of these
different pieces are or is part of your job in
helping people in opening people's eyes to what's possible. You know,
(23:23):
very at times you have to kind of be a
technology of angelists in terms of what the art of
the possible is against the problems UM. And it's not
because customers don't have the samability to see that. It's
just very often they don't see that the breadth of
things that we see when we're working with lots of
different industries and we can apply solutions from one place
to another. UM. The other element in terms of the
(23:44):
pace of adoption in organizations is less about the actual
people within them, but also the technology decisions that were
made in the past. Large investments will have already been
made to actually build the technology environments that they have.
They're known as legacy environments, and it's getting from a
leg see environment to the new environment. And that that's
a tricky dribble in the sense that you have to
(24:04):
look at your balance sheet, you have to look at
the amount of work that would be necessary to do that.
You've got to change everything from infrastructure to lines of
application code, of data sets and so on. Um so
it's a very complex environment for our customers to be
able for other thinking about and therefore, what do they
prioritize as their next area of innovation relowsy to the
actual value that they would get for their customers, are,
(24:26):
for their shareholders or whatever that drivers are. It's really
interesting the that word I was going to kick out
of it. Enterprise i T. It's the only context in
which legacy is an epithet. Right, Like you say legacy
to an I T person, they roll their eyes and
you know their their blood pressure goes up. It's like
(24:47):
nails on a chalkboard. But to most individuals, like what
is your legacy, the word legacy means like it's something
to be honored, right, It's something. In an enterprise context,
legacy just means you've made a lot of decisions already,
You've made a lot of the say as you've made
a lot of implementations, you're bringing a lot behind you.
That should be a good thing. But an enterprise I
T context and a technology domain, it's really challenging. Yeah,
(25:08):
I mean that what I've heard played back to me
is kind of yeah, how that God may have created
the earthen seven days, but you didn't have to deal
with legacy. Uh yeah, so it kind of gives you
a sense as to the differences in an I T context. Yeah,
one last question, I want you guys to jump ahead
ten years from now. I've gathered the two of you
(25:28):
ten years from now, tell me what's top of mind
in I think what's what's really a huge challenge in
business and in the ways that business and organizations collaborate
is this concept of compose ability. And I think compose
(25:51):
ability of the ability to go break things down into
simple functions and have them be intercombined. Um, we're just
still even at the outset of that. You're to see
that a lot in the cloud, but as we get
out closer to edge computing in some of these Fourth
Industrial Revolution use cases, the ability to take and compose
different capabilities from from an IBM, from another software company,
(26:14):
from a real estate company that's selling you access to
run computing capacity at the end of a of a
physical link. The ability to compose services together, whether it's
through multiple parties, or the ways organizations even present themselves
to the world take advantage of us in any way,
in any slice that you so choose. Compose ability is
(26:35):
going to open up a massive amount of possibilities that
it's maybe a little rooted in the here and now,
but it's it's something that I'm excited about over the
next five to ten. Yeah, the thing I'm interested is
kind of the So we're in the midst of ultificial
intelligence that is increasingly starting to tax the inventors of those,
which is human beings. So the prefrontal call text only
has so many energy it can burn it a debt
(26:55):
and it is being burnt out at the end of
every day through the actual amount of debt, so it's
bombarding it. So the intelligent augmentation, so flipping the two
letters from artificial intelligence to intelligence augmentation so that we
actually can actually work within these environments in a far
more commulative style relative to what we can biologically do.
It's going to be where there's a lot of advancements.
(27:16):
And I talked about the partnerships between two technology companies,
so saluminate ourselves, but they'll be increasing partnerships between health
and bio companies as well as it relates to technology. Yeah, wonderful, Well,
thank you so much. This has been really fun. Thank
you very much being a pleasure it was. Thanks again
(27:38):
to David cos and Howard Boville for talking with me.
It's fascinating to consider how quickly data analysis can change
performance in real time, and the endless possibilities of hybrid
cloud and edge computing, and look forward to witnessing its evolution.
(28:01):
Smart Talks with IBM is produced by Emile Rostak with
Carlie mcgliori and Katherine Gurda, Edited by Karen shakerge engineering
by Martin Gonzalez, mixed and mastered by Jason Gambrell and
Ben Tolliday. Music by Granmascope. Special thanks to Molly Sosha, Andy,
Kelly Neil, LaBelle, Jacob Weisberg, Head of Fane, Eric Sandler,
(28:24):
and Maggie Taylor and the team's at eight Bar and IBM.
Smart Talks with IBM is a production of Pushkin Industries
and I Heart Media. You can find more Pushkin podcasts
on the I Heart Radio app, Apple Podcasts, or wherever
you like to listen. I'm Malcolm Gladwell, See you next time.