Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Extension two twenty four. Give her a call. That number
again is nine oh nine eight A nine eight three
seven seven extension two twenty four.
Speaker 2 (00:13):
You're listening to the Inland Talk Express ten fifty AM
and one oh six point five FM CACAA, Loma, Linda.
Speaker 3 (00:20):
The information economy has a rid. The world is teeming
with innovation as new business models reinvent every industry industry.
Inside Analysis is your source of information and insight about
how to make the most of this exciting new era.
Learn more at inside analysis dot Comsideanalysis dot com. And
now here's your host, Eric Kavanaugh.
Speaker 4 (00:46):
All Right, ladies and gentlemen, Hello and welcome back once
again to the only coast to coast radio show that's
all about the information economy. It's time for Inside Analysis.
You're truly Eric Kavanaugh here, and I'm so excited to
be talking about engineering these days, engineering for success. And
what's one of the hot topics that we have, especially
with all these electric vehicles and hybrid vehicles these days,
(01:08):
are the batteries. And batteries of course are expensive, they're heavy.
We want them to run as long as humanly possible.
No one likes charging their batteries up people like using
their batteries, and certainly in the electric vehicle world, they
are crucially important. I mean, with a hybrid, you got gas.
With an electric vehicle, all you have is the electricity
and that's in the battery. It's very heavy, it's very expensive.
(01:30):
So what can we do to improve the performance of
these batteries. Well, guess what we've got. Jovanni Rossi calling
it all the way from Milan, Italy, my favorite country
in the world. I love Italy, and he is with
a company called Electra. They are leaders and applied AI
for battery packs. I'm like, what explain this? So, Giovanni,
welcome to the show. Tell us a bit about what
(01:50):
you folks are doing and how you're able to use
a digital twin of batteries to improve the performance, and
not just in the design phase, like meaning for the
next set of batteries to come on the market, but
for batteries that are already out there being used. Tell
us what Electra is and how you're doing this stuff.
Speaker 5 (02:07):
Thank you, Eric.
Speaker 6 (02:08):
So, Electra is an AI clintech and B to B
software company that is like focused to unlock the full
potential of Barry technology. So basically as was mentioning. We
have two different type of products. The first one is
like a digital tween applications, so we study the barries,
we understand what is happening, you know, between the berry
with having a virtual replica of the battery where you
(02:30):
can thanks to our software capability and AI models, test
and innovate with the berry, so you can insert your
different parameters. You can test out new chemistries or new
models what you want to do, what is necessary to
make a new berries, and so you can get faster
to your results. Basically, you know, when you need to
(02:51):
produce a new berries, it can take up to ten
years and cost like up to one billion dollars, and
so of course you need technology as ours to decrease
of course your time and also your cost into making
new berries.
Speaker 5 (03:05):
On the other side, we also have a set of
software for.
Speaker 6 (03:07):
Like you know, managing, optimizing and controlling very capabilities and
so improving again the performance of the berries and providing
valuable insights for the different type of users and for
the business managing these barries so to make the battery
lasts longer and perform better and so maximizing of course
the reternal investment for the battery assets.
Speaker 4 (03:29):
Yeah, that's really interesting. So and let's talk first about
the digital twin and that environment where you have built
out in a computing environment. Obviously the battery itself, what
goes into it, and then you're able to allow your
engineers to play around with different settings. Is it getting
(03:50):
into the material science? Like how much detail can you
give us on what's done in that environment and how
you're enabling more efficient battery designs.
Speaker 6 (03:59):
Ye, can go into more than one hundred parameters, you know,
into the very spects, so it can be either at
a cell level and so like you know, into the
small great stuff in that sense, into until like the
very pack. So it depends of course of what we
want to do. But we have different applications, different modules
in the platform, and so you can go sell to
(04:20):
pack to pack levels so that you can test and
innovate whatever is like necessary for you. Of course you
can change parameters. There are already some pre built AI
model that you know, we have already designed and tested
with different customers and understanding of course the trends in
the market. But then of course we can also create
some additional customized AI models to do some very specific
(04:44):
operation that a customer may be doing.
Speaker 4 (04:48):
That's interesting. So with all these parameters, you're allowing the
scientists basically to get in there and just play around
with things and see about whether it's the kind of
materials that are used, how much of them are used,
all these different aspects of the actual design process. And
then in this digital twin world you can kind of
play around. Can you talk a bit about the accuracy,
(05:11):
like what, you know, what's the range of efficacy with
these tests when you do them in the digital twin
world and then you see what happens in the real world.
Is that something that you're capturing and managing over time
the data around that.
Speaker 6 (05:26):
Yeah, I mean like you can play around with whatever
it's like, you know, within the platform of course, and
also adding some additional bodiles and piece when it is necessary.
Speaker 5 (05:35):
And I think that the main benefit.
Speaker 6 (05:37):
Here is like you know, when you need to test
out some berries, you know, and you can have this
digital replica and you can play around with it, you
can decrease the testing time, so you know that that
seems like a real deal. And the game changer perspective
that is, like you know, you don't need to have
like a lot of you know, physical testing, but you
can do everything virtually and this can go up to
(06:00):
you know, ninety percent of the testing can be reduced,
of course in the best case scenario, but we have
seen that an average can be of like reduction in
test is like sixty to seventy percent, so meaning that
of course you're saving up a lot of time and
also a lot of money, and so the scientists can
also focus on some more interesting stuff. Let's say this
(06:20):
way than just replicate the test over and over again.
Speaker 4 (06:26):
You know, I remember talking to a gentleman, a very
very interesting guy. He is an electric car racer, and
we were over in Belgium for an electric car race
a couple of years ago, and he was talking about
this this effect when you float you can actually slowly
you can actually increase the energy in the battery a
(06:46):
little bit. Do you know about this? Like when you
break our cost in an EV, the electric motor access
kind of a generator converting the energy back into electrical energy.
Is that right correct?
Speaker 5 (06:56):
Yeah? And you can say up plum energy and refiel
that of the marry.
Speaker 4 (07:00):
Yeah, that's pretty cool. That's pretty cool stuff. And then
talk about the battery management systems I'm getting spanked by
the sunlight here in my head, but try not to
worry about that. Talk about how you can also work
with the management matteries I'm sorry, the battery management system
to improve performance there. What does that look like?
Speaker 6 (07:21):
So basically, let's take the an electric vehicle as an example.
You know what you do right now, It's like you
have different information that are coming from the berries, coming
from the driving style, coming from the environments, and of
course all of these is like impacting the performance of
the battery.
Speaker 5 (07:37):
So what we do is like we take all of.
Speaker 6 (07:39):
These information together, so berries, environment, driving style, and we
put everything together, and thanks to our AI you know, algorithm,
we can provide an output to the users, to the
drivers in the sense that is like how you can optimize,
of course your range. You can so extend your range
in this sense we have been able to estimate that.
(08:05):
Of course, the reduction of sorry, the average error in
like you know, estimating the range is like twenty percent
fifteen to twenty percents depend on vehicles. And with our
model you can go down to less than one percent
error in estimating range. So this means that of course
you have much more precise information in that space, and
(08:28):
we also you know, and with additional insights and suggest
that we can provide, you can also extend the range
and the capabilities. We have actually been testing these very
recently in a cross country you know trip coming starting
from Boston to Vegas, because we were presenting our technology
at CS twenty thirty five, and we did it with
(08:51):
like a Tesla cybertruck. So our founder, a core founder
and CEO of Britain Martini, and one Copio drove all
the way down from Boston to Vegas to demonstrate the
capability for all our algorithms.
Speaker 4 (09:04):
That's Alloyd.
Speaker 5 (09:05):
Yeah, it's a very long drive.
Speaker 6 (09:06):
They looked like seven days, so that's really intense sad
this way, and we have been able to you know,
showcase and demonstrate that we can extend the range of
twenty percent and reduce the charges of around thirty three percent.
Speaker 5 (09:22):
That is pretty it's a pretty good number, we say.
Speaker 6 (09:25):
And in all of these you know, also managing and
understanding all the data coming from the various environmental driving styles,
and of course we have also capability to predict failures
in advance, so we can have like in a product
that is like one hundred percent safe for of course
the driver and and for them of course vehicle producer
in that sense that is also very very interesting, and
(09:47):
all of these can be scaled when you think not
only you know individual drivers, but you have like fleets
of vehicles or flit of electric vehicles, you know, adding
all of these information and understanding what is happening within
the so of course, what is like the state of
hels of your battery, what is like the degradation of
your batteries, and so how much your btrier can stay
(10:07):
can last? You know, for what we define as the
remain a useful life of a battery, you can actually
increase the value of your asset up to forty percent.
So that is like a way for both the drivers,
the OEM or the battery producer. And then you know,
the fleet operators to have many information, many sites and
(10:28):
so have a much better capability to manage the operations
and so of course get the value out of it.
Speaker 4 (10:37):
Yeah, well for sure. So I'm guessing what you're doing
here is in terms of being able to optimize the
battery life is you're you're analyzing the driving behavior. So
does this person accelerate very quickly the drive? It very
high speeds on a regular basis, do they go up
and down often in terms of speed. These sorts of
(10:58):
variations will have an impact on the battery life, and
so what you can probably do is detect those algorithmically,
understand what's really happening, and then give some recommendations to
the user to say, hey, maybe if you were to
not speed up and slow down so quickly, you would
extend I'm just guessing here that you know, if you
haven't even keel drive, you're going to get better use
(11:21):
of the battery.
Speaker 6 (11:21):
Right, yes, and your mattery can last longer because it's
not allly about range and performance, but it's like you know,
if you drive better that sense, so if you use
it battery better, your barrier can last for longer. Of course,
we have some algorithms like optimizing how much the bettery
can last. But of course then in this case, of
course also the users like responsible of that. So you know,
(11:44):
the two things combined that is like what can make
you very lasts longer? And if you very lasts longer,
you know, it means that you are keeping your cart
longer in a way, and that's also saving money in
a sense, but also results of some environmental impact that
is definitely important in this sense. And of course you
can scale these not only to electric vehicles, but if
(12:06):
you think also stationary storage, that is like another application,
usually much more for B to B side, let's say
this way of leg for businesses, but it's also becoming
quite popular for homes, especially you know, mentioning you know
the B two B side, or like businesses, when you
have some solar farms or wind farms and you want
to store the excess of energy because you know, renewable
(12:28):
energy is very cool, but you know the shine and
the sun, sorry, it is not always shining and the
wind is not always ploying, so you need to you know,
to store the energy when when you have the excess
of energy and then release it into the grid. And
so having technological capabilities and aid I driven capability to
(12:49):
understand what is happening in these berries, to manage them
better and to optimize them, and also to predict falls
because you know, having all of these information and it
is one of the main game change capability about what
technology is that we can predict falls up to three
months in advance. So this means that we can understand
what is actually happening, and you know how your berry
(13:09):
is performing, and if you're going to have some critical faults,
and when I mean criptical faults is like you know
them a runaway or like fires or some other issue
with a very that's very very important to maintain safety
and reliability of that asset.
Speaker 4 (13:23):
Yeah, no, that that's very interesting. And there are also
these power walls, right, doesn't Tesla don't they have a
power wall? So basically you can buy a car, but
you also buy this wall that is like in your house,
and it's it's it is a long term battery storage
is what it boils down to.
Speaker 5 (13:37):
Right.
Speaker 4 (13:38):
It allows you to I'm guessing to charge the car
faster and do things of this nature. Yeah, you also
just have a power source, like you mentioned, because I
looked into solar panels years ago when we were in Texas,
and you know, some of the key things you have
to keep in mind are being able to manage that
power that comes in and storage such that at a
certain point you can then pump it back into the grid, right,
(13:59):
which is you know, you have to be very delicate
about that obviously, because it's electrical energy. You don't want
to to get too much So what are you able
to do there in terms of understanding the sort of
dynamics of how battery usage works in these walls, and like,
can you I presume you can extend their life too, right.
Speaker 5 (14:17):
Correct, we can extend the life.
Speaker 6 (14:18):
You can monitor them of course better and especially understanding
how they are performing and increase their again performance and
optimizing them so that you can have like more energy
to use in that sense, and charging faster your car
if it is a car for application for a car,
or like you know, managing better d energy for your home.
(14:39):
Optimizing of course also the a flow of energy because
usually you have pav panels and so you want to
have everything that is working in a very optimized way.
Speaker 4 (14:48):
That's very interesting. So do you get much traction in
that space, or maybe talk a bit about your clients
who's using this right now? To what effect?
Speaker 5 (14:57):
I mean?
Speaker 4 (14:57):
Obviously it's you're just extending the life of this asset,
and you're improving the use of this asset, which is
incredibly valuable and saves money and also saves you on trouble,
especially this part about being able to predict fires, because
that is the one real big downside, it seems to
me of this whole paradigm is that these things can
(15:17):
get too hot and just blow up and that there's
no bueno, right, so what can you do happen? Yeah, so.
Speaker 6 (15:26):
We I mean like thanks to the technology that that
is like a new layer I would say about understanding
better these assets and using them better. I think that
you know right now that are the right technology and
also the artificial intelligence and machine learning capabilities to better
understand how to again manage and optimize these assets.
Speaker 5 (15:46):
That will be also.
Speaker 6 (15:48):
You know, well we'll have we will have so much
a broader adoption of these assets. And so this will
also change the paradigm of how we think at energy
in general and also the all the energy transition and
everything is based on this because of course it's not
only again the renewables, but you also need some other
components to make this happen and also to sustain a
(16:10):
much lower environmental impact, and so all of these will
be much more necessary, especially technological side, for like making
this transition you know, a reality. And if you think
out the grid right now is like you know configure
for instance, you know and you have like you know,
the traditional system in the western countries, like you have
(16:32):
big plants where you produce the energy and then it's
distributed to like you know, to the grid, to different facilities.
That is like off taking the energy right now with
having you know, solar farms, wind farms and also you
know BV panels in like domestic and business facility, so
like for reastance on your rooftop or like you know,
in some olar situations, you basically are changing this perspective
(16:55):
and changing the paradigm of how the energy flows are
actually you know, pumping in energy into the grid. And
so you need again capabilities technological capabilities to keep everything
working and running smoothly and also to have additional benefit
to all the users involved.
Speaker 4 (17:13):
What's next? What are you guys working on now that
we can expect in like a year or so. What's
the next big push for you? Oh?
Speaker 6 (17:19):
I cannot share that I would like to, but that's
a good Yeah, that's pretty funny.
Speaker 5 (17:25):
It is like yeah, yeah, I mean I can share.
Speaker 6 (17:28):
It's like, you know, we are implementing day after day
our models and you know, understanding what are the new
needs of the customer. And I think the most important
stuff this is also part of my job is like
getting ahead of the curve. So it's like what trends
are going to be there in the next six, twelve,
eighteen months and what we can do to address those
(17:49):
in advanced So one of the things that we've always
been able so is like understanding the trends before they
happen and you know, getting something out before that moment
so that we be leading.
Speaker 4 (18:01):
Well, this is fantastic. And the company is Elektra. That
are you based in Italy or you're just in Italy?
It's a company based in Italy.
Speaker 6 (18:08):
No, the company is based in Boston, Massachusetts, so the
main edportter is there and then we also second European
headquarter into in Italy.
Speaker 4 (18:17):
Excellent, Well, that sounds going to be great, great, great,
great information, great content. I love what you guys are doing.
Very cool. Look these guys up, Giovanni Rossi of Elektra.
We'll be right back. You're listening to Inside Analysis.
Speaker 3 (18:37):
Welcome back to Inside Analysis. Here's your host, Eric Tavanaugh.
Speaker 4 (18:45):
All right, folks back here on Inside Analysis. Your host here,
Eric Kavanaugh, and I'm very pleased to have another guest
with us today. Timothy Baines is the founder of a
company called Compatio, doing really interesting stuff they've built a
whole platform for you could say configure price quote. Really
it's a platform for managing compatibility hence the name of
(19:07):
different products. So think electrical engineering or building materials, or
anytime you need to bring a bunch of big pieces
and parts together to create some whole like I think
a car, for example, or any kind of complex machinery.
You need to know are these parts compatible, do they
work together? And then understand pricing and be able to
(19:28):
figure out what to charge for these things. And they
have a whole platform that does that, and they're focused
on a couple of markets, but they can do all
kinds of different stuff. But Timothy, what really struck my
interest here is you mentioned this taxonomy, and if you would,
can you tell us what you mean by a taxonomy
and how does that fit into what Campatio does.
Speaker 7 (19:48):
Yeah, sure, Eric, So, first of all, I think it's
important to understand that at the core, at the heart
of our system is essentially a knowledge graph, and that
knowledge graph sits on top of a decision taxonomy that
within a given industry, let's say electrical switch gear for example,
or industrial automation components, we have a very precise set
(20:11):
of categories as far as what components are what, along
with very high quality and very carefully engineered attributes on
the products as well as product catalogs for different manufacturers.
So together the taxonomy which is the categories and the attributes,
(20:33):
as well as the rules on what the attributes are
allowed to be, what data type, what is the range
or even what is the picklist of values, that comprises
the taxonomy, and then that gets hooked up to catalogs
of products, and together those feed into our larger sort
of layers of logic that is essentially a knowledge graph.
Speaker 5 (20:53):
Yeah.
Speaker 4 (20:54):
And the reason why that's important for our audience out
there listening is that when you have this graph and
you have this ontology, you are cataloging all these myriad
parts of which there can be millions of different ones,
in such a way as to facilitate the alignment over time.
(21:14):
Because if you were to try to manually put those
into a just traditional relational database or something, first of all,
it would take forever a day, and second of all,
it would not be performed at all, it would be
very very slow, very painful. So by going this route
of having a graph, a knowledge graph substrate with ontologies
for particular industries or disciplines, if you will, you're really
(21:37):
facilitating the end use of this whole system to allow
people to compare and contrast and figure out do these
parts work together or not. That's very very clever. And
you built the graph yourself, You popular this graph. Can
you talk about some examples of how the graph and
the ontologies facilitate what your end users want?
Speaker 7 (21:58):
Yeah, absolutely, so if you think about well, if you
think about any type of product that's available at scale,
sort of on a large e commerce site, let's say
all products need to have effective attribution and descriptions to
power search. So that's kind of a fundamental use case.
(22:18):
But when we get into more complex verticals, more complex
product domains, where the attributes the specifications of the products
need to be very very precise, you need an entirely
different level of quality and accuracy on those attributes. And
the use cases that that powers are things like search.
(22:40):
But you know search, including parametric search, right, so you
can filter down and find exactly what you want, so
any kind of discovery. But then the higher order logic
that also that that enables in the domains where we
operate the products are complicated. Finding the right product for
(23:03):
your specific application. Even just a single component can be
non trivial, and oftentimes you may need to talk with
an expert, let's say, that can help you figure out
what the right product is. And then if you think
about a product that is part of a larger system,
an industrial automation system with a motor and a motor
controller and a relay and so forth, at all need
(23:25):
to go together to be able to identify those components
number one that fit your application and the number two
go together. That's non trivial, and historically that task has
been handled by experts, folks with ten twenty thirty forty
years of experience in that industry, and you'd walk to
(23:46):
the counter and a distributor, a dealer, a store, or
they'd come to your plant and they'd have all the
expertise that's needed to help you speck it out. But
things are going online. Obviously, digital commerce is expanding, and
we also are seeing some fairly significant turnover in the
industry in terms of the labor force, where a lot
(24:06):
of the folks that came in twenty thirty forty years ago,
they're retiring their baby boomers and they're rolling out and
it's what our the companies that we're working with are
telling is it's hard to find the right people, and
it's hard to train them, and it's hard to retain them.
So we see really an emerging and a massive knowledge
gap coming into as digital commresce becomes more more prominent.
(24:32):
Combined with this problem of expertise, it makes it very
difficult to do these kinds of things to find the
right product fit your application, make sure they go together,
and then price them and quote them and get them
out the door. So we're addressing all of those use cases.
Speaker 4 (24:48):
Yeah, it's really impressive. And when you talk about industrial
engineering the machines that make machines, basically you have years
on these things. Different models and things change your over year.
I mean, I'll use the metaphor of an automobile just
because most people have an automobile or at least know
someone who has an automobile, and depending upon the year
of that make, you'll get different parts for different functions,
(25:12):
like whether it's to run something in the engine or
in the electrical system or whatever, and it really matters
to get the right thing. I've been one of those
do it yourselfer as I've bought some products I think
it was going to fix, and it didn't fit the car.
I was like, oh man, because it was the wrong year.
So it's really important to have the year right to
make the model. And then all those attributes you talk
about are basically edges on the graph, right, So you're
(25:36):
able to capture all that and then maintain some history
and thus facilitate finding that one specific part that you
need to fix this machine.
Speaker 5 (25:43):
Right, Yeah, that's right.
Speaker 7 (25:45):
And you know, taking in a car and automobile as
an example, I mean that data is generally very carefully
maintained and curated by the manufacturer of that car. I
mean even some of the legally required that they maintain
that data for many types of things, for example liability
(26:06):
reasons and so forth. But where it gets really interesting
is where you, let's say, again keeping our focus on
a car, you get out to what goes on the
car or around the car, or attaches to the car,
or accessorizes the car. That data is not maintained by
the manufacturer. So when you get into these multi brand situations,
something that goes on or with something else, you're in
(26:30):
a space that really nobody owns. Right now, where nobody
owns that data what goes with what? Across brands, and
in the industries where we operate electrical automation, building materials,
and so forth, those problems are everywhere. Everywhere, folks are
trying to figure out what goes with what, especially when
it's cross brand, but even within a brand, it's still challenging. So, yeah,
(26:53):
the problem is huge, and the markets are huge. The
companies in the industries that have these kinds of issues
are huge. We're talking trillions of dollars worth of annual
revenues in these industries and they all suffer from these problems.
Speaker 4 (27:07):
Yeah, well you can just imagine. I mean, I've studied
supply chain through SAP and some other companies, and it
gets incredibly complicated because some of these companies will have
hundreds of thousands, even millions of SKUs and so you know,
just from a data management and storage perspective, that's difficult.
Then enabling search is difficult. Then enabling search for your
(27:29):
partners and for your clients, which is also very important
because you want them to be able to find the
bits and pieces, the products, the component parts they need,
and all of that is facilitated by this compatible platform right.
Speaker 5 (27:43):
That's right.
Speaker 7 (27:43):
Yeah, Search and discovery, search discovery, compatibility and configuration.
Speaker 4 (27:49):
And then right once you can search and align and
determine compatibility, then you get into things like pricing and
configure price quote, which is another huge part of the industry.
Knowing what you can charge for something, Knowing what the
margins are that your competitors are putting out in the marketplace,
that you're putting out there, finding the balance amongst all
(28:12):
those things. I mean, that's a full time job for people.
And you still have to take guesses, right, I mean,
you're going to figure out all your prices, You're going
to configure and figure out Okay, I think I can
get this price, and you send it out there and
hopefully they bite, and then you go ahead and go
through this, and you have to be able to go
back and audit that and report on it and figure
out where's the margin because you've got to make money, right,
Everyone has to make money at the end of the day.
(28:34):
So you're helping facilitate these transactions. You're helping make sure
that companies are getting the parts that they need to get,
and you're also I think enabling more of that sharpening
of the pencil basically, because once you know what the
margins are, and once you know what the reliability of
the providers are as well, then you can make these
(28:56):
informed decisions. Then you can kind of understand where the
margin is and that's kind of how you grow your business.
But folks, so Compatio co O m P A t
io is Acompatio dot com. Is that right, Ai, Compatio
dot Ai. Well, Timothy, congratulations on building this platform. This
is a pretty seriously big deal and you're really greasing
(29:17):
the tracks that I love that you've got a taxonomy
or taxonomies ontologies basically and a knowledge graph. That's the
way to do it. That's what our audience loves to
hear about. So thanks for your time today. Look you
spokes up online ladies and gentlemen. Compatio dot ai and
Timothy Bains will talk to you next time. You've been
listening to Inside Analysis.
Speaker 3 (29:44):
Welcome back to Inside Analysis. Here's your host, Eric Tavanaugh.
Speaker 4 (29:51):
All right, folks back here on Inside Analysis. Your host here,
Eric Haavadaugh, And today I'm with Abby Rass of a
company called Hydraulics. It's hyd Elix, and Hydraulics is a
very interesting company that is filling a significant gap that
is enabling really powerful cloud content delivery networks and other
(30:12):
companies to do real time analysis of observability at scale.
And why does that matter. It matters for a lot
of different reasons. It matters because these systems are very
complex these days. You can imagine Netflix, for example, the
amount of data that's flying out all these various sources,
all these various data centers to get to your TV
(30:33):
so you can watch all these movies. That's a lot
of data. So when you're trying to maintain these systems,
you're looking for all kinds of things. And when you're
trying to do that sort of industrial grade analytics on
systems performance and management, well there are lots and lots
of metrics that you can look for these days. And
obviously the Kubernetes movement has played deep into this whole
(30:55):
space now spinning up amazing amounts of compute, federating compute basic,
so really enabling this tremendous scalability of information systems of
enterprise create information systems. But also it opened all sorts
of Pandora's boxes on data and on what's actually happening
under the covers, so to monitor that and to watch
(31:15):
not just for bad guys, but also for performance issues,
anything that stumbles and starts causing trouble to users. You
need to absorb tons and tons of data from a
lot of different systems and be able to coalesce that
and analyze it in fair fairly real time. And that's
the kind of stuff that Hydraulics does. So they call
themselves a bit of a streaming data lake company, but
(31:37):
also headless observability. And I love this headless concept. I'm
hearing this more and more and I really like it
because what it kind of implies is, look, we're giving
you an engine and you can put whatever covering you
want on it that suits your particular needs or your
environment to use this powerful engine to get stuff done.
Speaker 8 (31:56):
Right.
Speaker 4 (31:56):
What do you think?
Speaker 9 (31:58):
Yeah, I mean it's like mister potato head.
Speaker 10 (32:00):
You can flip THEMP you can, you know, put on
different hats, put on different eyes. It's whatever you want
in the top. But you know the bum right, But no,
it's it's essentially the same thing.
Speaker 5 (32:10):
Right.
Speaker 10 (32:10):
So with Hydraulics our software, anyone can plug in any
type of dashboard.
Speaker 9 (32:15):
It comes with pre built Graffona dashboards.
Speaker 10 (32:18):
However, if somebody wants to keep their splunk instance, if
somebody wants to keep any kind of front end dashboard
that they have, that's totally fine. We have connectors so
you can put any front end analytics dashboard on top
of our back end and still get the benefits of
less money, especially for storage, longer data retention, always hot data,
for rapid queering and ingesting massive amounts of data at
(32:39):
least to terabyte a month in real time.
Speaker 4 (32:42):
Well yeah, and that real time nature is crucial. You
were telling us a story before we hit the record
button about a particular use case where a Black Friday
real retailer was able to discover a problem as it
was happening. Some kind of a hack was going on,
found it, discover it, nullified it. Nobody even noticed on
the front end, right.
Speaker 10 (33:04):
Yeah, this was a large retailer global, It was during
Black Friday week and there was a massive DDoS attack
and it spanned eighty countries and three thousand IP addresses.
Speaker 4 (33:14):
Wow.
Speaker 10 (33:15):
Graphic Peak, the manager observability service that akama I built
using our software at Hydraulics pinpointed this attack instantly, so
the company was able to mitigate it in real time
before any of their customers even noticed anything. Wow, So
it really was instrumental for them to maintain business.
Speaker 4 (33:36):
Yeah, that's amazing. And you think about what it takes
to be able to ingest all that data, and that's
a big key to what you're offering, right is number one,
hyper scale ingestion of data. So you can tap into
any number of systems, industrial grade systems, cloud infrastructure basically,
and you are very good at pulling in data very quickly,
(33:57):
but then also enabling the analysis of that data. And
that's all in hydraulics, right though you do also, as
you suggest, you're headless, so you can work in conjunction
with any number of analytics tools.
Speaker 9 (34:08):
Right, that's absolutely true.
Speaker 10 (34:10):
We actually just ran one of the largest sporting events
in the country and at the peak viewing time, we
were ingesting more than a petabyte of data an hour.
Speaker 9 (34:24):
Wow, I'm sorry, not an hour, more than a petabyte.
Speaker 10 (34:27):
We let me repeat that, at peak viewing time, we
were ingesting more than a petabyte of data a day, Okay,
which is an enormous I mean that's monster data, right,
and a lot of a lot of times these legacy systems,
because they were not built for the modern architecture of
today will crash or slow.
Speaker 9 (34:45):
Down when ingesting that amount of data.
Speaker 10 (34:48):
But for us a petabite a day, nothing crashed, nothing
slowed down.
Speaker 9 (34:53):
We take it in in real time. We alert on.
Speaker 10 (34:56):
Ingests, so we can show incidents in real time. You
can query in three seconds, six seconds, super super fast,
and then you can go fix the problem before anyone notices.
Speaker 4 (35:08):
Yeah. Well, and you also you do some clever things
on the storage side and some clever things on data reduction. Basically,
so you really were purpose built for this kind of
use case, right, I mean your executive suite they saw
this coming, they saw a need for this, and then
purpose built this engine to be able to fill this
particular use case.
Speaker 5 (35:29):
Right.
Speaker 10 (35:29):
Yes, I mean it's two problems, right. It's the fact
that the amount of data is growing so much each
year it's supposed to increase by forty percent, that's when
an analyst recently told us year over year. And also
it's the cost. I mean, keeping data like this.
Speaker 9 (35:45):
Costs a fortune. So we wanted to solve both of
those problems.
Speaker 10 (35:49):
The ability to keep all your data no matter how
large the data set, and also make it affordable.
Speaker 9 (35:55):
So that's what we're doing.
Speaker 10 (35:56):
And the reason why, one of the biggest reasons with
especially data retention, that we can make it affordable, well
is that we have a twenty five to fifty compression ratio,
so we make that huge data set very very small
and easy to keep and managed.
Speaker 4 (36:09):
Yeah, now that's clever. What's interesting is there are lots
of different things that you can do from an architectural
perspective to optimize for certain goals, for certain objectives, and
this compression capability you have, that's why you're able to
enable a much more cost effective storage layer, right.
Speaker 10 (36:28):
Correct, And we have decoupled architecture, so compute and storage
can be scaled up or scaled down separately.
Speaker 4 (36:35):
Right, which is exactly what Snowflake did, right. I mean,
I think they were the poster children for separating compute
from storage. And everyone's like, oh, that's a pretty good idea.
Let's go ahead and run with that, right.
Speaker 5 (36:45):
Yeah.
Speaker 10 (36:45):
In fact, in that largest football game, we were able
to save the customer compute power because they didn't you know,
when the peak viewing time dropped, they didn't need all
that infrastructure, right, It scaled down really easily while still
keeping the.
Speaker 4 (36:57):
Storage well, you know, to myself here you talk about
traffic Peak, which is a cool name, by the way,
and that's that's the solution which Akamai built using your technology, right,
which they're now selling to their clients as well. Is
that correct?
Speaker 10 (37:10):
Yes, we have so I leave partner marketing for Hydraulics,
and we have very strong partnerships with Akamai and AWS
and others, but our software can run on any cloud platform.
Speaker 9 (37:21):
Akima is a huge.
Speaker 10 (37:22):
Success story for us because they have been so supportive
of the partnership from every level, from product sales to marketing.
So when we launched in twenty twenty three traffic Peak,
which is the manager observability service that Akamai built using
our software, we had thirty three customers in the beginning.
Speaker 9 (37:39):
In one year, that number shot up to more than
three hundred and seventy customers.
Speaker 10 (37:43):
That's great, and you know, it's a lot because the
product speaks for itself, but also because we have such
a collaborative relationship with Akamai and they're so supportive on
all ends, and we're so supportive of them on all ends.
So it's really been a successful partnership. And they just
named us the North of America Qualified Compute Partner of
the Year because it's great well.
Speaker 4 (38:04):
And the thing to understand for from a broader business
audience perspective is that we are seeing tremendous advances in
all sorts of different spaces, and companies can come along
like Hydraulics and build this engine purpose, build this engine
for massive and jest for real time analysis. And when
you do that, you're head and shoulders above some of
(38:26):
the older legacy systems that just weren't designed to do
that kind of thing. And you know there are a lot,
you know, so years ago I had doctor Michael Stonebreaker
on the show. I don't know if you recognize the name,
but he's the godfather of the modern database. Basically, he
wrote the Postgress database like fifty years ago and has
spun up several companies, including Vertica, which in its time
in two thousand and five was all about real time analytics,
(38:48):
was all about column oriented analytics. It was a new
crazy thing back then. But he had this one fun
quote on a show one day where he said, the
funny thing is that people should realize eighty percent of
the code that's been written in the world should just
be thrown away at this point was that it was
written for older systems. And so if you think about
(39:09):
spinning disc for example, versus solid state drive, well, it's
a very different world if you're using SSDs in terms
of that architecture and how fast you can move things.
And now there are all sorts of companies using the
NVMe protocol to move data much more fast than it's
ever been done before. And so I'm saying all this
Abbey as a way to kind of tee you up,
to explain to the audience, like this is you can't
(39:32):
just upgrade your existing SQL server instance to get this
kind of performance. I mean, this is a whole new
engine that's been built for that specific purpose.
Speaker 10 (39:40):
Right yeah, I mean that would be like creating mister
potato Head out of wood.
Speaker 4 (39:45):
I mean it's like mister potato head.
Speaker 9 (39:50):
I forgot this can cancel culture.
Speaker 10 (39:53):
But anyway, you know, these legacy systems were built in
a certain way to handle data in a certain way,
and what they didn't expect was for that data to
get so big that it's literally like busting the seams.
Speaker 9 (40:06):
Right right of these legacy systems.
Speaker 10 (40:08):
And so that's why we need new observability services that
can handle this modern architecture of all these micro services dispersed.
Speaker 9 (40:16):
Everywhere that produce so much data.
Speaker 10 (40:19):
I mean, just one application produces so much data because
all of those.
Speaker 9 (40:22):
Components that go into that app.
Speaker 10 (40:24):
And so that's exactly why hydraulics exists today to solve
for that.
Speaker 4 (40:29):
Well, look these folks up online, folks, Abby Ross of Hydraulics.
Very cool technology. You are listening to Inside Analysis.
Speaker 1 (40:36):
We expected.
Speaker 4 (40:39):
All right, folks back on the show here and now
we're talking to Kenneth Stott. He is the field CTO
from a company called Hasura, and they have a very
interesting play for reporting, specifically for regulatory reporting, and they
have a graph like approach to things. They use graph
q well and it's sort of the last mile of data.
And I really liked that concept and I that you
(41:00):
threw out there that it's the last mile of data.
And you can understand why that's important, especially for regulatory reporting,
because you give all these different information systems that people
are using, whether you're in financial services or healthcare, and
at the end of the day, you have to deliver
these reports to people and if the numbers don't line up,
that doesn't look good. That's when the auditor to say
no bueno, you have to go back to the drawing board.
(41:22):
So the question is where do you where do you
sort of reconcile all these different reports? And that's what
Hisora does in this sort of supergraph layer. Right, tell
us about that and how it works.
Speaker 2 (41:33):
Sure, so I would describe it as a semantic layer.
What that would What I mean by that is the
individual teams that to create the data and own that
data publish it in semantic terms into a data access layer,
(41:57):
and then that data access layer validates all of the
relationships that they that they've described within the information that
they're publishing, so that you can get across all the
teams one consistent view of your data in business terms.
And then that that same data access layer has a
(42:17):
rich set of services to kind of discourage building new
capabilities downstream, because oftentimes those new capabilities that they build
downstream is where these problems start to arise, right sore,
If they're cleaning data downstream, if they're making additional transformations,
this is often where it starts to go wrong. And
(42:40):
the other thing I want to say about this last
mile of data is this is the most consequential part, right,
This is where people in the c suite make key
decisions that have big impact on an organization, or where
regulators pick up really meaningful differences that can drive you know,
(43:00):
a lot of pain for an organization. Putting focus on
this really pays off.
Speaker 4 (43:07):
Yeah, that makes a lot of sense. And you know,
I'm from the whole world of business intelligence and analytics
and reporting, and you know, we came up with the
data warehouse concept as a way of marshaling all the
key data points into one relational database which we can
then report from. So I get that. But to your point,
a data warehouse isn't the only system of record. You
have lots of other systems of record that these organizations
(43:30):
are focused on to shepherd through. And if you have
this last mile, this semantic layer, or you're tracking these things,
you can get a much better handle on it. And
I like the way you say it's got some built
in capability to discourage downstream modifications, right, Yeah.
Speaker 2 (43:47):
Yeah, absolutely, data warehouses, you know, absolutely have to be
part of the landscape going forward. There's a need to
take data in and develop different analytics from it. But
centralization has limits, and analytical data isn't the only thing
(44:10):
that's required for these sorts of dashboards and reports. In
this last mile of data, you need both operational data,
you need analytical data, and so all of these things
ultimately become the data products that flow into this smantic layer.
There are some amazing things you can do once you
(44:30):
have visibility to all your data as it's in motion
at this last mile. Another I've studied. I've studied then
a lot of quantitative studies around where things go wrong.
Oftentimes anomalies have to do with people combining data across
(44:53):
disparate domains, and you can do things like anomaly detection
at that level, right, and you can say, this really
doesn't make sense the way you're combining this data to
produce some output.
Speaker 4 (45:07):
Yeah, that makes a whole heck of a lot of sense.
And it is the reconciliation layer. It's the final proof really,
and again you've sort of baked in here some of
the formulae to be able to identify I mean, I
won't it's not exactly like Master Data Management, but there
is a flavor of MDN. There's a flavor of reconciling
(45:29):
dispirit information systems. And I like too that you're not
just looking at analytical data from a warehouse, you're actually
pulling in operational data. As a sort of reality check, right.
Speaker 2 (45:39):
Yeah, yeah, absolutely, there are also upstream benefits, right, because
what often happens in those data warehouses, those analytical environments
is they start trying to serve a lot of different
stakeholders and they start pulling in data from their peers,
and you end up with a lot of cross sharing
(46:01):
of data in ways that become problematic. So I've also
studied the cost of this sort of sort of ball
of yarn that we generally create in these environments, and
my take on it is about two thirds of data
(46:21):
movement and data storage is wasted. If you could optimize,
you could get rid of two thirds of that cost.
And by the way, costs have been going up ten
percent per year for the last three years, they're projected
to go up again. And if you look at data
maturity statistics, they're flat. So we're spending all this money, right,
(46:43):
but it's not getting that much better.
Speaker 4 (46:45):
You hit the nail on the head. And it's funny
you would say that because quite literally, DM Radio. Let's see,
we turned seventeen years old yesterday. That's how long we've
been doing these shows, and in the earliest days in
the first year, I was trying to wrap my head
around all the ETL that's being done to fill the
data warehouse and thinking to myself, this is crazy. You
(47:06):
guys are clearly moving the same data multiple times. You're
probably not even using eighty percent of the data that
you're moving. Why are you moving it all? And it's
because of how we got here. It's because way back
thirty years ago, when we were beginning to really mature
data warehousing architectures, people were just catching on and you're like, oh, well,
I want this data in the warehouse. I want that
data in the warehouse. So you get these batch windows
(47:28):
that stack up and it's almost like an archaeological dig now, right,
you have just layers and layers and layers of data movement,
and the question becomes, what do we really need to do?
And what you're saying is that with this supergraph layer
that you've got managing semantics, if people work with you appropriately,
you can solve problems upstream, meaning you don't have to
move as much data now, and you can also solve
(47:49):
a lot of problems downstream. And that's what goes to
the executives to make the big decisions, right.
Speaker 2 (47:54):
Yeah, So the idea is that the upstreams focus i
would say, on core data sets. These aren't this is
information I own. I own the definition combine. You know,
both the tech and the business teams together. They don't
get into serving sort of ad hoc requests because somebody
(48:18):
randomly connected with them.
Speaker 9 (48:20):
Right.
Speaker 2 (48:21):
What happens then is the supergraph layer is sort of
the place where those ad hoc requests get satisfied. And
that's why the upstreams stop all this sort of duplicate,
you know, sharing of data because they're very focused on
just what they need to do and the downstreams. The
(48:41):
downstreams get their needs satisfied more quickly because they're able
to see the data, they're able to composite it into
their narrow use cases for their needs.
Speaker 5 (48:51):
Mm hmm.
Speaker 4 (48:52):
We got about two minutes, a little over two minutes
left here. You had mentioned before the call that you
are containerized, right, like, so you run in a Kubernetes environment.
Is that correct?
Speaker 2 (49:03):
Yeah? So our particular product works in all the major
cloud vendors of course, and you can do it as
a SaaS offering, but it also works in a Kubernetes cluster,
you can do it on prem and particularly think about
healthcare and finance. There's a lot of consternation around public
cloud and so forth, very important that you can do
(49:23):
on prem when people have serious security concerns.
Speaker 4 (49:28):
Yeah. Well, I mean I have to say I think
that this focus on the last mile is really crucial,
and I love that you're also helping upstream by satisfying
a lot of these ad hoc style queries. I mean,
the bottom line is that once you get analytics in
an organization, people are going to want to know things,
and they're going to want to run queries and they're
going to want to play with stuff. And the better
(49:48):
you can shepherd those processes, the more governance you get,
the more quality you get at the end of the day,
and the low and you manage the cost as well.
Right closing thoughts from you, go ahead?
Speaker 2 (50:00):
Yeah, absolutely, I mean I couldn't agree more. The benefits
are amazing. I'll say one last thing. I don't think
it's the last thing you do. You don't get your
house in order and then put this layer on top.
Put this layer on top for current states so you
can see what the heck is going on then fix
(50:21):
it surgically.
Speaker 4 (50:23):
Yeah, that's an excellent point as well. I mean it's
you're talking about data observability, right. Everyone's into observability these days,
into systems and you know, and tracers and logs and
all these things. And I think communities is a part
of that too, because it just opened up this whole
can of worms about how to distribute process and everyone's like, whoa, now,
hold on, how do we manage that? We how do
(50:43):
we understand what it's doing because it's doing things very
very quickly. So but this is a great focus area
and it's a layer of abstraction, as you suggest, the
last mile, and I like that we say, understand what's
happening now and then come up with the plant if
things both upstream and downstream. But look this, gentleman up
online Kenneth dot of Hasura h A s U r
(51:06):
A correct. All right, folks, you are listening to Inside Analysis.
Speaker 11 (51:11):
Now.
Speaker 12 (51:12):
You can listen to k c AA radio anytime on
your smartphone device call seven two oh A three five
three oh nine nine seven two oh a three five
three oh nine nine k c A eight.
Speaker 11 (51:26):
Peur Life's much better, So download the app in your
smart device today. Listen everywhere and anywhere, whether you're in
southern California, Texas for sailing on the Gulf of Mexico.
Life Sabreeze with KCAA download the app in your smart
device today.
Speaker 10 (51:43):
Ah B.
Speaker 13 (51:46):
Yesterday in the basic.
Speaker 10 (51:57):
K c A eight.
Speaker 14 (52:01):
What is your plan for your beneficiaries to manage your
final expenses when you pass away life? Insurance annuities, bank accounts,
destment accounts all required defertivity which takes ten days based
on the national average, which means no money's immediately available
and this causes stress and arguments. Simple solution the beneficiary
(52:26):
liquidity clan use money you already have no need to
come up with additional funds. The funds grow tax defer
and pass tax free to your name beneficiaries. The death
benefit is paid out in twenty four to forty eight
hours out a deficitary.
Speaker 6 (52:43):
Nerdy money without.
Speaker 14 (52:45):
A definitive call us at one eight hundred three zero
six fifty eighty six.
Speaker 15 (52:51):
To Hebo T Club's original pure powder to RCO super
Ta comes from the only tree in the world that
fungus does not grow on. As a result, it naturally
has antifungal anti infection, anti viral, antibacterial, anti inflammation, and
anti parasite properties. So the tea is great for healthy
people because it helps build the immune system, and it
(53:11):
can be truly miraculous for someone fighting a potentially life
threatening disease due to an infection, diabetes, or cancer. The
tea is also organic and naturally caffeine free. A one
pound package of tea is forty nine to ninety five,
which includes shipping. To order, please visit to Hebotea Club
dot com. T hebo is spelled tea like tom, a,
h ee b like boy o. They continue with the
(53:34):
word t and then the word club. The complete website
is to hebot club dot com or call us at
eight one eight six one zero eight zero eight eight
Monday through Saturday nine am to five pm California time.
That's eight one eight six one zero eight zero eight
eight to Hebot club dot com.
Speaker 16 (53:53):
Tune into the Faran Doozier show us you Mark the
Place in Time the soundtrack to Life Sunday nights at
eight pm on KCAA Radio, playing the hottest hits and
the coolest conversations. Sunday nights at apm on the ferand
Dozier Show with an array of music, talk, sports, community outreach,
and veteran resources. With the hits from the sixties, seventies, eighties, nineties,
(54:19):
and today's hits. The Farran Dozier Show on KCAA Radio
on all available streaming platforms and on a six point
five m and ten fifty Am The ferand Dozier Show
on KCAA Radio.
Speaker 11 (54:49):
This important, time sensitive message is brought to you by
this station's generous sponsor, George Let's Field Associates, who has
important Medicare information for all current and future Medicare recipients
about some big changes happening. Medicare Clarified. Medicare is a
nonprofit consumer service organization.
Speaker 13 (55:09):
It's more important than ever to review your Medicare plan
for twenty twenty five from October fifteenth through December seventh
to find out if you're in the right plan for you.
People are calling nine five one seven six nine zero
zero zero five nine five one seven six nine zero
zero zero five. A popular and local Medicare plan is improving.
(55:32):
Others are raising copays and adding deductibles, biggest changes in
the Medicare drug program in fifteen years.
Speaker 11 (55:40):
We thank George Letzfield and lets Field Insurance for their
generous support of this radio station.
Speaker 17 (55:55):
Here's the KCAA community calendar for March. On most Friday nights,
it's open mic and karaoke in Riverside at Urrel Brewing Company,
located on Chicago Avenue beginning at seven pm, So grab
the mic and have some fun. In downtown Redlands on
Saturday mornings from nine to one pm, it's the Downtown
Morning Market located between six and eighth Street. On Saturday mornings,
(56:19):
in the Riverside Main Library on Mission Avenue, it's Family
Storytime every Saturday morning at nine thirty. For car enthusiasts,
here are a couple of options to enjoy cars and
coffee in downtown Clarmat on the first Saturday of the month,
the Claremont Car Guys and Gals. They meet in the
village for coffee and cars at six thirty am. I
(56:40):
hope you're an early riser. In Corona, it's more cars
and coffee on Saturday mornings from seven am to nine
am at the IHOP on Ward Low Road. All years
makes and models are welcome. This weekly meetup allows you
to enjoy some beautiful cars. Looking to do some household
hazardous clean up in Royale, so they offer a hazardous
(57:01):
waste drop off on March first, March fourteenth, fifteenth, and
June thirteenth from eight am till noon. The drop off
site is located at two forty six South Willow And
for those of you who love Pokemon in Redlands, there
is a board game Teradise. Every Friday night at six
thirty pm is their weekly Pokemon tournament. All skill levels
(57:24):
are welcome, but you do need to have a full
understanding of the rules and how to play. Located at
one oh nine East State Street in Suite A in Redlands,
and that's the latest for the KCAA community calendar. I'm
Lillian Vasquez on KCAA ten fifty am and one oh
six point five FM.
Speaker 1 (57:46):
Located in the heart of San Bernardino, California, the Teamsters
Local nineteen thirty two Training Center is designed to train
workers for high demand, good paying jobs and various industries
throughout the lane An Empire. If you want a pathway
to a high paying job and the respect that comes
with a union contract, visit nineteen thirty two Trainingcenter dot
(58:10):
org to enroll today. That's nineteen thirty two Trainingcenter dot Org.
Speaker 8 (58:21):
NBC News Radio. I'm Chris Ragio. House Republicans are unveiling
a short term funding bill to keep the government running
through September.
Speaker 4 (58:28):
Details now from Lisa Carton.
Speaker 18 (58:30):
The six month plan includes cuts to non defense programs
and the IRS. The ninety nine page bill is in
the form of a continuing resolution known as a CR,
which Democrats strongly oppose. House leaders say the bill has
White House support ahead of Friday shutdown deadline. Speaker Mike
Johnson said this week he thinks Republicans will be able
to pass it along party lines, with a floor vote
(58:52):
expected as soon as Tuesday.
Speaker 8 (58:54):
The Trump administration is cutting four hundred million dollars in
grants for Columbia University over pro Palestinian pro tests. Education
Secretary Linda McMahon defended the move, citing anti semitism at
New York City's Ivy League University. She said Jewish students
have faced relentless violence, intimidation, and harassment following the deadly
Hamas October seventh, twenty twenty three attack on Israel. Since
(59:15):
last spring, the school has been a hotbed of pro
Palestinian protests, which have often turned violent. An unusual standoff
in London today as a man carrying a Palestinian flag
has been perched on a ledge on Big Ben's Elizabeth.
Speaker 1 (59:27):
Tower for over ten hours.
Speaker 8 (59:29):
Metropolitan Police say they've received a report about the unidentified
man this morning, shortly before seven thirty London time. The
BBC reports the man posted a video of his climb
up and said he's protesting against police repression and state violence.
Area roads and tours of Parliament have been canceled as
emergency crews and police negotiators try to talk them down.
It's unclear if the climber is directly connected with pro
(59:51):
Palestinian protesters who were in the vicinity for u planned rally.
The Vatican says Pope Francis has shown a gradual slight
improvement over the last few days. Over the weekend, the
Vatican said the Pope's oxygen exchange has improved, he doesn't
have a fever, and blood contests remained stable. The eighty
eight year old went into a Rome hospital with