Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
(upbeat music)
- Hello, and welcometo "insight.tech Talk,"
where we explore the latest IoT, AI, edge
and network technologytrends and innovations.
I'm your host, Christina Cardoza,
(00:21):
Editorial Director of insight.tech,
and today we're talking toQuanergy's, Gerald Becker,
about advancements in 3D LiDAR.
Hey, Gerald, thanks for joining us.
- Thanks for having us.
- Before we jump into the conversation,
if you could just give us abrief background of yourself
and the company, that would be great.
- So, a little bit aboutmyself, Gerald Becker.
I've been in the security spacefor a couple of decades now,
(00:41):
a little over 20 years,
and I've been in everyfacet of the industry
from being an end user,systems integrator,
and even more so, to more latest at times,
I've been on the manufacturing side,
which is where I really found my stride.
I've been with Quanergyjust shy of four years now.
Quanergy is a manufacturerof 3D LiDAR sensing hardware,
(01:03):
and we're also a developer of software.
We were founded in 2012
and we were one of thefirst LiDAR companies
to actually come out inthe space commercially,
originally targetingautonomous vehicles, right?
The holy grail.
So we were of the first companiesto come out commercially
and be able to offer a turnkeysolution to various markets,
(01:24):
which we'll go into here in a bit.
So really looking forwardto this discussion.
Thank you, Christina.
- Yeah, absolutely.
And excited to jump in.
You mentioned the company'sbeen around since 2012,
so obviously LiDAR, 3D LiDAR,it's not a new technology,
but I feel like it's been gaining
a lot more interest lately.
And you said it startedin automated driving,
(01:44):
but now it's spanningacross different industries
and different businesses.
So I'm just curious if we could start off
talking about what is 3D LiDAR exactly
when we're talking, how doesit go beyond automated cars
and what are the painpoints that businesses
are trying to solve with it today?
- This is the fun stuff, right?
So there's a lot ofapplications for LiDAR.
(02:05):
Predominantly everybodyknows LiDAR being used
for automotive and roboticsand stuff like that.
Also terrestrial mapping, right?
So putting these on dronesand mapping environments
to understand, is there a pyramid
hidden behind these rainforest
and stuff like that, right?
A lot of cool applicationsthat have been out there
for years and years.
So LiDAR is absolutelynot a new technology.
(02:26):
It's been around for decades,very, very long time.
It's not until I wouldsay the past 10 years
that we've really started going beyond
the comfort zone of what LiDAR can do.
So in my role within the organization,
I head up the physicalsecurity smart space
and smart city market sectors.
(02:46):
And with that being said,there's so much applicability
as far as what you could do with 3D LiDAR
in those three markets
because they've always beenconfined to a 2D space,
like what we're seeing on this camera,
in those spaces,
they've always predominantlyused like radar, camera,
other types of IoT sensorsthat have always either been
1D or 2D technologies.
(03:07):
But now with the advent of 3D technologies
and the integration ecosystem
that we've developedin the past few years,
we now provided so much more flexibility
to see beyond the norm,see beyond two dimensions.
See beyond what's been the common custom
of sensing in this space.
So for security, we're doingsome very, very big things.
(03:27):
In security, because they'vepredominantly been using
radar and camera and video analytics,
3D sensing is now being able to provide
additional capabilitieswhere we provide depth,
where we provide volume,
but even more so in 360, withcentimeter level accuracy.
Now, what that does forsecurity applications
is that that bringsdown the TCO advantage,
(03:49):
or I'm sorry, it increasesthe TCO advantage
compared to all legacy technologies,
but it decreases theamount of false alarms
to actually activate andtrack and see what is real
and what isn't, right?
So in these legacy technologies,
anytime that there's a movement
or an analytic tracks,
a potential breach or something like that,
automatically starts triggering events
and send 'em to the alert office to say,
(04:09):
"Hey, there's an alarm, there's an alarm!"
That's a big problem
when there's thousands andthousands of alarms coming in,
because the AI or the analytic,
the intelligent video doesn'tunderstand how to decipher,
"Hey, that's just an animal walking by.
It's not a perpetratorcoming up to the fence."
So with our sensors,
we're able to provide98% detection, tracking,
and classification accuracy in 3D spaces.
(04:31):
So when we marry upwith other technologies
such as a PTZ camera,
where camera may be focusedinto a specific zone,
our 3D LiDAR sensor seesin that whole space,
and we detect an object.
We tell the camera, Hey,camera, move over here
and keep tracking thisobject in this space.
Once again, we'recentimeter level accuracy.
We're able to slew to cue cameras
and provide that intelligenceto the security operations.
(04:54):
From the flow management side,
which is more on thebusiness intelligence side,
we're able to provide a higherlevel deep understanding
of what's going on withinspaces such as retail.
We can understand where consumers
are going through their journey,what path they're taking,
what products are they touching,
the queue lines, how long are they?
Even more so, much easier than you could
(05:16):
with traditional ways of doing it,
like the whole camera baseor stereoscopic waves,
we're able to eliminateup to seven cameras
with one sensor of oursto be able to give you
quite a bit of coverage in that space,
once again in 3D.
So instead of stickingcamera here, here, here
and stitching them all together,
you put one LiDAR sensorthat gives you full 360
and you're able to see that whole space
(05:37):
and see how peopleinteract in these spaces
as they're touching orexperiencing different...
You know, if it's at atheme park to understand
what is it a person's doing in line
or at a museum to seehow they're interacting
with this digital space.
We're able to provideso many cool outcomes
that you've just never been able to do
with 2D sensing technology.
So when you asked me like, what is new,
and what to do with 3D,
(05:58):
we've barely started tappinginto like the capabilities,
what you do with 3D.
- Yeah, and everything that you're saying,
obviously that depthof dimension, you know,
and that 360 view,
that is something that'sgoing to benefit businesses
and that you really want to strive for
to get the complete picture.
But like you said, it's notsomething that we've seen
businesses really be utilizing until now.
So what's been happening in this space?
(06:19):
Has there been any recentadvancements or innovations
that makes this a littlebit more accessible
to these types of businesses?
- So I think the biggest wall to adoption
was the ecosystem oftechnology integrations, right?
So as I stated, a lot of these companies
have predominantly beengoing after automotive,
the holy grail,
and that's typically been to like OEMs.
(06:39):
People that take the sensor,develop custom integrations
and stick it into the hoodor fender of a vehicle.
Now that's not what we've done.
So we've pivoted and we'vegone after a different market
where we've aligned with the who's who
from physical security,integration management platforms,
video management, softwaresolutions, cameras,
business intelligence, physicalaccess control systems,
(07:00):
where they've used our sensors
and they've integrated oursensors into their platforms
to provide all theseevent to action workflows.
All these different outcomes
that have just not beenavailable in the past, right?
So this is opening a wholenew level of understanding
and all new capabilitiesto solve old problems,
but even more so, new problems, right?
So what we've seen is now thatwe've got the integrations
(07:24):
to all these tier onepartners in these spaces,
is that that's giving endcustomers and end users
the ability to now explore
how to solve old problemsin different ways
and get higher levels of accuracy
that they've never been able to do before.
- Now you've mentioned a lotof different technologies,
and, you know, these companies
who have been doing 2Dsensing with their cameras
(07:45):
and their other sensors,
how can they now leverage
the existing infrastructure that they have
and add 3D LiDAR on top of it
or work with Quanergy
with their existing infrastructure?
Or does it take a littlebit more investment
in hardware and tooling
to be able to integrate someof these and get the benefits?
- Yes and no.
So as I stated in the previous question,
(08:06):
we do have a large ecosystemof technology partners
that we've integrated with.
So, I would say that 9 timesout to 10 off the shelf,
we can integrate with a lot of the stuff
that's already out there,
but we're very fluid
in how we're able to work with partners.
You know, you can integrate to us
directly through your camera,through a VMS platform,
through our open API orthird-party GPIO boxes,
(08:29):
which is basically nothingmore than Ethernet box
where we could push acommand directly to it,
and then activate a sirenalarm or whatever it may be.
So the other side of it too
is that we're not trying to gocompletely greenfield, right?
So I'm not trying to discount
a lot of the technologies out there,
but I will say a layeredapproach to any approach
(08:49):
is probably your best route.
'Cause there's not a singletechnology in the world
that can solve all use cases.
Does someone sell you on that?
Please turn around and runbecause it just can't be done.
But when you put together
the best of breed solutionstogether in your ecosystem
or in your deployment,
you're going to get the bestoutcome from every sensor.
(09:10):
Case in point with cameras.
We don't see like cameras do, right?
So we don't capture any personalidentifiable information.
When I explain what LiDARs sees,
I always revert back to likemy favorite movie of all time,
"The Matrix."
Remember when Neo saw the ones and zeros
dropping from the sky when hesaw Agent Smith down the hall?
So that's how we see.
We don't see like cameras do,
where I could tell,Christina, you have glasses
(09:32):
and a white blouse on,
or that I have a black,you know, polo shirt on.
You can't see that.
To us, everything lookslike a 3D silhouette
with depth and volume in 360.
Now, that's where we then partner
with 2D imaging technologiessuch as a camera
like your Boschs, your Axis,your Flares, your Hanwhas
big companies that help us see.
So when we do need to identify,
(09:53):
hey, there's a bad actor with a black polo
that's potentially going tobreak through this fence,
that camera helps us decipher that.
But when you need to actuallydetect, track, and classify,
when you marry those technologies,
that's when you open up new outcomes
that you can't do with just a camera.
So for instance, when you use,
let's say sort of traditional
pan-tilt-zoom auto trackingon a camera that's embedded,
(10:16):
they'll put a bountybox around the person,
and they'll track thatperson in the scene.
The issue is withtraditional 2D technology
and auto tracking that'sembedded on the camera
is that when that persongoes between an object
or another area,
camera doesn't know what's happening,
it doesn't see what's goingon in that environment.
But if you have enough of our lasers
shooting throughout the space
and we're seeing up and downaisles, halls, parking spaces,
(10:38):
whatever that obstruction may be,
we're able to accurately detect the object
and we tell the camera,
"Hey camera, stay focused on this wall
because we know the personis behind the wall."
Then when the personcomes from behind the wall
and into the view of the camera,
we're still telling the camerakeep tracking that person.
That's Mr. Bad Guy.
So we go from wall toguy with a black shirt on
(10:59):
and we're tracking him all throughout.
That's the beautifulthing about solution too,
is that we provide amesh architecture, right?
So unlike having to stitchmultiple technologies
and trek from scene to sceneto scene to scene to tile,
if you have enough LiDARs in a space,
as long as the lasersoverlap with one another,
it creates like this massive digital twin.
So you could literallyzoom in and pan around
(11:22):
and trek all throughout,
up and down corridors,up and down hallways,
other sides of walls,
you know, around a tree,around whatever it may be.
That's the power of our mesh architecture
is that gives you the flexibility
that you've just never been able to do
with other technologies.
- I love this whole idea ofpartnering with other, you know,
organizations and experts in this space
(11:43):
and being able to get thebest outcomes from each sensor
and utilize all this technology.
But how do you make sure
that now you have this all together,
it's not information overload,
that you're getting data that makes sense
and that you can make actions on
and, you know, do decisions?
- We're working with aglobal data center company
who came to us with avery specific problem.
They told us at any given time,
(12:04):
well, actually not at any given time,
within a 33-week period of time
and testing one of their sites,
they were generating over 178 alarms,
178,000 alarms, right?
To be exact.
Now this is by definitionneedle in the haystack
when I tell you only two ofthose were real alarms, right?
So when you think of the operation
(12:24):
to acknowledge an alarmwithin a security practice,
it's like click, next, review.
That isn't it?
Delete.
Click, next, review, delete.
Try doing that 178,000times to find that one time
when that disgruntled employeethat got fired for smoking
or doing something when heshouldn't be at the property
(12:46):
comes with that USB drive,
plugs into the network
and takes down a billiondollar organization, right?
They knew they had a problem.
So in that respect,
they tested everything underthe sun from, you know,
AI, radar, fact checkingtechnology, underground cable,
everything under the sun.
So they finally landed on our solution.
So they did a shootout,
one of their best sites with our site.
(13:08):
So same timeframe of testingtheir best site with our site,
they came up with 22,000alarms on their best site.
Our site generated five actual alarms.
And again, I'm gettinggoosebumps when I tell you this.
They told us that that whathas saved them 3,600 hours
in pointless investigation work
that they can reallocateto other capital expense,
(13:29):
other operational expense.
"We're buying more solutions,more CPUs from you guys,"
or more LiDARs from us, right?
There's just so muchthat they're able to see.
Now the idea is that wedramatically decrease
the operational effect ofthose legacy technologies
to make them only aware
of what was important to them, right?
So that was a key value proposition there.
(13:50):
But even more so,
by tying into all thoseother technologies,
it made it more effective.
So when we did track those five alarms,
we did actually trackthe camera to decipher,
is that a good guy, bad guy?
Is that a real alarm?
Absolutely.
So we're able to decreasethe operational expense
as far as someone havingto click, next, review,
click, next, review, click, next, review
a thousands and thousands of times
(14:11):
to actually only work onsomething that's important.
So there's so many differentoutcomes and effects
that to the positive sidethat I could go on and on for.
- That's great.
And I'm sure when you'relooking at hundreds of thousands
of different reviews,you can make mistakes.
You're probably justgoing through the motions
and something could be happening
that you're just like, allright, click next, click next.
(14:32):
I just want to get through allof these alarms and alerts.
So that's great that youguys are able to pinpoint
exactly what's happening.
You've talked a lot about
sort of infrastructure surveillance
and you talked about the customer behavior
within shopping aislesand things like that.
I'm curious if you could provide us with,
if you have any more customeruse cases or examples
of how you've helped somebody,
(14:53):
what the problem that business was having
and what the result was as theresult of using the 3D LiDAR
and working with Quanergy?
- We deal with various markets.
In fact, one of our bigger markets too
is the flow management-smartspace-smart city markets.
So we just did a webinarwith one of our customers,
YVR, Vancouver InternationalAirport where, you know,
(15:13):
they talked about theirapplication with LiDAR
and how LiDAR was able togive them the accuracy levels
that they needed to.
How to better engage the guest journey,
that curb to gate experience,
from airside, landside operations,
but even more so, it's howto get the flow of people
in, through and out totheir final destination.
There's a lot of bottlenecks,a lot of choke points
(15:34):
as you get dropped off by yourfamily, by taxi, or by Uber
as you go to check in, get your ticket,
as you go through CATSA orTSA to go through security.
Then finally, as you go to duty free
or a restaurant to get your food.
Then finally, when you getto the boarding gates, right?
There's a lot of areaswhere there's choke points
that create friction asfar as the experience
(15:55):
and the journey that one takesthroughout that environment.
Now, as I mentioned earlier,
I don't want to talk down onother sensing technologies,
but let's just say in this environment,
we were able to replaceup to seven cameras
in that environment with one LiDAR sensor.
And unlike cameras in that space
that had to be overheadlooking straight down,
(16:15):
giving them a limited field of view,
we gave them so much coverage, right?
So one of our long range sensors alone
could do 140 meters in diameterof continuous detection,
tracking, and classification.
That's equivalent to aboutthree US football fields
side by side, right?
So that's quite a bitof coverage you can do.
Now, when you look at it
from the TCO advantage that we provide,
(16:36):
the airports, the datacenters, the theme parks,
the casinos, the ports, Imean the list goes on and on,
is that we dramaticallydecrease the overall cost
in the deployment.
So when you would lookat it at a high level,
I always use this analogy,
I used to hear this when I was very young
from more senior sales guys,
is that whole iceberg theory, right?
You can't look at it atthe top of the iceberg
(16:58):
and put sensor to sensor what this cost.
You know, camera may be only a few hundred
while LiDAR may be a few thousand
plus, you know, softwareand et cetera, et cetera.
But the underlying cost isbeneath the iceberg, right?
What's it going to take
to install these seven toeight devices on this side
versus one device, right?
You look at labor, youlook at cost of conduit,
cable, licensing,
(17:19):
the maintenance that'srequired to deploy that, right?
So that's when it reallybecomes really cost effective
when you understand
the complexity ofinstallation legacy technology
versus new technology in that area.
Hence why Vancouverdecided to start deploying.
They got over 28 sensors in one terminal,
they're expanding to other terminals now.
So there's quite a bit of growth there
(17:40):
that we're doing with that airport,
but we got over 22 international airports
that we're currently deployed in.
Now here's anotherinteresting one as well.
So here in the States, in Florida,
there's a lot of draw bridgesthat go up and they go down,
they go up and they go down.
And it's susceptible to,you know, liability issues
where people may fall in,
(18:02):
vehicles may fall into the waterways,
and unfortunately, therehave been fatalities,
which is a horrible thing.
So what we've done is thatthey did initial tests
with our LiDAR solutions
and they were using LiDARon both sides of the bridges
to basically track if anobject comes into the scene,
in this case a person or a vehicle.
And if that person or avehicle comes into the scene,
(18:22):
hold the bridge from going up
and notify the bridgetender in the kiosk and say,
do not let the bridge up.
Which ultimately would bringdown the liability concerns
that they had in that area.
So now with the use of LiDAR
and, you know, confidentlycoming out of that POC
very high success,
they're now deploying these
across several bridges in Florida.
So when you look up at adrawbridge now in Florida,
(18:45):
you'll see our sensors deployed.
That's helping bring downthe liability concerns
and potential issues offatalities occurring,
or God forbid, a vehiclefalling into the waterway,
which could happen quite a bit.
- Yeah, and I'm sure that notonly benefits the operator
who's operating those drawbridges,
but also the comfortability
of the people driving over those bridges.
(19:05):
My husband absolutely hatesdriving over the bridges
and that's one of his biggest fears.
So I'll have to let him knownext time we're in Florida
that, you know, he hasnothing to worry about.
There's 3D LiDAR and explain all that.
I'll have him listento this podcast on the-
- For sure, for sure.- Drive over there.
But I'm curious, youknow, 'cause you mentioned
this whole ecosystem of partners
that you're able to work with
to be able to do all of this stuff.
So when you're talking aboutsome of these examples,
(19:27):
and I should mention insight.tech Talk
and insight.tech as a whole,we are sponsored by Intel,
but I'm curious how partnerships,
especially Intel technologyand that Intel partnership,
how does that help you besuccessful in these use cases
and in these customer examples?
- Spot on.
So let me start off by saying this,
unlike the herd of LiDARthat's heavily focused on,
(19:50):
you know, GPU processing
and they've got a ton of datathat they need to process,
we're a little bit different, right?
So our sensors are purposebuilt for flow management
and security applications.
You know, they don't need togo into a fender of a vehicle
and shoot tons of lasers allover the place and gather,
and push a ton of data through the pipe
(20:10):
as far as throughputrequirements for the sensor.
You know, our sensor are purpose built,
which means that, you know,
we have the best angular resolution
as far as capturingobjects within the space,
but ultimately, we havea CPU-based architecture,
which means it's more costeffective, it's highly scalable,
but even more so as we align with Intel,
we provide, you know, the bestof breed solution out there
(20:31):
for not only cost, accuracy
and deployment capabilities in the space.
So, you know, that's where we stand apart
from a lot of the other Tom,Dick and Harrys in LiDAR,
is that it really is a solution
you could take off theshelf now and deploy.
There is no custom integration
you're going to need to dofor six months to a year
to get it to where you need to.
As I explained earlier, there'sfour ways to work with us,
(20:53):
at the camera level, atthe VMS level, at our API
or through a third partyGPI or Ethernet box.
And then with our partnership with Intel,
we come to find out newuse cases on a daily...
I just finished a callwith the retail team
literally 30 minutes ago
where we were exploring,you know, brick and mortar
(21:13):
and warehouse, automationand stuff like that
where we could provide,you know, 3D sensing
beyond the traditional way
of looking at those type of spaces
with other sensors, right?
So there's so much to unfold there,
but even more so, thepartnership with Intel
makes it valuable for us
as we continue to scaleand grow in this space,
- That's really exciting,
(21:33):
especially with all thesedifferent industries
you've been talking about.
We've been writing on insight.tech a lot
about how they're usingcomputer vision, AI,
and other automated technologies
to be able to improve their operations
and their efficiencies and workflow,
but, you know, excited to see
how 3D LiDAR is goingto come into the fold
and how that's going to eventransform these industries
even further.
(21:54):
So I'm curious, since wetalked about in the beginning
that we've really only hitthe beginning of the use cases
or where we could go with this.
How do you anticipatethis space to evolve?
Are there any, you know,emerging trends or technologies
that you see coming outthat you're excited about?
- There's quite a few use casesalready that we've untapped,
but there's so much more
that's still yet to be explored, right?
(22:15):
So at the very beginning,
I talked a little bit about orchestration
and we're able to marrywith multiple sensors
to create different outcomes.
That's going to continueto grow and expand
with additional sensorintegrations, right?
So we integrate with thelicense plate recognition.
If there's a hit,
boom, we can then continue totrack within a parking lot.
But then there's an advent of AI,
(22:36):
what's going on with large learning models
and all the other stuff that's coming out.
And then cloud, right?
So there's just so much there
that just hasn't been touched, right?
From the AI side,
there's a ton of stuffthat's being done right now
on computer vision
and understanding much more
as far as what's beingcut within the scene
to understand more generalities
(22:57):
that can create different outcomes
and tell a different story
that ultimately get you to the end result.
Is it good guy?
Is it bad guy?
Is it good workflow or is it not, right?
I think that there's so much more
that can be done with LiDAR
as we marry with other AI technologies
that will provide theseadditional outcomes
that are just not being done yet, right?
So we're still in veryearly stages, I would say,
(23:19):
for LiDAR in the AI arena,
but as it pertains
to a lot of the physicalsecurity applications
and BI stuff of that,
it's already been provenand deployed globally
with quite a few differentcustomers around the world.
So definitely excited about that,
but there's just so much more to peel back
as far as what we do with cloud, with AI
that's really just a massiveopportunity in this space.
(23:41):
- Yeah, I'm excited tosee where else this goes
and I encourage all of ourlisteners too to follow along
as Quanergy leads this space
and what else you guys come up with
and how else you guys aretransforming our industries.
Before we go,
Gerald, is there anythingelse that you wanted to add?
You know, any finalthoughts or key takeaways
you wanted to leave our listeners with?
- I've always been kind of the guy
(24:02):
who always adopts the new platforms,
you know, once I hear from other people,
and like, I'll be the last one
to create a new social media account
and I'll wait for what everyonethinks and stuff like that.
But I think that with LiDAR or similarly,
like some people may be,you know, a little nervous
adopting new technology,
even more so going with somethingout of their comfort zone.
I think now more so than any other time
(24:24):
is a time to start testing.
We're past that early phase,
you know, the kick the tire phase.
There's so many deployments,so many reference accounts,
so many people that arenow talking about the value
and how this has increased,you know, their workflows,
that has increased andprovided additional value,
has decreased the false alarms
and operational effectiveness for the...
(24:44):
I think now more so thanever is a time to act
and start testing, startasking the questions,
what can LiDAR do for me
that I haven't been able to do before?
How can I use LiDAR inmy current operations
or my current deployments
that I just have never been able to see
with these other technologies.
And look at your existing,you know, use cases
(25:05):
or your existing business cases
and see, if I had depth, if I have volume,
if I had centimeter level accuracy,
how could that improve myday-to-day workflow, my job,
and provide more value tothe organization as a whole?
So I would say, if that'swhere you're at now,
reach out to me.
You can find me onLinkedIn, Gerald Becker,
or reach out to me directly on email,
(25:27):
gerald.becker@quanergy.com.
I'd love to have a chat with you.
You know, even it's a 10,15 minute conversation,
I'm sure it will lead
to a lot more fruitfuldiscussion after that.
- Yeah, absolutely.
And we'll make sure tolink out to your LinkedIn
and, you know, accounts for the company
so that if anybody listeningwants to get in touch,
wants to learn more aboutthis 3D LiDAR space,
you know, we'll make iteasy for you guys to access.
(25:48):
So, just want to thank you again, Gerald,
for joining us today,
and thank you to our listeners.
Until next time, this hasbeen the "insight.tech Talk."
(upbeat music)