Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:10):
Hey everyone.
Welcome back to the EdgeVerseTechcast, your go-to show for NXP's
software, tools and enablement forour Processors and Microcontrollers.
As always, I'm one ofyour co-hosts, Kyle Dando.
And hi listeners, I'm yourother co-host, Bridgette Stone.
In today's episode, we're thrilledto explore the new eIQ Time Series
(00:32):
Studio with our guest, Ted Kao.
Welcome to the show Ted
? Hey, thanks Kyle.
Thanks Bridgette.
Thanks for having me on the show.
So I'm the AI Product MarketingDirector at NXP, where I focus on
making AI and machine learning moreaccessible on Microcontrollers.
I lead the definition andmarketing of our eIQ software,
(00:53):
as well as the Neutron NPU IP.
Both designed to simplify AI and MLdevelopment while delivering high
performance and low power consumptions.
Well, thanks for being on today, Ted.
I'm gonna kick us off andwe're gonna jump right in.
So, what makes running AI onthe edge such a game changer?
And why has time series based AIin particular, been so challenging
(01:17):
for developers until now?
So, yeah, we have all seen howAI is transforming our lives
in the past couple years.
And at NXP, uh, we see tremendousopportunity to bring AI from the
cloud to our edge processors.
And this enables fast and accuratedecision making right at the
edge where data is generated.
(01:38):
This also brings benefit, like lowerlatency, reduce power consumption,
and improve data privacy by notalways sending the data to the cloud.
So while we have paralleldevelopment for computer vision,
voice, large language model,generative ai, those exciting stuff.
Time Series AI is equally powerful,especially for microcontrollers.
(01:59):
And that's why I'm here andI'm excited to share today.
So, what's Time Series AI?
That's what most people would ask?
So think about tasks like AnomalyDetections, Classifications, Regressions.
And how people leverage sensor data tomake intelligent decisions and predictions
in dynamic real world environment.
(02:21):
However, what most people don'trecognize is building AI models
with time series data is not easy.
Based on the years of experience wehave had with our customers and usually
takes deep expertise in sensors, data,machine learning knowledge, all, together.
To make it easy for the developers.
That's why we developed the eIQ timeseries studio, and that's included
(02:45):
as part of our broader eIQ toolkit.
Great.
Introduction and overview.
Thanks Ted.
Are you able to go into additionaldetail on the eIQ Time Series Studio
tool, and how does it make thisspecialized AI accessible to developers?
Sure.
Bridgette, I'm happy to do that.
Time Series Studio is an end-to-enddevelopment tool that streamlines
(03:07):
the Machine Learning developmentworkflow from data curation model
training, optimization, emulation,all the way to deployment on NXP
devices and all in one place.
While it's important to providean end-to-end workflow, what makes
Time Series Studio super usefulis integrated Automatic Machine
(03:27):
Learning, or short for AutoML.
That takes part of the heavylifting by intelligently
generating and tuning models.
So you don't need to be the ML expertto get great results and dramatically
speed up the development and lowersthe barrier to entry for anyone
looking to build AI at the edge.
(03:48):
And the best part it's completely free.
Our goal is really to make edge AI moreaccessible and remove as many roadblocks
as possible for developers everywhere.
All right, Ted.
Anytime you say completely free,that perks up our listeners' ears.
I'm a pretty frugal guy and when I havetwo options, I always tend to appreciate
(04:10):
and evaluate first the free tool.
I'm sure our customers arethinking the same thing.
I've had some experience looking intothe Edge AI tools, and I think it's
really rare to see all the features thatyou've talked about in a zero cost tool.
They may get something to evaluatethe ai, but it sounds like what you're
describing is the developers get allthe levels and they can jump in and
(04:30):
evaluate all the advantages of our tools.
So with that, why don't we divein even a little bit deeper.
You introduced the topic ofAutoML, which is at the heart
of NXP's Time Series Studio.
Can you walk us through exactlyhow AutoML transforms and improves
that developer experience?
(04:52):
Sure, sure.
I'm happy to do that.
So again, AutoML or Automatic MachineLearning really changed a game for
building machine learning models.
For developers, , as you mayknow, , it takes a lot of trial and
error and manually testing differentsignal processing techniques or
model architecture in the past.
And it was really slow, tedious.
(05:13):
It requires a, like pretty deepMachine Learning expertise.
Now with AutoML or AutomaticMachine Learning integrated, that
process become much, much easier.
It automatically explores and finetunes the wide range of model options
and signal processing to find thebest balance of accuracy, performance,
memory, efficiency, all without theneed to write in a single line of code.
(05:40):
Okay, so, so this AutoML is trulythe lab assistant that all these AI
developers have been waiting for.
They finally have someone or somethingto help 'em out with all that
tedious analysis you talked about.
All those available models, I'msure they come about all the time.
Uh, this AutoML will help steerthem in the right direction and
(06:00):
help them evaluate the trade-offs.
Yes, yes.
So, I mean, based on our workingexperience with even like experienced
Machine Learning developers, they areoften surprised about how much time and
how easy it is to use Time Series Studio.
And what used to take weeks for them todevelop a model now can be done in hours.
And you still get a visibility into howeach model is built and how it performs.
(06:24):
It's again, super helpful forembedded developers working with,
again, microcontrollers withtight memory and compute budgets.
Wow.
So even seasoned ML prosare saving weeks here.
Absolutely incredible.
I bet that frees them to focus onfine tuning and real innovation
(06:44):
rather than tedious trial and error.
So speaking of innovation, Ted,tell us more about the variety of
sensor inputs you can work with.
Sure, sure.
So, yeah, this tool is built tosupport a wide range of sensors.
Again, motion, temperature, vibration,pressure, sound, just to name a few.
And it even supports voltage and currentthat may not require extra hardware,
(07:07):
and for some of our customers, thattranslates into no additional cost.
And it also supports Sensor Fusion,combining multiple inputs for more
advanced multimodal applications.
There are again, three main use cases asI mentioned earlier: Anomaly Detection,
Classification, and Regression.
(07:27):
And they apply across industrieslike Industrial Automation, Medical,
Smart Home, and Automotive.
That's really a greatoverview of applications.
Are you able to zoom into somereal world use cases for us?
Of course, of course.
So let me just provide a couple examples.
Uh, take manufacturing as the first one.
(07:49):
, As factories become more automatic,uh, automated and robotic,
production lines are integrated.
Catching abnormal machine behaviorearly is actually, uh, really critical.
Sensors today can play a key role tonot only prevent costly breakdowns,
but also to monitor the health andlifespan of equipment over time.
(08:11):
So that's one.
And at home, uh, think about applianceslike washer dryers and refrigerators.
Sometimes we'll make strangenoises or vibrations.
Those are often early signs of trouble.
With AI built-in, those devicescould detect issues early and
alert you before things get worse.
And one other example thatagain, come to mind, is somebody
(08:35):
living in California where, uh,wildfires are a constant threat.
Arc fault detection is one of theuse cases that may seem niche to some
people, but we take it seriously.
If AI can detect or even predictelectrical faults that could spark
fire, it has a great potentialto play a real role in preventing
(08:56):
or minimizing natural disasters.
Ted, I love that powerarc detection example.
Also being from California, those weretragic incidents, that if our machines
and our products were a little bitmore intelligent, they could possibly
save lives, and save a lot of money.
It's amazing to see all this Edge AIdirectly contributing to these new,
(09:18):
safe and reliable systems being built.
The ability for new products that we'vebeen discussing to identify things,
these anomalies where the humans usedto not be able to monitor, uh, based on
it was just unrealistic or in some caseseven impossible to monitor the system.
I think this is a huge improvementand that right there, that's a,
(09:39):
headline for any AI/ML consideration.
So now let's shift gears a little bitand look at how the Time Series Studio
tackles this messy real world data.
Ted, can you share more on howthe tool helps developers curate
this valuable input data fromall their varying applications?
(10:01):
Yeah, that's a great question.
And indeed, it's really important to havea solid data curation tool when working
with Time series AI and with sensor data.
So real world sensor data can be messy,um, especially in real world environment.
And developers often have to dealwith different sampling rate, noise
and inconsistencies when they workwith different and multiple sensors.
(10:24):
And Time Series Studio makes iteasy by letting developers log,
visualize, label, clean, andalign the data all in one place.
And that saves a ton of timeand frustrations no matter if
someone is working with singleor multiple data streams.
Okay, but what can customers do toexpedite, I know getting data sets
is one of the most time consumingparts of building a successful model.
(10:49):
So how can they import data setsinto the Time Series Studio?
Does the tool supportdifferent ways of doing that?
Yeah, obviously you can log andcapture your own data or you can also
import the data from external sources.
And by the way, uh, aside fromjust importing and capturing
the data, we also incorporatewhat we call Data Intelligence.
(11:10):
And that's a utility that can be usedto spot like data imbalance, redundant
channels, and optimize sampling settings.
All those, utility really helps thedevelopers make informed decisions
on their data before processing andbefore the model training even begin.
Those utilities must be adeveloper's best friend.
(11:31):
No more surprise data glitches.
And being able to spot noiseand trends at a glance will save
hours of head scratching andaccelerate root cause analysis.
Now, how does all this translate intoseamless deployment with NXP hardware?
Once a model is ready, howdo developers deploy it?
Yeah, that's another great question.
So deploying AI can also bedifficult without the right tool.
(11:55):
So with eIQ Time Series Studio modelscan now be easily deployed to NXP
Microcontrollers or Application Processorswith Cortex-M and Cortex-A cores.
If the hardware includes an NPU orNeuro processor, such as the eIQ
Neutron, complex model can also getaccelerated for better performance.
And once the developer selectstheir preferred model, the tool
(12:19):
will compile and generate a customlibrary that's fully compatible
with GCC, Keil, IAR, Code Warrior.
This type of compiler and environment.
And from there it's just two simpleAPI calls: One to initialize the model
and the other one to run the inference.
That's all it takes.
(12:41):
Only two API calls.
Well, that's pretty easy to do.
All right.
, Now that you've simplified thedeployment to just the initialization
and the inference calls, that removesa ton of the complexity that a lot of
developers have in their AI projects.
So it looks like developers can getmodels up and running in no time.
Before we wrap Bridgette.
(13:02):
Why don't you recap the highlightsthat you've heard from today's episode.
We've really covered a lot today.
We unpacked what eIQTime Series Studio is.
Saw how AutoML accelerates model building.
Explored sensor inputs and use cases.
We dove into data curation and learnedhow to deploy models on NXP devices.
(13:24):
Thanks so much for beinghere with us today, Ted.
Yeah.
Thanks so much for having me.
It's a pleasure to join the techcast andsharing some of the exciting work we're
doing with enabling AI/ML at the edge.
And for listeners who are curiousto explore about the eIQ Time Series
Studio, you can download the full eIQtoolkit, at nxp.com/eIQ, and we can't
(13:48):
wait to see what you will be building.
If you have any feedback orsuggestions and or how we can better
support your development journey,please don't hesitate to reach out.
All right.
Well again, thanks Ted.
What a great episode!
I learned a ton aboutthe Time Series Studio.
I did notice Ted, in our priorepisodes we always talk about our
(14:08):
Application Code Hub and there is aTime Series Studio example in there.
I'll drop that in the episodelink, as well as the direct access
to the eIQ toolkit so that thelisteners can get to that quickly.
Yep, yep.
That would be awesome.
All right.
Well, as always, if you enjoyed thisconversation, please Like and Subscribe.
(14:29):
Make sure to tell all of your friendsthat you learned something new on
the EdgeVerse Techcast, and hitthat Notification Bell so that you
never miss a time when Bridgette andKyle have something new to share.
And we'd love to hear yourfeedback or topic requests.
So please reach out onsocial or drop us a line.
And until next time, keep innovatingat the edge and we'll catch you
(14:51):
on the next EdgeVerse Techcast.