Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Andrew (00:02):
Hello everybody and
welcome back to Wrappy Hour.
This is episode 23 and I amhere with two guests today.
Very exciting.
Today we're going to be jumpinginto a discussion about vision
systems and how they impact thepackaging world, both
operationally,sustainability-wise and whatever
else that might meansustainability-wise and whatever
else that might mean.
I've got two great experts here, one from Harpack Alma, and
(00:26):
I've got another guest, lindsaySullivan, who joins us from CCS
America, and I'm really excitedto have her here today.
So, without any further ado,spencer, do you mind just
reintroducing yourself realquick for the folks that might
not remember you from yourprevious episode?
Spencer (00:41):
Absolutely.
Hello Andrew, Hello Lindsay,Glad to be on the show with you.
I'm Spencer Thomas, I'm anapplication engineer with
HeartPack ULMA and I'm on ourautomation team.
Andrew (00:53):
Heck yeah.
And, lindsay, I'm going to giveyou a little bit more room to
dance.
You got a little bit more of astory to tell, I'm sure.
So, lindsay, why don't you tellus what it is that you do at CC
, ccs, and then maybe just alittle bit about what CCS does
as a company?
Lindsey (01:07):
Yeah, and thanks for
having me here today, Andrew and
Spencer.
So at CCS I am the technicalmarketing manager.
I have to add technical inthere, because I actually came
from the background of doingapplication engineering and the
technical side of engineeringand then I kind of wanted to do
something different.
So they let me be in amarketing role.
(01:28):
So now my focus is to take mytechnical expertise that I
learned over the last, you know,eight years and start using it
to help other people learnmachine vision techniques.
And you know, kind of get theword of CCS out there, that we
are a solutions provider.
So that's kind of my the wordof CCS out there, that we are a
solutions provider.
(01:48):
So that's kind of my segue intowhat the company does.
Ccs America.
We sell bar lights and ringlights to, you know, solve
machine vision applications.
But we really like to phraseourselves as we can provide the
solution, Because a lot of timesif you can't get your lighting
down, it's really hard to getthe vision system working, and
so we like to be okay, how do Imake this work?
You need this bar light at thisheight, this light working
(02:11):
distance, et cetera.
You're good to go now.
Andrew (02:15):
Awesome.
Well, I'm sure we'll get intoit a little bit more as we get
through the episode andeverything, but part of the
reason why I wanted to have youboth here on this episode today
is A because I think visionsystems are something that
really slides under the radarwhen it comes to like automating
and the processing side ofeverything, but also because I
know that we've been heavilyinvested in upgrading our own
(02:36):
vision systems and services thatwe offer, and it just felt like
there was a good match there.
So I'd like to bridge the gapbetween both of you right now
and just see how CCS approachesvision and then maybe see how we
at Harpack Omar are approaching, you know, vision systems and,
I guess, spencer, automation asa whole.
Lindsey (02:56):
Yeah, that's a good
question.
So I mean CCS approaches visionas really required in
manufacturing lines, because youcan have an operator standing
there looking at the same pieceover and over again, but
eventually you're going to getdiscrepancies between each
operator, something like that Anoperator is going to get tired.
But for us a vision systemdoesn't get tired.
(03:16):
It is, you know, as good as youcan build.
It is as good as it will be.
Obviously, none of them areperfect, but it can be a lot
more reliable.
And there's the word I'mthinking of consistent.
And so when we approach it, wethink of it as everyone who does
any type of manufacturing canuse a vision system.
(03:37):
And then our next step is thelighting is critical to the
success.
If you have a defect, yourvision system, the lighting in
your vision system can hide thedefect or it can show the defect
.
So if you don't have the rightlight, you're making it harder
or nearly impossible than itneeds to be.
(03:57):
But if you just pick adifferent color, for example,
then it's super easy and you canmake the time it takes you to
program the system and get itworking significantly reduced
and way less of a headache.
So we very much think everybodycan use it, and don't neglect
the lighting.
It can make or break it a lotof times or just make it wait
here.
Andrew (04:15):
Interesting and it's my
understanding that Harpac-Oma
kind of has a slightly differentapproach, Spencer, in that
we're not using as much of likethe reference imaging and more
using placement checks withgraphics and barcodes and things
like that.
Spencer (04:32):
Well, our systems are
designed specifically to work
with our thermoforming and traysealing equipment, so we do use
reference images and buildrecipes off of those.
We do use reference images andbuild recipes off of those.
Andrew (04:48):
And so much of what
Lindsay said about it not
getting tired, it being reliableand consistent that's part of
why our customers are coming tous wanting us to integrate them
into their lines.
The vision systems and I think,to Lindsay's point too, it's to
be the at least.
(05:08):
In today's day and age, witheverything automating and
needing to trim out the fatwherever you can operationally,
it's so crucial to have somekind of vision system.
So, while you might have athermoformer or, you know, to
Lindsay's point, like some otherkind of machine that you've had
for a while, it's kind ofgetting to that point where
you're really needing toreconsider that.
I guess it'd be at the end ofthe processing or I guess
(05:31):
post-processing end ofproduction and everything right.
Spencer (05:36):
Yeah, vision systems
can be incorporated at multiple
points within the line.
Sometimes you're inspecting theactual raw product for
something like fat content or umcolor of the product, but then
other times, um, you use them toinspect the actual product in
its packaging.
And, uh, sometimes you're doingquality checks on whether or
(06:00):
not the the product is.
Let's say, a bag that needs tobe put into a tote.
Is the bag hanging out abovethe top edge of the tote?
So anything that you canvisually inspect with your own
eye has the potential to beinspected by a vision inspection
system.
Andrew (06:18):
Very interesting.
So from a marketing perspectiveI understand the benefit of it
and the importance of it, forsure.
But I think where I really lackthe understanding is in the
technical aspects of thesolution bunch of pretty
(06:44):
pictures of the product as it'sgoing through.
I imagine there's got to besome kind of you know uniform
process that makes that visionsystem work so well and makes it
so reliable, so that you canreduce errors as much as
possible or completely.
Lindsey (06:56):
Yeah, I definitely that
.
I would say.
The thing that makes the visionsystem more successful for that
I've seen is how consistent itis every time the more variation
you have in your product,whether it's how it, you know,
appears if the bag is wrinkledin different ways, or the
orientation that one it's.
Not that's an easy barrier toovercome, but the more control
(07:21):
you have and the more your everypicture looks the same, the
more successful your visionsystem can be at seeing what you
want it to see I'm not sure ifyou sorry.
Andrew (07:33):
It looked like you might
have a point there, spencer.
Spencer (07:35):
Yeah, um, yeah, uh,
just with the consistency we.
We are integrating visionsystems with our thermoformers,
either standalone following thethermoformer or directly
integrated into the thermoformer, and this is connected to what
Lindsay's saying about havingconsistent inspection of your
product.
The vision systems that areintegrated directly into the
(07:58):
thermoformer have the ability toscan the web in a very
controlled positioning of theproduct and very controlled
lighting of that web.
So it gives us a very detailedinspection and we can even do
seal inspections with that as aresult of how controlled and
consistent we are with thepresentation of the product to
the camera.
Andrew (08:20):
Yeah, because with a
thermoforming line, with that
web, I mean you're not justscanning one package, you're
scanning like multiple at a time, right?
Spencer (08:25):
Yeah, you'll scan all
packages across the width of the
machine.
It's a line scan camera, soit's doing a single line of
typically 4,096 pixels and thentaking multiple lines of pixels
and compiling an image based offof the encoder readout it gets
from the driveline of thethermoformer.
Very interesting of the encoderreadout it gets from the
driveline of the thermoformerVery interesting.
Andrew (08:47):
So how do the cameras
that get used in vision systems
and this is a question for bothof you how do those compare to
the really new 4K cameras thateverybody's walking around with
these days?
Is it the same thing, just kindof stripped back a little bit
and wedged into a machine, or isit a really tailored camera
solution for a package?
Lindsey (09:06):
you mean like the 4k
cameras on our like iphones, on
our cell phones, kind of thingyeah, I guess it's probably like
the yeah, I would say the, Imean the resolution isn't, I
mean the, there's 5k, there's upto 21k, so you can get higher
resolution cameras for differenteither larger, smaller field of
views or how small of a defectyou need to see.
(09:26):
But also, a lot of it is thesoftware behind the camera being
able to, you know, take thatpicture and then interpret the
photo and say, okay, based onthis photo and my reference
photo, that's a defect, and thenkick it off the line.
So the camera, on theresolution itself, you can get a
higher resolution camera andmachine vision, but it's really
the software that is the make orbreak of the difference.
Andrew (09:51):
And do you think that
having better camera like
obviously we didn't have 4K Idon't think we had 4K 15 years
ago but do you think that thedawn of these higher resolution
imaging formats is making iteasier to implement a vision
system or at least have morecontrol over one?
Spencer (10:07):
yeah, and spencer,
maybe you can talk on this more
yeah, with the higher resolutioncameras, um, more recently,
with the actual datatransmission, when you have like
a giggy uh camera or a 10 giggycamera, um, you can transmit
the data from the image and fora large resolution picture it's
(10:28):
a lot of data, you can transmitit faster and you can keep up
with the throughput of your scan.
So in recent years, withincreased processing power,
we're able to more quicklyanalyze larger images that have
more detail to them that wecouldn't process as quickly in
the past, or even get them offof the camera in the necessary
(10:50):
amount of time.
Andrew (10:53):
Very cool.
Now does that mean that you'relooking at more things in one
scan, doing it faster?
Spencer (10:59):
It depends on your
field of view.
But if you are inspecting thesame doing let's say, seal
inspection if you increase yourresolution, you might have a
more accurate inspection becauseyou're looking at the, you have
more pixels per squaremillimeter of inspection area,
(11:20):
so it can help with accuracy,upping resolution, but at the
detriment of potentially havinga larger processing time, which
means you can't have as high ofa throughput.
But the cameras are such thatyou can get pretty high
resolutions that you need.
Andrew (11:40):
That was actually one of
my follow-up questions too,
because I know in the marketingworld for me and obviously it's
less fast-paced than directly ona processing line or a
production line, but there's alot bigger files going through
everything.
When we record footage in 4K,8K, even rendering some just
(12:04):
rendering a video or an imagetakes a long time.
So I imagine, like with this,higher resolution also comes,
you know, trying to balance.
It's like, is it a balancingact really between finding, like
that you know, ultimate highres thing that you can get, but
also managing the throughput ofwhat your operations are.
Spencer (12:20):
Absolutely, I think you
definitely want to.
There's no sense in wasting theenergy, the power to do and
wasting the time, rather andwe're talking, you know,
fractions of a second, but itadds up.
If you don't need thatresolution, you don't need the
higher quality equipment thatcan handle that kind of
(12:41):
resolution, the increasedprocessing power that's required
.
So it's really important tospec out the system to work and
handle the requirements, theperformance criteria that you
have for that system, withoutwasting capital because it is
expensive to purchase thesesystems.
No sense in making itneedlessly more expensive.
Andrew (13:07):
Any thoughts from you
there, Lindsay?
Lindsey (13:11):
Yeah, I honestly just
totally agree.
I haven't had to deal withspeccing out high-resolution
cameras in a few years butdefinitely the cost of the two
systems can be pretty drasticand the speed, will you know,
definitely slower with higherresolution.
But sometimes if it's what youneed and you have to have that
(13:31):
high resolution camera to getthe accuracy to see that tiny
little defect, you know like itis what it is at certain points
too.
So finding that balance isdefinitely the tricky part when
you're speccing the system.
Andrew (13:45):
Yeah, one of those that
comes to mind is like medical
packaging.
I imagine that they have to belike super fine tuned with
everything, like no room forerror whatsoever.
So I imagine they wouldsacrifice throughput for fewer
errors or whatever.
So I guess that kind of leadswell into the next, um you know,
(14:06):
discussion point that I wantedto ask both of you, which is
that I'm curious what some oflike the common pitfalls are of
vision systems.
And this is not me trying tobash vision systems, but
unfortunately, um, in this spaceand in the world in general,
things break and they breakoften.
Um, so I'm curious, you know, Iknow that CSS or CCS did a
(14:30):
webinar on the common pitfallsof vision systems.
So I guess, Lindsay, I'd askyou first what, like, what it
was like doing that webinar, andthen maybe some of the insights
that you gathered from there.
Lindsey (14:40):
Yeah, I, after I did
the webinar and kind of
presented it, I looked back andI was like that was just me
airing out like seven years ofgrievances, of talking to people
and getting the same questionsover and over and having to
explain why, like this won'twork.
So, um, it's actually been avery more useful than I expected
it to be and I think peopletook more information out of it
(15:01):
than I, you know, kind ofthought they would.
When I was first making it Ithought it was just me
complaining, but, um, it's,there are so many ways that a
vision system can go wrong butso many ways it can work too.
But I think the biggest onethat kind of encompasses
everything is the scope.
If you start off with like,okay, we want to do this and
(15:25):
this, and then you say, well,okay, if we're doing that, can
we add this, can we add too?
And then you get what was areally simple, maybe a low res
application not going to be toomuch of a project to take on, is
now a much bigger issue.
You know you're probablystepping up a resolution.
Are we now going slower Stufflike that.
So making sure that you canrein in what you're actually
(15:46):
trying to do is kind of the onestep of it.
And then the other commonpitfall is not factoring
everything at the right time.
If you're building a machinefrom the beginning and you think
you want to add a vision systemto it, you know, start with
what light do I need, whatcamera do I need, what distances
do I need?
(16:06):
What am I trying to inspect atthis point in time and then
build that into the design ofthe machine?
Cause the worst case scenariois when you get to you know the
machine's designed, it's built,you're building it, and then
you're like oh, let's add this,but I can't put my light there
because you put a bar there.
So, like now I have to find adifferent solution that might
not be as good but will workgood enough, um, and you know.
(16:29):
So that's kind of the two thatI think were.
The biggest takeaway is to justthink about it from the
beginning and make sure that youpick a project that is actually
achievable and will besuccessful.
Andrew (16:44):
I appreciate that
insight, Lindsay.
Thank you.
Spencer (16:48):
I'll add to that, if I
can.
I mean, the reason why whatLindsay said is so important is
because some of the I guess,acute pitfalls of the actual
camera setup.
You're trying to do photography, essentially, and you need your
camera to be in focus, you needproper lighting, you need, in
(17:09):
this case, the right resolutioncamera to inspect what you need
to see, and so if you don't dowhat Lindsey said and scope out
the project and define yourperformance criteria ahead of
time, one of those elements isgonna be forgotten or not
accounted for, and then youcan't fit the correct lighting
(17:29):
in the space or you can't getyour camera far enough away from
your product.
So that's that's why I totallyagree with Lindsay's answer and
just wanted to add some of the,some of the the parts that can
fail in the actual photographyof it.
Lindsey (17:48):
I will add in that it's
not that you can't retrofit a
machine in Adavision system.
I think you guys even have theoption to do it on your machines
.
It just isn't as seamless.
Maybe is a good word for it.
Andrew (18:03):
Yeah, it wasn't built
with it in mind.
I mean, it makes sense.
You know you're trying to putlike a Mercedes hood ornament on
a Toyota Capri in some ways.
But I think that that also, youknow, leads to another question
that I had, which is in thatinitial discussion, when you're
talking with a prospectiveclient about, you know, adding a
vision system, is it comingfrom a begrudging point of wow,
(18:34):
I kind of have to do this, sohow can I do this as cheap as
possible?
Or is it like I'm sick ofhaving errors on my line?
I need to hit these margins byX date.
Do whatever you can to makethat happen.
I'm sure that there is aspectrum of people that you talk
to, but I'm curious what comesup more often.
Lindsey (18:53):
You can maybe talk on
that, spencer.
Spencer (18:54):
Yeah, I I would say, um
, everyone wants to add vision
to their, to their lines.
I mean, it's very appealing, um, to be able to inspect every
product, as opposed to just oneproduct per batch or just taking
a statistically significantsampling of your population.
So I mean, if you can samplethe entire population, that's
(19:18):
awesome.
So people aren't begrudging intheir approach to applying it or
obtaining a vision system.
The capital costs can be alittle bit of a sticker shock to
some, but generally I would saymost people that are doing it
(19:40):
we get people that are eitherrequired to do it because it's a
medical project or they want todo it because of the
repercussions of not doing itcurrently are costly to the
point where it outweighs thecapital investment.
So that would be.
You know, let's say, in themeat industry, if they're
(20:03):
sending products to theircustomer that don't pass QC,
then they can get sent back tothe plant or they can get
charged for them.
So it can be costly to not havea good handle on your quality
control.
Andrew (20:21):
Yeah, and I mean maybe
Lindsay, this is a better
question for you.
But you know from a marketinglens obviously like there's a
sales aspect and like a lostproduct.
But you know from a marketinglens obviously like there's a
sales aspect and like a lostproduct.
You know calculation to be made, but there's also like the
customer perception of yourproducts, like if they see your
product on shelves and somethingslipped through the cracks and
(20:41):
it looks bad.
That's like you know, maybe oneless customer that you have
buying your stuff or more,depending on how many people
they tell or who they end upscreaming at about it, you know.
Lindsey (20:52):
So that's funny when I
go grocery shopping, I like will
see a bottle and the label'slike messed up, and I've done a
lot of label inspections overthe years so I was like, oh,
like you don't have a visionsystem.
And then I, I know that it'sinside, it's exactly the same,
but I'll still grab the bottlethat doesn't have the messed up
label Exactly.
I don't know why.
It's just funny, even like meknowing, yeah, this is how this
(21:18):
is like manufactured.
Andrew (21:18):
I still want the pretty
one.
Exactly it's.
You know, it's an aestheticpart of your kitchen experience,
or whatever it might be.
I can't have my multipurpose.
Lindsey (21:24):
Cleaner, looking silly
in my kitchen.
Exactly, cleaner looking sillyin my kitchen.
Andrew (21:29):
Exactly.
So that actually makes me thinkof, I guess, the next one we've
kind of been circling around itwhich is, I guess, the data
aspect of it.
So it seems like these areextremely data-driven systems
that really rely on capturing asmuch as possible to report on
(21:52):
as much as possible.
So I'm just curious what thatdata and what those insights
look like and how they can beutilized to further streamline
your operations.
Spencer (22:03):
I can take this one.
We're often wanting to trackOEE of our systems, of our
equipment, the overall equipmenteffectiveness, and one of the
key aspects of that is thequality of the product, and it's
one of the hardest parts of theequation to get data on.
But the vision system solvesthat problem and it can even go
(22:26):
further by classifying thequality by failure mode.
So if I have a vision systemintegrated into a thermoforming
machine and I notice that acertain part of the tooling is
consistently giving a seal error, well perhaps my tooling is
failing and it can give insightinto predictive maintenance.
(22:49):
And it can give insight intopredictive maintenance, so it
also can let you know if there'sother things happening upstream
that are causing qualitydefects.
And the key thing about itabout processing every package
(23:13):
and always having the data beingtracked consistently is when
something does go wrong, youcatch it almost immediately,
depending on what your alarmstructure is like, so you can
react faster and not have asmany defective products getting
through.
Andrew (23:21):
Very cool.
I'm not sure if you had a pointthere, Lindsay, I was giving
you a little.
Lindsey (23:27):
Oh yeah, even you know.
I've seen a lot of the databeing important in medical
devices as well, where they wantto do track and trace of the
same serial number throughoutthe whole line and they want to
see you know every single stepalong the way and have proof
that you know when we sent thisout, it was a good product and
there's nothing bad and and so alot of it is just them making
(23:51):
sure that you know every productthey send out.
They can reinforce that.
Yes, we make good products, wesend out good products.
Kind of on your point, spencertoo.
A lot of times I've seen peopleuse vision systems as machine
protection.
So it's like it's data wherewe're saying, okay, how much is
(24:12):
the?
You know how much glue iscoming out?
Is it enough glue?
Is it not enough glue?
And then, once it's not enoughglue, they say, okay, we need to
.
Actually, if there's not enoughglue, five times we need to
stop and add more glue, so we'renot wasting more product.
So it's not necessarily datathat they care about on the
product itself, but it isprotecting the machine or
protecting, you know, the wastedproduct that would have been
(24:34):
missed if the operator wasn'tpaying attention.
Andrew (24:37):
Yeah, that's a layer I
didn't even think about.
You know, like a flow wrappingline.
It's very common for, like, thefilm to kind of get ahead of
itself a little bit and thingsget pinched in the wrong ways.
So I could totally see a visionsystem all of a sudden be like
all right, the last 10 have beenduds let's, you know, shut her
down, fix it real quick and thenget things back up to speed.
So that's, that's a lens Ididn't even think about.
(24:57):
Very cool.
You know, I've talked with a lotof automation folks in the past
.
This is probably one of themore specific conversations I've
had, because I would definitelysay that this is a way to
automate your line One way oranother.
You're making things moreefficient by adding some smart
technology, and I think this oneis incredibly granular, but I
(25:19):
find it really unique.
But one of the things we alwaystalk about when it comes to
automation is labor.
Right, labor is an expensivething.
When you implement a visionsystem, are you taking somebody,
two people, ten people awayfrom that line, or are they now
switching to being operators ofthese vision systems, or does it
?
Is that kind of a scaryquestion to ask.
(25:41):
I don't know.
Lindsey (25:41):
I'm sorry I kind of
laugh because when I was I used
to do sales and machine visionand I would always call it.
We can reallocate that resourceto somewhere else in your
production line, becausesometimes it does replace an
operator standing there, butthat doesn't mean that they
can't do something else that'smore efficient down the line.
Or a lot of times it's justbackup, checking the operators
(26:02):
as they're assembling things.
They take a picture and makesure they didn't miss anything
and then they keep going.
So it's not necessarily thatwould be like a robot replacing
the operator, but the visionsystem is just assisting them.
So you know, kind of sometimesyou do you're never really
replacing 10 people at a time,but you know the one or two
people who are standing there.
(26:22):
You're either helping them outor they can find, you know,
become an operator in adifferent position.
Andrew (26:28):
Well, just to your point
real quick, lindsay.
I mean the turnover rates inthose jobs are incredibly high,
like people don't stay there forvery long.
I mean, right in the beginningof the conversation you were
talking about standing therelooking at the same package all
day.
That sounds mind numbing to me,just looking for small little
errors here and there.
So to then give that job to asmart tool and then give you
(26:55):
know the person ability to justoperate that tool just makes a
lot more sense to me and givenyou might only need like one
operator of a vision systemcompared to maybe like five
people manually inspecting I'mnot quite sure what the numbers
are there, i'm'm literally justspeculating, but I'm sure that's
a big piece of the puzzle aswell.
I'm sure you know, spencer.
(27:16):
I mean we talk about the laborand automation all the time.
Spencer (27:20):
Yeah, I think the jobs
can be.
There can be a lot of fatiguefrom inspection inspecting
something manually all day long,inspection inspecting something
manually all day long.
And I mean I was just at a tradeshow and was was passing a few
packages through an visioninspection system and I would
pick up the package and try tosee what the failure type was
(27:41):
without looking at the readouton the vision system and it
would take me a minute sometimesto figure out what was
incorrect, because it's lookingfor the barcode, it's looking
for the print inspection, it'slooking for the print inspection
, the artwork, and I would haveto look at it and just discern,
you know, what's wrong on this.
It's kind of like a where'sWaldo sometimes with label
inspection and you don't have toworry about that with the
(28:03):
vision system.
So maybe it's better that wereallocate those resources to a
different part of the plant ifit means that we're going to
have more quality productsgetting through and we're going
to have save on the waste ofthere being more rejects and
we'll no longer need to do a jobthat's very tedious in a way.
Lindsey (28:28):
Yeah.
Yeah, I also like to talk aboutthe.
You know we have threedifferent operators looking at
the same part.
They're all going to call agood part or a bad part a little
bit different.
They're all going to have goodtraining and sometimes I've seen
operators see things that I'mlike wow, I'm amazed you could
like find that, because Icouldn't.
Um, but you know, from a toperson a to b to c, they're all
(28:48):
going to say this one's good butperson B might call it bad.
So there's not quite the actualline that you can draw of
knowing what's good and what'sbad that you get from a vision
system.
Spencer (28:58):
That's a great point,
Just in the software being able
to tune the tolerance on yourspecific inspection that you're
doing to provide a window wherewe're saying this can be out two
millimeters that way and twomillimeters the other way, and
anything greater than that it'snot good.
Anything that's within thatrange will allow it.
(29:21):
So you get complete controlover what you consider to be
quality.
So I think that's a great point.
Andrew (29:27):
Lindsay, yeah, yeah,
first thing that comes to mind
is like, on like the stretchwrap meat packages, they put the
sticker on that thing whereverthey want.
You know, it's never in thesame place, it's always
somewhere different.
So, like, whenever I'm at thegrocery store, I'm like picking
up the package, I'm like tryingto see how much it is and I'm
like, oh, it's over here on thisone it's kind of funny.
Uh, thank you both so much forsharing that.
(29:51):
I think that the labor aspect ofeverything is always really
interesting to me, just as amarketing person and, I guess,
myself in general.
But it makes me think ofanother kind of aspect and what
we were just talking about withlike the stickers and the labels
and everything, which is thathow much like a tolerance is
like, I guess, like being beingadaptive.
(30:13):
But like, how adaptive do thesesystems need to be?
Like I think of, you know, abakery operation.
They're running tons ofdifferent products all the time.
So if you're ever running likedifferent products through that
system, like through the samevision system, like, let's say,
like you're changing overproduct, is the vision system
also able to change over thatquickly as well?
(30:35):
That's a great question.
Spencer (30:38):
The more mix you have
in your products, it's
definitely becomes moredifficult to implement a vision
system, just because it doesneed to adapt and with the
adaptability, it needs torecognize the change Now there
are.
And, like we said earlier,control and consistency is so
important.
So if you don't have controland consistency and you're just
(31:01):
changing the product on the fly,can it detect it?
Well, if you've trained it todo so, it can.
It's not natively intelligent,but a lot of the machine
learning and AI allows you topre-classify images of different
products so that when it seesthat product, it recognizes that
(31:24):
product and then it has its ownrecipe for that specific
product to analyze it.
But it definitely adds to thecost of the project when you
have high mix and the difficultyof the implementation.
Andrew (31:39):
Thank you, spencer,
appreciate that.
That was definitely a curveball, so I appreciate you taking a
good swing at that.
Lindsay, did you have somethingto say as well?
I'm sorry, oh no.
Lindsey (31:48):
I was just gonna say I
got out of like the sales and
camera business before AI was athing.
So for us, when we answeredthat it was any new product you
had, you had to make a newprogram and then when you put in
, you know product A is running.
You put in product B, youchange the program from A to B
and that was you know, dependingon how many problems like
(32:12):
things you had.
That was kind of part of thescope that I talked about with
the pitfalls, like if you've gotso many different products,
you're having to build so manydifferent programs and some of
them are easier than others.
So yeah, I'm excited to see howAI can make this a more
achievable thing by just sayinghere are all the good pictures,
go from there.
Just saying here are all thegood pictures, go from there.
(32:33):
In terms of lighting and stufflike that.
It does make it harder when youdo have a lot of product
variability, because sometimes awhite light will work on sample
A but like a blue light iswhat's needed for sample B.
So that kind of becomes achallenge for us when people
have a lot of different colorsor size or something like that.
We have to find one system thatworks for all of these things.
(32:53):
Um, isn't always easy to do.
A lot of times, you know,sometimes need two different
lights.
That, okay, we'll turn this onefor a and this one on for b, or
two different stations.
Um, if it has to be that way,so follow up arguably nitpicky
question.
Andrew (33:10):
But um, how do the
lights like actually run on the
the machine?
Is it like a disco in there?
Are they flashing lights everytime to take a picture like, or
is it like keeping a constantflash as the product runs
through?
Is it really kind of liketrying to get like a picture
every time with like the flash,or is it really just rolling it
(33:32):
through?
I just asked the same questionlike three times.
Lindsey (33:33):
I'm sorry okay, um, it
can either one.
You can have the light oncontinuously and so it's just on
as the camera flashes.
That kind of wears out yourlight a little bit faster,
because we measure the lifetimeof a light by how long it's on,
and so a lot of times peoplewill choose to synchronize the
flash to the camera.
So every time the camera youknow there's a trigger from the
(33:55):
camera to take the picture, italso triggers the light to turn
on and you just have to makesure that those line up well so
the image looks the same everytime.
So both there's disco and justa constant light.
Andrew (34:09):
Nice I.
I guess I was also going to askhow long it takes for one of
those to burn out, becauseobviously, like I, only use the
light in my kitchen when I'm inthere.
If you're running a packagingmachine or vision system all day
or you know weeks at a time,whatever it might be, I imagine
like kind of burn through thosepretty quick.
Lindsey (34:32):
For kitchen lights,
sure, machine vision vision
lights not as quick, um they.
So they measure the lifetime ofhow much, how long it takes
from your light, because lightswill you know all leds will
eventually kind of decrease intheir intensity over time.
No led is just going to be 100and then turn off.
So it's the amount of time ittakes from your light to start
(34:54):
at 100 intensity and eventuallydegrade to about 80 intensity,
and that's about 50 to 70 000hours.
Wow, factors to that, yeah, sofactors to that is like how
bright you're using it.
Are you overdriving it?
Is it getting enough?
Um, like, if you areoverdriving it, are you giving
it enough?
Like rest time between flashes?
So there are ways that you canmake it not last very long.
But like, if you areoverdriving it, are you giving
(35:15):
it enough?
Like rest time between flashes?
So there are ways that you canmake it not last very long.
But like, if you're using thelight the way it's meant to be
made, you can get a good 70,000hours.
And if you have it flash everytime and it's flashing at you
know, 30 microseconds, that'syears of 30 microseconds every
day for 70,000 hours, you canget the lights to last for a
really long time.
So, um, yeah, that's why we kindof recommend people.
(35:38):
There's two recommendations wesay we do recommend strobing the
light in synchronization withthe exposure time of the camera.
Um, so it's just on less.
But we also recommend startingyour light at about 80%
intensity.
So you know, eventually, if youhave it, we have our light.
That's 100, right, you knowit's at its peak performance,
but we only have it set to 80 asthe light degrades over time.
(36:01):
We just turn up the intensityof the light, so our image is
still the same intensity.
We don't have to, so we canextend how long it takes until
we have to replace the light soit's a really smart approach.
Andrew (36:13):
I was actually about to
ask also how that degradation of
the light plays into thereplacement of it, I guess,
which is kind of to the pointthat you were making earlier
about knowing when somethingmight be wrong with the machine
and fixing it right.
Then and there, I'm sure, oncethe pictures start getting, you
know, darker or whatever itmight be, it's like all right,
(36:34):
might be time to change thelight.
But that's a really, you know,that's almost like a
MacGyver-esque fix right there.
I like it.
Yeah, good, did you have athought there, spencer, at all.
Sorry if I'm keeping you in thecorner on this one.
Spencer (36:50):
No worries, that's
definitely more Lindsay's area
of expertise.
The lighting I know on theirintegrated lines the lights can
be on consistently and then forsome of our standalone devices
they strobe or trigger.
So yes, we're using both,depending on the machine.
Andrew (37:14):
So I wanted to kind of
get a little bit back into the
labor realm a little bit, butmore talk less about like the
labor advantages that likeautomation typically gives some
more about like that operationor operational training aspect.
Or you know, just the peopleusing the machine, like how easy
is it to just walk in, ispretty much just like, hey,
here's your vision system, youknow, you pat it on the hood and
(37:34):
then you walk away.
Or is there like a big trainingperiod that kind of goes on and
teaching people how to use thisequipment and how long before
it's like installed on theirlines, would you say that
they're, you know, effectivelyusing it properly?
Spencer (37:50):
I can take this one
With our equipment that's
designed for our thermoformingmachines.
The software is very intuitive.
You can teach someone how tomake new recipes or adjust the
recipes for different productsvery quickly, with sliders and
(38:15):
adjustments made directly to theimage that is captured.
It comes back to what Lindsaysaid about the scope, though If
the system is designed to meetthe scope it has the built-in
flexibility in the software tomeasure everything that they're
trying to measure or capturewhat they're trying to capture,
(38:35):
and then the recipes can handlethat.
Sometimes, if I have a systemthat's designed to do more label
inspection but all of a suddensomeone's like hey, can you do
product inspection on that?
Well, we can, but we're goingto have to modify the software
at a level that's not typicallywhat an operator would be able
(38:56):
to handle.
So it depends on what you'retrying to do.
But basic operation you canteach in a couple hours or less.
The actual control of gettingthe system ready to run,
understanding when there'sfailures and understanding which
pocket they correlate to theactual operation of the machine,
(39:19):
is very easy, very intuitive.
It's customization that can geta little more difficult, but we
like to make it easy to do inthe recipe modification and,
depending on your userprivileges, you can make those
adjustments right at the HMI.
Andrew (39:45):
Very cool.
Now I also imagine that there'sa layer of complexity in
building the system so that itdoes have a friendly interface
as well.
Is that part of the challengein building the vision system is
making kind of tailoring it tothe client, or is it really just
like very similar standardinterface that they use across
all the vision systems?
Spencer (40:07):
It's definitely.
That's why we have anengineering phase and we design
the software for their specificapplication, so we can handle
anything they throw at it therecipe adjustment, the tolerance
adjustment is able to be doneon the fly after the fact.
(40:28):
So definitely defining scope isimportant, cool.
Andrew (40:36):
And just a quick side
note, I love that you call them
recipes.
I think that that's very fun.
I'm all about good namingconventions and that's uh,
that's a good one.
No offense to engineering guysCan't usually account for the
creative fun stuff there, butthat's a good one.
Stuff there, that's a good one.
(41:00):
Um, so before I ask you both Iguess about like, what you're
expecting for, like the futureof vision systems, I wanted to
talk just real quick about, um,a little bit more on that ease
of implementation, which is whatwould like, uh, if you're
coming to meet a customer forthe first time and they're, you
know, framing that scope in theproper way, what does that
process of going through andactually implementing that
(41:21):
vision system look like, from,like, the time you meet the
customer to like that createphase, engineering, phase,
whatever it might be?
I'm just curious what thetimeline might look like.
I imagine it differs by productand customer and scale scope,
like you were saying, lindsay,but I'm just kind of curious
what that process looks like.
Lindsey (41:41):
I think I'll let
Spencer take this one.
Spencer (41:46):
Okay.
So I mean we're engineering asystem.
So it comes to definingperformance criteria and having
your constraints as well,defined ahead of time.
So if the customer has a URSthey can provide.
That helps go a long way withto say what are you inspecting
(42:09):
for, what are your tolerances onwhat you're inspecting?
Because that's when we startspeccing out the cameras for
resolution.
Because if they are justlooking at label placement in
terms of orientation of thelabel on a larger package, well
you could, with a large degreeof contrast, get a pretty high
accuracy with a lower resolutioncamera.
(42:31):
If they want to look at a QRcode that's only two centimeters
by two centimeters in footprint, you're going to need a higher
resolution camera just to haveenough pixels correlating to the
different little squares thatmake up the QR code.
So it comes with defining theproblem first of all.
(42:54):
From there we spec out theequipment so that we are sure we
can keep up with the throughputthat is required so that we can
take the pictures fast enoughand process them quickly enough.
After that, once we'recomfortable that we've specced a
system that can handle theirtype of inspection, then we can
(43:16):
get a budgetary quote togetherand we'll provide the budgetary
quote or a formal quote,depending on how well we've
defined the project.
And if they decide to go withus, then we move into the
engineering phase of checking tomake sure that everything is
correct.
And then we'll release adossier that defines what the
(43:40):
system is going to be veryspecifically to the customer and
then we start fabricating thesystem.
Then we start fabricating thesystem and once the system is
put together we'll have a FAT, afactory acceptance test, where
we prove that it can handle thequality inspections with the
(44:00):
proper performance that they'verequested and the system works.
And then from there we willdeliver the system.
It'll get installed on site.
Sometimes they'll do a siteacceptance test as well and the
whole process really depends on,in terms of timing the quote
(44:25):
process can be.
We can have a quick turnaroundif we can define the problem
quickly enough.
So that can happen in under amonth or a couple of weeks.
Let's say, if it's a standalonesystem that is doing a very
typical application or a typicaltype of inspection, we could
have a quote in under a week.
But the actual installation ifit's on a line that has a lot of
(44:47):
automation equipment I don'twant to it can be months before
it gets installed because of theother equipment you're
integrating it with.
Lindsey (45:00):
I also want to jump in
there.
When you talked about speccingthe system, getting the right
camera resolution, our end isgetting the right lighting as
well.
Our end is getting the rightlighting as well.
If you're just looking at abarcode, it's pretty simple with
just a bar light.
But if you want to see aspecific defect, sometimes you
might not have the exact lightthat it needs to solve that
defect.
And then people come to us.
(45:21):
That's when CCS gets involved.
People say, hey, I have thisproduct, I need to see this.
Can you help me pick a lightfor it?
So then'll say, yep, you needthis light at this working
distance.
Here's your quote.
You'll toss it into your quoteand then keep going on.
If that doesn't happen andyou're at the engineering phase
and all of a sudden it's notworking.
Maybe on site or something I'vegotten called in before where
(45:42):
the machine needs to bedelivered and they have
everything and it's not workingthe way it was supposed to work,
which I've brought up in thethe webinar, and it's in that
scenario they use parts that hadoxidized over time.
So the contrast, um, when theywere testing in the lab, was a
lot higher than like the fresh,freshly um, you know pieces that
were coming out of the line andso we had to kind of retrofit
(46:02):
and figure out a new solution,um, which usually doesn't happen
.
I think what you guys doprobably doesn't come up that
often.
But yeah, bringing in lightingearly, um during that speccing
phase, making sure you've gotthe right light there, really
simplifies it later on in theengineering phase.
Andrew (46:17):
Yeah thank you both so
much.
I mean, uh, I wanted to ask you, lindsay, is there kind of like
a?
You know, we have like themodels of machines where it's
like, hey, this is really goodfor meat packaging or this is
really good for bakery.
Do you have like different kindof lights where it's like, hey,
for dry goods, this is exactlywhat you want, but for something
that's got a little bit of aglisten to it, like a poultry or
(46:40):
a meat product, like you wantto use this kind of light
instead?
Is there really that kind ofapples to apples comparison
there?
Lindsey (46:56):
For a few like a bar
light is going to be kind of
anything.
But we have this flat domelight, the LFXV, and it just
does an amazing job at likeflattening images.
So if you're trying to see likeI actually have one here which
is really silly but if you'retrying to see the barcode on
this can of food, it can be hardwhen you've got the different
shadows.
You know you might like lose aletter at that height change.
But the lfxv just like flattensit so well and it does it
perfect every single time andit's like my favorite machine
vision light because thecollimation is really cool.
(47:18):
I didn't think I'd ever havelike a favorite machine vision
light I grew up.
But here we are, um, so youknow there's some where if
you're trying to flatten thingsyou use like a dome light.
If you're trying to see heightchanges, you use like a low
angle light.
So there's principles to it andthe physics of light and how
they work.
But the lfxv is just alwaysreally good at like can
(47:38):
inspections, what we go to, orlike looking at curved surfaces
and making them the light wraparound so you can see more of it
.
They do really well on that.
Andrew (47:47):
It does well on that as
well does using like a different
and I apologize if neither ofyou can answer this question
effectively, but does using adifferent camera often help
sometimes?
Is there an advantage to havinga fisheye lens sometimes versus
a normal standard definition?
Maybe?
Spencer (48:07):
I know in our
thermoformers if we're doing a
seal inspection it's often verybeneficial to use a telecentric
lens and otherwise we're justusing other standard fixed focal
length lenses.
But the lens selection isimportant just because the
telecentric lens.
(48:27):
I can't get into the detailsexactly of why it works, but it
essentially captures the rays,of the LEDs, the light, all
parallel to one another.
It doesn't have the distortionthat you get from another lens.
Andrew (48:47):
We just had a friend
join us For some reason the way
it didn't work.
Lindsey (48:51):
You can't stop cats on
a counter.
We just had a friend join us.
I tried to push him away.
It didn't work.
Andrew (48:52):
You can't stop cats on a
counter.
They will knock over whateveris the most threatening object
up there.
Lindsey (48:59):
Yeah, knocked over a
plant yesterday.
That was super fun.
Sorry, hopefully we can cut.
Andrew (49:05):
No, it's all good.
It sounds like Spencer is aboutto break an NDA anyway, so I
think that was for the best.
Lindsey (49:13):
I can touch on some
lens stuff though?
Yeah, sure it is.
You know you talked abouttelecentric and using other
fixed focal length.
Even within fixed focal length,like an 8mm lens will kind of
look down a cup a little bitmore than like a 50mm, will be a
little bit more telecentric andlike look kind of more at the
top of the part.
So you know telecentric versusfixed focal length is important.
(49:36):
But also an eight millimeterversus a 50 millimeter lens will
give you a totally differentimage as well.
So the camera not as muchnecessarily, but uh, definitely
the lens has a an importantfactor as well.
Andrew (49:51):
Well, I really
appreciate both sharing that.
The lens and lighting aspect iswhat really interests me about
the vision systems and then, Iguess, using that to leverage
the smart data around it, whichI find really interesting as
well.
But I'm curious from both ofyou what you think the future of
vision systems look like,whether that's future
(50:13):
developments within theseapplications or cameras getting
better, machines getting faster.
What do you both see happeningfrom the lens that you both come
from and I guess, what thosechanges mean?
Lindsey (50:29):
You can start, Spencer,
if you want.
Spencer (50:32):
I definitely think it
goes back to your question about
adaptability of the system.
With advancements in AI andmachine learning, I think the
software will be able to adaptmore readily without having to
do manual selection of recipes.
So I think that's probablygoing to be the most significant
(50:54):
change.
And then the other thing ispotentially more 3D kind of
vision technologies that aredoing actually understanding the
depth of the product, dependingon what type of camera it's
using.
So that's what I would say.
Lindsey (51:17):
All right, cool, I
would definitely agree with that
.
On cameras, I'm really excitedto see what AI does For lighting
.
I've kind of noticed peoplejust want it to be easier and
easier.
Just let me plug my light inand have it do what it to be
easier and easier, you know,just let me plug my light in and
have it do what it wants to doand like have it work.
So, um, I think having, youknow, maybe smarter lights that
(51:39):
have built-in drivers, stufflike that, um, will be a thing
of the future, and just tryingto make light as easy as
possible for people is what theywant out of lighting, which
isn't surprising, to be honest.
No, not at all.
Andrew (51:54):
I mean, I know, one of
the things that we talk about
all the times at HardPak andit's like one of our big selling
points is that you're not justbuying like a machine, you're
buying like a smart, connectedecosystem of equipment.
And I think that as visionsystems adapt from my 50 minutes
of knowledge that I have nowwe'll probably start to see a
(52:15):
lot more of that.
You know connectedness betweenall of the lines and not just
between your own operations.
But imagine if you haveoperations around the country,
right, and you can take, youknow, your Thermoformer
application that's out inCalifornia and then start to
compare imagery with a verysimilar solution from over there
, and then when you decide thatyou want to implement that same
(52:38):
vision system in a new location,it's way easier because the
bones are already there.
So I'm sure that already existssomewhat now, but I am curious
to see how that changes goingforward now, but I am curious to
see how that changes goingforward.
Well, I think that that takes usto the end of our episode,
unless either of you have anylike burning hot topics in
(53:00):
vision or lighting that youwanted to share uh, you know
you'd think, but I don't well,I'll make sure I get your mount
rushmore of favorite lightingsystems.
I think you said it was the lx9that was your favorite lfxv
lfxv.
Lindsey (53:17):
Sorry my bad, yeah you
got to get a play around with it
, man.
Move the lighting workingdistance, it'll change your
whole image and your world.
Andrew (53:24):
I'll play around if I
can get my hands on one.
Well, I appreciate you both somuch.
Jumping on this call today.
This is definitely a little bitmore outside of my wheelhouse
than a normal episode, but Idefinitely learned a lot and I
appreciate you both dumbing itdown for me, but also just like
sharing your perspectives,giving such an honest take on
(53:46):
the industry, what to expect,what some of those pitfalls are.
So, yeah, really appreciate youboth so much.
Lindsey (53:53):
Thanks for having me
too.
It was great to be here andshare, you know, information
I've gathered over my wholecareer.
So, yeah, thanks for letting metalk about it too.
Andrew (54:03):
For sure.
Spencer (54:04):
Thank you for having me
on, andrew, and it was great
talking with you, lindsay andAndrew.
Andrew (54:09):
Yeah, don't worry,
spencer, you're not off the hook
forever.
I will get you on anotherepisode eventually.
Again, lindsay's probably gooduntil year five of Rappy Hour.
But if you have not tuned in,if you're not familiar with CCS,
definitely check them out.
Check out that webinar thatthey did about vision systems
and the common pitfalls.
It's really insightful andengaging and it was part of what
inspired me to want to do thisepisode.
(54:30):
So I really appreciate CCS forputting that on and I hope that
you learned something today.
Remember that episodes are onSpotify, apple Music, all those
streaming platforms alsoavailable on our website, right
next to our newly launchedwebinar series.
So feel free to check that outand we hope to see you on the
(54:53):
next one.
Thanks so much for listeningand have a good one.