All Episodes

July 14, 2025 23 mins

What if you could track a plant’s health and growth every 15 minutes, all automatically and without ever touching it?

In this episode of Inside IALR, Dr. Scott Lowman, Vice President of Applied Research at the Institute for Advanced Learning and Research, explores the SMART Platform—IALR’s Spatially and Mechanically Accurate Robotic Table system. These high-tech tables combine robotics, precision imaging and automation to capture tens of thousands of data points per experiment, helping researchers analyze plant growth, stress response and even subtle movements in real time.

Learn how the SMART Platforms allow for entire plant life-cycle testing for beneficial microbes and enable real-time monitoring of plant health. You’ll hear about how interns have played a central role in coding and refining the system, how companies can contract research on the tables and how this technology is helping lay the groundwork for more sustainable agriculture.

Whether you’re into agtech, robotics, plant biology or data science, this episode connects it all. Plus, you’ll hear how this one-of-a-kind platform is opening doors for students and researchers alike.

🔍 Topics Covered:

  • What makes SMART Platforms unique
  • How 80,000+ images become meaningful plant health data
  • Intern-driven innovation in Python and computer vision
  • Industry collaboration and commercialization opportunities
  • The future of AI in agriculture and early stress detection

The Institute for Advanced Learning and Research serves as a regional catalyst for economic transformation in Southern Virginia. Our services, programs and offerings are diverse, impactful and far reaching.

Get updates from IALR on other channels:

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Caleb Ayers (00:09):
Welcome to another episode of Inside IALR.
We've had a few weeks off or Iguess more than a month off at
this point, just with the summerholidays and vacations and all
of those sorts of things but weare back and ready to keep
telling the stories of kind ofwhat's going on here in southern
Virginia, what's going on hereat the Institute for Advanced
Learning and Research, and todaywe have Dr Scott Lohman, who's

(00:30):
our Vice President of AppliedResearch, here.
Dr Lohman, thanks for beinghere.
Thank you, you are by far oneof our most persistent podcast
guests.
I don't even know what numbertime this is for you, but you
have been an excellent guestevery time you have been here.
Time this is for you, but youhave been an excellent guest
every time you have been here.
So the main thing I wanted totalk to you about today is kind
of our plant imaging platformsthat you have helped run for, I

(00:51):
mean, I think, 10 years at thispoint.
So tell us a little bit about Iknow they're called SMART
platforms.
We love acronyms here.
Tell us what a SMART platformis.

Scott Lowman (01:01):
So our SMART tables or SMART platforms
platforms are the acronym standsfor spatially and mechanically
accurate robotic tables, andwhat that means is they're.
They are extremely precise intheir location, so if we program
it to go to a certain place,the tables are five foot by ten
foot and just for instance, justimagine that we have 20 plants

(01:25):
on the table and we can programit to go to each one of those
plants, and when you say it, youmean a camera.
Yeah, I'm sorry.
Yes, the head is called agantry robot.
We can program that gantry headof the robot to go precisely to
each of the 20 plants.
The head has a camera mountedto it.
It can be changed, it can bedifferent types of cameras and

(01:49):
it captures that image.
The really defining factorabout the tables is that it's
accurate to within about athousandth of an inch, so it
goes almost exactly to the samespot each time and what that
gives us the ability to do is tocapture that image from the
same spot and eventually we cantime-lapse those images.

(02:12):
It looks like there's a cameraabove the plant the whole time,
so it seems like there's 20plants and 20 cameras on that
table.

Caleb Ayers (02:20):
But it's just the one that's going back and forth
and taking the pictures of eachone.

Scott Lowman (02:24):
Yep, and it does that.
It can do it every 15 minutes.
And what that does is it givesus lots and lots of data for
plant science experiments.
In the old days it used to beyou'd plant the plant, you'd
water it, you'd maybe take somemeasurements with some rulers or
some tape measures or something, and then, at the end of the
experiment, you would pull theplant up and weigh it.

(02:45):
This gives us a lot more datain between.
It really gives a lot ofinsight into how the plant grows
and behaves under differentconditions.
So where did this?

Caleb Ayers (02:55):
idea come from.

Scott Lowman (02:56):
The idea initially started at Virginia Tech with
Dr Jerzy Novak and Al Wicks fromthe Mechanical Engineering
Department.
They had worked Al Wicksspecifically had worked with
imaging with the military andthey used those tables to image
different things that themilitary was working on.

(03:16):
So he and Yersey Novak, who wasa former head of the Department
of Horticulture and also animportant member of the original
team that set up the facilityhere at the Institute, saw an
opportunity with them in plantscience.
The former director here at theInstitute was Barry Flynn.
He was involved in that as well, and those tables were brought

(03:40):
down about the time I finishedmy PhD work here at the
Institute.
So I was fortunate enough to beable to develop those through
my postdoc work and thencontinue to develop those.
The initial tables and theinitial concept was to put them
into greenhouses and we foundout pretty quickly that they
weren't really meant for highhumidity environments and so

(04:05):
they started to rust anddifferent things, and so about
six years ago we completelyredesigned them.
We designed them to be mainlyaluminum, rubber or stainless
steel so they aren't susceptibleto those same challenges with
humidity.

Caleb Ayers (04:24):
So you mentioned at the beginning you know it's
this camera's going throughevery 15 minutes taking a
picture of every plant.
Tell me about kind of what atypical experiment looks like.
What kind of data is itcollecting, how much data is it
collecting, what's happeningwith that and then what are you
doing with that information?

Scott Lowman (04:38):
So many of our experiments are focused on plant
biostimulants, looking at whichdifferent types of
microorganisms can increaseplant growth.
So if you think about that typeof experiment, it starts out as
we start a lot of little plants.
We pick plants that are uniformin size because we want to

(04:58):
start off pretty similar withall of them.
So say it's 80 plants.
We may start 160 plants andpick 80 that are almost exactly
the same size.

Caleb Ayers (05:07):
And because the table can handle up to 80, right
.

Scott Lowman (05:10):
Even more in reality, depending on the type
of systems.
Back when we first started, wewere using floating tobacco
trays which about a one foot bytwo foot tray could hold 200
plants.

Caleb Ayers (05:21):
Oh, wow.

Scott Lowman (05:21):
And it's a time so we could get about 2,400 plants
per table Wow.
But today we mainly focus onindividual plants.
And it's a time so we could getabout 2,400 plants per table,
but today we mainly focus onindividual plants.
We have each plant gets animage of itself, and that
promotes each plant being a sortof an experimental unit, so we
can take averages from fivedifferent plants and put them

(05:41):
together and do statistics onthem's.
Harder to do when you havepopulations right that makes
sense.

Caleb Ayers (05:47):
So you're collecting that data over the um
, over the experiment.
So then, kind of what's what'sgoing?
Yeah, I guess, what data areyou collecting?
What's what's going on there?

Scott Lowman (05:55):
yeah, so after I should have finished last time,
I apologize.
Uh, we, after we plant theplants on the table, um, we
program the table to go aboveeach plant.
The neat thing about the tablesis that when we start the table
and it goes to the first plant,it creates a folder for that
plant.
So the image of that firstplant creates a first folder and

(06:19):
then that plant all alongduring the whole experiment has
its own folder on your desktopand the images go into it and
then the tables run, the plantsgrow, images captured of every
plant every 15 minutes, and thenat some point we can introduce
an experimental variable,whether it be a biostimulant, so

(06:39):
a microbe that may increaseplant growth, or maybe something
else that may increase plantgrowth, or maybe something else.
It may be taking water away, ormaybe introducing some other
nutrient or something thatwasn't there before, or taking
another nutrient away.
What happens is that makes itdynamic.
So the plants are growing,they're happy.
You've got a population of 80plants.

(07:01):
They're very happy and growing,and then you introduce
something to it.
You introduce something toabout half of those plants, so
40 plants or something'sintroduced to, and then we're
able to see precisely whathappens with those 40 plants
versus a plant that doesn't havethat variable, and that gives
us a lot of power because we cantell almost immediately what's

(07:22):
happening.

Caleb Ayers (07:22):
Yeah, and as you said, with traditional
measurements you're morefocusing on what's happening at
the end, whereas with thisyou're getting that almost
real-time data every 15 minutesto show when those effects are
taking place.
I imagine that's a lot of whatyou're looking at.

Scott Lowman (07:36):
It is, and a lot of valuable information can be
contained in that data set.
So, for example, we've had somebiostimulants that really
promote a lot of plant growthand when they do that, as the
plants get larger, they requiremore water.
Well, that's typically not so.
We don't try to adjust for thatwater need.

(07:57):
We keep giving them the sameamount of water.
But imagine the larger plantstarting to run out of water.
It'll stop growing as fast.
So when you go back and look atthe data and the curves of the
plants, the plotted out dataitself, you can see that right
away the plant gets really bigreally fast, but then it stops

(08:19):
growing and it comes down and bythe end of the experiment it's
the same size of the rest of theplants.
Now, what's important aboutthat is, if you were just
weighing it at the end you wouldthink nothing happened.
But with all this data we cansee that something did happen
and we need to go back and focuson that component of it.

Caleb Ayers (08:36):
So you get data that before was missed, right,
and so you mentioned you knowthat each plant has its own
folder on the computer.
So you're talking about ifyou're running 80 plants, that's
80 separate folders.
You're talking about a pictureof each every 15 minutes.
How long is a typicalexperiment?
Several weeks, right?
Most experiments are three orfour weeks, okay.
So if you're talking about animage every 15 minutes over the

(09:02):
course of four weeks, that'shundreds and hundreds of
pictures for each plant.

Scott Lowman (09:03):
Yep, yeah, some experiments generate about
80,000 images.

Caleb Ayers (09:07):
Wow, okay, so 80,000 images.
I know you guys don't have timeto manually sort through 80,000
images, so tell me kind of whathappens with all of those
images, with that data thatmakes that usable information.

Scott Lowman (09:20):
Yep.
So we use Python, theircomputer vision suite, and
sometimes we also use an NSFplatform called ImageJ to
analyze the plants.
And what that software does is,of course, it was developed
in-house, mainly by interns, andit goes into each folder, looks

(09:45):
at each image, it separates thegreen plant from the background
, it counts the number of pixelsthat that plant is, that that's
on the picture itself, and thenit puts that data into an Excel
spreadsheet so we can tellexactly down to the pixel, how
fast the plants growing.
So it's not a measurement likein inches, it's the number of

(10:07):
pixels.
The more pixels, the bigger theplant is.
And we've gone back and doneexperiments where we show and
prove that, yes, a bigger plant,the more pixels are bigger
plants, they weigh more.

Caleb Ayers (10:20):
That baseline you're saying has already been
proven that this technologyworks to be able to measure it
this way.

Scott Lowman (10:25):
Yeah, and the specific plants that we look at,
there are some plants that arenot as appropriate, so plants
like grasses that grow straightup and are slender.
It probably doesn't catch dataas well, but most other plants
that have leaves plants whenthey grow they're looking for
the sun, so their whole purposein life is to be exposed as much

(10:50):
of their leaf surface to thesun as possible, so that's a
good measure of plant size,right?

Caleb Ayers (10:55):
You mentioned interns.
Tell me kind of about the workthat interns have done.
I know I mean you bring ininterns every summer to kind of
tweak and improve and work onthese things.
So tell me about some of theenhancements, the improvements
that interns have worked on onthese platforms over the summers
.

Scott Lowman (11:11):
I wish I could take a lot of credit for the
developing the software side andeven the control side, but that
almost all of that creditshould go to interns and other
students we've had working onthose tables.
I'm an older person, you know,I grew up in the 70s.
I'm not scared of computers.
I had a computer back then.

(11:32):
But to be able to really diveinto Python Python is a language
and it's something that'sbetter and oftentimes younger
adults are easier it's easierfor them to grasp it and use it.
So over the years, the controlsystem, how to sort the images

(11:56):
into folders for each plant, thegraphic user interface what we
call a GUI that we've created torun the tables.
That's straightforward.
Now it's easy to use.
This summer we have two internsworking on it that are even
making the analysis part veryeasy to use.

(12:17):
Before you had to go into youhad to really program, you had
to put it into the code itself.
Now there's going to be sort ofa graphic user interface that
you just pick your folders andit does all the analysis itself.
So practically all of thesoftware side has been developed
by either interns or students.

Caleb Ayers (12:36):
Yeah, that's really cool Because I know I mean
that's a meaningful experiencefor them, that they can walk
away, you know, saying that theycontributed to a very I mean a
very cool project.
I'm sure every intern thatwalks in there when they learn
about that is excited about whatthey get to work on.
So you mentioned, you know,that a lot of this goes with our
biostimulant and biocontrolagent research where we're, you

(12:56):
know, we're looking at how thosemicroorganisms that we have in
our plant endophyte researchcenter impact plants again, not
just at the end but over thewhole process of their life.
But I know we also can runexperiments for companies when
they have products that theywant to test.
So tell me about that side ofit.
If a company is interested inkind of, I guess, commissioning
an experiment on our smartplatforms, what does that look

(13:18):
like?

Scott Lowman (13:19):
Well, first of all , I can tell you that most all
companies that come through thecenter and are interested in
contracting us for plant scienceresearch are interested in
those tables because it's such aunique platform.
Not only does it provide youspecific data all through the
growing process, it also givesyou lots of great images to use

(13:40):
in marketing and in sales.
So imagine being able to notonly show a new customer that of
course, your product makes theplant larger in data or in
charts, but imagine being ableto show them time-lapse images
of the plants getting biggerversus the control plants.

Caleb Ayers (13:59):
It really is a rich platform for generating data,
both on the solid data side, butalso data and images, and stuff
for marketing, and youmentioned a few variables,
obviously, if you're giving adifferent treatment or putting a
different product in the plant,reducing or increasing water,

(14:27):
what other kind of variables canyou test with with these?

Scott Lowman (14:29):
Mostly it's nutrients.
It's the different types ofbiostimulants we use.
So we use some biostimulantsthat increase hormone levels in
the plants to make the plantslarger.
Some of them fix atmosphericnitrogen.
Some of them provide otherphosphate or other nutrients to
the plant like phosphates.
Those are the main ones so far.
We haven't really got intolighting yet, so changing the
different wavelengths of light,we could do that.

Caleb Ayers (14:51):
You could do that for each like, for different
sections of the table.
You could change the lighting.

Scott Lowman (14:55):
No, what we would do was we'd change the light,
for different tables would havedifferent lights on them and
we'd look at how they grow.
There's other things.
The range of different thingswe could do is almost infinite,
because you can think of thedifferent types of cameras you
can use.
You can think of evenexperiments.

(15:16):
Like people say, plants growbetter when they have music
playing.
So we could look at that if wewanted to.
Most interestingly, though,we've been looking at plant
movement.
So as plants grow, they kind ofmove back and forth.
We think we hypothesize thatthat's just a natural way for
them to find light and shade.

(15:37):
So most plants, as they'reyoung, they're in shade of some
type, so they're always lookingfor light.
But we've been able to relateand connect that movement to
plant health.
So a healthy plant moves more.
That's growing more.
If we introduce something tothat plant, like a drought or a
pathogen, the movement stopsalmost immediately.

(15:59):
So it's a great indicator ofplant health.
Even before we can see theplant turning yellow or
shriveling up or whatever youwant to call it, we can see that
that plant, something's goingon and we need to fix it.

Caleb Ayers (16:12):
And if you're tracking plant movement as
opposed to plant size, that's awhole different thing that the
computer's looking for at thatpoint.
Right yeah, that's a wholedifferent software set.

Scott Lowman (16:20):
It's a whole different thing that the
computer is looking for at thatpoint right.

Caleb Ayers (16:21):
Yeah, that's a whole different software set.

Scott Lowman (16:21):
It's a whole different code that's developed
to be able to do that.
We've done that multipledifferent ways.
There's lots of ways.
Just looking at two imagescompared to each other, two
subsequent images compared toeach other, and how much pixels
are different.
That is one way to look at it.
Another way we're looking at itmore recently is tracking the
tips of the plant.
So as the tips are movingaround, we can quantify that.

(16:45):
And, of course, plants that aremoving less, their tips move
less Right.

Caleb Ayers (16:49):
Yeah, that's fascinating and we've already
touched on this some.
But as far as the smartplatforms themselves and kind of
what types of information theylet you get, you know.
You said traditional researchyou're generally looking at, you
know, maybe a few measurementsthroughout the process or a few
hand measurements throughout theprocess and then kind of the
weight at the end.
This you're looking at, thewhole life cycle worth of data.

(17:10):
What other insights andanalysis are you able, are you
guys able, to do once you havethat data set?
You know, obviously you saidthat the computers get spitting
out.
Okay, here's how many pixelsare in each image, but what are
you able to do with thatinformation?

Scott Lowman (17:23):
We're able to plot it.
That's the main thing.
We can plot it and see anddetermine at what point there
are statistical differences.
So again, when you have it, inthe old days you would just have
the weight at the end and youdetermine if that weight was
different whether they werelarger or smaller or whatever
from the control group.
You determine if that weightwas different, whether they were
larger or smaller or whateverfrom the control group.

(17:43):
In this way we can look at itand we can tell when that
happens within the plant growth.
So it's not just at the end,you can tell on day 32 those
plants are significantlydifferent.
So we can tell how fast theresponse is.
Again, that's a dynamic type ofmeasurement that's happening
over time.
We can start experiments withall plants the same and
introduce something to them.

(18:04):
We can also start an experimentwith all plants the same and
take something away from them,whether it be water or nutrients
or something like that.
So it just gives just a ton ofdata that's really nicely
illustrated and it's somethingthat everybody we present it to
says it's unique.
So it's not just us saying it'sunique, it's others saying it's

(18:27):
unique.
Virginia Tech we've got one isbeing ordered now by a community
college partner to use.
We've got in talks with anotherinstitute of technology I don't
want to say which onespecifically, but they are
interested in the table as well,and I think, as we continue to
refine the table to make iteasier to grow to make it easier

(18:50):
to not necessarily grow, tomake it easier to operate that
we're going to have more ofthese opportunities, and what
that does is it gives us anetwork of collaborators that we
can then leverage for grantsand other things as well.

Caleb Ayers (19:06):
And I know you recently went to Georgia Tech to
kind of talk about this stuff.
I mean, when you go to thesetypes of events to talk about
this specifically, this plantimaging platform, what's kind of
your pitch?
As far as you know what makesthis unique?
Why would companies oreducational institutions or
technology institutions like,why would these different
organizations want to get in onthis?

Scott Lowman (19:28):
Yep.
So the main thing and one ofthe unique things about the
institute and our researchprogram is that we're
multifaceted.
We have plant scientists andwe're also working with robotics
and computer vision.
Many of the researchers Iinteract with are either
engineers, computer scientists,and we're also working with
robotics and computer vision.
Many of the researchers Iinteract with are either
engineers, computer scientistsor plant scientists.
The plant scientists don't do alot of robotics.

(19:49):
The engineers don't have theplant science expertise.
So I see, when I go to thesetypes of conferences, I see
engineers have these justwonderful ways to image plants,
to gather data.
It's really incredible.
And then you mix in artificialintelligence.
Now it really takes it toanother level.
However, they don't have goodcontrolled plant science

(20:12):
experiments.
When we have plants that are,as a plant scientist would know
how to do.
So what we do?
We're able to do both.
We're able to have really goodexperiments that are controlled.
We can use statistics likerandomized complete block design
.
We can really do a lot on thedata side and then we have all

(20:32):
those images to go with it andwhat we find is that those
computer scientists andengineers are very interested in
those data sets.
So it's something that we canuse and leverage for
collaboration, mutuallybeneficial collaboration.

Caleb Ayers (20:47):
Yeah, that's really cool and I think, as you were
talking about, kind of thedifference between, you know,
the plant science side and theengineering side and you all
being able to kind of bridgethat gap and have both.
I think that kind of speaks tothe Institute for Advanced
Learning and Research as a whole, as we kind of see ourselves as
being able to bridge the gapbetween a lot of different types
of organizations and a lot ofdifferent, a lot of different

(21:09):
industries, public and privatesector.
You know, all of those thingsthat we can kind of bridge the
middle and help connect thosethings that usually are a little
more separate.
That's all the questions I have.
Like I said, I think thesethings are very cool.
Every time I go in one of thelabs and look at them and watch,
you know, watch the camera rolland look at the, look at those
time lapses that you all puttogether.
It's just really really coolpiece of technology.

(21:30):
But I mean, is there anythingelse that that you would want to
add or think it's importantthat people know?
It's always important to to.

Scott Lowman (21:36):
To finish with the big picture, and the big
picture is, as we're developingthese tables, we're developing
the computer vision and, ofcourse, we can feed those into
artificial intelligence modelseventually, as our program
continues to grow.
Just think about, in the future, a robot being able to go
through a field, sit still for afew moments and determine if a

(21:57):
plant is healthy or not, basedon its movement or based on how
fast it's growing per minute,something that, while farmers
are terrific, the human eyecan't really detect.
That we have to be able to havea computer be able to condense
that, to have that information.

(22:18):
So it's really a pathway toearly detection of plant stress,
and if you can detect plantstress early, you can fix it and
then you don't lose yield.
We have tremendous challenges,both here in our country and
worldwide, with feeding thepopulation that's coming up.
So by 2050, they estimate theworld population is going to be

(22:40):
about 10 billion people.
To feed them, we need to almostdouble agricultural production
per acre.
So all hands on deck inagriculture.
Focusing on that, because it'sonly like 25 years away and in
the world of agriculture 25years is not that long so it's
just another tool for help tohelp with sustainability and to

(23:01):
increase yield to feed theglobal population.

Caleb Ayers (23:04):
Yeah, well, again, it's all about impact tonight.
Yeah, appreciate you bringingthat in.
So thanks for being here,appreciate it, thank you.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.