Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Tom Hagy (00:01):
Welcome to the
Emerging Litigation Podcast.
This is a group project drivenby HB Litigation, now part of
Critical Legal Content, and vLEXCompanies, Fastcase and Law
Street Media.
I'm your host, Tom Hagy,longtime litigation news editor
and publisher and currentlitigation enthusiast.
If you wish to reach me, pleasecheck the appropriate links in
(00:23):
the show notes.
This podcast is also acompanion to the Journal of
Emerging Issues in Litigation,for which I serve as
editor-in-chief, published byFastcase Full Court Press.
Now here's today's episode.
If you like what you hear,please give us a rating.
Today we're going to talk aboutwhat happens litigation-wise
(00:45):
after you've violated biometricprivacy laws.
From fingerprints to face scansto DNA, an individual's
biometric information is beingincreasingly collected and
increasingly used in anincreasing number of ways.
It's used in security andauthentication for smartphones
and other devices throughfingerprints, facial recognition
(01:07):
, iris scans.
Did I say iris scans?
Excuse me?
I meant iris scans, as ineyeballs.
Nothing to do with our irishfriends.
That would be discriminatoryBiometric information.
It's used for access control tosecure areas and buildings, like
(01:27):
offices, labs and data centers.
It's used in financial servicesto secure customer information
identification and to authorizetransactions.
It's used in law enforcementand public safety, like
fingerprinting and facialrecognition and DNA profiling to
identify suspects and solvecrimes.
It's used in Border Patrol toverify the identities of
(01:51):
travelers through passports withembedded biometric chips,
e-gates and visa applications.
It's used in healthcare toidentify patients and control
access to sensitive medicalrecords.
It's used by employers to tracktime and attendance and for
workplace safety, restrictingaccess to spaces designated for
(02:12):
authorized personnel.
Retail and e-commerce companiesuse it to secure online
payments and point-of-saletransactions.
Facial recognition is used topersonalize customer experiences
and streamline checkouts.
In education, schools usebiometrics for student
identification, secure access tofacilities and security during
(02:34):
exams.
Government and civil serviceshave biometric national ID
programs to provide uniqueidentification for citizens.
Biometric info is used toverify identities for the proper
distribution of welfarebenefits.
When traveling by air,biometrics streamline passenger
check-in, boarding and securityprocesses, enhancing efficiency
(02:57):
and security.
Some hotels use biometrics forguest check-in, room access and
personalized services, so it'sno wonder more and more states
are following Illinois inenacting biometric privacy laws.
I just finished hosting awebinar on this subject with two
presenters who talked about thelitigation that follows
biometric privacy law violations.
They spoke extensively aboutthe state of biometric privacy
(03:22):
litigation, the regulatorylandscape and the insurance
coverage considerations andrulings.
They're both shareholders withthe Anderson Kill Law Firm and
they both earned their lawdegrees from Fordham University
School of Law.
John M Leonard is co-chair ofthe firm's biometric liability
group.
He's recovered millions ofdollars for policyholders in a
(03:44):
full spectrum of insurancecoverage matters, including
disputes over businessinterruption, d&o and E&O,
defense and indemnity, generalliability losses and
environmental liability Court TMalone is chair of the firm's
Biometric Liability InsuranceRecovery Group.
He's an experienced litigatorfocusing on insurance coverage
litigation and disputeresolution, with an emphasis on
(04:07):
commercial general liabilityinsurance and cyber insurance,
employment practices insurance,advertising, injury, dno, e&o
and property insurance issues.
He's also a member of thefirm's restaurant, retail and
hospitality, environmental law,cyber insurance recovery and
COVID task force groups.
He's in a lot of groups and,because no one has probably ever
(04:30):
mentioned it, you know his nameis Court, so that never comes
up.
So, following the webinar whichis coming soon to the West Legal
Ed Center, you can take itthere for CLE.
Cort and John stuck around toanswer some of my questions and
we started off with theirthoughts on a couple of recent
(04:51):
cases that I thought wereillustrative, including the use
of biometric info that enabledone teenager to gain access to a
wildly popular theme park andin another instance attract
employees at a popular burgerchain and more because there's
always more.
And now here's my conversationwith John Leonard and Cort
Malone of Anderson Kill.
I hope you enjoy it.
(05:12):
I thought Six Flags was sort ofa good example of just some
regular thing that people dothat could lead to liability for
a company.
But then I thought the WhiteCastle and the Facebook cases
were interesting just from theenormity of them.
But can you just talk a littlebit about what the Six Flags
(05:35):
case was about?
Cort Malone (05:37):
Go ahead, John.
It's your favorite, it is myfavorite.
John Leonard (05:40):
I love the
Rosenbach case.
Maybe it's because I think ofit like in myself as a
15-year-old kid and what I wouldhave been, like yeah, why, who
cares?
Take my fingerprints and saveme some money.
And then, like I think of whatmy mom's reaction would have
been when I, like, went home,proud, like look I'm saving us,
saving us all this money shewould have.
I mean, I don't know ifrosenbach got smacked, but I
would have gotten smacked by mymom, if I so yeah he was.
(06:01):
He was like a 15 year old kid Ithink he was at the time Tom,
and I guess it was.
It must have been towards thebeginning of the summer, because
it revolved around getting aseasonal pass to Six Flags.
So he shows up at Six Flags, hegoes to pay his admission and
they say, hey, listen, we cangive you a seasonal pass, save
(06:23):
you some money, come as often asyou want, and all we need is
for you to give us yourfingerprints and we'll take your
fingerprints, give you the passand you're good to go.
I mean, there's also the issueoutside of BIPA whether a
15-year-old could consent tothat anyway.
But under BIPA, you know, theyhit the, the trifecta of BIPA
(06:46):
violations.
They didn't do anything right.
They didn't uh, they didn't uminform him in writing that they
were going to take it.
They just told him I mean, itwas like they, it was like they
stole it from.
They told him they were goingto take it and he said, okay,
but um, as the, as the courtclarified, it's got to be in
writing.
You got to provide to theperson you're collecting it from
(07:07):
in writing the purpose and thelength of time that you're going
to maintain this information,how long you're going to keep it
for, how you're going todestroy it, and you have to have
them in in writing, give youtheir consent.
And they didn't do any of thosethings, probably among a host of
(07:28):
other things that they gotwrong in terms of BIPA, but
those are the three keythreshold requirements.
They didn't do any of them.
So I mean, I guess, in theory,he probably got his pass, he
probably got a seasonal pass, sohe was probably happy.
Sure, he probably got aseasonal pass, so he was
probably happy.
Sure, you know, that was their.
The interesting thing to me wasbecause it's right in the
(07:49):
statute that you've got theseliquidated damages minimum
damages of $1,000 for negligentand $5,000 for a reckless
violation of BIPA and Six Flagsstill argued like look, that's
this is.
There's no damage here.
There's nothing happened.
Nothing's no damage here.
Nothing happened, nothing'swrong, there was no breach,
there was no.
(08:09):
And the court, rightfully Ithink, clarified like well,
there doesn't have to be.
The statute's pretty clear thatone way or the other you're
getting $1,000 or $5,000 at aminimum.
You know if God help you, ifthere then is an actual breach
and there are actual damages.
You know, that's the baselineand it's only going up from
(08:31):
there, from the 1,000 or the5,000.
Tom Hagy (08:33):
Yeah Well, a couple
thousand dollars, that gets you
a lot of passes to teen paradise.
I mean, I think you know, ifyou said we need something from
you to let you in here, I wouldthink six flags of like yeah
sure, I'll give you a kidney.
Do I need them?
How many do I have?
Cort Malone (08:50):
I'm in.
We go the other way, tom, whenwe talk about the White Castle
case and their effort to throwthemselves on the mercy of the
court.
When they said this would costus 17 billion dollars, you know
the running joke is that's anawful lot of sliders.
Tom Hagy (09:05):
And they are delicious
.
Yeah, so that that, yeah.
So what was the, what was thatcase about?
How did they, how did theyarrive at $17 billion?
Cort Malone (09:15):
And and I think
John went through this a little
bit in in our presentation, butthe idea being this tracks back
over a lengthy time period.
So your White Castle, you'vegot you know several thousand
stores throughout the country.
You might have a millionemployees.
(09:37):
I mean, you've got you knowtens of thousands of employees,
thousands of employees, and eachand every day, each and every
employee comes to the store andclocks in with a fingerprint and
clocks out with a fingerprint.
This is the multiplier of thosethousand dollar.
Every time they scan, every day, every employee at every store,
(10:02):
my goodness, the math gets upinto the billions.
It legitimately does.
And you know what's funny?
The real takeaway from that wasand as John pointed out, the
court, looking at the statute,said hey, the statute is very
clear, that that's what itapplies to.
(10:23):
If the legislature did a crummyjob writing the statute and
wants to fix that, that's thelegislature hopefully to be
(11:02):
passed sometime soon.
Basically is a word for word ofIllinois and BIPA.
So New York is a state who hasseen the litigation that has
arisen from BIPA, the level ofliability put on companies, and
even they haven't gone in andchanged the wording of the law
that's pending and potentiallywill be on the books in New York
soon as well.
Tom Hagy (11:18):
Yeah, I just .
.
Go ahead, John.
Were you going to say something?
John Leonard (11:22):
My first thought
when I read the White Castle
case, I for some reason figuredWhite Castle must be public.
It's not, maybe not standalone,but under like PepsiCo or Yum
Foods or something like, andlike to admit that you're facing
a $17 billion loss in court.
Like my God, what happens toyour call your DandO too?
But it's not.
(11:43):
It's a private company, which Imean doesn't--
Cort Malone (11:46):
Yeah.
John Leonard (11:47):
--g et any any
better, I guess, for your, but
at least you're not talkingabout tanking a stock.
Tom Hagy (11:52):
Yeah.
John Leonard (11:53):
Public trading
stock.
Tom Hagy (11:54):
I was just.
I was just wondering whysecurity is so tight at White
Castle, but then I figure it isa castle.
It is a castle, yeah.
Cort Malone (12:03):
Well and again,
they weren't even thinking of it
from a security standpoint, tom, that's just they needed to
know how much to pay the15-year-old kid and it's more
reliable.
That's the technological issue,right?
You know a time card whatevergets lost or the kid can fake it
(12:23):
His fingerprint, you know.
You know he worked six and ahalf hours or eight and a half
hours or whatever, and it's justinteresting that you know.
Going forward, companies areaware of BIPA.
Now, if you're using afingerprint clock in, clock out
system, you've had somebody signan employment agreement on day
(12:43):
one that says we are collectingyour fingerprint to check how
often you're working.
We are collecting yourfingerprint, we will store it
until the week after you getfired or decide to move on and
we need your informed consent.
Do you sign that?
So, interestingly, thislitigation over BIPA claims
might have a shelf life ifcompanies all get smart enough
(13:07):
and do things the right way.
That being said, that's likesaying, hey, we thought asbestos
claims were going to disappearin the 1990s and you know, here
we are 25 years later stilldealing with these.
Things tend to have long tails.
Tom Hagy (13:22):
Unfortunately.
Yeah, yeah, don't get mestarted on the asbestos thing.
I mean I get, I mean I get it.
You know I get it.
But my God, some company buys acompany to buy a company that
made gaskets.
Yes, yeah, it's a mom and popshop.
Cort Malone (13:38):
I don't know Well,
when I when I started at
Anderson Kill Tom, the twobiggest types of cases that I
worked on were insurancerecovery for asbestos claims and
insurance recovery for historicenvironmental liabilities and
insurance recovery for historicenvironmental liabilities.
Sure enough, 30 years later,I'm doing a lot of cool cutting
(14:04):
edge things with respect tobiometrics and AI and things
that come up COVID coverage butsure enough, I've still got some
asbestos and environmentalclaims I'm handling as well.
Tom Hagy (14:10):
Court.
Those were my life from likethe 80s to almost 2000.
That's all I did wasenvironmental and asbestos
coverage.
I mean, that was a big productat Mealy's, and day in and day
out, I took great pride in beingable to call people like Gene
Anderson and tell them that theywon or lost something.
(14:32):
But that was before everythingwas online, you know.
So I was tied into everycoverage policyholder, lawyer,
coverage lawyer.
It was oddly fun.
I mean, if you told me when Iwas a teenager, trying to get
into great flags, that six flags, whatever, that I'd be writing
about insurance and having fun,I would not have believed you.
(14:53):
And then we don't have to talkabout the Facebook one.
But, needless to say, facebookis gosh face among among many
things that they're facing.
Uh, they're facing a lot of uh,privacy related, uh, biometric
related, uh thing you know andwe talk about.
You know teenagers just givingthings away.
I mean adults, I mean.
(15:15):
How many things have we justsigned off on when we sign up
for something, even on Google?
John Leonard (15:21):
or Ancestry.
Just because I said I wouldhave done it as a 15-year-old
doesn't mean I wouldn't still doit right now.
My season will just count tosix flags.
Tom Hagy (15:30):
Yeah.
Cort Malone (15:32):
Well, that's what
the technology brings, right,
tom.
It brings this simplicity andit you know.
You know the Amazon with thedrone deliveries.
You know, I punch a button andI've got a product on my
doorstep four hours later.
Everyone loves that, but you'vegot to accept the fact that
that drone is also taking apicture of your house.
(15:53):
It's got you opening the frontdoor in your underwear to pick
that package up.
And you know and what you know,and those things were signing
away.
Now it's no longer just, youknow, the right to sue somebody
in court or versus anarbitration.
Now it's literally the rightsfor someone to take a picture of
our face and use it.
You know, however, they want touse it.
(16:14):
And I'll tell you the bestanalogy of this, tom I was
giving a similar presentation ata RIMS you know, risk
management conference a year orso ago, and the woman who was
covering it for their kind ofin-house publication wrote an
article.
And the headline of the articleI'm going to butcher it a little
bit, but it was basically likewhat do you do when a criminal
(16:36):
steals your face?
And you know, as John said, theconcept of hey, worst case
scenario, you lose a driver'slicense or a credit card or God
forbid a social security number.
It might be a pain in the buttbut you can get a new one and
get it replaced.
If somebody's got yourbiometric info and somebody's
using your face, you know I meanlisten deep fakes, whatever it
(16:59):
may be, if they can access youryou know security features.
Because of that, we're talkinga whole nother world of criminal
enterprise.
Tom Hagy (17:10):
The you know, I had a
college friend.
She was.
She ended up being an anchor onNBC in the morning in the 80s.
She was pretty popular.
But she put something onFacebook about how, as we're
getting older and she got up inthe morning she said my face
looks so different my phonedidn't even recognize me.
I don't have that problem.
(17:31):
I'm consistently ugly For thecoverage disputes.
Briefly, I know because I'mgoing to steer people obviously
to the webinar and to yourarticles and things that you've
written about this, but can youbriefly discuss what are the
arguments for coverage and thensome of the things against, like
the exclusions and things likethat.
So what do you?
(17:51):
What can you tell me there?
John Leonard (17:53):
exclusions and
things like that.
So what can you tell me there?
I'll tell you just in the firstinstance.
The arguments for coverage, Ithink, tom, are what we were
talking about before.
When you're under theseliability policies, which
generally have these very broadcoverage grants you uh, the
(18:13):
Krishna Schomburg case, whichwas the first real GL um under
coverage B, the advertising andpersonal injury coverage Um,
you're talking about a, a DNOthat probably has very broad
coverage grant, things like thatto get in the door and to show
like, at least in the firstinstance, we as the policyholder
are entitled we've got acovered loss here that falls
(18:35):
within the ambit of this policy.
It's not that it's easy to do,but it's certainly easier to do
than it is to then come back andsay wait a minute.
Well, no, this is excludedbecause it is the policyholder's
initial burden to get in thedoor and say this policy covers
(18:55):
this loss, but then it switchesto say that it's not covered.
And the way that courtsgenerally interpret exclusions I
always hesitate to say everycourt, but the vast majority of
courts interpret exclusionstrictly against the insurance
company because it makes sense.
The insurance company draftsthese policies.
The policy holder has no sayreally, particularly when
(19:17):
talking about like a GL policy.
The policy holder has no say inhow those policies are written.
The court's saying it's goingto be you, the insurance company
, now have to prove that thisexclusion applies.
And if you look back at those,some of those I go back to
Thermoflex because it's the onethat hits all three of the
exclusions that insurancecompanies have relied on for a
(19:38):
long time now and every one ofthem it said this is ambiguous
as to BIPA, for reasons X, y andZ, as they applied to those
three exclusions.
And it's the theme of that isthat we, the court, are not
going to rewrite your policy foryou insurance company,
(20:01):
particularly not in a way that'sgoing to restrict coverage.
We're not going to, we're notgoing to let you use these
exclusions in a way that's notplain on their face, not to
restrict coverage for yourpolicyholder.
And if they're ambiguous whichthe court found that they were
in that in the Thermaflex casethen automatically we're going
(20:21):
to go back to the policyholderand say, because this is
ambiguous, we have to side infavor of the policyholder here.
We can't, there's, no, it's thetie goes to the runner theory,
where the policyholder is therunner.
Tom Hagy (20:32):
Yeah, ambiguity is the
policyholder is the runner.
Yeah, ambiguity is thepolicyholder's friend.
I think that was my takeawayfrom all those years of writing
about pollution, exclusions,thanks, all right, so there are
arguments against.
I mean, I can go back throughsome of the exclusions and sort
of repeat some of what you said,but there were various
(20:54):
exclusions that they they had,that they have.
And then the one that wasinteresting to me and you you
mentioned it.
I think it was with regard toWhite Castle, or maybe it was
Six Flags, doesn't matter, oneof those where it was like
what's the harm?
Someone's got my face,someone's got my.
And the point you were makingwas doesn't have to be harm, the
(21:15):
statute says no, it's illegalor whatever.
Cort Malone (21:18):
Yeah, no, that's
right, tom.
It's under the way BIPA isdrafted and most of these
privacy law statutes are drafted.
It's the equivalent of strictliability and the only
differentiation is did you usesomebody's information and
collect somebody's informationin a quote, unquote negligent
manner, or did you use it in areckless you know reckless and
(21:41):
intentional manner?
And the difference there isjust the quantum of damages.
So you know whether it andlisten, fingerprinting employees
and storing that informationjust so you can know how much to
pay them.
Like I mean, it's a victimlesscrime.
It was going on for 50 yearsbefore there was a statute
(22:04):
protecting it and you got toassume most of these businesses
and companies they were doingnothing nefarious with that
fingerprint information.
They probably didn't even carethat they had it, other than
that it tracked employees comingand going.
But lo and behold, bipa on thebooks employees coming and going
.
But lo and behold, bipa on thebooks.
(22:25):
Six Flags case gets decided.
That says people who areaggrieved do not need to show
they've suffered any harm.
And now we're seeingmultimillion dollar settlements
right and left.
And, interestingly, john and Ihad a client case.
We worked on national hotelchain and they were using the
fingerprinting for all the staff, the bellhops, the cleaning
(22:45):
service people, the guys whowork the front desk, and so you
know the class was large.
The amount of potential damagesafter the White Castle and
Tim's cases meant you've gotfive years worth of daily
fingerprinting.
And what was the first thingthat this company did when they
got sued?
They switched back to an oldschool you know punch card
(23:09):
system.
They said listen, we're not,we're no dummies, we're not
going to keep doing this system,because we read the statute and
it says we're not allowed to dothis.
Now, likely, they will go backto a more technologically
advanced system, but they willmake sure they're meeting all
the various requirements.
But this was a company who hadto pay out multi-million dollars
(23:31):
in damages and again, theydidn't do anything wrong.
Their employees didn't evenreally think they did anything
wrong.
But you get a few you knowenterprising plaintiffs, bar
attorneys out there who say look, it's not our fault, the
legislature wrote this broadstatute.
We're going to go collect somepeople, we're going to put some
money in their pockets and oh,by the way, we're going to take
(23:52):
30 or 40% of it off the top asour legal fees as well.
Tom Hagy (23:58):
Man it.
Just it does remind me asthough there's a lot you could
do with somebody's personal data, obviously.
And then if there's a databreach I guess I did another
podcast based on an article onmedical monitoring.
So I wonder if there's not someI suppose there is, there's
some probably privacy monitoringor something to see whether you
(24:18):
, your stuff has been because Iget alerts sometimes You're this
has been your password orsomething was compromised by a
breach.
I don't know if that comes upmuch, uh in cases, but uh just
reminds me that yeah, there'snot yeah in and of itself.
Yeah, someone's got myfingerprint big deal.
But you know, if I'm gettinginto government buildings with
(24:39):
my fingerprint or something,somebody could do a lot of
damage Exactly With that stuff.
John Leonard (24:43):
Yeah, the court
said right at the top of the
presentation that there's somuch more risk with your
biometrics being out there thananything else that you can give,
and that's what they recognizedin the statute.
The harm is the violation ofthe statute.
That's the harm.
Tom Hagy (25:01):
And you did hit on one
thing uh, court maybe, maybe
more gleefully than you should,than is appropriate.
But uh, when you talked aboutbecause I was thinking you know
who's got a lot of private dataor insurance companies and uh,
private data and in biometric,my god, they've got everything
on me.
I mean, um, all your healthdata and everything and I did, I
(25:22):
did a quick scan while I waslistening to you and I was just
finding all these cases whereinsurance company, a life
insurance company, healthinsurance companies are getting
sued, things like that.
So you smiled a little too muchwhen you were talking about
that, but I just thought I'msure there are other cases where
insurance companies are gettinghit for things that they're
(25:43):
also insuring, like fire orwhatever, but nothing like I
don't know, nothing like thiscomputer crimes, because going
after a law firm is going to getyou access to a lot of clients
(26:04):
personal, you know, not justlegal but financial information.
Cort Malone (26:07):
Going after an
insurance company is a treasure
trove of, you know, people'spersonal information, both
health, finances and just abouteverything.
So we've seen that same thing.
We've seen a lot of insurancecompanies be the target of
cybercrime.
We've seen a lot of law firmsbe subject to that.
(26:28):
So I try to not take too muchjoy in the insurance companies
because someday the shoe couldbe on the other foot.
Tom Hagy (26:35):
Yeah, yeah, they're
both sweet targets.
John Leonard (26:38):
I thought you were
actually kind of understated in
your glee or you your offline.
Maybe that would have been it.
Tom Hagy (26:45):
Maybe I was projecting
.
I know where he comes from.
I don't mean Paramus.
That's really all the stuff Iwanted, unless John.
You wanted to talk aboutchanging your social security
number because it just seemedlike.
John Leonard (27:01):
If you want to
hear it, I'll tell you.
Maybe not in the podcast, butit's not Okay.
Tom Hagy (27:04):
Well, no, maybe
another time.
Well, I thought about doinganother podcast on, because I
for a while I was really.
I was doing a little projectwhere I was collecting stories
from attorneys on stupid thingsthey did when they were young
attorneys.
And maybe you guys, maybeyou're different, but most
attorneys love telling thesestories.
John Leonard (27:25):
I have so many of
them that I won't speak for
court, but I know I've got somegood ones.
I think you probably do too,Court, but I know I've got my
own good ones.
Cort Malone (27:32):
If it's anonymous,
I'm happy to share mine, hey,
tom.
The one last point theinteresting thing about the way
the BIPA claims and theinsurance coverage fights over
those claims has played out overthe last five years.
It's absolutely a precursor towhat we're going to see with AI
related claims and the insuranceindustry is actually already
(27:56):
sort of a half step ahead of AIbased claims because of how
they've seen the biometricprivacy law claims play out.
So the downside for people likeme and John is that the
insurance companies having ahead start is never a good thing
for us or for the policyholders.
But we've got our own playbookyou know in advance already as
(28:19):
well and are already handling,you know, certain claims and
pursuing coverage for AI relatedliabilities that companies have
been hit for.
So all this tech stuff, as youknow, the insurance world is
going to always be involved whenyou've got new technology
that's resulting in newliabilities and new claims.
(28:40):
So we're going to continue to,you know, fight the good fight
and represent the policyholderside of those things.
But it's very cool to beinvolved in these cutting edge.
You know technology aspects ofthat world as well.
Tom Hagy (28:53):
Yeah, you've got a,
you've got an interesting
practice.
And yeah, thank you formentioning AI.
As you said on the webinar, youcan't do a presentation in 2024
without mentioning it.
Well, thank you very much.
Cort Malone (29:03):
Thank you, Tom,
really appreciate it.
Tom, thank you for having us.
Tom Hagy (29:11):
That concludes this
episode of the Emerging
Litigation Podcast, aco-production of HB Litigation,
Critical Legal Content, vLexFastc ase and our friends at Law
Street Media.
I'm Tom Hagy, your host, whichwould explain why I'm talking.
Please feel free to reach outto me if you have ideas for a
future episode and don'thesitate to share this with
clients, colleagues, friends,animals you may have left at
home, teenagers you'veirresponsibly left unsupervised,
(29:34):
and certain classifications offruits and vegetables.
And if you feel so moved,please give us a rating.
Those always help.
Thank you for listening.