All Episodes

December 10, 2025 65 mins

Send us a text

Autonomous vehicles promised safer roads, fewer accidents, and a seamless future — but the real-world footage tells a very different story. In this episode, Ken Lucci and James Blain react to a curated set of shocking, hilarious, and sometimes disturbing AV clips, including Tesla autopilot mishaps, snowy-road confusion, near-miss lane changes, and even a tragic cruise-vehicle incident.

From the “cardboard-box test” debate to the ethics of the trolley problem, they break down what the technology actually sees, how it makes decisions, and why operators shouldn’t assume AI thinks anything like a human. They also explore real implications for operators:

  • Will AVs really lower insurance premiums?
  • How close are we to driverless buses?
  • What happens to safety drivers and workforce transition?
  • Are regulators and engineers moving fast enough — or too fast?

It’s equal parts education and entertainment as Ken and James take on robo-taxis, Tesla logic, data collection, human instinct, AV ethics, and the future of ground transportation — one wild video at a time.

Whether you’re curious, skeptical, or cautiously optimistic about AV tech, this episode will have you laughing, thinking, and maybe gripping your steering wheel a little tighter.

At Driving Transactions, Ken Lucci and his team offer financial analysis, KPI reviews,  for specific purposes like improving profitability, enhancing the value of the enterprise business planning and buying and selling companies. So if you have any of those needs, please give us a call or check us out at www.drivingtransactions.com.

Pax Training is your  all in one solution designed to elevate your team's skills, boost passenger satisfaction, and keep your business ahead of the curve. Learn more at www.paxtraining.com/gtp

Connect with Kenneth Lucci, Principle Analyst at Driving Transactions:
https://www.drivingtransactions.com/

Connect with James Blain, President at PAX Training:
https://paxtraining.com/

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
James Blain (00:00):
there's a whole empty left lane here you can see

(00:03):
at the river.
That guy's coming.
Get the hell out the way.
I don't cuss real often, butlike in my head, all I'm hearing
is ludicrous song from when Iwas in high school.
Move bitch.
Get out the way.
get out the get out the wave,

(00:45):
Hello everybody and welcome back to another exciting
episode of the GroundTransportation Podcast.
Now you know me, James Blainefrom PAC's Training.
You know, my wonderful co-hostKen Lucci of driving
transactions, right?
But if you missed the last timewe had him on, you might not
know John.
John is our wizard behind thescenes.
He is the producer.
He is backed by popular demand.

(01:07):
So kind of like top gear of old.
You have those challenges theproducer gives you and you film
them.
If you don't know about cars,you won't get that reference.
Don't worry, you'll figure itout later.
But here's the fun thing.
So we are once again handing thereigns over to John.
So John is gonna be our mainpilot on this episode.
Ken and I are along for theride.
And John, what are we talkingabout today?

(01:27):
I think it's, I think it'sautonomous vehicles, right?

Ken Lucci (01:29):
unaccustomed, I am to talking about autonomous
vehicles.

James Blain (01:32):
yeah.
It's.

John Tyreman (01:34):
the way that this is gonna work guys, is I'm gonna
play a couple videos that aregoing to be around a certain
theme, and then we're gonna havea conversation around that.
So

James Blain (01:43):
to determine the theme or will you be giving us
the theme?

John Tyreman (01:46):
I'll be giving you the theme.

Ken Lucci (01:48):
I love

James Blain (01:48):
much more fun if we had to figure it out on our own.
See if we even got remotelyclose to

Ken Lucci (01:52):
It's, it's Friday.
I am not that creative.
Oh, here we go.
Here we go.
I like this.
I like where this is

James Blain (01:57):
Okay.
Oh, geez.
Elon out the gate, out the hole.

John Tyreman (02:01):
Elon out of the gate.
So this one is a, um, a Teslafully autonomous driving mode in
the snow.
Let's check it out.
Seems to take, have taken thatturn pretty well.

James Blain (02:25):
I'm watching that hand.
I wanna, oh, there it,

Ken Lucci (02:28):
Oh, there you go.
There you go.
There you go.
What it, it wasn't, wait aminute.
It just, for those people whoare not, who, who, who are not
looking at this, we're lookingat a Tesla that was going
straight down the road in asnowstorm, but then magically it
started going off to the lefthand side on its own, and

James Blain (02:49):
and it did the turn fine, like it turned onto the
road fine.
But you saw the road and you'relike, Ugh.

Ken Lucci (02:55):
What do you think happened there?

James Blain (02:57):
I mean, so a couple things.
So one, Teslas are heavy, right?
They weigh a ton.
So my

Ken Lucci (03:03):
of the, because of the battery.

James Blain (03:04):
yeah, you've got kinda that skateboard.
You got a big battery packunderneath.
So I gotta tell you, I mean,just me, I'd be a little nervous
driving a Tesla in the snowanyway, but I think what
happened is it lost a little bitof traction.
It starts sliding sideways.
And I, I gotta tell you, if youwere watching the wheel, like
John, I don't know if you canturn the audio off and pull it
back a little bit.

(03:25):
Like if you're watching thewheel coming outta the turn,
it's a little twitchy.
Like I, I was watching causethis, this guy's got his hand on
his, his thigh here trying tolike, I'm not gonna touch the
wheel.
Um, and, and it looks a littletwitchy when he's doing that.
I gotta tell you, at least in myopinion, I think it was already
struggling.

(03:46):
I think this was just asking fortrouble.

Ken Lucci (03:49):
So, so let's back up a step.
I mean, you know, here to fourWaymo's have been in Phoenix,
you know, besides dust storms,really easy, uh, dust storms,
they're really easy from aweather perspective, it's been
in San Francisco.
Besides the fog in the, in thewindow a little bit, it's been
in

James Blain (04:08):
which is not by accident, right.
They,

Ken Lucci (04:10):
right?
Do you think?

James Blain (04:11):
choosing places that don't have

Ken Lucci (04:13):
right.
And, and the latest is Austin inTexas.
So do you think that this is,this is what we're going to come
to, that when these autonomousvehicles are in a snowy weather
condition, they're just notgonna be on the road or what?

James Blain (04:30):
So, all right, so let's, let's get a couple things
out of the gate because thefirst is humans learn in a
linear fashion.
I'm a little bit smartertomorrow than I was today.
I'm a little bit smarter thenext day.
I'm a little bit smarter.
The

Ken Lucci (04:45):
I don't think that's true.
'cause I don't think I'm anysmarter now that at 60 than I
was at 16.

James Blain (04:50):
by the way, we go downhill the a a lot more like
AI goes up,

Ken Lucci (04:54):
there you go.
There you go.
Yep.
I think IPI think I peaked whenI was 40, so it's

James Blain (04:58):
there you go.
Yeah.
Yeah.
Someone was peaked in high

Ken Lucci (05:01):
So, so tell, tell us, tell us what you mean by that.
So, humans are linear.
we we learn from ourexperiences.

James Blain (05:08):
we are linear in the way that we progress, in the
way that we learn.
And the other part of that isour expectations of AI and
technology are generally linear.
We expect them to be a littlebit better than they were.
The problem is with technology,and we've talked about this in
the past with Moore's Law, theydon't, 1, 2, 3 isn't how you

(05:28):
count with ai when you'rethinking of computers.
It goes 1, 2, 4, 8, 16, 32 64, 128, 5, 12, 10, 24.
And then from there it startsjust absolutely hockey sticking.
The same thing is happening withthe advancement of AI and
self-driving.
Um, they've been doing thisstuff all the way back to the

(05:50):
nineties when NASA had their

Ken Lucci (05:51):
Oh yeah, absolutely.
So are you saying that the moreand more that

James Blain (05:55):
It's gonna get better

Ken Lucci (05:57):
it's gonna get better?
So, all right, so I'm gonna playdevil's advocate.
Why didn't they start the damnthings in Anchorage, Alaska, to
prove your point?

James Blain (06:04):
It is the

Ken Lucci (06:05):
Because doesn't it, doesn't it snow up there three
40 days a year?

James Blain (06:10):
So, so on behalf of my friends, Charlie and Athena,
I'm gonna say it's beautiful inthe spring.
Come visit, right?
Go to Anchorage.
It's incredible.
You don't start by having a kiddead sprint, because at the end
of the day, there are humanengineers that are trying to do
this.
And as people, we can't juststart with an infinite amount of

(06:31):
money, an infinite amount ofresearch, an infinite amount of
everything.
And the other thing is, believeit or not, they have already,
the fins and the Swedes arealready driving these things on
frozen lakes.

Ken Lucci (06:44):
very good point.

James Blain (06:45):
thing to, the thing to keep in mind is we can't beta
test it with people on frozenlakes.
So what we're seeing with theserollouts in these highly
controlled environments is youare seeing the public face of
it, Hey, we're ready for you totest this.
We're ready for you to drivethis.
Versus, you know, going toFinland, going to Sweden, going
to, you know, the northern partsof Russia to

Ken Lucci (07:06):
well wait a minute, to follow your logic, then they
had to do Phoenix, SanFrancisco, LA to get the other
parts of.
Topography down.
Driving, driving, driving onreal world stuff.
And now they're gonna progressand they're gonna go into heavy
rain areas.

(07:26):
Oh, wait till these, wait tillthese things get down into Miami
and they get into a deluge atthree o'clock in the afternoon,
there's gonna be 150 of theseTeslas parked under in the, on
the highway, underneath thebridge.
Waiting for the rain.
Waiting for the rain

James Blain (07:40):
I I mean we, we all know about the parking lot
incident, right?
So I was like, where's thatcoming from?
I looked down and I was like, Ithink it's coming from the Waymo
cars.
Yeah.
Kind of bizarre, right?
Neighbors in San Franciscodealing with some sleepless
nights, as they say, driverlesscars in a parking lot nearby.
Confused.
They start honking at eachother.

(08:01):
Maybe they're just talking and astatement.
Waymo says, it's her complaints.
They're working on a fix.
That is your rush,
So, I mean, these things are gonna happen, but
would we rather that happenlike, I don't know, in the
middle of New York in thewinter?
Or do you want that to happen?
You know, in Arizona, in themiddle of spring where you know,
if there's rain or anything, itjust pulls over and

Ken Lucci (08:23):
Well, and to your point, you had said on a, on an
episode a while ago that theTesla is constantly learning.
So that episode that it had thatclose call now will be used for
learn it.
It will have learned that, andit will be taking that into
account in the future.

James Blain (08:41):
Yeah, ideally.
Now I will, I will also alsopreface with not every incident
gets looked at.
Not every incident getsreported, and if the Tesla
doesn't know it's screwed up, itmight not be in a training set.
So keep in mind, a lot of thisis it, it's kinda like software,
you know, if you are usingsoftware and you're having
problems and you just getfrustrated and don't tell

(09:03):
anybody

Ken Lucci (09:03):
You gotta report it in.
Yeah.
All right.
Let's go to the next one.

James Blain (09:08):
Oh, it's Tesla again.
John's really beating on Elontoday,

John Tyreman (09:11):
Yeah, we're gonna beat on EL in a little

Ken Lucci (09:12):
Are we gonna get a cease and disor order

James Blain (09:15):
by the way, Elon, if you wanna be a guest and
explain this, we would love tohave you on.

Ken Lucci (09:20):
You know.
Yeah.
You know what, and it's kind.
Yeah.

James Blain (09:22):
you to us, are like the mad scientist.
Like, I, I like to, I like togive Elon crap, but for God's
sake, he was like, I wannarocket.
I'm gonna build a rocket.
So we'd love to have him as aguest,

Ken Lucci (09:31):
And you know, you go on Joe Ro, you go on Joe Rogan.
I mean, are we not as good asJoe Rogan?
Is that what you're saying?
Elon?

James Blain (09:38):
are you, are you saying that we gotta smoke pot
if Elon comes on?
Because if that's what it takesto get Elon on, we will film an
episode in Denver or Californiaif we

Ken Lucci (09:46):
Well, that's all you, that's all on you.
I,

James Blain (09:49):
Now, now that said, I cannot partake because I have
a CDL and I'm not willing togive that up, but

Ken Lucci (09:54):
John will smoke with Elon then.
All right.
Go ahead.

James Blain (09:57):
We're gonna get John High with Elon.

John Tyreman (09:59):
and I can talk up together.
All right, so, uh, so I think,um, I, I wanna switch things up
and I wanna preface this videowith the big question that
James, I, I think, let's watchthis video and I'd love for you
to react and answer.
So, James, how do you trainsomeone?
A driver, uh, an AV attendant,if you will, to stay alert when

(10:21):
their main job is to watch amachine that rarely makes
mistakes until it does.

James Blain (10:27):
He's given me an impossible question and now he
wants me to watch a video.
This is gonna be great guys.
Digging the techno music.
We've got the rainbow justdriving down the highway.
Oh, snap.
Oh, we got him.
I love that.
All right, so we've got theTesla driving down the highway
that looks like a Toyota.
I don't know if that's a BRZ orwhat it is.

(10:48):
It's a, looks like a whiteToyota crossover.
And the left hand lane, we're inthe far right lane and it looks
like it's coming over.
This guy's in full self-driving.
You got the, the little rainbowroad out in front of it.
You can hear the techno musicrunning in the background.
Um, this guy clearly does nothave his hands on the wheels.
I think he's holding the phone.
I think that's why it takes hima second to get his hand back on

(11:08):
the wheel.
Play it again real quick.
'cause this one went reallyfast.
I wanna make sure I have itbefore I All right.
So, okay.
All right.

Ken Lucci (11:23):
Describe it again now that it's driving down the road
and then you see this white SUVveering over into the lane in
front of the Tesla.
Why didn't the Tesla see it,James?

James Blain (11:35):
so, so two things.
This looks like almost aself-inflicted pit maneuver on
behalf of the guy coming over,right?
So probably the front left, um,kind of fender area is probably
right at the rear tires at thedoor.
And the other thing is, thisguy's use of a turn signal.
It's kinda like, um, I don'tknow if you guys have seen the

(11:56):
meme where it's like, I've gottacross the left lane.
I've got across eight lanes.
I'm gonna turn the take signalon.
Good luck everybody.
And they just kinda like yolo itover.
I feel like the Toyota put theturn signal on and then was
like, all right, turn signal'son wham.
Um, two things.
I think one, the thing that weforget is that even though this

(12:18):
thing can see ahead, it.
It's not going to processintention the way we do.
So one of the cool things, and Ibelieve he was actually a KU
professor, so a professor cameout to Orlando and, and I'm on
the executive committee for thebus industry Safety council.
And one of the sessions that wedid in Orlando that I absolutely
loved was they had a professorcome talk about, at a collegiate

(12:41):
level drive, like basicallydistracted driving.
And one of the things he did ishe put up a picture and he is
like, here's a highway.
And he goes, all right, I'mgonna have you guys look at it
for a second.
Did you look at it for a second?
He goes, I'm gonna put up thenext picture and I want you to
look at it, and I want you totell me what's wrong.
And he puts up the next picture.
And it's, it's basically like,find the differences.

(13:02):
Nobody, nobody in the freakingroom got it.
Why?
Because we're trained to look atthe road.
We're looking at the cars.
We're looking at the signs.
We're looking at the highway.
We're looking at everything weknow that is important.
Then he goes, all right, I'mgonna circle it.
It was like a building off tothe side of the highway.
Why?
Nobody cares.
It doesn't matter to us.
The building's not gonna causethe accident.

(13:24):
So I think what we're seeinghere is it does not see the
intention.
But the other thing is, ifyou've ever ridden in a Tesla,
you will see that there's alittle bit of reaction time.
The Tesla isn't as quick.
I, I saw the same thing with thezoo.
If you go watch the zoo video,when I'm in Vegas and I'm riding

(13:46):
around at the zoo, you'll seethe exact same thing.
It's got like 40 or 50 differentoptions.
And it's not like us where itgoes, oh crap.
And it commits, right?
A lot of times it's having tomake decisions and it did react,
but it didn't react fast enough,it didn't catch the intention.
And frankly, it's not gonna havethe same level of anticipation

(14:08):
that we do for years becauseit's gonna have to get to the
point where processing is gonnahave to get faster and
anticipation and training.
Data's gotta get better.

Ken Lucci (14:19):
wow.

John Tyreman (14:19):
Um,

Ken Lucci (14:20):
So what you're saying to me is every real life
scenario that takes place has tobe entered into its database or
its memory bank.
It needs to.
Look at how it reacted and thenadjust itself accordingly.
It's gotta, it's gotta repeatthat.

James Blain (14:41):
Basically so, so in our world, you know, I coach
hockey with my son.
So I think of it as with my son.
I can have him shoot from themiddle of the ice and then I can
have him shoot from, you know,the blue line and then I can
have him come up a little closerand he's gonna be okay shooting
in between.
Um, the same thing happens withthis except the problem is,

(15:01):
unlike a kid that can kind ofinfer it and turn him loose,
you've gotta have these massivesets of data for it to go
through and get through andfigure out.
And the other thing is, humansdo really good in completely
one-off ridiculous situations,right?
If I throw you into a weird,ridiculous situation, we do

(15:22):
really well at improvisation.
That's one of our gifts.
If you've ever seen me give atalk, it's improvisation,

Ken Lucci (15:28):
Some, some of us do.

James Blain (15:30):
But it depends because even if you suck it
improvisation, if I put you in alife or death situation, your

Ken Lucci (15:37):
it's fight or

James Blain (15:37):
your everything, you're gonna react.
And in, sometimes we screw itup.
But the reason that we've doneso great as a species is we tend
to get it right.
The problem with the AI is theAI has no adrenaline, it has no
fight or flight.
It's just processing.
And so it's one of theinteresting quirks that you see

(15:59):
in these scenarios.
Now, the other side is toactually answer John's frankly,
crappy question.
'cause I can't answer it.
Right.

Ken Lucci (16:06):
don't blame John because you can't answer the
question.

James Blain (16:09):
I, I,

John Tyreman (16:11):
about the role of training

James Blain (16:13):
All right,

John Tyreman (16:13):
and how it's

James Blain (16:14):
so here's the role of training.
The role of training is that thesafety driver's time is limited.
It, the whole point of havingself-driving is not to need the
safety driver and the idea thatwe're gonna have someone that
sits there just staring at theroad, waiting for his moment to
do something, frankly, it, it,it makes no sense because that

(16:35):
defeats the whole purpose.
That's like saying, Hey, I'mgonna, I'm gonna find the
ability to fly, but I'm onlyever gonna fly two feet off the
ground.
So if something happens, I canjump outta the plane.
Like, it only makes sense inthis teeter-totter phase

John Tyreman (16:49):
I disagree.

Ken Lucci (16:50):
yeah,

James Blain (16:51):
here.
Uh, I've, I've started a greatdebate.
Let's go boys.

Ken Lucci (16:54):
no.
So, so one of the things that,that occurred at the Cha for
Driven and LA show is I had agreat dinner with the, with the
guys from Wendell Marks and, uh,while, uh, while Matt, Matt Doss
was not there, pat Russo wasthere and we had a very spirited
discussion about Waymo's in NewYork.

(17:14):
And the thought process of, ofNew York to me was, and reading
all of the analyst reports andreading through.
The Waymo announcements and, andthe Zoox announcements is that
those vehicles are gonna be inNew York within three years.
They had a,

James Blain (17:32):
and they're required to have a safety
driver.

Ken Lucci (17:34):
but they had a different take.
And their take was becausethere's going to be some serious
blowback on labor where the, theNew York as a, a city is worried
about displacing so manydrivers.
And I di I, I thought there wasvalidity to that, that it's
going to extend it.

(17:55):
Now let's not debate that,

James Blain (17:57):
Well, there's no debate there.
You're absolutely right.

Ken Lucci (18:00):
okay, but what if, what if the plan in New York
City was, look, we know you wantto launch these avs, but I think
we'll all agree that New YorkCity is extraordinarily
challenging.
Why don't we go into a period offive years where you have to
have a body in there?

(18:22):
So that we can make sure that,that we have that human reaction
to the com, you know, to thedarting out the way you just saw
it.
I mean, that to me seems to bethe best of both worlds because
at that point, the vehicle couldlearn from what the human being
does.
Now I know that if Elon was onhere, God willing, we would love

(18:43):
to have him on, but he wouldgive me the 15 scientific
reasons why.
That makes no sense.
But what we just

James Blain (18:50):
No, you gotta split the middle, Ken.
You gotta split the middle.
It may you makes, alright, sohere's the thing.
So I-I-A-T-R, not this year, butthe year before, one of the
things that got brought up is

Ken Lucci (19:02):
tell the audience who IATR is.

James Blain (19:04):
So that's the International Association of
Transportation Regulators.
That's when all the regulatorsthat regulate transportation
come together, and the wholefocus that year was on.
Autonomous vehicles and electricvehicles, and everybody was
super excited.
This is gonna be awesome.
And New York, the, the one ofthe representatives at New York
was like, oh God.
Oh God.

(19:24):
And everybody's like, what'swrong?
New York?
Like, it's almost like those,those cartoon memes you see
like, all right, New York,what's wrong?
Uh, well, we've got all of thesedrivers and their pension and
their retirement relies on, andI'm not gonna call it a Ponzi
scheme, it's not a Ponzi scheme,it's a retirement fund, but it
relies on new people coming inbehind and paying into it.

(19:45):
So you have to have new driverspaying into the retirement fund
to keep it going for thedrivers.
No, no, no.
The the black car fund, uh, issomething different.
Um, but anyways, the concernthere was at the point where we
stop having new drivers come inand we displace all of these
workers immediately.
What happens now, a couplethings to note here.
One.

(20:06):
Safety drivers make sense in thebeginning to, to not give John
crap and, and give you a directanswer.
The way that you get them to dothe job effectively is you don't
let them take their phone.
You don't let them havedistractions and you make the
only form of entertainment orinput or anything.
They have actually driving thevehicle to the point where if

(20:29):
you really wanted an effectivesafety driver, you would
literally put a black partitionand lock him in the driver's
seat.
And his only option for any formof input or stimulus is to watch
the dam road

Ken Lucci (20:43):
And, and would you agree?
Would you agree that that wouldtrain the av?

James Blain (21:28):
Uh, yes and no.
The point, the job of the, allright, so if we're looking to
train the av, we don't actuallyuse an av.
If I wanna train an av, I takea, to think about it for a
second,

Ken Lucci (21:40):
where, where's he going, John?

James Blain (21:42):
the AV doesn't learn the same way we do.
If

Ken Lucci (21:44):
What rabbit hole is he going

James Blain (21:46):
If you wait, so if you're looking purely to train
an av, what you do is you havethe AV vehicle and you put it in
manual mode and you let thedriver drive it, and then
afterwards you have that drivinggraded by someone else and you
tell the AV, this is what thedriver did.
Great.
We want you to do this.
This is what the driver didwrong.

(22:07):
We don't want you to do this.
To the point that Uber isalready doing that.
So Uber recently announced thatthey are looking at employing
drivers to train the AV tobasically do that.
Because here's the thing, ifyou're trying to do that is a
safety driver, the only thingyou can do as a safety driver is
either intervene when it screwsup or tell it what it got wrong,

(22:30):
which a true function of asafety driver.
That's why I said you almosthave to put'em in a completely
partition box because I don'twant'em distracted by the radio.
I don't want him distracted bythe phone.
I don't want him distracted bythe passenger.
Your job is, you are the Oh God.
Right.
When, when it's an oh Godmoment, you should take over and

(22:50):
be there.

John Tyreman (22:50):
But that's just one job.
And, and I think that that waskind of my point is we're
you're, you're going down likesafety driver,

James Blain (22:58):
the safety driver.

John Tyreman (22:59):
but if we are looking at this through a black
car operator's lens, right?
It's much more than just adriver.
This is

James Blain (23:05):
are two different jobs.

John Tyreman (23:08):
well, right.
And I guess maybe that was mymistake in not clarifying that,
but that was kind of where I,

James Blain (23:14):
we're, we're talking computers.
It's one or zero, you don't getto try again.

Ken Lucci (23:17):
No, it, it wasn't a shitty question just'cause he
couldn't answer it,

James Blain (23:21):
I did answer.
I said it was a shitty questionand then I answered it anyway.

Ken Lucci (23:25):
you know when you come off the road you are sassy.
Okay.
You are sassy.
John and I have been doing thiswithout you for the,

James Blain (23:32):
this, is why, this is why I don't do it while I'm
on the road.
Y'all wouldn't be able to handleme when I'm traveling

Ken Lucci (23:37):
we've been doing this alone for the past 30 days while
you've been out there screwingaround at all these association
meetings.

James Blain (23:43):
by the way I want.
I want all the associations toknow that our sponsorships and
ad dollars are considered to bescrewing around by No, I'm just
kidding.

Ken Lucci (23:50):
So, so no, what you just hit upon probably, and,

James Blain (23:53):
but it, you, you can't do both effectively.

Ken Lucci (23:56):
but, but to, but to, you know,

James Blain (23:58):
do both effectively.

Ken Lucci (23:59):
my realization because you know how I've been
with AVS and because I've lookedat the six cities in China that
have 30,000 of these beasts onthe road already.
You know what Pat, and I forgetthe other gentleman's name from
Wendell Marx Pull pointed out tome is that China's entire

(24:21):
business climate and rerelationship to government is
completely different than ours.

James Blain (24:26):
Oh, yeah.

Ken Lucci (24:27):
completely different than ours.
So when he said to me,

James Blain (24:31):
By the way, we're not saying anything bad about
China.
Please,

Ken Lucci (24:33):
not at all.
No.
Well, geez, we don't want anycease.
And is this coming from, fromthem either?
Um, what, what the, and thegentleman from Wendell Marx also
used to manage the New York Cityand their entire fleet for many,
many, many years.
So what, what he said to me wasthe human element of it, which
is New York City will givemassive pushback about all of

(24:55):
these drivers losing their jobs.
So what you are just saying tome seems to me to be a good
solution.
Where,

James Blain (25:03):
but wait,

Ken Lucci (25:04):
wait a minute, you man, you mandate that the, the
drivers have to have to drivethese AV units.

James Blain (25:11):
yeah, so, all right, so a couple things.
So, so one of the.
One of my favorite Elon quotes,right?
And, and we, we give Elon abunch of crap'cause he's
basically a mad scientist.
But one of my favorite Elonquotes is that the fundamental
characteristic and mistake of anengineer is optimizing a thing
that shouldn't exist.
Okay?

(25:31):
And the, the thing that peoplehave to understand is a safety
driver is a step to get wherewe're going.
So if you wanna look at aindustry that's done this, and
the airline industry.
We used to have that when youwould cross the Atlantic, you
had four engine planes.
You had a flight mechanic orflight engineer.
You had a UR in command and asecond in command.

(25:54):
Over time what's happened?
We've gone down.
We got rid.
Now we're happy with twoengines, not four.
Now we're happy with two pilots,not one.
The Airline Pilots associationis now going through a huge push
to try and keep two pilots inthe cockpit because Boeing, all
these others have come back andsaid, guys, we can push a
button.
And the fricking thing fliesitself.

(26:15):
What do we need?
Two guys in here we're payingfor, we

Ken Lucci (26:18):
no, no,

James Blain (26:19):
shortage and they're trying to go down to one
pilot.

Ken Lucci (26:21):
no, we need

James Blain (26:23):
now, now this is where it gets interesting
because it depends.
So you look at your Cessnas, youlook at all your small private,
your small individual aircraft.
You got a bunch of guys flyingby themselves,

Ken Lucci (26:34):
Yeah, but they're in charge of themselves.
They're not in charge of 300souls.
That's the issue I have.
And are we really gonna takeBoeing's word for, for Christ's
sake, they're building planesthat fall out of the sky just
because of some computer says itshould go up and the damn thing
goes down.
Do we wanna talk about the 7 37max

James Blain (26:52):
James Blaine nor Pack's training endorse any of
the remarks made by Ken.
No, I'm just kidding.
Um, so, so here's the deal.
So, interestingly enough, Ithink we're at a point where the
fundamental idea of autonomousdriving is that you don't need a
driver.
The problem that you have iswhat I talked about a second

(27:13):
ago.
We as humans trust really fast.
If it works for the first 10minutes, we think it's gonna
work for the next 10 minutes.
And so the problem is you've gotall these people blindly
trusting in it that don'tunderstand it.
I always go back to when quoteunquote, autopilot came out for
RVs.
It was cruise control because itwas called autopilot.

(27:36):
People would literally go to theback in the back of the RV and
wonder why all of a sudden theycome outta the bathroom in a

Ken Lucci (27:41):
and try to make a, and try to make a sandwich

James Blain (27:43):
Yeah.
Yeah.
So the so, so to, to wrap thisup in a nice little bow, the one
thing that I would say is that.
We need to only have safetydrivers until the data proves
that the safety driver isn'tneeded.
Having a safety driver for thesake of a pension or to have
someone have a job or just topay them to be there, in my

(28:03):
mind, it makes more sense to tryand find a better role, better
thing that person could be doingthat we know AI or technology
can't do.
Rather than having safetydrivers longer than we need to
just'cause we have a pensionfund.
We have to find more effectiveuses.
And not only that, like think ofcoming home from work.
How was your day, honey?
I stared at a fricking robotdriving the car like I do every

(28:26):
day.
Well, I scrolled through Twitteron the side, trying to do it in
a way that the camera wouldn'tnotice and tell me that I'm
scrolling through Twitter.
That's a meaningless

Ken Lucci (28:33):
But, to your point in New York, you know, they're
gonna, New York City is gonnapush towards, no, we don't want
this evolution in technologybecause it's going to displace,
you know, 13,000 taxi driverspotentially.
And another, I don't know howmany Ubers there are out there,

(28:54):
let's just say another 15,000when what you are saying is come
up with a, a plan.
Where you have a period of timewhere all of the data is being
collected, and at the, at thesame time, come up with a plan
to retrain the, retrain theworkforce.

James Blain (29:14):
Yeah.
You have to, you have to build abridge to transition, right?
I, I'm gonna use the most clichequote ever, but when one door
closes, another door opens,right?
Fighting progress is dumb.

Ken Lucci (29:27):
I, I, I also don't think you can.
I, I just don't think you can.

James Blain (29:30):
But the other piece of that is that, you know, a lot
of, a lot of it is we get caughton these problems and we get
this analysis, oh my God,there's 80,000 drivers, there's
a million drivers, there's a,you know, 10 million drivers or
whatever number it is.
We can't solve this, we can't dothis.
Why are we going to, we foundsomething that might be better
done by technology.
We know it eventually will bedone by better technology.

(29:51):
How do we find meaningful workfor these people that they can
do effectively and do a job thatonly people can't?

Ken Lucci (29:58):
I've already got the answer to that.
And to train all these guys tohave CDLP licenses and CDL
licenses because there's 85,000school bus jobs open.
There's 275,000 truck drivingposition.
So if you like to drive, we'regonna train you on a CDL or

James Blain (30:16):
the only problem is they're coming for that next, I
would, I

Ken Lucci (30:19):
wait a minute.
But that'll be next.
But that will be next.
But you still, I refuse tobelieve in my life lifetime,
you're gonna have a 56 passengervehicle with no driver in it.
I might be wrong, but, but tobut to your point, then, then

James Blain (30:34):
Well, I don't know how much longer you got left,
Ken.

Ken Lucci (30:37):
don't know, maybe five years.
I don't know.
Probably five.
Five.
I mean, good ones.
Five.
I've only got five Good onesleft of me

James Blain (30:44):
Because,'cause I think I think we're, I think
we're 10, I think in 20 years.
Right.
And Ken, if you're not aroundfor that, I'm gonna go, I'm, I'm
going to your grave and bitchingevery day.
Right.
I

Ken Lucci (30:53):
Let me tell you something.
If I'm on this frigging podcast20 years from now, you we're
gonna both have a problemanyway.
No, but to you, no, to yourpoint.
I think to what the point you'remaking is instead of Fight New
York, and that was theperspective I hadn't heard
before.
Instead of fighting progress,why don't you interview Uber

(31:17):
Waymo, Zu X.
Collectively, you need tointerview all these people and
come up with a consensus of howthey're gonna be trained to go
someplace else.
And guess what?
You guys are mega rich.
You guys are gonna have to payfor this shit.

James Blain (31:32):
Yeah.
And hey, guess what?
If, if you have to, and I'm notsaying to do this Uber, don't
sue me, right?
But maybe you go in and you say,Hey, if you're gonna, like, uh,
for example, in the state ofMassachusetts, every single ride
share ride, they have to pay 10cents and it goes into a fund
that's supposed to be there to,to help people.
It make total sense for me ifthey said, Hey for, and as, as

(31:55):
this goes down, if you wannaoperate autonomous vehicles in
the state of New York, X out ofevery ride has to go to the
pension.
That helps transition all of theexisting drivers out and take
care of all the ones.
And as those people shrink,we'll start scaling that back.
We all know that'll neverhappen.
It'll go to building somethingelse.
But, but there are, there areplenty of ways to do that, that

(32:17):
allow progress to continue whilestill creating a soft landing
for the affected industries ordisplaced individuals.
And it's just a matter offiguring it

Ken Lucci (32:27):
And before, this is the last comment, and then John
we're gonna get to this video,is, uh, I, I've heard from an
extraordinarily reliable sourcethat Waymo Google had a, meeting
with the Waymo team and GoogleFlat out said, we don't give a
shit what it costs.
Just keep going much, muchfaster.
Go as much faster as youpossibly can.
They've got the money and

James Blain (32:47):
they're winning the race.

Ken Lucci (32:49):
and this is, this retraining of, retraining of
drivers to me is something thatthey could do that would grease
the skids.
Across the board and move usforward, move us forward into
autonomy.
And, but I don't see thathappening on the robotics or AI
side, but let's not go down thatroad.

(33:10):
But anyway, I think that that'sa solution to what they should
be doing in New York.
If, if New York comes back andsays, slow down, what are you
gonna do with these drivers?
You hit the nail on the head,they're gonna have to fund it.
All right, so let's go.
What's Tesla autopilot takescontrol to avoid two accidents.

James Blain (33:29):
All right.
Let's

John Tyreman (33:30):
How about that headline?
All right.
Yeah.
So, uh, this next series ofvideos will be a little bit,
have a little bit of a differentfocus.
So the first batch was aroundhumans that were taking over and
correcting the autonomousvehicles.
This next couple videos isactually the opposite.
This is when autonomous takesover

James Blain (33:49):
the digital nanny.

John Tyreman (33:51):
the digital nanny.
Exactly.
So, uh, I'll, I'll pose thisquestion before we get into the
videos and then

James Blain (33:57):
be Ken's question.

John Tyreman (33:58):
yeah.
Maybe Ken, we can start withyou.
Do you think that AVS canactually drive down fleet
insurance costs?

James Blain (34:14):
No.

John Tyreman (34:15):
Look.

Ken Lucci (34:18):
Well, the first question I have is, John, did
you put that music in?

James Blain (34:22):
Yeah.
Like,

John Tyreman (34:23):
that was not, that was not my selection of music.
I would've picked something much

Ken Lucci (34:27):
Okay.
So let me give a narration for

James Blain (34:29):
oh boy.
Was buzzing.

Ken Lucci (34:31):
For those people in the studio audience that are not
in front of YouTube.
What happened was the Tesla wasin the middle and on the right
hand side there was a pickuptruck with a something on the
back of it, and it swerved intothe Tesla's lane.
And so the Tesla, it swerved,that vehicle swerved in front of
the Tesla going left, so theTesla went further left over to

(34:52):
the other lane, but there was acar there, so that vehicle got
in the way.
So the Tesla kind of did twomaneuvers to get out of the way.
Okay.

James Blain (35:02):
Yeah, this all could have been avoided by
hitting the brakes.
By the way, as the safety guy,I'm like, if he, if he wouldn't
have been hauling absolutelyflying down the center lane and
just hit the brakes, he wouldn'thave had to overcorrect.
'cause I will tell you, Ken,where I thought that went, and
the reason I gave that reactionis I totally thought it was
gonna spin out.
'cause that that thing was justflopping around

Ken Lucci (35:22):
yeah.
One thing I did did leave out isthe Tesla, who, whoever was in
that Tesla, it got a little bitof a, a jarring ride because it
did overcorrect a couple times.
Um, okay, so the question youasked is, tee up that question
again.

John Tyreman (35:39):
Yeah, so the, the big question is with, autonomous
vehicles and presumably with theright level of training and
inputs, they would grow to thepoint where they would be able
to prevent more accidents intheory.
So therefore, would that be aviable way to drive down fleet

(36:00):
insurance costs, which isplaguing the industry.

Ken Lucci (36:03):
Yeah.
My answer is yes.
One thing that the, the AVS havegot going that the traditional
vehicles in the, of the pastdon't have is the data
collection.
The data collection in theautonomous space is, is front
and center into every singleplatform.
Okay.
So the answer is yes, everysingle incident is being learned

(36:25):
from, especially on the Waymoside and what we're seeing as
far as Tesla reporting into thecloud.
So, yeah, I do, and I think thatthe, it, I, I also think that

(37:41):
the data collection is going tobe done in weeks and months or
months instead of over many,many, many years.
You're going to have so muchdata input that it's going to
overwhelmingly make the casethat these things are yes, much
safer or no.

(38:01):
And the way it's leaning rightnow is they are.
Much, much safer Autonomousvehicle is much, much safer than
a human being driving a vehicle.
That's what it looks like rightnow.
So yeah, I do just purelybecause of the data collection.

John Tyreman (38:19):
All right, well, here's another video, kind of in
that same vein of a, uh, aTesla, an autopilot on the
highway.
Uh, and this one is from whambam Tesla cam.
Uh, so let's take a look

Ken Lucci (38:32):
For those of you who want to continue to follow this,
go onto YouTube

James Blain (38:38):
Yeah.
Yeah.
You gotta see these

Ken Lucci (38:39):
and find at wham WHAM BM Tesla wham bam tesla cam.

James Blain (38:49):
Is that bam or is that.

Ken Lucci (38:51):
wham bam.
Yeah,

John Tyreman (38:52):
can't really tell.
But if you're also going toYouTube, also go to at Ground
Transportation Podcast and giveus a

James Blain (38:59):
guys.
Yeah, yeah, yeah.
Forget those guys, right?
We're we're just, we're we justgave him free.
We're, we'll send him a bill forthat, that call out later.
Go to our, go to ours instead.

John Tyreman (39:07):
All right.
Well, we're, uh, you know,courtesy of wham bam Tesla cam,
here is, uh, this video of theTesla autopilot on the highway.

James Blain (39:14):
Let's see here.

Ken Lucci (39:29):
Oh,

James Blain (39:42):
Okay.
Elon, what the hell?
You, you, what the hell?
He had a whole empty lane nextto him.
Beep, beep, beep.
You're gonna get hit.
Move stupid.
Like, are you kidding me?
It knows it's coming.
It could have just moved.
You saw in the rear view cam,the other car moved.

Ken Lucci (39:58):
But Okay.
But, but the typical humanreaction would've been to slam
into the car in front of him.

James Blain (40:06):
Oh, hell no.
If you're PAC's trained, youbetter have moved.
If I find out your PAC's trainedand you're slow down and you
didn't look in the rear view andyou didn't just move, I'm

Ken Lucci (40:14):
you?
So what you are, wait a minute.
What you're saying is you're notconfident that the Tesla is
doing the 12 o'clock, threeo'clock, six o'clock, nine
o'clock

James Blain (40:24):
Well, but All right.
All right, roll.

John Tyreman (40:25):
But if you get

James Blain (40:26):
Can we, can we roll this back?

John Tyreman (40:28):
wouldn't that car behind them slam into the, the
car that you would get out?
Oh yeah.
Let's roll it

James Blain (40:32):
Roll it back for, play the tape.
Play the tape.
All right.
Look, you've got a whole emptyshoulder to the left.
Okay.
I don't know if, I don't know ifthere's a mute on this, but, but
there's, there's a whole emptyleft lane here you can see at
the river.
That guy's coming.
Get the hell out the way.
Like I, I, I don't cross, Idon't cuss real often, but like

(40:54):
in my head, all I'm hearing isludicrous song from when I was
in high school.
Move bitch.
Get out the way.
get out the get out the wave,get out, get.
Like

Ken Lucci (41:08):
there's no, there's no need for that kind of talk.

James Blain (41:12):
Says, says, Mr.
F Bomb.
But, but

Ken Lucci (41:14):
Are you, uh, wait a minute.
Are you saying go to theshoulder on

James Blain (41:17):
but look, look, you've got, you've got an empty
right lane.
You've got an empty left lane.
Yeah.
You've got two empty lanes, theguy behind him.
So if you watch that again inthat impact, that what looks
like some kind of forward focusor something went around the van
that hit him.
I'm telling you in this one,anytime I'm slowing down right,

(41:38):
I'm looking rear of me and I'mlooking behind me.
And if I see that guy's look,brake lights, I'm, I'm.
Instantly.
Oh crap.
The guy behind me doesn't seeit.
Here he comes, Tesla, be beep.
You should have moved.
Right.
It it Effective drivers would'vetried to hit.
Are you successful?
I don't know, but I can tell youhe has two empty options left

(41:59):
and right.
Neither of them were used.
He psyched because he didn't hitthe car in front of him.
I'm pissed because the passengerin my backseat is now not gonna
make it to the airport on time.
Because he had an accident.

Ken Lucci (42:10):
well, not to mention the fact the passenger in the
back seat seat's gonna sue youfor whiplash, you know?
Um,

James Blain (42:16):
things, speaking

Ken Lucci (42:17):
point.
Yeah.
To your point, to your point, Ithink that the beep beep, beep
beep is not an ample warning for

James Blain (42:24):
yeah.
I might as well just said, youare screwed.

Ken Lucci (42:27):
No.
It should have sent brace,brace, brace.
Rear crash coming.

James Blain (42:31):
I feel like, you know, that's the type of thing
for me, look, not, I, I getthat.
Not every accident is avoidable,but most unavoidable accidents
are a result of putting yourselfin a compromised position and
you can avoid generally puttingyourself in a compromised
position.
And I think he was actually ingood position to make a move.

(42:53):
Right?
Or make a move left.
We saw a Tesla practicallyfishtail out'cause it moved
outta the way earlier.
It just, in this case, decidedto stay straight instead of move
outta the way.

Ken Lucci (43:03):
and lemme ask you a question.
When you think of thepossibilities of an accident
occurring, doesn't that scenarioalmost always come up where
someone, someone comes, breaksin front of you?
Okay?
You're My whole point is that'snot an auto that's happened to
every one of us.

James Blain (43:23):
day

Ken Lucci (43:24):
Okay,

James Blain (43:25):
we teach the safe zone.
Right?
So for, for me, my big thing isif you ride with me on the
highway, right?
Especially, you know, all of ushave our modes where we're not
as high alert, but if, but ifyou ride with me on the highway,
I typically am going topurposely ride in what I call
open pockets.
And so I'm going to try and findan open pocket where I have some

(43:47):
option to move.
And I don't care how angry youget, I'm gonna make sure that
there is space in front of mebecause if I have to stop short,
I have to be able to get outtathe way.
I have to make sure I've gotplenty of room riding.
Having that space to maneuver iscrucial.
And in this case, he had it, hejust didn't use it.

Ken Lucci (44:06):
Good point.
Okay, now

John Tyreman (44:08):
Now we're

Ken Lucci (44:08):
Romania.

John Tyreman (44:09):
Romania.
Yep.
Now no, we're, we're off toRomania.
And this one is a little bit ofa different question.
So thank you, Ken, for giving usyour thoughts on whether you
think AVS could actually drivedown fleet insurance costs.
But I think this next question,I would love for both of you to
chime in who is at fault in anAV crash.

(44:30):
So what we're gonna

James Blain (44:31):
it's Elon's fault.

John Tyreman (44:33):
is we are gonna go to Romania and we're gonna check
out this video.
Alright.
You ready?

James Blain (44:38):
Yep.
Let's go.
Oh, and almost still hit him.

John Tyreman (44:54):
Alright.

Ken Lucci (44:55):
so let's tee that up.
For those of you in

James Blain (44:57):
life.
A human driver would'vestruggled with that.

Ken Lucci (45:00):
so those of you in the studio audience who can't
see this YouTube, um, whathappens is the te white Tesla is
going on by itself and it is inthe right hand lane, but next to
that lane there's pedestrians,

James Blain (45:15):
is like those little bollards you see in

Ken Lucci (45:17):
Right.
The road is separated by somebollards, but there's a sidewalk
there and a pedestrian fallsinto the roadway and the Tesla
completely 100% avoids him bygoing the only place that they
could, which is into the nextlane, the oncoming traffic.

(45:38):
Okay.

James Blain (45:39):
and it still almost hits him'cause it hits that car.
It bou.
I've been watching this.
We've got it looping for us towatch here when it bounces back
that rear right tire almost.
Well, I think he gets barely hitwith the bumper, but that tire
would've been on him,

Ken Lucci (45:53):
but uh, that is a fantastic save.

James Blain (45:56):
Yeah, but this is the trolley problem.
And for those that aren'tfamiliar with the trolley
problem, you know, you've got,you've got a trolley going down
a track, okay?
And you've got two differenttracks.
You can throw a switch, there'sa trolley coming.
You have no way to stop it.
You throw one switch, one persondies, you throw another switch,
three people die.
Right now it's heading towardsthe three people.

(46:18):
So the question becomes, if Igo, oh God, I gotta save those
three people, and I throw theswitch and one person dies, I am
now a direct murderer.
I murdered that person.
I am personally responsible fortheir death.
If I do nothing, three peopledie.
But that wasn't really my fault.
I didn't do anything.
I just watched them die.
So the question becomes, in thetrolley problem, do you throw

(46:40):
the switch and kill the oneperson to save three?
Or do you say, Hey, I don'twanna kill anybody.
If they die, they die.
I'm hoping to God, Jesus takesthe wheel and it derails the
trolley and everybody's okay.

Ken Lucci (46:52):
So, but I look at this differently.
This is a law probability thatthe Tesla looked at it and said,
the probability if I hit thathuman, that human being's gonna
die.
If I go into the next lane andhit a vehicle, chances are lower
that the

James Blain (47:07):
You're giving it, you're

Ken Lucci (47:08):
that,

James Blain (47:09):
much credit.
I don't You're giving

Ken Lucci (47:11):
I think it.
I, I I'll, it made the rightdecision.

James Blain (47:15):
I, it made the right decision based on the
sequence of events, in my

Ken Lucci (47:19):
Oh, totally.
About what choice did it have?
The, the,

James Blain (47:24):
My point is the guy falls into the street first.
All right.
So if you, so, all right, I'mgonna go full programmer for a
sec.
Yeah.
It's that Jack Off's fault.
Um, so yeah.
Yeah.
Why you falling in the straight

Ken Lucci (47:35):
can't wa was he drinking?
He can't walk on a sidewalk.
What's your problem?

James Blain (47:40):
Romania, uh, no, but in all seriousness,

Ken Lucci (47:43):
trips on the pothole.
He trips on a pothole and throwsit.
He gets thrown into the street.

James Blain (47:50):
All right, so let's, let's bring it back for a
second.
Computers work on a sequence ofevents.
Computers are typicallymonolithic, so in this case, it
sees someone fall into the roadand it takes immediate action to
not hit that person.
There are bollards on the right.
There's an open lane on theleft, okay.
If you look, there's actuallytwo lanes of oncoming traffic.

(48:14):
Arguably, if it would've seenthe guy falling and not just
been fully reactionary, itmight've been able to, and a
human driver couldn't do this,but it might've been able to
calculate how to get into thefar left lane and go around the
green car on the far side.

Ken Lucci (48:28):
Uh, I dunno if there

James Blain (48:29):
this is, but this is, but this is the same thing
you see in movies where like theAI is shooting at'em and the guy
is dodging the bullets.
When AI hits war, it's gonnasuck.
'cause AI doesn't miss, AI isn'tlike us.
AI would only take the shot ifit knows you're dead.
Right.
This whole like ai.
So I think what we're seeinghere is we're seeing a sequence

(48:50):
of events that led to anoutcome.
It made the, it made thedecision to go to the other
lane.
It then had to figure out whatto do with the green car and the
clock ran out.
It just ran out of time.
I don't think it calculatedthat.
Um,'cause now we have the iBotproblem.
The kid has a lower chance ofsurviving than I do, but he's

(49:12):
got more life ahead of him.
Right.
That's the whole, if you watch,and by the way, um, the iRobot
movie with Will Smith was basedon the Isaac Asimov book that
develops the

Ken Lucci (49:22):
He's take, he's take, he's taken us down a

James Blain (49:24):
I know, I I to this is what, this is what you get
when you bring up eye, eye.
But, but the, but the problem isultimately going to become,
these are moral and ethicalhuman problems, not robot
problems.
That's

Ken Lucci (49:39):
in that case, it is a good point.
He

James Blain (49:40):
And it only made the decision because of the
sequence of events.
We can't, we can't humanize itand say it didn't want to kill
the pedestrian.
It said, oh shit, there'ssomething in the road.
I don't want to hit it.
And then it got to the otherside.
It

Ken Lucci (49:52):
I refuse to believe that.
I think Tesla knew that it was ahuman being.

James Blain (49:57):
And your Roomba loves you and it's not mapping
your house.
It's keeping your house clean.

Ken Lucci (50:01):
Well, the only way to prove my

James Blain (50:02):
My robotic cat isn't spying on me.
It really loves me.

Ken Lucci (50:06):
My, my, the only way to prove my hypothesis is to
throw a cardboard box in frontof,

James Blain (50:13):
I thought you were gonna say a child.
Thank God it was cardboard box.

Ken Lucci (50:16):
it's, and you know, it's, if I'm looking, if I'm
looking at that Romanian videoclip and I wanna prove, I, if I
wanna prove my point, I'mduplicating that scenario and
throwing a cardboard box outthere, and I'm betting you that
that Tesla goes right over thecardboard box.

James Blain (50:36):
uh, maybe

Ken Lucci (50:38):
Well, maybe there was a human, wait a minute, there's
a

James Blain (50:40):
Elon.
Come on the podcast.

John Tyreman (50:42):
Yeah, let's test that out.
Ilan.
Come on.
Show us.

James Blain (50:45):
There's a, there's a robotic cat in the cardboard
box that we're throwing in frontof the robotic taxi to see.
And then, and then we'll throwfrom the other side, we'll throw
a real cat in a box and we'llsee which box it runs over.

Ken Lucci (50:57):
We really need Elon.
We need Elon on this.
We need him on the podcast tosettle these important
questions.
Okay, now we've got a Robo Taxiicompany.
Cruz

John Tyreman (51:08):
All right, so the question remains who's at fault
in an AV accident?
So let's play this clip and thenwe'd love to get your reactions.

Ken Lucci (51:17):
Mm-hmm.

James Blain (51:41):
Oh God.
Rest that poor woman.

Ken Lucci (51:44):
But, but to be clear, she got hit by another car
first.

James Blain (51:47):
Yeah, so, so John, correct me if I'm wrong, this is
a pretty dark story.
So basically the cruisevehicle's driving along, this
woman gets hit, it tosses herunder the cruise vehicle.
The sensors don't notice it,right?
Uh, she gets dragged, I don'tknow what eventually stops it,
but she basically gets dragged.
And the argument here is a humanwould've known something

(52:10):
happened.
Now that said, we shouldprobably also pull up the clip
of the guy driving 90 down thehighway with no wheel on his
vehicle, dragging the frontcaliper on the, the highway,
'cause I call bs.
But the whole idea here beingthat a human would've known
better and that the fact thatshe got dragged, and, and like I
said this is, but this goes backto what I just said a second

(52:31):
ago.
It is a tragic series of eventsand the order of operations
matters because if I rememberright, the whole deal is that
because she got.
Tossed under the vehicle.
The crew's vehicle.
Didn't know she was down theregetting dragged.

Ken Lucci (52:48):
Well, a, a couple things, you know, couple, couple
things.
First and foremost, you, um,cruise is out of, cruise is out
of business, you know, and Ifirmly believe for reasons.
Um, when I look at the dollarsthat Waymo has put into AV
compared to the dollars thatthat GM put into it, I think
that number one is an evolutionof autonomous technology.

(53:12):
And today.
You know, today, I think if thecrews had encountered anything
that had that kind of an impact,the tech is such that it
would've stopped the vehicle.
Um, would she still have died?
Sure.
She still, she still would'vedied because that's a, that's
sequence of events is tough toprevent.

(53:32):
Right.
She was tossed in front of it.

James Blain (53:34):
but these are human problems, right?
You hear stories all the time ofsomeone that you know, they
have, they have a, and, and thisis again, very tragic, but this
is what we're talking about.
You hear stories all the time ofsomeone who, they get hit by a
subway or they have somethingmassive fall on them, and
there's no way to save them likethey're done.
They, they're pinched in a way.
That we cannot extract them andsave their life.

(53:58):
And so I think at a certainpoint, you know, you have to be
careful of the hype versus buzzbecause there's a whole group of
people ready to point the fingerat AV and be like, that shit
doesn't work.
You're gonna kill us all.
While there's another groupthat's like, oh no, that's
great.
We need to do that tomorrow.
And I think, um, when, andagain, back to the International

(54:18):
Association of TransportationRegulators, at their conference
this year, they had two guysback to back.
They had a, uh, the guy thatbasically was, Hey, this is
great.
This is awesome.
Let's do it.
We believe in them a hundredpercent.
And then they had a professorthat was, Hey, you're going too
fast.
This is too dangerous.
Don't do this.
And at least what, what cameoutta that for me is that

(54:40):
especially in the us it's leftor right.
It's red or blue, it's, youknow, black or white.
There's no in between.
And I think in our world,there's a middle ground because
there's unfortunately accentslike this all the time.
There's drunk drivers all thetime.
There's things that happen, youknow, every X amount, and this
is a tragedy, but I, I don't,there's, if she got hit and

(55:02):
thrown under the wheel, it's notlike the AV could have jumped
and not hit her.
So, to a certain extent, some ofthese are there.
Like I said, not all events canbe avoided.
A lot can, some circumstancesare freak accidents

Ken Lucci (55:18):
but if she was thrown in, in front of that vehicle, 25
feet in front of it

James Blain (55:25):
and

Ken Lucci (55:25):
and this Correct.
So to, I hate to say it, butbased on that video, you know,
engineers guaranteed they lookedat it and said, okay, wait a
minute, now we need to putsensors on the front bumper.
And, and I guarantee you thatthat senseless death caused some

(55:46):
changes in technology.
But, but to my way of thinking,because we've gone through a
second revolution on av, thefirst revolution on AV was done
by vehicle companies.
It was done by the, by theautomotive companies and it was
a miserable failure.
And then, and then Uber tried itand.

(56:08):
Again, they were just usingbasic vehicles.
Waymo, Waymo and Zu X startedfrom the ground up.
And Waymo, Waymo and Tesla andWaymo and

James Blain (56:18):
and Apple had a

Ken Lucci (56:18):
X,

James Blain (56:19):
they give up on.

Ken Lucci (56:20):
they gave up on it.
So I think the, yeah, I thinkthe evolution is, is gone, um,
has moved forward I tremendouslysince the cruise.
The other thing I'll say is wehave, without question, a great
source of data and a greatsource of what could happen and

(56:43):
has happened.
When you look at China having30,000 robo taxis on the road,
23 hours a day, one hour forcleaning, one hour for a new
battery.
And this, and the sad part is,I'm, I'm just gonna say it, I
think China has a different wayof looking at things.
And there's a completelydifferent relationship between

(57:04):
regulatory and business.
I mean, the,

James Blain (57:06):
Yeah.
That's the home of the suicidenet at the factory.
Right?
They were having people's,people were jumping off the
factories to commit suicide, soinstead of fixing the working
conditions, they put nets aroundso that when you would try to
kill yourself, you didn't dielike that.
Clearly,

Ken Lucci (57:20):
a different, it's a different way of thinking.
And, but the, but, but they areway ahead on, I they're way
ahead on autonomous vehicles.
'cause they don't give a shit ifpeople drop to their debt.
No, I'm only kidding.
I'm only kidding.

James Blain (57:34):
but, well, but, but you've brought up a fundamental
business problem, right?
So I recently listened to apodcast on Dieselgate.
So Dieselgate, basicallyVolkswagen had this idea that
they were gonna create.
Clean diesel vehicles and theCEO comes down and he says,
we're doing this.
It's happening.
It's gonna work, it's gonna bethere.
The engineers couldn't make ithappen.
They couldn't make it happen.

(57:55):
But the mandate was, I don'tcare what you're doing, you get
it done.
So what did they do?
They cheated.
They made it, they, well, sowhat they did is they used
what's called a defeat device.
So when you were testing thevehicle, it would notice that
the steering wheel wasn'tmoving.
It would notice all of theseinputs and it would go into like
a limp mode and it, that limpmode, it would give out certain

(58:15):
numbers.
And the only reason they gotcaught is some college kids got
a grant to drive vehicles andtest them.
And they had like this JerryRidge cardboard box testing and
they couldn't get the Volkswagento do it.
And they didn't believe it.
Like they'd marketed so hardthey'd sold it so well.
The kids are like, we screwedup.
Like we, we got something wrong.
And the kids kept testing againand again and again.

(58:38):
They couldn't get it to actuallydo it in the real world and
eventually it turned into agiant issue.
But I think the other side ofthis, and it kind of goes to
what you're saying about withChina, is to a certain extent, I
think you have to be aware thatthere are times that business
wants results that will be goodfor business even if they're not

(59:02):
feasible with the currenttechnology.
Right?
I think so.
I think that's, for me, what'sscary about this is, and we've
seen the same thing, believe itor not, in nuclear power and the
nuclear regulatory industry.
And that's why you have, youknow, nuclear a, a lot of these
nuclear meltdowns, if you everwatch the Cher NOL series on HBO

(59:22):
Max, it's great, but it willscar you because it was
literally the Soviet Uniondecided, hey.
We're gonna test this thing andwe're gonna run the test.
And we don't really care aboutsafety or anything like that.
The test has gotta be done.
And oh, by the way, we didn'ttell anybody about the flaw in
the reactor because that wouldmake us look bad.
You get, you know, incidentslike Three Mile Island, they

(59:43):
start looking at three MileIslands maintenance record, and
it's crap.
You know, this is not nuclearpower, but this is something
that, you know, I, I hear allthe time, what if it gets
hacked?
What if it fails?
What if it gets struck bylightning?
I mean, these are all thingsthat, you know, like I said
earlier, I, I support thetechnology, but to a certain

(01:00:04):
extent, you've gotta do a safeand effective rollout, and you
have to transition intosomething.
You can't just say, Hey, I don'tfeel like driving anymore, and
instead of driving three hours,I want to be able to work on my
laptop, so I'm gonna adopt ittonight.

Ken Lucci (01:00:17):
Well, in the end, we had the, uh, CEO of OB on.
Who talked about the fact thatthe CEO of ob OB is a ride share
aggregator.
Think about, you know,booking.com for the ride share
companies.
You can find the best prices onOB and her, one of the
statistics she gave was whenthey did a survey of, I think it

(01:00:39):
was 10,000 ride share customers,78% of them said, yeah, they
would prefer an autonomous orthey would, they would,
absolutely, as long as thesafety was there, they would,
they want an autonomous vehicle.
So we're in an excitinginflection point where, where.
They've worked in Phoenix,they're working in Austin.

(01:01:00):
We see how they're going to goin more congested cities and
cities that are, have olderinfrastructures and perhaps, you
know, they're not all flat, theydon't have the best weather.
So I think it's gonna becritically important to more
than ever before.
The answer is data.
The answer is safety data.
One of the things that I reallylike about the AV evolution is

(01:01:25):
every single one of thesevehicles is a force multiplier
for law enforcement because theyhave cameras on them.
So to

James Blain (01:01:35):
So I I think that only makes sense until you get
someone like Apple that's like,nah, we're not giving it to you.

Ken Lucci (01:01:40):
okay, but wait a minute, but, but in the city of
Miami, they have an autonomousvehicle, uh, autonomous police
car that is going around andit's being one person can
monitor six of those policecars.
So, and at some point AI isgoing to be looking at this data
and saying, wait a minute, thisall looks good, except the guy
that running down the streetwith the gun in his hand.

James Blain (01:02:03):
Well, but the problem is now you start getting
into the question that they havein England.
Right?
And, and so you look at Englandand they have more CCTVs than
anywhere on the planet.

Ken Lucci (01:02:13):
yes, the city of London has

James Blain (01:02:15):
right, they at, they were testing technology at
one point.
They literally listened to toneand voice inflection, and
anybody that sounded aggressiveor raised their voice, it would
flag at what point, right?
And this is me swinging thependulum as hard as I can.
At what point does this becomeminority report and I'm in
trouble for something I haven'tdone yet?

Ken Lucci (01:02:35):
Well, and, and, and what if you're inside the AV
vehicle and that it's listeningto you?
There's no question.
I mean, I never thought I wouldlive to see the day that we had
robots and AI and the AV units.
even as much adoption as thereis now, even though

James Blain (01:02:53):
Oh, I'm still waiting on the Terminator.
I expect to see the Terminatorin my lifetime.
And I, and I don't, I don't meanthat, I mean that in a serious
sense, right?
I, I genuinely, we are alreadyseeing drone strikes, right?
At what point now?
And now we're seeing it withUkraine.
You are seeing autonomous dronestrikes.
They program the target, theyturn it loose.

(01:03:14):
It determines what the target isand isn't.
And if it doesn't have thetarget it wants, it picks a
secondary one.
And so we're literally, it's notthe Terminator in the sense of
Skynet, but we are, I genuinelybelieve that in my lifetime we
will see that.
And all they had to do rightnow, we could have the
Terminator tomorrow.
Elon's got his humanoid robot,give him an AR 15, right?

(01:03:36):
And all of a sudden he's theTerminator.
He's

Ken Lucci (01:03:39):
You know, John, we really do miss him when he's not
here, but he's taken us quitefar afield today.
All I wanna do is see autonomousvehicles in New York City.
He wants to see robocop outthere with an ar.

James Blain (01:03:53):
whoa, whoa, whoa, whoa.
I I didn't say anything aboutputting a human brain in there.
Okay.
Now, you, you?

Ken Lucci (01:03:59):
well listen, we've given our audience something to
think about.
As usual, this is the GroundTransportation Podcast.
We are gonna have more episodeswith John, our producer, showing
us videos.
We please subscribe to theGround Transportation Podcast,
go on YouTube so you can, youcan see all of the shenanigans
that have gone on, uh, in videoas well.

James Blain (01:04:21):
Give us a, like, give us a subscribe.
Let

Ken Lucci (01:04:23):
absolutely.
And we've got some excitingstuff coming up in the future.
So thanks again for, uh, joiningus today and have a great
weekend.
Thank you for listening to theground transportation podcast.
If you enjoyed this episode,please remember to subscribe to
the show on apple, Spotify,YouTube, or wherever you get
your podcasts.

(01:04:43):
For more information about PAXtraining and to contact James,
go to PAX training.com.
And for more information aboutdriving transactions and to
contact Ken, Go to drivingtransactions.com.
We'll see you next time on theground transportation podcast.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Burden

The Burden

The Burden is a documentary series that takes listeners into the hidden places where justice is done (and undone). It dives deep into the lives of heroes and villains. And it focuses a spotlight on those who triumph even when the odds are against them. Season 5 - The Burden: Death & Deceit in Alliance On April Fools Day 1999, 26-year-old Yvonne Layne was found murdered in her Alliance, Ohio home. David Thorne, her ex-boyfriend and father of one of her children, was instantly a suspect. Another young man admitted to the murder, and David breathed a sigh of relief, until the confessed murderer fingered David; “He paid me to do it.” David was sentenced to life without parole. Two decades later, Pulitzer winner and podcast host, Maggie Freleng (Bone Valley Season 3: Graves County, Wrongful Conviction, Suave) launched a “live” investigation into David's conviction alongside Jason Baldwin (himself wrongfully convicted as a member of the West Memphis Three). Maggie had come to believe that the entire investigation of David was botched by the tiny local police department, or worse, covered up the real killer. Was Maggie correct? Was David’s claim of innocence credible? In Death and Deceit in Alliance, Maggie recounts the case that launched her career, and ultimately, “broke” her.” The results will shock the listener and reduce Maggie to tears and self-doubt. This is not your typical wrongful conviction story. In fact, it turns the genre on its head. It asks the question: What if our champions are foolish? Season 4 - The Burden: Get the Money and Run “Trying to murder my father, this was the thing that put me on the path.” That’s Joe Loya and that path was bank robbery. Bank, bank, bank, bank, bank. In season 4 of The Burden: Get the Money and Run, we hear from Joe who was once the most prolific bank robber in Southern California, and beyond. He used disguises, body doubles, proxies. He leaped over counters, grabbed the money and ran. Even as the FBI was closing in. It was a showdown between a daring bank robber, and a patient FBI agent. Joe was no ordinary bank robber. He was bright, articulate, charismatic, and driven by a dark rage that he summoned up at will. In seven episodes, Joe tells all: the what, the how… and the why. Including why he tried to murder his father. Season 3 - The Burden: Avenger Miriam Lewin is one of Argentina’s leading journalists today. At 19 years old, she was kidnapped off the streets of Buenos Aires for her political activism and thrown into a concentration camp. Thousands of her fellow inmates were executed, tossed alive from a cargo plane into the ocean. Miriam, along with a handful of others, will survive the camp. Then as a journalist, she will wage a decades long campaign to bring her tormentors to justice. Avenger is about one woman’s triumphant battle against unbelievable odds to survive torture, claim justice for the crimes done against her and others like her, and change the future of her country. Season 2 - The Burden: Empire on Blood Empire on Blood is set in the Bronx, NY, in the early 90s, when two young drug dealers ruled an intersection known as “The Corner on Blood.” The boss, Calvin Buari, lived large. He and a protege swore they would build an empire on blood. Then the relationship frayed and the protege accused Calvin of a double homicide which he claimed he didn’t do. But did he? Award-winning journalist Steve Fishman spent seven years to answer that question. This is the story of one man’s last chance to overturn his life sentence. He may prevail, but someone’s gotta pay. The Burden: Empire on Blood is the director’s cut of the true crime classic which reached #1 on the charts when it was first released half a dozen years ago. Season 1 - The Burden In the 1990s, Detective Louis N. Scarcella was legendary. In a city overrun by violent crime, he cracked the toughest cases and put away the worst criminals. “The Hulk” was his nickname. Then the story changed. Scarcella ran into a group of convicted murderers who all say they are innocent. They turned themselves into jailhouse-lawyers and in prison founded a lway firm. When they realized Scarcella helped put many of them away, they set their sights on taking him down. And with the help of a NY Times reporter they have a chance. For years, Scarcella insisted he did nothing wrong. But that’s all he’d say. Until we tracked Scarcella to a sauna in a Russian bathhouse, where he started to talk..and talk and talk. “The guilty have gone free,” he whispered. And then agreed to take us into the belly of the beast. Welcome to The Burden.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.