Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
A self driving shuttle bus in Las Vegas operated for
almost two whole hours before being involved in a minor accident.
Are self driving vehicles really the safest option? Yes, they are.
I'm Jonathan Strickland and this is text Updaily. Let's cover
the news first. A French company called Navia developed an
(00:27):
autonomous electric vehicle designed to travel along predetermined routes. The
city of Las Vegas had such a shuttle to carry
passengers around the Fremont East district. The shuttle itself has
a very modest route at just over a half mile.
It's really just a convenience vehicle. In early November, after
being an operation for about two hours, one such shuttle
(00:49):
collided with a truck, or rather, it's more appropriate to
say a human operated truck backed into the shuttle. According
to a Triple A representative, the accident was due to
human error, not the shuttle. Triple A, by the way,
is a sponsor of this driverless shuttle pilot program. A
representative of the city of Las Vegas gave a bit
more information on the official Tumblr page for the Las
(01:11):
Vegas City government. It read the shuttle did what it
was supposed to do and that its sensors registered the truck,
and the shuttle stopped to avoid the accident. Unfortunately, the
delivery truck did not stop and graze the front fender
of the shuttle. While the headlines may make it sound
like driverless vehicles are unreliable and unsafe, a closer look
(01:32):
at stories shows that that's not really the case. Google's
self driving vehicles have been in operation for a few years,
running tests quietly in northern California, and while the cars
have been involved in several incidents, nearly all of those
were due to human error. In some cases, the Google
vehicle was in manual operation mode at the time, meaning
(01:53):
a human driver was at the controls and made a
mistake resulting in an accident. In other cases, drivers of
other vehicles failed to follow the rules of the road
and caused an accident. In fact, before a Google spun
out its self driving car division into a new company
called Waymo, the list of incidents only contained one case
in which the company determined the self driving system was
(02:15):
at fault. In that incident, a Google car was navigating
a street in Mountain View, California, and approaching an intersection.
The car sinced an obstacle in the road. It was
a pile of sandbags around a storm drain. The car
moved slightly out of its own lane to pass the obstacle,
then attempted to get back into the lane it was
supposed to be in. The only problem was that a
(02:36):
bus was approaching from behind and started to fill that space.
The self driving cars AI assumed the bus driver would
slow or yield to the self driving vehicle, and it
began to merge back into its lane. Then the self
driving car learned the valuable lesson that two objects cannot
coexist in the same physical space at the same time.
(02:56):
There was a low speed collision. Google admitted the car
was at fault for making a baseless assumption that the
human driver would yield, And yes, the apology did sound
a bit like Google was throwing a little shade at
the bus driver, but only a little. If you look
at statistics, self driving cars are extremely safe, particularly if
you're looking only for the times they are at fault
(03:17):
in an accident. Google's cars have driven more than two
million miles on streets in the United States with only
one at fault accident. That's significantly lower than for human drivers.
New drivers are forty times more likely to get into
an accident than a self driving vehicle is. However, self
driving cars do tend to get into accidents caused by
(03:39):
human drivers at a rate higher than you'd see in
an experienced human driver. Self driving cars might be a
bit too polite and safe, avoiding aggressive tactics that can
sometimes prevent an accident. Maybe self driving cars place more
faith in humans to do the right thing when they
drive foolish robots that will be their downfall to be
(04:00):
serious for a moment. Self driving cars have the potential
to make an enormous positive impact. For the year sixteen
in the United States, the National Safety Council estimated that
forty thousand people died in motor vehicle crashes. From a
statistical standpoint, self driving cars have the potential to prevent
thousands of deaths every year. On a lesser note, they
(04:21):
could also reduce property damage incidents and reduce strain placed
on emergency rooms and first responders. They could also help
alleviate traffic snarls and reduce commute times in dense cities.
While several companies, including Waymo, are aggressive in getting self
driving cars out on the streets and more real world scenarios,
we also have to face some facts. The National Highway
(04:42):
Traffic Safety Administration recognizes six levels of autonomy, from zero
to five. At zero, you have a car that is
completely human operated. Levels one and two cover driver assist systems,
in which an automated feature might kick into gear when needed,
but the car is still largely under the control role
of a human driver. A Level three system is largely
(05:03):
automated but can allow for a human to take control
in a safety critical situation. This one is tricky because
handing off the control of a vehicle isn't easy to
do in a seamless way. It's sort of like having
a passenger reach over and grab the steering wheel to
yanket to the side in order to avoid a crash.
The passenger and driver might find themselves struggling against each
(05:24):
other for a moment. Level four is a fully autonomous
vehicle that can operate in an operational design domain or
o d D. That means the vehicle has restrictions on
where it goes and under what conditions it operates. This
is the level the shuttle bus I mentioned at the
top of the show is supposed to inhabit. The vehicle
should be capable of handling all situations that fall inside
(05:46):
that O d D. You could argue the shuttle failed
to do this and that it didn't find some means
of preventing the fender bender, though without all the details
that's hard to say. Level five is a fully autonomous
system that can work under any dry driving scenario. That
would be a car that could adapt to conditions the
way human drivers can, no matter whether traffic or road conditions.
(06:07):
Most experts in the fields say we are still a
far away away from this, as teaching machines how to
recognize a threat versus something that is ultimately harmless is tricky.
Humans can extrapolate based upon experience. Machines have trouble with that.
All that being said, I'd still hop into a self
driving shuttle without hesitation, assuming I was confident in the
abilities of the company that made the sensors and software. I,
(06:31):
for one, welcome our robot chauffeur Overlords. To learn more
about autonomous vehicles and all other things tech related, subscribe
to the tech Stuff podcast It's robot Approved. See you
again soon.