Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
This. Is the Elon Musk podcast.
Your daily hit of what is reallygoing on at Tesla, SpaceX X AI.
And the rest of the Musk universe.
I'm your host Will Walden and I have covered Elon Musk for more
than five years, spent a year onthe ground at SpaceX, Starbase
during. Early Starship development and
before this I spent my career asa software developer working
(00:20):
with billion dollar. Companies.
I've also built and sold my own businesses and now I make.
Content and. Help other people.
Grow. Their companies now on this
show, I use that experience to break down the news, filter out
all the noise and give you clearcontext you can actually use.
Tesla has begun pushing Full Self Driving Supervised version
(00:43):
14.2 to cars in the field, and the company is framing this
build as the step that enables Awider release.
Now the update centers on a new,higher resolution vision encoder
and a collection of driving behavior refinements that aim to
make the car smoother and more confident.
The big question is whether thisroll out stays limited to a
(01:03):
small set of vehicles or expandsquickly to much of the fleet, as
the company hints. Now today we're going to walk
through what changed in V142, what hardware is receiving at
first, how the release notes describe new emergency vehicle
handling and routing, and why the parking and arrival options
are a practical shift. We'll also look at the October
(01:25):
comment that set expectations for broader availability.
Then we're going to talk about what it's going to be like and
what a measured, widespread pushlikely looks like in practice.
Now, after the break, I'll resetwith the Corfax and then move
feature by feature with clear examples pulled from Tesla's
(01:45):
notes. Now, Tesla started rolling out
FSD supervised 14.2, and the company ties this build to the
moment when a broader release could begin.
The release notes focused on vision emergency vehicle
protocols, routing that adapts to blockages in a set of new
parking choices with several scenarios specific improvements
(02:06):
listed plainly. Now with that baseline, let's
unpack the details and what theyadd up to for drivers giving FSD
on daily routes. Now this is a fully upgraded
neural network vision encoder that reads higher resolution
features from the scene. Now, in practice, that means the
system should capture finer details and reduce ambiguity in
(02:30):
cluttered environments, which supports more decisive actions.
Now this connects directly to better detection of emergency
vehicles, obstacles and even human gestures.
So the camera stack is not just seeing more pixels, it is trying
to interpret intent in motion any more granular way.
Now emergency vehicle behavior gets specific treatment with
(02:52):
this one. Tesla describes added logic for
controlled pullovers in yieldingbehavior when the system
encounters police cars, fire trucks, and ambulances.
That's a practical step because these encounters carried legal
expectations and social norms that people take for granted,
and the software now aims to follow those norms with
structured decisions. If the system recognizes
(03:13):
flashing lights and adjusts predictably, it reduces
confusion for other drivers and for the occupants of the Tesla.
Now routing and navigation movesdeeper into the vision network
itself, which matters when the roads are blocked or detours
appear with a little warning. And the notes say that the car
can respond to block roads and detours in real time because
planning is more tightly coupledto perception, That architecture
(03:37):
should help the system replan without awkward pauses when a
lane closes or when a route changes near construction, which
is where earlier versions sometimes hesitated.
Now there's several driving scenarios that receive targeted
refinements in this build. The notes call out unprotected
turns, lane changes, vehicle cut, insurance, and interactions
(03:58):
with school buses. Each of these is a stress test
for prediction and yield behavior, so tuning them
suggests work on both comfort and compliance.
And when the card judges gaps more clearly on an unprotected
left, commits earlier on a safe lane change, or respects school
bus rules without abrupt breaking, the occupant
experience improves while risk drops dramatically.
(04:21):
Now there's also parking and arrival options, and they've
added a new layer of user control to that.
The release lists choices for where FSD should park or drop
off, such as a lot, the street, a driveway, a garage, or the
curbside. Now that matters for the last 50
meters of a trip. Or earlier versions could second
guess a destination or stop in asuboptimal place a couple blocks
(04:44):
away and just expect you to get out and parked there.
Well, it's most of the time thatdoesn't work.
And if the driver can pre selectthe arrival style, the system
can plan the final approach withfewer last second adjustments,
which should feel more natural when you arrive at your
destination. Now the hardware scope is very
clear in the early hours of the rollout.
(05:05):
Reports that the update landed first on Model Y vehicles in
California with the company's A14 hardware set the expectation
that this is not yet across every single car.
The notes also indicate that this wave is limited in terms of
included vehicles, which points to a staged approach as Tesla
validates the new model and a narrower slice of the fleet
(05:27):
before pushing it more widely. That matches how complex
perception changes typically ship, and the version string in
the notes reads 2 O 25.38.9.5 with FSD supervised V 14.2
installed now alongside the big items, the list includes smaller
(05:47):
quality of life changes, like analert for residue buildup on the
interior windshield that could degrade front camera visibility.
That line is a reminder that sensor health is not just about
software, though. Real world grime can still blunt
the vision system. Snow, dirt, mud, a leaf falling
in front of your camera, anything.
And the car now warns if that happens, so drivers can clean it
(06:11):
and restore fidelity on their own and not rely on the system.
Tesla also lays out a few upcoming improvements attached
to this train, including continued to work on overall
smoothness, on parking spot selection, and on parking
quality. The company positions these as
the next steps rather than finished work inside a 14.2,
(06:32):
which signals that parking remains an active area of tuning
now for drivers. That means the car should get
better at choosing A stall or a curbside space and it completing
the maneuver without hunting. Now, context for the widespread
framing comes from a comment in October responding to a well
known tester who praised a prior14 point 1.2 build for reducing
(06:52):
indecisive lane changes and braking.
Now, Elon Musk said that 14.2 isfor broad use.
That comment sets a bar for today's release, because if 14.2
begins on a narrow hardware subset and then steps into a
much larger pool to performance,people experienced on 14 Point,
1.2 should carry forward just with the higher resolution
(07:13):
vision and added behaviors. Now the least notes repeat
several times in ways that show emphasis, including improvements
to handling static and dynamic gates and offsetting for Rd.
debris such as tires, branches, and boxes.
And in daily driving, that translates to fewer awkward
slowdowns, an object sits partially in the lane and better
(07:36):
lateral positioning to give space.
Those are subtle changes that add up over a commute, cutting
down on small corrections that used to break the flow of the
drive. Now decision making reliability
gets a nod with language about better management of system
faults and smoother recovery from degraded operation.
(07:56):
So in practical terms, that means if a sub component has a
momentary hiccup, the system should not overreact or exit
automation abruptly without need.
Smooth recovery reduces driver workload and builds trust
because consistent behavior is easier to supervise than
behavior that changes dramatically and drastically
under small disturbances. Now there is also the question
(08:19):
of hardware diversity in the existing fleet.
The company's wide FSD fleet still includes a large number of
HW-3 vehicles, and it remains tobe seen how 14.2 maps out of
that mix. If early validation on AI 14
looks strong, Tesla will need tocommunicate how the vision
(08:41):
encoder and the new behavior scaled down to earlier hardware,
or whether some features arrive in a trimmed form.
That's a normal step is architectures evolve.
Now Tesla is rolling out FSD Supervised V 14.2 with a higher
resolution vision encoder, explicit emergency vehicle
protocols, routing that adapts to detours, new parking and
(09:02):
arrival options, and a list of scenario level improvements.
The first wave appears limited to AI 14 hardware, with the
company signaling that this is the build meant for wider use
once validation supports it. If the earlier behavior holds
across more cars and hardware versions, the path to a broader
release is straightforward. Keep the comfort gains from 14.1
(09:23):
point X, apply the high resolution perception and
expand, expand, expand. Hey, thank you so much for
listening today. I really do appreciate your
support. If.
You could take a second and hit this subscribe or the follow
button on whatever podcast platform that you're listening
on. Right now I greatly appreciate.
It it helps out the show tremendously.
(09:44):
And you'll never. Miss an episode?
And each episode is about 10 minutes or less.
To get you caught up quickly andplease if you want to support
the show even more. Go to Patreon dot.
Com slash. Stage 0.
And please take care of yourselves and each other, and
I'll see you tomorrow. OK, so what's going on?