This alarming video of poor Tesla autopilot driving can really help make using autopilot safer

Illustration for the article titled This alarming video of some bad Tesla driver driving can really help make using autopilot safer

Print Screen: You Tube

Tesla, as usual, is very generous in providing us with a lot to talk about, especially when it comes to Level 2 known driver assistance system, confused, like autopilot and / or Full Self Driving (FSD). Yesterday there was a Tesla accident using the autopilot that hit a police car, and now one video of a direction largely assisted by autopilot through Oakland it is circulating, generating a lot of attention due to the often confusing and / or just wrong decisions that the car makes. Strangely, however, it is the poor performance of the system that can help people use it safely.

All of this comes in the wake of a letter from the National Transportation Safety Board (NTSB) to the US Department of Transportation (USDOT) regarding the National Highway Traffic Safety Administration (NHTSA) “regulatory notice notice” (ANPRM), where the NTSB is effectively telling what the hell (WTF) we should be doing about autonomous vehicle testing (AV) on public roads.

From that letter:

Since NHTSA has imposed no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limitations of the AV control system. For example, Tesla recently launched a beta version of its Level 2 Autopilot system, described as having full autonomous driving capabilities. In launching the system, Tesla is testing highly automated AV technology on public roads, but with limited supervision or reporting requirements.

Although Tesla includes a disclaimer that “the features currently enabled require active driver supervision and do not make the vehicle autonomous,” NHTSA’s non-intervention approach to supervising AV testing poses a potential risk to drivers and other users of the road.

At this point, the NTSB / NHTSA / USDOT letter is not proposing any solution, just bringing something we have seen for years: Tesla and other companies are testing self-driving car software on public roads, surrounded by other drivers and pedestrians who did not consent to do part of any test, and in this beta software test, accidents have the potential to be literal.

All of this provides a good context for the video of the autopilot in Oakland, the highlights of which can be seen in this tweet

… and the full 13 and a half minute video can be watched here:

There’s so much to this video that it’s worth watching if you’re interested in Tesla’s Autopilot / FSD system. This video is using what I believe to be the most recent version of the FSD beta, version 8.2, of which there are many other driving videos available online.

There is no doubt that the system is technologically impressive; doing so is a colossal achievement, and Tesla engineers should be proud.

At the same time, however, it is not as good as a human driver, at least in many contexts, but as a level 2 semi– Autonomous system, requires the driver to be alert and ready to assume at any time, a task in which humans are notoriously bad, and because I think none The L2 system is inherently flawed.

Although many FSD videos show the system in use on highways, where the general driving environment is much more predictable and easier to navigate, this video is interesting precisely because driving in the city has a much higher level of difficulty.

It’s also interesting because the guy in the passenger seat is such a constant and unflappable apologist, that if Tesla attacked and killed a litter of kittens, he would praise him for his excellent ability to track a small target.

During the trip to Oakland, there are many places where the Tesla performs very well. There are also places where he makes really terrible decisions, driving down the driveway or turning the wrong way on a one-way street or zigzagging like a drunken robot or cutting curbs or just stopping, for no clear reason, right in the middle of the road .

In fact, the video is divided into chapters based on these interesting events:

0:00Introduction

0:42Double Parked Cars (1)

1:15Pedestrian in the crosswalk

1:47Cross solid lines

2:05Disengagement

02:15China Town

3:13Avoid driver

3:48Left unprotected (1)

4:23Turn right at the wrong lane

5:02Next to the Incident

5:37Acting Drunk

6:08Left unprotected (2)

6:46 amDisengagement

7:09Do not connect red

7:26 am“Take over immediately”

8:09Wrong track; Behind parked cars

8:41 amDouble Parked Truck

9:08Bus Only Lane

9:39Close call (curb)

10:04Turn left; Lane Blocked

10:39 amWrong way !!!

10:49Double Parked Cars (2)

11:13 amStop signal delay

11:36 amLeft hesitant

11:59 amNear collision (1)

12:20 pmNear collision (2)

12:42 pmClose Call (Wall / Fence)

12:59 pmVerbal Drive / Beta review

This sounds like the track list for a very strange concept album.

Nothing in this video, as impressive as it is objectively, says that this machine drives better than a human. If a human did the things seen here, you would be asking out loud what the hell was wrong with him, continually.

Some situations are clearly things that the software has not been programmed to understand, such as cars parked with emergency lights on and obstacles that must be carefully circumvented. Other situations are the result of the system misinterpreting the camera data, or overcompensating, or just having difficulty processing your environment.

Some of the video’s defenses help to bring out the bigger issues involved:

The argument that there are many, many more human accidents on any given day is very misleading. Sure, there are many more, but there are a lot more humans driving cars too, and even if the numbers were the same, no automakers are trying to sell shitty human drivers.

Furthermore, the reminders that the FSB is a beta only serves to remind us of that letter from the NTSB with all the acronyms: we must leave the companies beta self testdriving car Software in public without supervision?

Tesla’s FSB is still no safer than a normal human driver, which is why videos like this one, showing many worrying FSD driving events, are so important and can save lives. These videos slightly decrease confidence in the FSD, which is exactly what needs to happen if this beta software is tested safely.

Blind faith in any L2 system is how you end up stuck and maybe killed. Because L2 systems give little to no warning when they need humans to take over, and an unreliable person behind the wheel is much more likely to be ready to take over.

I’m also not the only one suggesting this:

The paradox here is that the better a Level 2 system becomes, the more likely it is that people behind the wheel will trust it, which means that less attention will be paid, which makes them less able to take control when the system really does. you need them.

That’s why most hit with Automatic pilot they happen on highways, where a combination of generally good autopilot performance and high speeds leads to little driver attention and less reaction time, which can lead to disaster.

All level 2 systems not just autopilot they suffer from it and therefore they are all garbage.

While this video clearly shows that the basic driving skills of the FSD still need to be improved, Tesla’s focus should not be on this, but on discovering safe and manageable failover procedures, so immediate driver attention is not required.

Until then, the best case for the safe use of Autopilot, FSD, SuperCruise or any other Level 2 system is to watch all these videos of systems spoiling, lose a little confidence in them and remain a little tense and alert when the machine is driving .

I know it’s not what anybody else would you like autonomous vehicles, but the truth is that they are not yet done. It is time to accept this and treat it that way, if we are to make real progress.

Being defensive and trying to soften the steering of a low quality machine does not help anyone.

So, if you love your Tesla and autopilot and FSD, watch this video carefully. Enjoy the good parts, but really accept the bad parts. Don’t try to make excuses. Watch, learn and keep that shit in the back of your mind when you sit behind the wheel that you’re not really driving.

It is not fun, but this stage of any technology like this always takes work, and work is not always fun.

.Source