Tesla’s “Full Self Driving” beta is ridiculously bad and potentially dangerous

A beta version of Tesla’s autopilot update “Full Self Driving” has begun to be rolled out for some users. And man, if you thought “Full Self Driving” came close to reality, this video of the system in action is sure to get you out of that notion. It is perhaps the best comprehensive video to illustrate how morally dubious, technologically limited and potentially dangerous the Autopilot “Full Self Driving” beta program is.

This content is imported from YouTube. You can find the same content in another format or you can find more information on their website.

In a 13-minute video posted on YouTube by user “AI Addict”, we see a Model 3 with FSD Beta 8.2 fumbling in Oakland. He looks unhappy and totally confused all the time, never passively imitating a human driver. At the beginning of the video, the front seat passenger comments on the correct decision of the car to pass through a bunch of cars parked in a double row instead of waiting behind them – but the moment of praise is interrupted when the car parks on the center line while trying to enter a lane to turn left.

That’s because – like all semi-autonomous systems on sale today – Tesla’s “Full Self Driving” and “Autopilot” systems are not, in fact, fully autonomous. They require constant human supervision and intervention in fractions of a second. And now that the latest beta version of the software has been released, it seems to demand more attention than ever.

The video quickly goes from “embarrassing mistakes” to “extremely risky and potentially damaging driving”. In autonomous mode, Tesla breaks a variety of traffic laws, starting with a last-minute attempt to cross a rigid line and perform an illegal lane change. Then he tries to make a left turn next to another car, only to give up in the middle of the intersection and get out.

He goes on to make another very wide curve, landing on the opposite track and demanding the driver’s intervention. Shortly afterwards, he crosses again to the lane in the opposite direction on a straight stretch of road with motorcyclists and traffic in the opposite direction. Then he stumbles drunk at an intersection and again requires the driver’s intervention to pass. When making a left turn unprotected after a stop sign, he decelerates before the turn and cools down in the passage of cars that need to brake to avoid hitting him.

The video is not even halfway through, but the litany of errors continues with another random shutdown. Tesla tries to turn right at a red traffic light where this is prohibited, again almost breaking the law and demanding that the driver actively stop him from doing something. He stops randomly in the middle of the road, goes straight along a conversion-only lane, behind a parked car, and eventually almost hits a curb when making a turn. After handling the traffic to get around a parked car, he safely drives directly to the track in the opposite direction before realizing his mistake and getting away. Another traffic violation in the books – and yet another moment when the confused car simply gives up and leaves it to the human driver to sort out the mess.

Tesla’s software is defeated by cars stopped on the road and an intersection where it clearly has the preference. Then there is another near collision. This time, Tesla arrives at an intersection where it has a stop sign and cross traffic does not. He continues with two cars coming in, the first car barely passing the front bumper of the car and the rear car braking to avoid the Model 3 T-boning. It is absolutely unbelievable and indefensible that the driver, who should be monitoring the car to ensure safe operation, it did not intervene there. It is even more incredible that this software is available to the public.

But this is not the end of the video. To top it off, the Model 3 almost hits a Camry that has the preference while trying to get around a bend in the road. After going through this intersection, he goes straight to a fence and almost hits it directly. Both incidents required the driver to intervene to avoid.

Certainly, no one has resolved autonomous driving. It is a challenging problem that some experts say will only be solved with highly advanced artificial intelligence. Tesla’s software clearly does a decent job of identifying cars, stop signs, pedestrians, bicycles, traffic lights and other basic obstacles. However, to think that this is anything like “fully autonomous driving” is ridiculous. There is nothing wrong with having limited resources, but Tesla is alone in its inability to recognize its own shortcomings.

When the technology is immature, the natural reaction is to keep working on it until it is resolved. Tesla opted against that strategy here, instead of choosing to sell software that it knows is incomplete, charging a substantial premium and hoping that those who buy it will have an advanced and nuanced understanding of its limitations – and the ability and responsibility to skip and save when he inevitably gets confused. In short, every Tesla owner who buys “Full Self-Driving” is serving as an unpaid security supervisor, conducting research on Tesla’s behalf. Perhaps more damaging, the company takes no responsibility for its actions and leaves it up to the driver to decide when and where to test it.

This content is imported from Twitter. You can find the same content in another format or you can find more information on its website.

This leads to videos like this, in which the first users perform uncontrolled tests on the city streets, with pedestrians, cyclists and other drivers unaware that they are part of the experiment. If even one of these Tesla drivers slips, the consequences can be deadly.

All of these tests are being carried out on public roads, for the benefit of the most valuable automaker in the world, at basically zero cost. We contacted Tesla to comment on the video, but the company does not have a press office and does not normally answer questions.

This content is created and maintained by third parties and imported into this page to help users provide their email addresses. You can find more information about this and other similar content on piano.io

Source