This Alarming Video Of Tesla Autopilot Poor Driving Could Really Help Make Autopilot Safer – tech2.org

This Alarming Video Of Tesla Autopilot Poor Driving Could Really Help Make Autopilot Safer


Illustration for the article titled This Alarming Video of Tesla's Poor Autopilot Driving Could Really Help Make Autopilot Safer

Screenshot: Youtube

Tesla, as usual, is very generous in giving us a lot to talk about, especially when it comes to his Level 2 known driver assistance system, confusingly, like Autopilot and / or total autonomous driving (FSD). Yesterday there was an accident of a Tesla on autopilot that hit a police car, and now a video of a largely autopilot assisted unit through Oakland it’s hanging around, drawing a lot of attention due to the often confusing and / or just plain bad decisions the car makes. Interestingly though, it is the bad of the system performance that can help people to use it safely.

All of this comes immediately after a letter from the National Transportation Safety Board (NTSB) to the US Department of Transportation (USDOT) regarding the “advance notice of proposed rulemaking” (ANPRM) from the National Highway Traffic Safety Administration (NHTSA), where the NTSB is effectively saying what the heck (WTF) we should be doing regarding vehicle (AV) testing on public roads.

From that letter:

Because the NHTSA has no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limitations of the AV control system. For example, Tesla recently released a beta version of its Level 2 autopilot system, described as fully self-driving. By launching the system, Tesla is testing highly automated AV technology on public roads but with limited monitoring or reporting requirements.

Although Tesla includes a disclaimer that says “currently enabled features require active driver supervision and do not make the vehicle autonomous,” NHTSA’s hands-off approach to AV testing supervision poses a potential risk. for drivers and other road users.

At this point, the NTSB / NHTSA / USDOT letter does not propose any solutions, it just mentions something we’ve been seeing for years: Tesla and other companies are beta testing autonomous car software on public roads, surrounded by other drivers and pedestrians. They have not consented to being a part of any test and, in this beta software test, the crashes have the potential to be literal.

All of this provides good context for the video of the autopilot in Oakland, the highlights of which can be seen in this tweet.

… and the full 13 and a half minute video can be viewed here:

There is so much in this video that is worth watching, if you are interested in Tesla’s FSD / Autopilot system. This video uses what I think is the latest version of FSD beta, version 8.2, of which there are many other driving videos available online.

There is no doubt that the system is technologically impressive; Doing all of this is a colossal achievement, and Tesla engineers should be proud.

Yet at the same time, he’s nowhere near as good as a human driver, at least in many contexts, and yes, as a level 2 driver. semi-Autonomous system, requires a driver to be alert and ready to take control at any moment, a task in which humans are notoriously bad, and because I think some The L2 system is inherently faulty.

While many FSD videos show the system in use on the roads, where the general driving environment is much more predictable and easier to navigate, this video is interesting precisely because city driving has such a high level of difficulty.

It’s also interesting because the guy in the passenger seat is such a constant and unflappable apologist, to the point that if the Tesla attacked and cut up a litter of kittens, I’d praise it for its excellent ability to track a small target.

Over the course of the trip in Oakland, there are many places where the Tesla performs well. There are also places where you make really terrible decisions, driving in the lane of incoming traffic or turning the wrong way on a one-way street or spinning around like a drunk robot or cutting curbs or just stopping, for no clear reason, right in the middle of the road. middle of the road.

In fact, the video is usefully divided into chapters based on these, um, interesting events:

0:00Introduction

0:42Cars double parked (1)

1:15Pedestrian in Crosswalk

1:47Cross solid lines

2:05Unlinking

2:15China Town

3:13Driver avoidance

3:48Left unprotected (1)

4:23Turn right into the wrong lane

5:02Near Head On Incident

5:37Acting drunk

6:08Left unprotected (2)

6:46Unlinking

7:09Do not turn red

7:26“Take control immediately”

8:09Wrong lane; Behind parked cars

8:41Double truck parked

9:08Single bus lane

9:39Close call (sidewalk)

10:04Turn to the left; Lane blocked

10:39Wrong way !!!

10:49Cars double parked (2)

11:13Stop sign delay

11:36Hesitant left

11:59Near collision (1)

12:20Near collision (2)

12:42Close call (wall / fence)

12:59Verbal Drive / Beta Review

That reads like the song list from a very weird concept album.

Nothing in this video, objectively impressive as it is, says that this machine drives better than a human. If a human did the things you see here, you would be asking out loud what the hell is wrong with them, over and over again.

Some situations are clearly things that the software has not been programmed to understand, such as parked cars with their hazard lights on are obstacles that must be handled with care. Other situations are the result of the system misinterpreting the camera data, or overcompensating, or simply having difficulty processing its environment.

Some of the video defenses help to raise the most important issues involved:

The argument that there are many, many more human accidents on any given day is highly misleading. Sure, there are many more, but there are also many more humans driving cars, and even if the numbers were the same, no automaker is trying to sell crappy human drivers.

Also, the reminders that FSB is a beta version only serve to remind us of that letter from the NTSB with all the acronyms: should we let companies beta self testDriving a car software in public without supervision?

Tesla’s FSB is still no safer than a normal human driver, which is why videos like this, showing many troubling FSD driving events, are so important and could save lives. These videos erode some trust in FSD, which is exactly what needs to happen if this beta software is to be safely tested.

Blind faith in any L2 system is how you end up crashed and maybe dead. Because L2 systems give little to There’s no warning when they need humans to take over, and an untrustworthy person behind the wheel is much more likely to be ready to take control.

I’m also not the only one suggesting this:

The paradox here is that the better a Tier 2 system becomes, the more likely people behind the wheel will trust it, which means they will pay less attention, leading to them being less able to take control when the system really needs them.

That is why most crashes with Autopilot they happen on the roads, where a combination of generally good autopilot performance and high speeds leads to poor driver attention and less reaction time, which can lead to disaster.

All level 2 systems not just autopilot they suffer from this, and therefore they are all rubbish.

While this video clearly shows that FSD’s basic driving skills still need improvement, Tesla’s focus shouldn’t be on that, but on figuring out safe and manageable failover procedures, so no immediate attention from the driver is required. driver.

Until then, the best case for the safe use of Autopilot, FSD, SuperCruise, or any other Level 2 system is to watch all these videos of the systems going bankrupt, lose some confidence in them, and stay a bit tense and alert when the machine is down driving.

I know that’s not what anyone wants of autonomous vehicles, but the truth is that there is still a lot to do. It’s time to accept that and treat them this way if we ever want to make real progress.

Getting defensive and trying to sugarcoat driving a shitty machine doesn’t help anyone.

So if you love your Tesla and you love Autopilot and FSD, watch that video carefully. Appreciate the good parts, but really embrace the bad parts. Don’t try to make excuses. Watch, learn and keep these shits in the back of your mind when you sit behind a wheel that you are not driving.

It’s not fun, but this stage of any technology like this always requires work, and work isn’t always fun.

.

Source link