The recording raises several questions.
In the first week of this year, we reported on the case where a – presumably – using a self-driving mode (Full Self-Driving mode or FSD) Tesla caused a mass crash, in which eight cars were involved. The car reportedly slowed down from the top speed of 55 miles per hour (88 km/h) to 20 miles per hour (32 km/h) on California’s I-80 after multiple lane changes.
This is a life-threatening maneuver on the highway, so it is no wonder that several vehicles ran into each other, one person needed hospital treatment, and eight people suffered minor injuries. According to the Tesla driver, he was using the car’s FSD mode, which has a known problem with the so-called phantom brakes occurrence. The California police could not confirm this, according to them, only Tesla can say whether this was the case or not.
This week, however, a video was found showing the accident in question – reports a The Intercept. According to the footage, the Tesla first flashed the left indicator, then started to brake, while also pulling into the outer lane, and finally stopped in the middle of the lane. Due to the unexpected and unjustified braking, the drivers behind him could no longer react in time, thus the mass accident involving eight cars occurred.
I obtained surveillance footage of the self-driving Tesla that abruptly stopped on the Bay Bridge, resulting in an eight-vehicle crash that injured 9 people including a 2-year-old child just hours after Musk announced the self-driving feature.
— Ken Klippenstein (@kenklippenstein) January 10, 2023
The case is being investigated by the American highway supervision, the National Highway Traffic Safety Administration (NHTSA), according to whose data, between July 2021 and June 2022, 273 accidents were caused by cars using Tesla’s Autopilot and FSD modes. More telling is that in this one year, 329 accidents occurred in the United States caused by cars using driver support systems – 70 percent of them can be attributed to Tesla.
The company has received a lot of criticism for its misleadingly named driver assistance systems: Autopilot, despite its name, cannot drive the car by itself, instead it only corresponds to level 2 self-driving, just like any modern car with lane keeping assistant and adaptive cruise control. More investigations due to consumer deception, but a federal investigation was also launched against Tesla.
Google’s subsidiary, which actually develops self-driving taxis, protested due to the misleading terminology Waymo also:
Unfortunately, we see some car manufacturers use the term self-driving inaccurately, giving consumers the wrong impression of driver-assistance – and not fully self-driving – technologies. And this misconception can result in some people, through no fault of their own, taking risks (for example taking their hands off the steering wheel), which can endanger not only their own health, but also that of others
– says Waymo’s blog post.
Incidentally, the accident happened the same week when Elon Musk announced on Twitter that FSD was now available to any Tesla owner in North America, and then congratulated Tesla’s self-driving team on their great success.
Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option.
Congratulations to Tesla Autopilot/AI team on achieving a major milestone!
— Elon Musk (@elonmusk) November 24, 2022
US Transportation Secretary Pete Buttigieg also protested against the misleadingly named driver support technologies.
I’ll repeat it until I’m blue in the face: today, only driver support technologies are available on the market, which do not replace the driver. I don’t care what the companies call it, we have to make sure that everyone has a crystal clear understanding of what it’s about – even if the companies don’t want it
In the United States, there is currently no federal law prohibiting self-driving vehicles from being tested on public roads – although certain restrictions have been introduced at the level of individual states, Tesla continues to happily use American roads as a test laboratory.
Meanwhile, NHTSA is also investigating Elon Musk’s statement that some FSD users may be able to turn off the warnings that alert drivers that they are constantly keeping their hands on the steering wheel. Tesla recently announced that from recalcitrant violators they may withdraw the possibility of using the FSD for two weeks.