Technology NewsA video of the Tesla causing the mass accident...

A video of the Tesla causing the mass accident has surfaced

-


The recording raises several questions.

In the first week of this year, we reported on the case where a – presumably – using a self-driving mode (Full Self-Driving mode or FSD) Tesla caused a mass crash, in which eight cars were involved. The car reportedly slowed down from the top speed of 55 miles per hour (88 km/h) to 20 miles per hour (32 km/h) on California’s I-80 after multiple lane changes.

This is a life-threatening maneuver on the highway, so it is no wonder that several vehicles ran into each other, one person needed hospital treatment, and eight people suffered minor injuries. According to the Tesla driver, he was using the car’s FSD mode, which has a known problem with the so-called phantom brakes occurrence. The California police could not confirm this, according to them, only Tesla can say whether this was the case or not.

There are many question marks surrounding Tesla's self-driving driver support systems (Photo: Tesla)
There are many question marks surrounding Tesla’s self-driving driver support systems (Photo: Tesla)

This week, however, a video was found showing the accident in question – reports a The Intercept. According to the footage, the Tesla first flashed the left indicator, then started to brake, while also pulling into the outer lane, and finally stopped in the middle of the lane. Due to the unexpected and unjustified braking, the drivers behind him could no longer react in time, thus the mass accident involving eight cars occurred.

The case is being investigated by the American highway supervision, the National Highway Traffic Safety Administration (NHTSA), according to whose data, between July 2021 and June 2022, 273 accidents were caused by cars using Tesla’s Autopilot and FSD modes. More telling is that in this one year, 329 accidents occurred in the United States caused by cars using driver support systems – 70 percent of them can be attributed to Tesla.

The company has received a lot of criticism for its misleadingly named driver assistance systems: Autopilot, despite its name, cannot drive the car by itself, instead it only corresponds to level 2 self-driving, just like any modern car with lane keeping assistant and adaptive cruise control. More investigations due to consumer deception, but a federal investigation was also launched against Tesla.

Google’s subsidiary, which actually develops self-driving taxis, protested due to the misleading terminology Waymo also:

Unfortunately, we see some car manufacturers use the term self-driving inaccurately, giving consumers the wrong impression of driver-assistance – and not fully self-driving – technologies. And this misconception can result in some people, through no fault of their own, taking risks (for example taking their hands off the steering wheel), which can endanger not only their own health, but also that of others

– says Waymo’s blog post.

Incidentally, the accident happened the same week when Elon Musk announced on Twitter that FSD was now available to any Tesla owner in North America, and then congratulated Tesla’s self-driving team on their great success.

US Transportation Secretary Pete Buttigieg also protested against the misleadingly named driver support technologies.

I’ll repeat it until I’m blue in the face: today, only driver support technologies are available on the market, which do not replace the driver. I don’t care what the companies call it, we have to make sure that everyone has a crystal clear understanding of what it’s about – even if the companies don’t want it

Buttigieg said.

In the United States, there is currently no federal law prohibiting self-driving vehicles from being tested on public roads – although certain restrictions have been introduced at the level of individual states, Tesla continues to happily use American roads as a test laboratory.

Meanwhile, NHTSA is also investigating Elon Musk’s statement that some FSD users may be able to turn off the warnings that alert drivers that they are constantly keeping their hands on the steering wheel. Tesla recently announced that from recalcitrant violators they may withdraw the possibility of using the FSD for two weeks.

We still have a lot to tell you, you can find everything interesting here!



LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest news

its Gorilla Glass Victus 2 debuts with the Galaxy S23

It was an open secret but it is no longer a secret, because Corning has officially confirmed that...

விஞ்ஞானிகள் தோலில் இருந்து மூளை வரை உணர்திறன் பாதைகளை கண்டுபிடிக்கின்றனர்

கொலம்பியா பல்கலைக்கழகத்தின் ஜுக்கர்மேன் இன்ஸ்டிடியூட் மற்றும் இரண்டு கூட்டாளர் நிறுவனங்களின் விஞ்ஞானிகள், சுட்டி ஆய்வுகளில் இன்பமான, பாலியல் மற்றும் வெகுமதியளிக்கும் சமூக தொடர்பு தொடர்பான...

ChatGPT even passes exams at American universities. How does it compare to real students?

It's been a while since ChatGPT software became widely available. Internet users have already tested it in...

Google Takes Down 50,000 Instances of Pro-Chinese DRAGONBRIDGE Influence Operation

Jan 26, 2023Ravie LakshmananThreat Analysis Google on Thursday disclosed it took steps to dismantle over 50,000 instances of activity...

Google Takes Down 50,000 Instances of Pro-Chinese DRAGONBRIDGE Influence Operation

Jan 26, 2023Ravie LakshmananThreat Analysis Google on Thursday disclosed it took steps to dismantle over 50,000 instances of activity...

Will artificial intelligence work instead of accountants?

Compared to paper-based processing, an online system based on machine learning and vision can reduce costs by up...

Must read

its Gorilla Glass Victus 2 debuts with the Galaxy S23

It was an open secret but it is...

Siraj: Self-engraver; Phoenix Resurrected; The story of Siraj becoming number 1!

Mohammad Siraj has progressed to become the world...