Cheraw Chronicle

Complete News World

United States: Tesla does not take adequate measures to prevent abuse of Autopilot – Image and Audio – News

The operation of Tesla's Autopilot software resulted in “predictable misuse and avoidable accidents.” This is the conclusion of an American study on this system. It's not clear whether an Autopilot update last December fixed the issues.

The US traffic agency, the National Highway Traffic Safety Administration, has released the results of its investigation into the operation of the Autopilot function in a Tesla car subscriber. To this end, 956 incidents were analyzed over three years. This analysis showed that the autopilot function played a role in 467 of those accidents. Thirteen cases involved fatal accidents.

According to NHTSA, these accidents were due to drivers misusing the system. However, the regulatory body blames Tesla for the fact that the manufacturer did not implement adequate safety measures to prevent such a violation. The regulator says that “under certain circumstances” the Autopilot system does not sufficiently ensure that drivers are paying attention and using the function correctly.

NHTSA said drivers expect the Autopilot system to require much less supervision than is actually the case, creating a “serious safety vulnerability.” According to the regulator, Tesla must improve the effectiveness of Autopilot warnings and ensure that users know better what this system can and cannot be used for.

The regulator is concerned, among other things About the name “autopilot”. The name indicates that this mode allows the car to drive autonomously, while a name containing the word “assist” or “team” would describe it better, the government body claims. Attempts to make adjustments while automated driving may also result in the system being completely deactivated. As a result, drivers may be discouraged from engaging in driving, NHTSA wrote.

See also  "EU begins formal investigation into Nvidia's acquisition plans of Arm in September" - IT Pro - News

Last December, Tesla released a software update for part of the Autopilot system, the Autosteer function. The company did so because Nhtsa informed Tesla that the manufacturer did not adequately check whether drivers were holding the steering wheel. All incidents analyzed in the current study occurred before the release of this update. It is therefore not clear whether the update addresses the regulator's concerns. Nhtsa has launched a new investigation to find out.

However, the government body does I already know The update may not be enough to resolve the issues, as several new reports of incidents involving Autopilot have since emerged. In addition, drivers can choose whether they want to download the update and it is also possible to roll back it.