sourcegraph
April 19, 2024

The federal government’s top auto safety agency is dramatically expanding its investigation into Tesla and its Autopilot driver-assistance system to determine whether the technology poses a safety risk.

The National Highway Traffic Safety Administration said on Thursday, upgrade its initial assessment From Autopilot to engineering analysis, greater scrutiny is required before a recall can be ordered.

The analysis will look at whether Autopilot fails to prevent drivers from taking their attention from the road and engaging in other predictable and risky behaviors while using the system.

“We’ve been calling for greater scrutiny of Autopilot for some time now,” said Jonathan Adkins, executive director of the Governors’ Highway Safety Association, which coordinates state efforts to promote safe driving.

NHTSA said it was aware of 35 crashes that occurred when Autopilot was activated, nine of which resulted in 14 deaths. But the company said Thursday that it has not yet determined whether Autopilot has a flaw that could cause the car to crash while in use.

The broader survey covered 830,000 vehicles sold in the United States. They include all four Tesla vehicles – Model S, X, 3 and Y – from 2014 to 2021. The agency will study Autopilot and its various component systems that handle steering, braking and other driving tasks, as well as more advanced systems that Tesla calls fully autonomous driving.

Tesla did not respond to a request for comment on the agency’s move.

The initial assessment focused on 11 incidents in which Tesla vehicles under Autopilot control collided with parked emergency vehicles that had their lights flashing. NHTSA said Thursday that during that review, the agency learned of 191 incidents — not just those involving emergency vehicles — that needed closer investigation. The agency said they occur when the car is operating on autonomous, fully autonomous or related functions.

Tesla says the Full Self-Driving software can guide the car on city streets, but it cannot make it fully autonomous, requiring the driver to remain attentive. It’s also only available to a limited set of customers in what Tesla calls “betas,” or betas that aren’t fully developed.

The depth of the investigation suggests that NHTSA is taking safety more seriously due to the lack of safeguards to prevent drivers from using Autopilot in dangerous ways.

“This is not a typical defect case,” said Michael Brooks, acting executive director of the Center for Auto Safety, a nonprofit consumer advocacy group. “They’re actively looking for problems that can be solved, they’re looking at driver behavior, and the problem might not be a component in the vehicle.”

Tesla and its chief executive, Elon Musk, have been criticized for hyping Autopilot and full self-driving technology, ways they show they can drive cars without driver intervention.

“At least they should be renamed,” said Mr Adkins of the Governors Highway Safety Association. “These names confuse people into thinking they can do more than they actually can.”

Competing systems developed by GM and Ford use infrared cameras to closely track the driver’s eyes and sound a warning chime when the driver’s eyes are off the road for more than two or three seconds. Tesla didn’t initially include such a driver monitoring system in its cars, and only later added a standard camera, which is far less accurate than the infrared cameras in eye-tracking.

Tesla tells drivers to use Autopilot only on divided highways, but the system can be activated on any street with lines in between. The GM and Ford systems — dubbed Super Cruise and BlueCruise — can only be activated on the highway.

Autopilot was first offered in Tesla models in late 2015. It uses cameras and other sensors to steer, accelerate and brake with little or no input from the driver. Owners manuals tell drivers to keep their hands on the steering wheel and keep their eyes on the road, but earlier versions of the system allowed drivers to keep their hands off the wheel for five minutes or more under certain conditions.

Unlike technologists at nearly every other company working on self-driving cars, Musk insists that self-driving can only be achieved by tracking its surroundings through cameras. But many Tesla engineers question whether it’s safe enough to rely on cameras without other sensing devices.

Mr Musk has often touted Autopilot’s capabilities, calling self-driving a “solved problem” and predicting that drivers will soon be able to sleep while their cars drive to work.

Questions about the system were raised in 2016 when an Ohio man died when he crashed into a tractor-trailer on a Florida highway while Autopilot was activated. NHTSA investigated that accident and said in 2017 that it found no safety flaws in Autopilot.

But the agency annouce Driver assistance systems that fail to keep drivers engaged “may also pose an unreasonable risk to safety,” it said in 2016. In a separate investigation, the NTSB concluded that the autopilot system “played a significant role” in the Florida crash because, while it worked as intended, it lacked safeguards against abuse.

Tesla is facing lawsuits from the families of deadly crash victims, and some customers have sued the company alleging its self-driving and full self-driving cars.

Last year, Musk admitted that developing self-driving cars is a lot harder than he thought.

NHTSA began an initial review of Autopilot in August, initially focusing on 11 crashes in which Tesla vehicles using Autopilot crashed into police cars, fire trucks and other emergency vehicles that had stopped and flashed their lights. These accidents resulted in 1 death and 17 injuries.

When examining the incidents, it found another six vehicles involved in emergency vehicles and excluded one of the original 11 from further research.

Meanwhile, the agency has learned of dozens of accidents that did not involve emergency vehicles during Autopilot’s launch. Of these, the agency first focused on 191 and excluded 85 from further review because it couldn’t get enough information to get a clear picture of the situation if Autopilot was the primary cause.

Of the remaining 106 cars, about half were NHTSA found evidence that drivers were not paying full attention to the road. About a quarter of 106 occurred on roads where Autopilot should not be used.

In an engineering analysis, NHTSA’s Office of Defect Investigations sometimes takes the vehicles it is inspecting and schedules tests to try to identify defects and replicate the problems they might cause. In the past, it disassembled components to find faults and asked manufacturers for detailed data, often including proprietary information, about how the components functioned.

This process can take months or even a year or more. NHTSA aims to complete the analysis within a year. If it concludes that there is a safety defect, it can urge the manufacturer to initiate a recall and correct the problem.

On rare occasions, automakers challenged the agency’s conclusions in court and prevailed in halting recalls.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *