A Tesla Model 3 car on the road in San Diego, Calif. on July 21, 2021. The federal government’s top auto-safety agency is significantly expanding an investigation into Tesla and its Autopilot driver-assistance system to determine if the technology poses a safety risk. (Roger Kisby /The New York Times)
The federal government’s top auto safety agency is significantly expanding an investigation into Tesla and its Autopilot driver assistance system to determine if the technology poses a safety risk.
The National Highway Traffic Safety Administration said Thursday that it was upgrading its preliminary evaluation of Autopilot to an engineering analysis, a more intensive level of scrutiny that is required before a recall can be ordered.
The analysis will look at whether Autopilot fails to prevent drivers from diverting their attention from the road and engaging in other predictable and risky behavior while using the system.
“We’ve been asking for closer scrutiny of Autopilot for some time,” said Jonathan Adkins, executive director of the Governors Highway Safety Association, which coordinates state efforts to promote safe driving.
NHTSA has said it is aware of 35 crashes that occurred while Autopilot was activated, including nine that resulted in the deaths of 14 people
. But it said Thursday that it had not determined whether Autopilot has defects that can cause cars to crash while it is engaged.
The wider investigation covers 830,000 vehicles sold in the United States. They include all four Tesla cars — the Models S, X, 3 and Y — in model years from 2014 to 2021. The agency will look at Autopilot and its various component systems that handle steering, braking and other driving tasks, and a more advanced system that Tesla calls Full Self-Driving.
Tesla did not respond to a request for comment on the agency’s move.
Also Read: Inside Tesla as Elon Musk pushed an unflinching vision for self-driving cars
The preliminary evaluation focused on 11 crashes in which Tesla cars operating under Autopilot control struck parked emergency vehicles that had their lights flashing. In that review, NHTSA said Thursday, the agency became aware of 191 crashes — not limited to ones involving emergency vehicles — that warranted closer investigation. They occurred while the cars were operating under Autopilot, Full Self-Driving or associated features, the agency said.
Tesla and its CEO, Elon Musk, have come under criticism for hyping Autopilot and Full Self-Driving in ways that suggest they are capable of piloting cars without input from drivers.
“At a minimum they should be renamed,” said Adkins of the Governors Highway Safety Association. “Those names confuse people into thinking they can do more than they are actually capable of.”
Check out our Festive offers upto Rs.1000/- off website prices on subscriptions + Gift card worth Rs 500/- from Eatbetterco.com. Click here to know more.
©2019 New York Times News Service