WASHINGTON (Reuters) – The chairman of the U.S. National Transportation Safety Board (NTSB) said on Tuesday “operational limitations” in the Tesla Model S played a “major role” in a May 2016 crash that killed a driver using the vehicle’s semi-autonomous “Autopilot” system.
The limits on the autonomous driving system include factors such as Tesla being unable to ensure driver attention even when the car is traveling at high speeds, ensuring Autopilot is used only on certain roads and monitoring driver engagement, NTSB said.
The NTSB recommended auto safety regulators and automakers take steps to ensure that semi-autonomous systems are not misused.
“System safeguards were lacking,” NTSB Chairman Robert Sumwalt said. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”
Tesla Inc (TSLA.O) had no immediate comment on the NTSB report.
Joshua Brown, a 40-year-old Ohio man, was killed near Williston, Florida, when his Model S collided with a truck while it was engaged in the “Autopilot” mode.
The NTSB on Tuesday found that the system’s “operational design” was a contributing factor to the 2016 crash because it allows drivers to avoid steering or watching the road for lengthy periods of time that were “inconsistent” with warnings from Tesla.
The NTSB said Tesla could have taken further steps to prevent the system’s misuse, and faulted the driver for not paying attention and “overreliance on vehicle automation.”
The agency said the Autopilot system operated as designed but did not do enough to ensure drivers paid adequate attention. On some roads, drivers could use Autopilot at up to 90 miles (145 km) per hour, it said.
Tesla did not ensure that the system was used only on highways and limited-access roads, as recommended in the owner’s manual, a fact that Sumwalt noted.
The NTSB recommended that automakers monitor driver attention other than through detecting steering-wheel engagement.
The system could not reliably detect cross traffic and “did little to constrain the use of autopilot to roadways for which it was designed,” the board said.
‘TEN SECONDS TO REACT’
Monitoring driver attention by measuring the driver’s touching of the steering wheel “was a poor surrogate for monitored driving engagement,” said the board.
Tesla said in June 2016 that Autopilot “is not perfect and still requires the driver to remain alert.”
At a public hearing on Tuesday on the crash involving Brown, NTSB said the truck driver and the Tesla driver “had at least 10 seconds to observe and respond to each other.”
On Monday, Brown’s family said the car was not to blame for the crash.
“We heard numerous times that the car killed our son. That is simply not the case,” the family’s statement said. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”
“People die every day in car accidents,” the statement said. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”
A spokeswoman for Tesla and a lawyer for the family, Jack Landskroner, have declined to say if the automaker had reached a legal settlement with the Brown family.
NTSB recommended that NHTSA require automakers to have safeguards to prevent the misuse of semi-autonomous vehicle features.
The National Highway Traffic Safety Administration (NHTSA) said it would review the findings of the safety board.
In January, NHTSA said it found no evidence of defects in the crash. NHTSA and NTSB said Brown did not apply the brakes and his last action was to set the cruise control at 74 miles per hour (119 kph), less than 2 minutes before the crash – above the 65-mph speed limit.
Editing by Bernadette Baum and Matthew Lewis