Spencer Platt/Getty Images
The inside of a Tesla vehicle is viewed as it sits parked in a new Tesla showroom and service center in Red Hook, Brooklyn on July 5, 2016 in New York City.

NTSB: Fatal Crash Involving Tesla Autopilot Resulted from Driver Errors, Overreliance on Automation

Sept. 14, 2017
The National Transportation Safety Board announced Sept. 12 that a truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation are the probable cause of a fatal crash.

On May 7, 2016, Joshua Brown was driving his Tesla, which featured a self-driving function, when he slammed under a tractor-trailer crossing his lane of traffic near Williston, Fla. Brown had set the speed on the Tesla Autopilot at 74 mph and neither the automated system nor Brown attempted to brake before the crash.

According to reports, Brown had not attempted to control the car for at least two minutes before the crash, and only had his hands on the wheel of the car, a Tesla Model S, for 25 seconds out of 37 minutes that the car was on autopilot.

In its report, the National Transportation Safety Board (NTSB) determined the operational design of the Tesla’s vehicle automation permitted Brown’s overreliance on the automation, noting its design “allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.”

As a result of its investigation the NTSB issued seven new safety recommendations and reiterated two previously issued safety recommendations.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments.”

These systems, he added, require the driver “to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.”

Findings in the NTSB’s Report

The NTSB determined the Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert and the automatic emergency braking did not activate. 

Brown’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations. “If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains,” the report noted.

The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement, the agency added. Brown’s Tesla warned him seven times to place his hands on the wheel before the fatal crash; warnings that Brown allegedly ignored.

Tesla made design changes to its Autopilot system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

Fatigue, Mechanical Failure Ruled Out

Fatigue, highway design and mechanical system failures were not factors in the crash, according to the NTSB report. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed Brown was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention.

Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.

The NTSB issued a total of seven safety recommendations based upon its findings, with one recommendation issued to the U.S. Department of Transportation (DOT), three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems and one each to the Alliance of Automobile Manufacturers and Global Automakers. The safety recommendations address the need for:

  • Event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems;
  • Manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards;
  • Development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking;
  • Need for manufacturers to report incidents, crashes and exposure numbers involving vehicles equipped with automated vehicle control systems.

The board reiterated two safety recommendations issued to the National Highway Traffic Safety Administration (NHTSA) in 2013, dealing with minimum performance standards for connected vehicle technology for all highway vehicles and the need to require installation of the technology, once developed, on all newly manufactured highway vehicles. In January, NHTSA released a report indicating the crash was not the result of a defect in the Tesla Autopilot.

Automated Driving Systems (ADS): A Vision for Safety 2.0

The U.S. DOT and NHTSA on Sept. 12 released Automated Driving Systems (ADS): A Vision for Safety 2.0, a new federal guidance for automated driving systems to industry and states.

“The new guidance supports further development of this important new technology, which has the potential to change the way we travel and how we deliver goods and services,” said U.S. Transportation Secretary Elaine L. Chao. “The safe deployment of automated vehicle technologies means we can look forward to a future with fewer traffic fatalities and increased mobility for all Americans.”

A Vision for Safety 2.0 is a voluntary guidance that “encourages best practices and prioritizes safety,” according to DOT. It calls for industry, state and local governments, safety and mobility advocates and the public to lay the path for the deployment of automated vehicles and technologies.

“In addition to safety, ADS technology offers important social benefits by improving access to transportation, independence and quality of life for those who cannot drive because of illness, advanced age or disability,” continued Chao. 

Sponsored Recommendations

ISO 45001: Occupational Health and Safety Management Systems (OHSMS)

March 28, 2024
ISO 45001 certification – reduce your organizational risk and promote occupational health and safety (OHS) by working with SGS to achieve certification or migrate to the new standard...

Want to Verify your GHG Emissions Inventory?

March 28, 2024
With the increased focus on climate change, measuring your organization’s carbon footprint is an important first action step. Our Green House Gas (GHG) verification services provide...

Download Free ESG White Paper

March 28, 2024
The Rise and Challenges of ESG – Your Journey to Enhanced Sustainability, Brand and Investor Potential

Free Webinar: Mining & ESG: The Sustainability Mandate

March 28, 2024
Participants in this webinar will understand the business drivers and challenges of ESG and sustainability performance, the 5 steps of the ESG and sustainability cycle, and prioritized...

Voice your opinion!

To join the conversation, and become an exclusive member of EHS Today, create an account today!