Alex Wong/Getty Images

Uber Self-Driving Car Unit’s Safety Culture Slammed by NTSB

Nov. 19, 2019
In a statement, Chairman Robert Sumwalt said that the automatic driving system and the vehicle's operator were symptoms of an "ineffective safety culture."

Uber Technologies Inc.’s self-driving vehicle unit lacked an effective safety culture at the time when one of its test vehicles struck and killed a pedestrian in Tempe, Arizona last year, National Transportation Safety Board Chairman Robert Sumwalt said Tuesday.

“The inappropriate actions of both the automatic driving system as implemented and the vehicle’s human operator were symptoms of a deeper problem, the ineffective safety culture that existed at the time,” Sumwalt said as he opened a board meeting to determine the probable cause of the collision.

The probe is the NTSB’s first to examine a fatal crash involving a self-driving test vehicle. The case is being closely watched in the emerging autonomous vehicle industry, a sector that has attracted billions of dollars in investment from companies such as General Motors Co. and Alphabet Inc. in an attempt to transform transportation.

Elaine Herzberg, 49, was hit and killed by an Uber self-driving SUV as she walked her bicycle across a road at night. Uber halted self-driving car tests after the crash investigative information released since the March 2018 collision highlighted a series of lapses -- both technological and human -- that the board may cite as having contributed to the crash. Uber resumed self-driving testing late last year in Pittsburgh.

The Uber vehicle’s radar sensors first observed Herzberg about 5.6 seconds prior to impact before she entered the vehicle’s lane of travel and initially classified her as a vehicle. The self-driving computers changed its classification of her as different types of objects several times and failed to predict that her path would cross the lane of self-driving test SUV, according to the NTSB.

The modified Volvo SUV being tested by Uber wasn’t programmed to recognize and respond to pedestrians walking outside of marked crosswalks, nor did the system allow the vehicle to automatically brake ahead of an imminent collision. The responsibility to avoid accidents fell to the single safety driver monitoring the vehicle’s automation system, while other companies place a second human in the vehicle for added safety.

The safety driver was streaming a television show on her mobile phone in the moments before the crash, despite company policy prohibiting drivers from using mobile devices, according to police. The NTSB has also said that Uber’s Advanced Technologies Group that was testing self-driving cars on public streets in Tempe didn’t have a standalone safety division, a formal safety plan, standard operating procedures or a manager focused on preventing accidents.

Uber made extensive changes to its self-driving system after several reviews of its operation and findings by NTSB investigators. The company told the NTSB that the new software would have been able to correctly identify Herzberg and triggered controlled braking to avoid her more than 4 seconds before the original impact, the NTSB has said.

About the Author


Licensed content from Bloomberg, copyright 2016.

Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!