The National Transportation Safety Board (NTSB) has issued a preliminary report on the recent fatal crash of a Tesla vehicle operating in the “Autopilot” self-driving mode in Florida. That crash has cast a shadow on the efforts of Tesla, Google and other companies that have been testing autonomous vehicles for several years now.
A formal investigation by the National Highway Traffic Safety Administration (NHTSA) is still ongoing, and the Securities and Exchange Commission (SEC) is looking into whether Tesla should have informed investors of the crash in May.
According to the NTSB report, the driver of the Tesla Model S was operating the car using Tesla’s Traffic-Aware Cruise control and Autosteer lane keeping assistance system, and was driving approximately 10 mph over the posted speed limit.
In August, there was a second autopilot crash when a driver in Beijing sideswiped another vehicle that was parked on the side of the road. There were no injuries. The July crash of a 2016 Tesla Model X, which injured the driver and passenger, was initially reported to have involved the autopilot mode, but Tesla later announced it had no evidence to determine whether or not the feature was activated at the time of the collision.
The Tesla autopilot feature is not a true self-driving system; it is designed to keep the car within the lane and from striking other vehicles under very specific circumstances. The company has been beta testing the feature with its customers since 2015. In the aftermath of the crashes, Tesla has received criticism for the marketing of the feature, particularly in naming it “Autopilot” in the first place.
In a blog posted after NHTSA announced its investigation into the crash, Tesla noted that neither the autopilot system nor the driver saw the white side of the trailer because it was backlit against a bright sky.
According to Tesla: “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”
On May 7, 2016, the Tesla was traveling on US-27A near Williston, Fla., when it struck and passed beneath a 53-foot semitrailer before colliding with a telephone pole. The driver, 40-year-old Joshua Brown of Canton, Ohio, died in the crash.
Earlier this year, a Google self-driving car was involved in a minor fender bender in Mountain View, Calif., when it drove into the side of a bus. The accidents have raised questions about the technology’s safety.
“It’s a wakeup call to a lot of people,” says Joe Register, director of emerging technologies at the Auto Care Association. “Having the ability to do something, and the practical application of it are two different things.”
The crashes have not deterred Tesla. Speaking at a conference early in the summer, CEO Elon Musk said that he thought the industry was less than two years from having fully autonomous vehicles on the road, and appeared to hint that full self-driving capabilities could be available on its new Model 3 sedan.
During an August 3 earnings call, Musk was even more bullish: “Autonomy is going to come a hell of a lot faster than anyone thinks it will, and I think what we’ve got under development is going to blow people’s minds. It blows my mind,” he said.
Consumer Reports was less impressed. “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. “In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology. ‘Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.”
Other automakers are also moving forward with autonomous driving initiatives. BMW has teamed with Intel and Mobileye on its iNEXT, which is scheduled for production in 2021. Over the summer, Mobileye announced it would end its partnership with Tesla on the Autopilot system.
There are currently a number of autonomous car projects around the country, including a 32-acre proving ground called MCity where the University of Michigan and state Department of Transportation will test connected and autonomous vehicles in different driving scenarios. “There is a great potential, but we haven’t really got this technology nailed down yet,” Register says. “To have vehicles on the road with autonomous driving capability – people are just now beginning to understand what the implications are.”
The NTSB will continue to review the accident data and collect additional information from the vehicle’s electronic systems. The agency’s final report is expected within the next year. Until then, NTSB’s preliminary report is at this link.
Have your say: