Human Crash Test Dummies?
Put on the brakes!
The NTSB has dispatched investigators to examine another fatal crash of a Tesla electric vehicle, this one last week in California.
One of the goals of their investigation will be to determine whether its semi-automated driving system was engaged, according to a report from The Wall Street Journal.
“Unclear if automated control system was active at time of crash,” the NTSB said in a social-media posting, in a reference to Tesla’s Autopilot feature. “Issues examined include: post-crash fire, steps to make vehicle safe for removal from scene.”
According to the story, a man died in the accident after his Tesla Model X sport-utility vehicle traveling south on Highway 101 struck a barrier and was struck by two other vehicles.
“This investigation is not focused on the automation, rather, it is focused on understanding the post-crash fire and the steps taken to make the vehicle safe for removal/transport from the scene,” an NTSB spokesman said in an email message. “We are working with Tesla to determine if automation was in use at the time of the accident, but the focus of this field investigation is on the other two points.”
Still, the NTSB has found itself increasingly scrutinizing emerging automated- driving technologies, adding to typical investigations the agency conducts of crashes involving aircraft, trains and buses, and other incidents.
I’ll ask the question: Is this going to increasingly put the NTSB in a difficult position, being asked to investigate accidents by semi- or fully-autonomous vehicles after the crash has occurred, instead of having the U.S. Congress or other authority put some laws into effect that are proactive and prescriptive?
The SELF-DRIVE Act, which was passed by the U.S. House of Representatives late last year, aimed to allow automakers and tech giants to eventually test as many as 100,000 experimental autonomous vehicles annually.
But as reported by Recode, under the proposal those companies could obtain exemptions for the federal safety standards that govern all motor vehicles, and they would not have to seek review of their technology before it hits the market.
The measure got hung up in the U.S. Senate, and perhaps for good reason (although the hitch was concerns about cybersecurity intrusions, and not the autonomous technology itself).
When I went through driving school, they always told me to keep two hands on the wheel and my eyes on the road. I always thought it was a pretty good prescription for safer driving.
And though I fully expect in the long term autonomous vehicles will be a boon for highway safety, in the short term, we humans seem to be sitting in the crash test dummy driver(less) seat.
Leave a Reply