From the Blogosphere
Can Self-Driving Cars Ever Really Be Safe? | @ThingsExpo #AI #IoT #M2M #Sensors
No. Self-driving cars can never really be safe. They will be safer!
By: Shelly Palmer
Apr. 27, 2017 04:00 PM
Analysts estimate that by 2030, self-driving cars and trucks (autonomous vehicles) could account for as much as 60 percent of US auto sales. That’s great! But autonomous vehicles are basically computers on wheels, and computers crash all the time. Besides that, computers get hacked every day. So you gotta ask, “Can self-driving cars ever really be safe?”
The Short Answer
Humans Are Very Dangerous
Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents – there were over 4.4 million in the United States during 2015.
Data Begins to Make a Case
Rage against the machine quickly followed, along with some valid questions about whether Tesla had pushed this nascent technology too fast and too far. Everyone expected the accident to be the fault of a software glitch or a technology failure, but it was not.
The NHTSA investigation found that “a safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.” In other words, the car didn’t cause the crash. But there was more to the story. The NHTSA’s report concluded, “The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.” In reality, while Mr. Brown’s death was both tragic and unprecedented, the investigation highlighted a simple truth: semi-autonomous vehicles crash significantly less often than vehicles piloted by humans.
What Do You Mean by “Safe”?
That said, this is very new technology, and regulators will need to define what they mean by “safe.” Must our autonomous vehicles drive flawlessly, or do they just need to be better at it than we are? The RAND Corp think tank says, “A fleet of 100 cars would have to drive 275 million miles without failure to meet the safety standards of today’s vehicles in terms of deaths. At the time of the fatal May 2016 crash, Tesla car owners had logged 130 million miles in Autopilot mode.”
The Transition to Fully Autonomous Vehicles
Self-Driving Cars Need to Be Trained
Hacks and Crashes
As for computer crashes, yes, it is possible for the computer that runs your self-driving car to crash, but it will happen so infrequently that, by the numbers, you will be significantly safer in an autonomous vehicle than if you were driving yourself.
Fear and Assessment of Risk
(BTW: Please do not bring up the absurd “Why Self-Driving Cars Must Be Programmed to Kill” scenario where “One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?” If you had situational awareness and time to consider all of the outcomes posited by this nonsense hypothetical, you’d have time to step on the brake. If you didn’t have time to consider all of the potential actions and outcomes, the AEB would have engaged to prevent the car from hiting what was in front of it – the people you would have killed while you were thinking about what to do.)
With any luck, the fear-mongers and bureaucrats will get out of the way, and we will all be much safer sooner.
SOA World Latest Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week