In what is believed to be the first U.S. pedestrian death involving an autonomous vehicle on public roads, a woman in Tempe, Arizona, was struck and killed by a self-driving Uber vehicle on March 18. Apparently, she was walking her bicycle across the street, not in a crosswalk, and the Uber—which did have a human operator in place as backup—reportedly did not slow down.
Of course, the novelty of new technology claiming its first pedestrian fatality set off alarm bells and made instant headlines, with countless shares on social media.
Uber immediately suspended its self-driving tests in Arizona (where testing is encouraged by light regulation) and other North American locations, and a number of public officials made statements urging autonomous vehicle manufacturers to slow down and proceed responsibly with technology that, according to Sen. Richard Blumenthal (D-Connecticut), “has a long way to go before it is truly safe for the passengers, pedestrians and drivers who share America’s roads.”
Are Autonomous Cars Really Less Safe?
It’s no surprise that some people are skeptical that driverless cars can be programmed to respond properly to the many unpredictable situations that will confront them. But the flipside of Blumenthal’s caution is this: Vehicles driven by humans are not truly safe for the passengers, pedestrians, bicyclists and drivers who share America’s roads.
According to the National Highway Traffic Safety Administration (NHTSA), there were 37,461 lives lost on U.S. roads in 2016, an increase of 5.6 percent from the previous year. There were 840 bicyclist fatalities (the highest number since 1991), while pedestrian deaths increased by a whopping 9 percent to 5,987 (the highest number since 1990). To put in perspective the single Tempe fatality on March 18, on any average day in the United States, cars with drivers kill 16 people.
A Question of Man vs. Machine
Much has been written about the potential pros and cons of autonomous vehicles; one of the biggest upsides is that the most dangerous driving behaviors of human beings will be programmed out of driverless cars. With computer algorithms to calculate appropriate distance between cars, stopping distance, when to turn, appropriate/legal speed, etc., experts predict autonomous vehicles may actually be safer “drivers” than many humans. They will not be drunk or distracted, they will not tailgate or run red lights, and they will be programmed to follow the rules of the road with precision. In fact, anecdotal evidence suggests there have been incidents in which human-driven vehicles have rear-ended test autonomous vehicles, because the latter came to a full stop when they were supposed to.
On the down side, there are a number of concerns about cost, insurance liability in driverless crashes, education of backup operators, the reliability of GPS systems on which the cars will depend and the possibility of the computer technology failing. Probably the most front-and-center concern is how sophisticated the programming will be when it comes to detecting sudden, unforeseen dangers involving other vehicles, pedestrians and bicyclists. Bike advocates are particularly concerned that pedestrian detection programming will not be sufficient to account for faster-moving and more unpredictable bicycles.
A Plea for Caution
In a letter to commissioners at the state Departments of Transportation and Motor Vehicles—who are charged with developing state regulations for the autonomous cars and their sensing technology—New York Biking Coalition Executive Director Paul Winkeller wrote:
“NYBC remains very concerned that taller, more fast-moving and less predictable bicyclists could be in further danger if the sensing technology, per government oversight, are not well thought out or carefully designed. Please keep our perspective in mind and our request to be at the table if and when an interagency AV Task Force is put together!”
Stay tuned for more as this new technology and its impact on society continues to develop.