Self-driving cars: The first pedestrian fatality
It was an accident that was never supposed to happen, said Tim Higgins in The Wall Street Journal. The promise of self-driving cars “is that robot eyes are supposed to detect things humans can’t.” So when one of Uber’s autonomous vehicles “plowed straight into” a pedestrian in Tempe, Ariz., last week, killing her, many experts said that it was “clear the system failed.” The self-driving Volvo “never seemed to detect” the 49-year-old woman and did not appear “to brake or veer.” The woman was crossing the road outside of a crosswalk and stepped into the path of the vehicle. But autonomous technology is supposed “to ‘see’ objects in the dark,” and conditions for the car’s cameras and sensors were optimal: The night was clear, and the roads in Tempe are wide and clearly marked, with minimal traffic. This first-ever pedestrian death “comes at a critical time for the nascent self-driving vehicle sector,” said Aarian Marshall in Wired.com. Uber, Alphabet, and others have spent billions on research and development to prove the technology is safer and more efficient than human drivers. “But now is the in-between time, the moment when autonomous vehicles are less than perfect, even as they take to public streets in ever greater numbers.”
It’s “appealing” to assume that self-driving cars are safer than those driven by humans, said Megan McArdle in The Washington Post. “Unfortunately, it’s wrong.” In 2016, there were 1.18 fatalities for every 100 million miles that Americans drove. “Since Americans drove nearly 3.2 trillion miles that year, that added up to tens of thousands of deaths.” Self-driving cars, by contrast, have gotten nowhere near racking up 100 million miles; Alphabet’s Waymo, the industry leader, has logged about 4 million miles of road travel, while Uber, which has now suspended its testing, just reached 2 million. “We won’t know how dangerous self-driving cars are compared with human drivers until they’ve driven billions more miles.” Self-driving evangelists insist removing humans from the equation will help safety, said Leonid Bershidsky in Bloomberg.com. But here was a car driving in perfect conditions on “streets laid out on a perfect grid”—and it wasn’t able to drive better than the average human.
It is “getting increasingly hard to fathom why we’re trusting” private tech companies with matters of such critical public importance, said Patrick George in Jalopnik.com. Uber, Alphabet, and others want us to imagine a future of driverless cars whisking us “around our futuristic cities in quiet comfort, reading tablets and getting work done.” But we know from Uber’s record that the company can be “irresponsible, exploitative, and downright scummy.” Shouldn’t some “large NASA-type organization” be placed in charge instead? “The tech free-for-all must end at some point, and perhaps this horrible crash is the beginning of that.” ■