Editorial: Arizona Uber Robo-Car Death Highlights Backwards Approach of Regulators

If robot cars were about safety, the computers would be making sure humans drive right

Photo: ABC15 Arizona
Photo: ABC15 Arizona

Note: GJEL Accident Attorneys regularly sponsors coverage on Streetsblog San Francisco and Streetsblog California. Unless noted in the story, GJEL Accident Attorneys is not consulted for the content or editorial direction of the sponsored content.

If you’ve ever sat near the front of a modern passenger train, you may have heard an occasional bell or buzzer coming from inside the driver’s compartment. The bell is part of an electronic safety tool called a “Vigilance” or “Automatic Warning” system. When that bell sounds, train drivers have a few seconds to take certain steps, such as apply the brakes or push a button, to confirm that they are alert and doing what they’re supposed to do.

If they don’t, computers intervene and stop the train. This is a big part of the reason, despite some recent and well-publicized crashes, that trains are far safer than automobiles.

California’s roads remain deadly, in ways that would never be tolerated if cars were regulated with the same kind of vigilance that is applied to rail or aviation. 3,680 people died in 2016 on California’s roads and highways. And yet last month the Department of Motor Vehicles (DMV) Director Jean Shiomoto decided it would be a good idea to give autonomous car (AV) makers permission to test robot cars on city streets without a backup driver behind the wheel. “Safety is our top concern [emphasis added],” she wrote in the announcement.

Shiomoto’s disconnect was underscored yesterday by the death of Elaine Herzberg, who was walking her bike across a street in Tempe, Arizona when she was fatally struck by an autonomously driven Uber SUV.

It’s unclear if a human driver could have done anything to avoid the crash. That said, the Tempe police said the car was going 38 mph in a 35 mph zone; that’s not the first time an Uber driverless car has broken the law or put lives in danger. [CORRECTION: Early reports that the car was speeding were wrong, apparently].

Instead of allowing AV companies to experiment with driverless cars on public streets, the DMV should require these companies to develop technology akin to vigilance systems for trains. In other words, use AV technology to assure that humans drive safely.

First, the DMV should work with AV developers on tech to make sure human drivers don’t speed or run stop signs or red lights. They can work on tech that will set off an alarm if the motorist starts texting or otherwise not paying attention. Why can’t the DMV work with transportation network companies (TNC), such as Uber and Lyft, to suspend the license of drivers who break the speed limit, block bike lanes, or violate crosswalks? Obviously, the TNCs already know where their cars are, what they’re doing, and who is driving.

The DMV can then require companies to develop technology that can take over control of a car if a motorist shows signs of being incapacitated, starts driving erratically, is texting or otherwise distracted, or runs a red light. Later versions could have enough aptitude to pull over to a safe location, turn off the engine and disable the ignition, and signal the police. Eventually, these safety features could be required for a car to operate legally in California.

Some cars already have features such as lane assist and automatic braking. These are a step in the right direction when they’re designed to set off an alarm and warn a driver who seems to be losing focus. But they’re obviously not required and some, such as Tesla’s Autopliot system, seem engineered to encourage distracted driving instead of preventing it.

In the rail industry, vigilance systems were designed to assure that trains were operated safely by their human drivers. But eventually, these systems evolved to the point that they could take over train operation almost entirely–and after decades of trial and error, there are now many transit systems where the human operators were finally removed.

DMV regulations should require AV developers to follow this model; government and developers can work together to create technology and legal frameworks that assure that human drivers drive safely.

Then, and only then, should we be talking about allowing fully autonomous cars on public streets.

  • Stuart

    the DMV should require these companies to develop technology akin to vigilance systems for trains

    How, exactly? The basic premise of your argument seems to be that instead of focusing on making a product that there is consumer demand for safer than human drivers, the government should force companies to make systems that consumers don’t want, force car companies to put them in cars, and force consumers to pay for them. You think that’s a viable approach in a world where we can’t even manage to get the political will together to test speed cameras in SF?

    the DMV should work with AV developers on tech to make sure human drivers don’t speed

    Speed regulators are a well-established technology. Similarly, the ability to make a system that automatically reports speeding to the police would pretty much be a speedometer and a mobile data connection (or just speed cameras). The technology for either or both has been available for quite some time; it’s not a tech problem. The answer to the question “so why aren’t they already required?” explains the fundamental problem with what you are describing.

    (Separately, it’s not at all clear to me why you think the systems you’ve described would help nearly as much as actual AVs can. A focus-monitoring system ignores studies about the substantial distraction effect that just talking on a hands-free phone has while driving, for instance. A system that takes over after you run a red light doesn’t help if the driver hits a person while running the light. Keeping a driver focused doesn’t help with the insurmountable reaction time that’s fundamental to humans but doesn’t need to existing in computers.)

  • murphstahoe

    If Tempe police ticket this car for going at 38 in a 35, it will be the first time they’ve ticketed “an entity” for going 38 in a 35.

  • You want to slow down the huge benefits for cyclists, pedestrians and pollution when cars get the ability to come to you to pick you up, allowing low cost one-way trips, the ability to mix walking, biking, transit and driving on the same trip as desired, the ability for cars to take themselves to electric charging stations, removing all the barriers to electric transportation. Safety is important, and it is task #1, but it is far from everything.

  • It is a 45 zone. Original reports were in error.

  • Roger R.

    Thanks. I put in a correction note.

  • At the end of the day, no amount of software optimization or regulation can prevent injury when someone steps out in front of a vehicle moving at 40mph. Physics does not bow to concerns about pedestrian safety or regulated AV. There simply *will* be cases where someone jumps out from behind a van in front of a car, and that person may die.

    We simply don’t know at this point if these vehicles are more or less safe than regular drivers. There’s 1.81 pedestrian fatalities for every 100 million miles driven in the US…. Since there’s literally trillions of miles driven each year in the US, that comes out to about 6000 pedestrian deaths last year. 16 a day. 35% of those involve drunk pedestrians.

    Autonomous vehicles have driven way less than 100 million miles under testing conditions. So statistically, we have no idea of this person’s death is an outlier or a harbinger of things to come.

  • Nancy Johnson

    They are safer as long as everyone follows the same set of parameters. The computer can only accurately predict what is going to occur if it knows what the other object is going to do. I believe all of the accidents thus far were caused by the other driver or in this case the pedestrian. If every car, cyclist and pedestrian does exactly what they should be doing, and the computer is programmed to know that is what will happen, it is safer (assuming the computer does what it is programmed to do).

  • I don’t think this is correct at all. It’s not the job of a computer to predict. It’s the computer’s job to react, and quickly.

    Humans drive through an intersection when they have the right of way assuming that other cars will stop. If a car being chased by a cop runs through a light, I may or may not be able to react in time…but the computer almost certainly will be able to do so better than me.

  • Ethan

    You should want the computer to predict a cross-traffic car isn’t slowing down and is going to run the red light. You shouldn’t want it to wait until the car is actually running the red, headed for your door, and only then for the AV to react.

  • The video of the incident shows a situation where a self-driving vehicle should presumably be able to avoid a collision.

  • Nancy Johnson

    The question posed was whether autonomous cars are safer. What I’m saying is that, assuming everyone follows a set of parameters and the autonomous vehicle is programmed to follow those parameters, it will be safer than human drivers. If people aren’t following the set of parameters (like running a red light) than I cannot say whether autonomous vehicles are safer because I don’t know how the computer will react.

ALSO ON STREETSBLOG

An Amtrak train waiting to depart for San Jose from Oakland. Photo: Streetsblog/Rudick

Amtrak Crash Highlights Need to Accelerate Safety Upgrades in Bay Area

|
Note: GJEL Accident Attorneys regularly sponsors coverage on Streetsblog San Francisco and Streetsblog California. Unless noted in the story, GJEL Accident Attorneys is not consulted for the content or editorial direction of the sponsored content. Yesterday morning Amtrak Cascades #501 derailed on a curve in Dupont, Wash. Jim Hamre and Zack Willhoite, advocates with the Rail Passengers Association (RPA), were […]
An SFMTA worker installing safe-hit posts at Baker and Fell late this morning. All photos Streetsblog/Rudick unless otherwise noted.

Eyes on the Street: Action at Baker and Fell

|
Note: GJEL Accident Attorneys regularly sponsors coverage on Streetsblog San Francisco and Streetsblog California. Unless noted in the story, GJEL Accident Attorneys is not consulted for the content or editorial direction of the sponsored content. An SFMTA crew installed official safe hit posts today to make the intersection at Baker and Fell safer. The posts compliment painted bulbouts that […]