March 22, 2018
The heated debate about self-driving vehicles has been reignited after a 49-year-old woman was struck and killed by a self-driving Uber that was being tested on Arizona roads. The woman was attempting to cross the road in low light with her bicycle at the time of the accident, and though the human safety driver was behind the Uber’s wheel, he did not react in time to prevent the collision.
Conflicting viewpoints have been taken regarding autonomous cars, with advocates arguing that they have a reduced risk of accidents compared to vehicles driven by humans and critics saying that there are driving variables machines can’t predict or appropriately respond to.
Neither Vehicle nor Human Safety Driver Detected Pedestrian
It was around 10:00 p.m. on March 18, 2018, when 49-year-old Elaine Herzberg attempted to cross an unidentified road in Tempe, Arizona, with her bicycle. Dashcam video from the self-driving Volvo XC90 (outfitted with non-Volvo software) shows dim lighting and an empty roadway when a figure suddenly appears illuminated by the vehicle’s headlights in the middle of the right-hand lane. The video, obtained by Tempe Police, cuts out just as the front right of the vehicle is about collide with Herzberg. The vehicle does not appear to slow down at any point in the video.
Video Shows Driver Looking Down Before Crash
An alternate video that was focused on the human safety driver of the Volvo shows 44-year-old Rafael Vasquez looking down, possibly at a phone, before glancing up and reacting in shock to something ahead. Tempe Police sergeant Ronald Elcock said that Vasquez did not appear to be impaired in any way and that he was cooperating with investigators. Officials believe the Volvo was traveling at 38 m.p.h. in a 35 m.p.h. zone in good weather conditions.
Herzberg was later pronounced dead at a local hospital.
Police Say Uber “Likely Not at Fault”
Initial responses from Tempe Police seemed to favor Uber, with Tempe Police Chief Sylvia Moir saying that the video footage made her think that it was, “very clear that it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how [the pedestrian] came from the shadows right into the roadway.”
Moir even went on to say that she doubted the blame would be on the ride-sharing giant.
“I suspect preliminarily it appears that the Uber would likely not be at fault in this accident,” Moir told media after the incident. The Tempe Police Department, however, later released a statement clarifying that it “does not determine fault in vehicular collisions” and that the Maricopa County Attorney’s Office would be the ones to review the investigation and make any charges in the matter. The National Transportation Safety Board is also investigating.
Experts Disagree, Saying Self-Driving Should Have “Seen” Pedestrian
Two autonomous vehicle experts interview with CTV News in Canada had a vastly different take regarding the crash.
“[The vehicle] absolutely should have been able to pick [Herzberg] up,” Sam Abuelsmaid, a Navigant Research analyst who studies self-driving vehicles, told the news outlet. “From what I see in the video it sure looks like the car is at fault, not the pedestrian.” He noted that the laser and radar would have better ability to detect things in the dark and that Herzberg would have been within range of both.
Bryant Walker Smith is a law professor at the University of South Carolina who also studies self-driving vehicles, and he echoed Abuelsmaid’s statement.
“The victim did not come out of nowhere. She’s moving on a dark road, but it’s an open road, so Lidar (laser) and radar should have detected and classified her [as a human]” Smith said.
The accident is exactly the kind of thing John M. Simpson, who serves as privacy and technology project director for Consumer Watchdog, has been worried about happening.
“The robot cars cannot accurately predict human behavior, and the real problem comes in the interaction between humans and the robot vehicles,” Simpson told The Guardian.
Uber and Toyota Suspend Self-Driving Vehicle Programs After Crash
Uber took to Twitter after the fatal crash, saying that their hearts went out to the victim’s family. Shortly thereafter the ride-sharing company announced they would temporarily halt their self-driving vehicle testing across North America. A follow-up statement on Uber’s communication Twitter account on March 21, 2018, described the video from the crash as “disturbing” and “heartbreaking to watch” and confirmed that their autonomous cars were still grounded.
Toyota responded similarly to the incident, saying that they would temporarily suspend testing of their autonomous vehicles. The carmaker said the decision was inspired in part by concern for their test drivers, who they believed could be experiencing “an emotional effect” from the Uber crash.
Advocates Worry Legal Loophole Will Prevent Victims from Being Able to Fight Car Manufacturers
Beyond the safety concerns, consumers have about autonomous vehicles are legal fears. A bill known as the AV Start Act is currently under consideration in the Senate, and, if passed, it would allow forced arbitration between the manufacturers of self-driving vehicles and consumers.
Arbitration has a reputation for benefiting the corporations involved and not the consumers, and in this case, would prevent consumers from pursuing lawsuits. At the same time, information in the proceedings would be unlikely to be shared publicly, possibly preventing important safety information or risks of the autonomous cars from being shared with the public.
If you or a loved one has been injured in an auto accident, the attorneys at Cutter Law have the experience and expertise to help you understand your legal options and hold those responsible accountable. Contact us for a free case evaluation.