Ethical Concerns Rise Over the Future of Autonomous Vehicles

Automated machinery is remarkable, though it comes with risks associated with the lack of human supervision and interference. The death of a pedestrian hit by one of Uber’s driverless cars last month brought these concerns to light and prompted the public to question the ethics behind the use of these autonomous vehicles.

 

 

A Tragic Accident

The Uber vehicle in question struck a 49-year-old woman in Tempe, Arizona while traveling in autonomous mode. The woman, Elaine Herzberg, was transported to a local hospital, where she later died from her injuries. This marked the first known pedestrian fatality in an accident involving a driverless vehicle.

Police later released video footage of the accident, taken from cameras installed in the vehicle. The incident occurred at night, and from the footage released, it appears that the woman was masked by the shadows and her dark clothing. Unfortunately, she was not spotted until it was too late. Although the vehicle had a human operator behind the wheel, he was looking down the moments before the accident. However, he did periodically look up to inspect the road. By the time the woman came into view, it was too late to avoid her, even if the driver had been looking the entire time.

 

Public Response

Backlash following the accident could put a hold on autonomous vehicles and future developments unless more is done to prevent tragedies such as this. Tensions mounted during the investigation, as those involved searched for someone to blame and the general public weighed in with their opinions on the matter.

If Uber’s technology was at fault, then the company could face public and commercial resistance as well as criminal charges. Even without fault, the incident has cast doubt on the public with regard to safety and reliability.

 

 

The Investigation

While this certainly calls into focus the need for human drivers to be fully alert with their eyes on the road, it also sheds light on autonomous vehicles and road safety in general. Although it was difficult to see the woman until she was directly in the path of the car, the vehicle itself should have been able to detect her.

“The sensors should have detected the pedestrian in this case,” said Carnegie Mellon professor Raj Rajkumar. “The cameras were likely useless, but both the radars and the Lidar must have picked up the pedestrian. Though no information is available, one would have to conclude based on this video alone, that there are problems in the Uber vehicle software that need to be rectified.”

Timothy Carone, an associate teaching professor at Notre Dame University’s Mendoza College of Business, also found fault with the vehicle and its inability to detect the woman, stating that the question now is whether adequate testing was conducted prior to sending the vehicles out for road tests. “Did they jump the gun?” he asked. “If their testing is found to be inefficient, that cannot be allowed to happen again because these systems have to be ready for road tests.”

An Uber spokesperson discussed the footage in a statement, calling it “disturbing and heartbreaking” and expressing condolences to Herzberg’s family and friends. The spokesperson also stated that the fleet of self-driving vehicles would remain grounded, and the company is assisting authorities in the investigation.

Despite the vehicle’s inability to detect Herzberg, police stated early in the investigation that it would be unlikely that Uber would be liable for the accident, primarily because she was not crossing at a crosswalk at the time of the accident. “It’s very clear it would have been difficult to avoid this collision in any kind of mode based on how she came from the shadows right into the roadway,” said Tempe Chief of Police Sylvia Moir.

 

Automotive Response

Nvidia suspended its self-driving tests following the accident, which appears to have stemmed from company-specific issues with either the hardware or software. Other automotive companies grounded their autonomous fleets in response to the tragedy as well.

Toyota was the first major auto manufacturer to halt tests of driverless cars. Toyota spokesman Brian Lyons cited the incident and the “emotional effect” that it may have on test drivers as the reason behind the move to halt testing of Toyota’s Chauffeur mode on public roads. Toyota also stated that it does not have information regarding the accident and had not yet decided whether to purchase Uber’s driverless vehicle software.

Alphabet’s Waymo, however, stands on its own due to its independent approach to autonomous vehicles. Waymo utilizes the company’s own hardware, software, and mapping which is developed with the aid of Intel. Although these vehicles have done well and produced great results thus far, the future of these vehicles and the hardware and software depend on the company’s ability to convince top automobile manufacturers to purchase from Waymo rather than attempting to create their own versions.

 

The Takeaway

Autonomous vehicles are excellent and innovative. They allow people to get from Point A to Point B with limited interference on the part of the human operator, which is especially useful in instances where the driver is least comfortable, such as parallel parking and travelling unfamiliar routes. Driverless vehicles also have the potential to aid disabled people in becoming even more independent. However, until problems such as these are addressed and fixed, that may take quite a bit longer to become reality. In the meantime, any “driverless” vehicles on the road will certainly require an alert and capable operator behind the wheel, just in case.

 

Did you miss this?

Scroll to Top