Uber’s self-driving car kills a pedestrian, who is at fault?

In the United States, experiments with autonomous cars has been encouraged at the federal level since 2017.

However, a fatal collision caused by an autonomous car being developed by Uber has revived the debate on the responsibility of the manufacturers during accidents with this kind of vehicles.

For the second time, an autonomous vehicle has been involved in a fatal accident. On Monday, an experimental self-driving Uber car with a driver on board caused the death of a pedestrian crossing the road in the city of Tempe, Arizona. In 2016, a man had already died driving a Tesla Model S, while the autopilot was activated. With an algorithm at the controls, autonomous cars are supposed to eliminate human error and reduce the number of accidents. They are not, however, infallible.

In the United States, the experimentation with autonomous cars has been authorized throughout the country since 2017. The Self Drive Act, voted in September by the House of Representatives, provides for the test of 100,000 autonomous vehicles on American roads within three years. The act’s aim is also to prevent states from curbing the development of autonomous cars by putting in place stricter regulations.

However, “there is still no specific rules on liability in the case of an accident involving an autonomous car,” says Solën Guezille, associate attorney at Chatain & Associés, specializing in risk management.

In the case of an accident, the responsibility of the manufacturer will be called into question, highlighting technical failures. Most autonomous cars are today tested with a person on board, in order to prevent, in theory, any problem and accident. Some states, such as Arizona, allow testing of autonomous vehicles without anyone on board. This state will be joined in early April by California, which will still require that vehicles be remotely controllable in case of danger.

The definition of the responsibility of the driver (when there is one), the manufacturer, or an external element, must however be established on a case by case basis. The Tempe police chief who is investigating Uber’s latest accident said the company was probably not at fault. “It is clear that it would have been difficult to avoid this collision, autonomously or with a human being in control,” Sylvia Moir explains. A year ago, still in Tempe, an autonomous Uber car collided with another vehicle. It is ultimately the driver of the latter who was declared at fault by the police.

The development of autonomous cars also raises an ethical question about the algorithm that drives them. In case of imminent danger, should the vehicle prioritize the survival of passengers or pedestrians? “It’s an ethical subject that will have to be debated,” says Solën Guezille.

Shakes Gilles

Editor of The Talking Democrat. He enjoys bike riding, kayaking and playing soccer. On a slow weekend, you'll find him with a book by the lake.