In February of this year, we got a glimpse of the future of auto accidents when a self-driving car that was being tested in Mountain View California got into an accident. This in itself, is not notable, as self-driving cars have been involved in fender benders before. However, as the data from the car itself proved, those accidents were never the fault of the car, it was usually the other drivers, ignoring traffic signals.
This time, however, was different. The self-driving car, having encountered an obstacle on the road of sand bags acting a barrier to an open storm drain, moved to the center lane to avoid the obstacle. A few seconds later, it hit the side of a bus. In this specific instance, the fault is with the car, as the car’s programming assumed the bus would give it right of way, and that didn’t happen.
This brings up an interesting question. In a case like this, if the accident is the fault of the car, then who is actually to blame? There’s no human driver, so the driver can’t take responsibility. Even if someone is sitting in the driver’s seat, in an accident situation, there usually isn’t enough time to for a person to take over and attempt to avoid an accident situation.
So what happens when you know a self-driving car is at fault, and there’s no driver?
Going Up The Chain
As the future brings more self-driving cars onto the road, a few things are becoming clear. One thing that self-driving cars do better than any human being is collect accurate information. A self-driving car is full of cameras, sensors, and even laser-based radar, so it has an accurate account of exactly what other vehicles are doing on the road. However, nothing is perfect, which means that if a self-driving car gets into an accident, and the car is clearly at fault, that data it has been collecting about traffic movement is going to implicate the car itself. With that kind of hard evidence in a court of law, finding fault will not be hard.
The next question then becomes who, ultimately, will take responsibility for this? In a normal situation, the driver is at fault, either through negligence, poor judgment, or a combination of both. In this case, the people in a car are passengers, so no one actually in the car is responsible for the car’s poor choices if the car implicates itself in an accident.
That means we need to go “up the chain,” to whomever it was that “taught” the car to make the choices it did. In this case, this means either the auto-makers themselves, or the companies—such as Google—who make the software that drives the car’s decision making process. Volvo, for example, is aiming to have a self-driving car on the roads by 2020, and they’ve already announced they will take legal and financial responsibility for any accident in which their car is at fault.
It’s a brave new world for accident law, and if you’re not familiar with it, then get an expert, like a lawyer for car crash on your side to help you.