Two Cars Involved In Traffic Accident

Car Collisions: Can Self-driving Cars Blame Humans?

Spread the love

When a self-driving car collides into another self-driving car in a comparative fault state like Washington, who is liable?

Compared to a car accident involving just human drivers, accidents that occur between self-driving cars include a different set of factors such as the level of automation, the role of the human driver, and the autopilot program’s limitations. In cases like this, it’s best to seek the counsel of a car accident lawyer.

Partially autonomous cars liable

Fully-autonomous cars are not yet allowed on US roads, but partially-autonomous cars, or those that require some level of supervision from the car’s driver, are. This means that there’s a chance the driver of a partially autonomous vehicle may be at fault in a crash.

Some self-driving cars can be more autonomous than others. The partially autonomous ones intervene occasionally, such as through automatic breaks. Others can be left autonomous for a long time until it alerts the driver when it encounters a situation it’s not programmed to tackle.

Thus, when a self-driving car crashes into another car, the accident needs to be investigated to find out if it was the human driver, the autopilot program, or both that is at fault.

How safe is safer?

Young woman steer a car with the steering wheel, maybe she has a driving test perhaps she exercises the parking

Robots are supposed to be safer drivers than humans. Estimates say that advanced driver assistance systems, the less autonomous relative of fully-automated vehicles, could save 50,000 lives per year. Hopes are high that its more advanced relative, self-driving cars, will also significantly reduce deaths caused by road accidents, of which 94 percent are due to human error.

But how safe is safer, and how can you know if the self-driving car did make a mistake?

For graduate student Cian Ryan, there could be a way for people to assess the safety of a self-driving car. He is developing a way to determine a self-driving car’s likelihood of getting into a crash by studying how it swerves, accelerates, and hits the brakes too hard — behaviors that are associated with reckless driving in humans. He presented a paper on this topic in Portugal.

Together with other researchers, Ryan designed an algorithm to monitor and compare a car’s behavior with similar cars. If successful, this study could help insurance companies determine how to measure risk and liability for self-driving car insurance policies.

The fact is, although it seems safer to have a computer for a driver instead of a person, it doesn’t mean that computer programs can’t be held liable when they make a mistake. It goes without saying that car manufacturers may have to answer for the accident when this happens.

Patience for answers

So, if both man and machine can make mistakes, and two partially autonomous self-driving cars can be at fault, does this mean that there can be four parties at fault in a car accident? Maybe.

Having said that, there must be a way for car accident lawyers and car insurance companies to determine which of the four committed a more significant fault. But is there a way to do this, and is it fair? What if a tiny human error during a partially-automated drive has the potential to magnify an accident, resulting in loss of life?

Self-driving cars have a noble purpose: to minimize human errors in driving, thereby reducing accidents. But we need to know how to make the most of this new technology. Relevant laws are needed to help people handle accidents of this nature. Insurance policies must tackle the complexities of such accidents. And people need to have patience until this technology is perfected.


Spread the love
Scroll to Top