Tesla driver death highlights risks from self-driving vehicles

1
751
Tesla car testing the autopilot

Assigning liability might be the greatest problem that the advent of self-driving vehicles poses to insurers. The first confirmed death of a motorist in an autonomous car indicates that nobody knows who is liable in such cases.

Joshua Brown was using the “autopilot” feature when his Tesla Model S collided with a semi-tractor in Florida on 7 May, news reports indicate. The car failed to stop when the truck pulled in front of it, the Florida Highway Patrol reported. Brown, a known Tesla enthusiast was killed in the accident.

The Model S was knocked clear of the highway and landed in a nearby field. One of the first men to reach the wrecked vehicle claims a Harry Potter movie was playing on a DVD player in the car, Reuters reported. Another witness told Reuters that he heard no movie playing when he reached the crash scene.

A highway patrol officer did find a DVD player with a Harry Potter video on it in the wrecked car, Reuters reported. The possibility that Brown was watching a movie instead of the road reinforces charges that autonomous vehicles encourage reckless behaviour on the part of motorists.

What is Tesla’s liability

The obvious liability issue for Tesla is whether the autopilot failed to detect the semi-tractor and the trailer it was pulling. The autopilot system is supposed to detect vehicles and other objects and brake or steer around them as you can see in one of the video posted by Mr Brown early April of this year before his death.

Witnesses said the Model S did not stop but kept going straight into the trailer. An unidentified Tesla spokesman admitted that the sensors in the vehicle might not be able to distinguish white vehicles like the truck involved in the accident from sunlight.

An even greater issue is who was ultimately at fault; Brown for behaving irresponsibly, or Tesla for encouraging his behaviour with claims about its autopilot features. Brown’s family obviously has a case against Tesla and its CEO Elon Musk; who has bragged about the autopilot’s capabilities in the past.

The case can certainly be made that Tesla assumed liability for Brown’s actions; by designing a feature that performs some of the driver’s functions. The liability is enhanced by Tesla’s claims that the autopilot is capable of steering the vehicle under certain conditions.

The limited evidence available from news articles indicates that Tesla has inadvertently assumed liability for the damage done in the accident. That will have to be determined by American courts – a process that will take years.

Tesla has taken some steps to limit the liability created by self-driving features. Last year The Wall Street Journal reported that the Model S contains a feature that allows it to automatically pass other cars. This feature is apparently activated by hitting the turn signal.

The thinking is that by pressing the turn signal button the driver assumes liability for the vehicle’s actions. The impetus is put on the driver to assume that conditions are safe enough to let the car operate autonomously.

What is the driver’s liability

That ploy might not help Tesla in the Brown case because the vehicle was travelling straight and not passing. The liability issue is cloudy; because Brown had the ability to take control of the car at any time, but he failed to do so.

Such liability is an important issue in the United States because it may determine who will pay any claims arising from accidents. Is Tesla at fault, or was Mr. Brown and now possibly his heirs.

If Tesla is found at fault, it could create a precedent for American auto insurance companies to refuse to pay claims in cases involving autonomous vehicles. A related issue is whether Mr Brown, or his heirs, assumed liability for the inherent risks when he activated the autopilot.

Tesla; or an insurer, might argue that Brown created unnecessary risks by abusing the autopilot system. That of course raises the questions of whether there was a legal contract requiring Brown to behave in a certain way while using autopilot. A related question is did Brown break such a contract by watching the movie instead of the road?

The existence of such a legal relationship or contract; would necessitate the existence of a policy designed to cover those circumstances. If no such policy exists, the question has to be raised did Tesla violate mandatory insurance laws by putting a vehicle with autopilot of on the road?

An even more troubling question is was Brown driving without insurance when he was using autopilot. If no existing policy covered his special circumstances, it can be argued that Brown was violating Florida law which requires insurance for all motorists; even if he had paid for standard auto coverage. It might also be argued that Tesla violated the law by making the autopilot available, without providing insurance coverage for it.

New insurance products will be needed

There is one clear lesson that insurers can learn from the Brown tragedy. New insurance products will have to be designed to cover autonomous cars, because present liability standards are inadequate.

A new kind of policy that covers the both the driver and the manufacturer might be necessary for self-driving vehicles. That raises many other questions; including who will cover the costs of such a policy?

The Brown case demonstrates that insurers may have to go back to the drawing board and create an entirely new class of policies for autonomous vehicles. Such products might be necessary, because traditional notions of liability may not apply to claims arising from accidents involving self-driving vehicles.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here