An 18-year-old college freshman finishes her final mid-term.

|

Exhausted but excited, she walks to the parking lot and spotsher already packed sedan and gets in to head out for spring break.She pulls away from campus and places her car into autonomous mode.

|

Wanting to let her friends know she is on her way, she reachesinto her purse to grab her cell phone and starts sending a text. Inwhat feels like an instant later, she squints open her eyes to theharsh fluorescent bulb above as a nurse tells her she was in anaccident. In those few seconds she took her eyes off the road,trusting her safety to her car's "autonomy," an SUV broadsided herat an intersection at 50 miles per hour.

|

New era changes notions of liability

Fortunately, this account is fictitious, but the scenarioexemplifies how, from a legal perspective, the autonomous vehicle erawill change notions of liability where even the distracted driver isnot necessarily at-fault. There will be a whole new playbook, newtheories of liability, and human vs. machine accounts that amountto "he said, it said."

|

Autonomous vehicles will become part of many types of claims,including personal injuries, property damage and, with respect toautonomous cargo trucks, issues such as business interruption anddamaged or lost goods. Adjusters are facing a minefield in sortingout fault, coverage and exposure in claims involving autonomousvehicles.

|

man behind the wheel of a self-driving car

|

Between full autonomy and total driver control, consumerswill have many intermediate options. (Photo: Shuttestock)

|

Autonomous vehicle claims

When investigating a claim involving an autonomous vehicle, itis important to first determine the vehicle's classification andtherefore its level of "autonomy." Between full autonomy and totaldriver control, consumers will have many intermediate options.

|

Related: How self-driving cars will change the rules of theroad

|

In September 2016, the National Highway TrafficSafety Administration (NHTSA) issued new guidelines identifyingsix levels of autonomy:

  • Level 0: No automation.

  • Level 1: Some automation (like automaticbraking).

  • Level 2: Automation can conduct some of thedriving (like lane assistance and adaptive cruise control).

  • Level 3: Driver can cede control over somedriving in some circumstances.

  • Level 4: Fully autonomous under certainconditions.

  • Level 5: Fully autonomous for all conditions(no steering wheel required).

Level 2 and 3 vehicles will most complicate fault determinationsdue in part to the very concept of rotating responsibility betweenhuman and machine. With Level 4 and 5 vehicles, the "driver" is thevehicle itself. Thus, if the vehicle is at fault, the claim wouldbe based only in product liability unless there is evidence ofindependent negligence such as improper maintenance.

|

Vehicle data recorder

After determining the vehicle's classification, a major prioritywill be obtaining data from the vehicle's event data recorder or"black box." NHTSA recommends all autonomous vehicles have a blackbox, which often records information such as speed, seat belt usageand braking. With autonomous vehicles, there could be additionaldata.

|

For example, after the fatal May 2016 crash involving a Tesla Model S with"autopilot" (a Level 2 vehicle), the black box showed thatautopilot was engaged, that the automatic emergency braking systemdid not provide any warning or braking, and that the driver took nobraking, steering or other actions to avoid the collision.

|

NHTSA ultimately used this data to help reconstruct the accidentand found the Tesla performed as designed, and both the vehicle andthe driver failed to respond to a crossing tractor-trailer despiteit being visible for seven seconds before impact. Note, however,that the NHTSA's role is to determine whether there is a defectnecessitating a recall and not to determine potential civilliability.

|

Cameras may make images possible

In addition to black boxes, some autonomous vehicles will usecameras as component parts. Thus, it is plausible that images maybe available and efforts should be made to retrieve them.

|

After obtaining any available hard data, ideally the autonomousvehicle components should be inspected for pre-existing damage orimproper maintenance. Furthermore, the vehicle should be checked toensure any software updates were downloaded. As an example of itsimportance, four months after the fatal Model S crash, Teslawirelessly beamed a software update into its vehicles that creatednew safety measures.

|

interior of an autonomous vehicle

|

In further determining potential liability, an areaof contention to explore involves consumerexpectations. (Photo: Shutterstock)

|

Who's at fault?

In further determining potential liability, an area ofcontention to explore involves consumer expectations. Among thepublic criticisms of the Model S is the word "autopilot," whichcould suggest a total cessation of control more suitable for atleast a Level 3 vehicle. For example, following the fatal crash,Consumer Reports criticized the name "autopilot," saying it couldgive consumers a false sense of security.

|

The Model S does describe its limitations in the owners' manual.The "Driver Assistance" section includes 52 warnings and sixcautions. Then, there is the catch-all: "Never depend on thesecomponents to keep you safe. It is the driver's responsibility tostay alert, drive safely, and be in control of the vehicle at alltimes."

|

woman putting on mascara while driving

|

In determining case value and exposure, it's important torealize that American attitudes have not yet caught up tomanufacturer excitement (Photo: Shutterstock)

|

Anticipating the inattentive driver

However, in its report on the Model S, NHTSA noted thatrealistically, drivers do not always read their manuals. The newNHTSA guidelines encourage manufacturers of Level 2 and 3 vehiclesto keep the inattentive driver in mind.

|

Related: How driving habits are changing the auto insuranceindustry 

|

Manufacturers are listening. The September 2016 Model S updatecreated a "three strikes" feature where the vehicle determineswhether the driver's hands are off the wheel, will providewarnings, and after the third "strike," the autopilot system willshut off.

|

Other manufacturers, like GM and Audi, are taking a differentroute and will be installing a camera near the rearview mirror thatcan monitor the driver's eyes and head to tell if the driver isbeing attentive.

|

Not all manufacturers are convinced those warning systems willwork, however. Honda and Toyota have added radar, cameras andautomatic braking, but they have purposefully avoided anautopilot-like system. Taking it a step further, Google long agoabandoned development of Level 2 vehicles after finding testdrivers were not paying attention even when told to do so.

|

Programs to educate consumers?

To help ensure accurate consumer expectations, NHTSA recommendsmanufacturers and dealers develop programs to educate consumersregarding the abilities and limits of their vehicles and evensuggests providing on-road, hands-on training. Thus, potentiallyliable parties and theories can expand from manufacturers todealers and distributors as to arguably negligent warning,education and training.

|

Finally, in determining case value and exposure, it is importantto realize that American attitudes have not yet caught up tomanufacturer excitement. An American Automobile Association surveyreleased in March 2017 found 78 percent of Americans are afraid toride in a self-driving vehicle. Perhaps movies like Terminator andThe Matrix have conditioned us to mistrust machines with oursafety.

|

Fears regarding autonomous vehicles are only heightened by mediaaccounts sensationalizing the autonomous vehicle in an accident— whether it is the fatal Model S crash or the March 2017viral photo of the Uber autonomous Volvo SUV on its side. In bothcases, the autonomous vehicle was ultimately found not at fault byinvestigators. For the foreseeable future, however, potentialjurors may be instinctively distrustful of autonomous vehicles.

|

New way of thinking about claims investigations

Claims investigations involving autonomous vehicles will simplyrequire a new way of thinking. No longer will it necessarily beconclusive that the texting 18-year-old driver is at-fault if itwere reasonable for her to rely on her vehicle's autonomy. Theautonomous vehicle revolution has begun. It's time to change theplaybook.

|

Eric Ruben ([email protected]) focuses his practice onpremises and products liability lawsuits and serves as regionalcounsel for various tire manufacturers. He represents a wide rangeof clients, including tire manufacturers, banks, telecommunicationscompanies, insurance companies and individuals.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.