(Bloomberg) – Telling Tesla drivers its Autopilot featuredoesn't mean their cars can drive themselves may not be enough tokeep Elon Musk off the hot seat if the technology comes upshort.

|

This month, two Teslas equipped with Autopilot veered intobarriers following disclosure of the first fatal wreck, a Model Sslamming into a 18-wheeler crossing a Florida highway after thesemi-autonomous car failed to distinguish the truck's white trailerfrom sky.

|

Tesla Motors Inc. warns drivers they must still pay attentionand be ready to grab back control of the car, but there's a lot ina name.

|

“The moment I saw Tesla calling it Autopilot, I thought it was abad move,'' said Lynn Shumway, a lawyer who specializes productliability cases against carmakers. “Just by the name, aren't youtelling people not to pay attention?''

|

Joshua Brown's death in Florida was the first involving Tesla'ssemi-autonomous technology, triggering chatter in legal circlesabout who was liable for the crash and prompting a probe by the National Highway Traffic SafetyAdministration as well as the National Transportation SafetyBoard, which typically devotes its attention to mishaps involvingplanes and trains. Some details remain in dispute, includingwhether Brown, a former Navy SEAL, might have been watching a HarryPotter movie in a DVD player found in the car.

|

Musk had anticipated the moment for at least twoyears, telling drivers to keep their hands on the wheelbecause they will be accountable if cars on Autopilot crash. Teslabuyers must activate the Autopilot software, which requires them toacknowledge the technology is a beta platform and isn't meant to beused as a substitute for the driver.

|

Driver's responsibility

When U.S. investigators began evaluating Brown's crash, Tesladoubled down in a statement: “Autopilot is an assist feature. Youneed to maintain control and responsibility of your vehicle.”

|

But people will be people and they often don't do what they'resupposed to do.

|

Lawyers compare giving Tesla drivers Autopilot to building aswimming pool without a fence; the property owner should know thatneighborhood kids will find it hard to resist and may get hurt.

|

“There's a concept in the legal profession called an attractivenuisance,” said Tab Turner, another lawyer specializing inauto-defect cases. “These devices are much that way right now.They're all trying to sell them as a wave of the future, butputting in fine print, 'Don't do anything but monitor it.' It's adangerous concept.''

|

As with so-called smart features before it such as anti-lockbrakes and electronic stability control, telling drivers Autopilotmight not prevent an accident won't help Tesla in court if thetechnology is found to be defective, Turner said.

|

“Warnings alone are never the answer to a design problem,'' hesaid.

|

Possible arguments

In a court case, lawyers for accident victims or their familieswould have other lines of attack if Tesla blames accidents ondrivers failing to heed warnings. They could assert that Tesla'ssoftware is defective because it doesn't do enough to make suredrivers are paying attention.

|

Attorneys could also argue that, in Brown's case for example,the car should have recognized the tractor-trailer as an obstacle,or that Tesla could have easily updated its system to address sucha foreseeable problem.

|

“Any argument will try to establish that Tesla acted in anunreasonable way that was a cause of the crash,” said Bryant WalkerSmith, a University of South Carolina law professor whoresearches automation and connectivity. “It doesn't even need to bethe biggest cause, but just a cause.”

|

If Brown's May 7 crash doesn't end up in court, othersmight.

|

A 77-year-old driver from Michigan, which passed laws allowingsemi-autonomous and fully autonomous vehicles, struck a concretemedian in Pennsylvania and his 2016 Model X SUV rolled over.Also this month, a driver in Montana said his Tesla veered off thehighway and into a guardrail. Both drivers said their cars wereoperating on Autopilot at the time and both were cited for carelessdriving.

|

Pennsylvania and Montana are among the 42 states withoutlegislation regulating autonomous and semi-autonomous cars. JohnThune, chairman of the U.S. Senate Committee on Commerce, Scienceand Transportation, has asked Musk to brief the committee ondetails of Brown's crash, according to an e-mailed statement fromthe committee.

|

Musk response

Musk fired back in a tweet, saying the onboard vehicle logs showthe Autopilot was turned off in the Pennsylvania crash and that theaccident wouldn't have happened if it had been on. The company saidthe Montana driver hadn't placed his hands on the wheel for morethan two minutes while the car was on Autopilot.

|

Musk and Tesla are certain to argue that while their technologyhas yet to meet the threshold for “autonomous vehicle,” its Model Shas achieved the best safety rating of any car ever tested.

|

Even with that record, Consumer Reports on Thursday called on Tesla todisable Autopilot on more than 70,000 vehicles. “By marketingtheir feature as 'Autopilot,' Tesla gives consumers a false senseof security,” Laura MacCleery, vice president of consumer policyand mobilization for Consumer Reports, said in the report called“Tesla's Autopilot: Too Much Autonomy Too Soon.”

|

Tesla shares fell almost 1 percent Friday to $219.64, the lowestsince Monday, and were trading at $220.07 at 2:20 p.m. in NewYork.

|

“Tesla is consistently introducing enhancements proven overmillions of miles of internal testing to ensure that driverssupported by Autopilot remain safer than those operating withoutassistance,” the carmaker said Thursday in a statement. “We willcontinue to develop, validate, and release those enhancements asthe technology grows. While we appreciate well-meaning advice fromany individual or group, we make our decisions on the basis ofreal-world data, not speculation by media.”

|

Khobi Brooklyn, a spokeswoman for the Palo Alto,California-based carmaker, cited the company's earlier comments onthe three accidents and declined to comment further on possiblelitigation involving Autopilot.

|

National rules

The U.S. government will soon offer the auto industry guidingprinciples for safe operation of fully autonomous vehicles, part ofa plan that includes $4 billion for safety research by 2026.

|

For now, the double line between Autopilot and full autonomy isa blurry one.

|

In 2013, NHTSA released a five-rung autonomous vehicle ratingsystem based on cars' computerized capabilities, ranging from level0 for “no-automation” to level 4 for “full self-drivingautomation.”

|

Tesla's likely to argue its technology has yet to surpass level2: automation designed to relieve driver control of at least twofunctions. Plaintiffs will counter the car's been marketed morelike a level 3, when the driver can fully cede control of allsafety-critical functions while remaining available for occasionalintervention.

|

“It's great technology, I hope they get this right and putpeople like us out of business,'' says Steve Van Gaasbeck, an autoproducts lawyer in San Antonio, Texas. “There's really no excusefor missing an 18-wheeler.''

|

Copyright 2018 Bloomberg. All rightsreserved. This material may not be published, broadcast, rewritten,or redistributed.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.