Filed Under:Claims, Auto

Tesla Autopilot too much, too soon, Consumer Reports warns

(Photo: Shutterstock)
(Photo: Shutterstock)

(Bloomberg) -- In an unusual move, Consumer Reports has called Tesla's Autopilot "Too Much Autonomy Too Soon" and called on the automaker to disable the hands-free feature until its safety can be improved. The system has come under increased scrutiny in the wake of a fatal May 7 crash in Florida, which U.S. safety regulators are investigating.

'Deeply concerned'

"By marketing their feature as 'Autopilot,' Tesla gives consumers a false sense of security," said Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports, in the article published Thursday. The article continues: 

"In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology. 'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver's hands are on the wheel."

More than 70,000 Tesla cars are equipped with Autopilot worldwide, and drivers must actively choose to engage the system. In a blog post about the Florida accident, Tesla stressed that those cars have driven more than 130 million miles while using the feature, giving Tesla valuable data in real-world driving conditions. Some customers see the system as an enormous asset during grueling commutes, but YouTube is also filled with videos of people taking risks with it. 

'Appreciate well-meaning advice'

"Tesla is consistently introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance," said Tesla in a statement Thursday. "We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media."

Adam Jonas, a Morgan Stanley analyst who has historically been bullish about the company, also chimed in Thursday. "The name 'Autopilot' could create a consumer expectation and a potentially moral hazard," he said in a research note to clients. The NHTSA says that 94% of all crashes are because of human choice or error.

The promise of semi-autonomous or fully self-driving cars has always been that they are safer. Road & Track recently published a piece of its own, called "Leave Tesla Alone," that argues that the safety record of Autopilot is "at least in the same ballpark with that of human driver[s] on the highway."

And don't expect Tesla Chief Executive Officer Elon Musk to heed the call for a name change. Tesla has always said that Autopilot "functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car."

Copyright 2017 Bloomberg. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.


Safety regulators ask Tesla for details of autopilot crashes

Federal safety regulators have asked Tesla for a broad range of documents and information about the automated driving systems.

Featured Video

Most Recent Videos

Video Library ››

Top Story

The trouble with estimating additional living expenses (ALE)

ALE provides coverage for additional living expenses over and above the insured's normal living expenses.

Top Story

Identity theft takes the sparkle off of the holiday shopping season says new study

Cyber risks affect shopping patterns, according to Generali Global Assistance.

More Resources


eNewsletter Sign Up

Claims Connection eNewsletter

Breaking news on disasters, fraud, legal trends, technology, and CE initiatives for the P&C claim professional – FREE. Sign Up Now!

Mobile Phone

Advertisement. Closing in 15 seconds.