The recent incident of the Tesla car plowing into the back of a parked fire truck has sparked more than a little interest. Many of us have actually been looking forward to the ease of driving to and from work on autopilot, freeing up time so we can spend it more productively taking care of the business at hand, or perhaps catching up on some shuteye. We look forward to having our vehicle take control of the dangers and frustrations of being attentive at the wheel. In fact, as of August 2017, Tesla was averaging more than 1,800 reservations each day for its most reasonably priced electric car, the Tesla Model 3, and Tesla was expecting to produce 10,000 vehicles per week by the end of 2018.
Since October of last year, autonomous trucks built and operated by Embark have been delivering refrigerators along the I-10 freeway, from a warehouse in El Paso, TX to a distribution center in Palm Springs, CA. While these trucks currently use a human driver to monitor the computer chauffeur, the ultimate goal in the future is to let the trucks drive solo. Embark describes their thought process behind using the freeway for using the robotic features of the truck as being the most predictable, with less potential for unexpected obstacles such as cyclists and pedestrians. Embark utilizes a professional driver to take over and safely navigate the truck when it gets back into the city, where there are more obstacles, more congestion, and more expected construction. Clearly, the need for a human driver is foreseeable when it becomes necessary to take over the wheel to avoid collision with unexpected obstacles and pedestrians in the roadway.
Since the recent incident where a Tesla Model S purportedly driving in Autopilot mode crashed into the back of a parked fire truck that was responding to an accident on I-405 in Culver City, California, we have come to understand that Autopilot is apparently not what it seems to be. There obviously needed to be a human driver attentive at the wheel to stop the Tesla before the crash, and Tesla has since released a statement that Autopilot is intended only to be used by an attentive driver.
However, Tesla's autopilot website states, "All Tesla vehicles produced in our factory … have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver." The accompanying web video shows and describes that, "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." The website claims "full self-driving capability" but states the following, "…enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver." The phrase "almost all circumstances" is critical wording here.
While the recent Tesla crash is unfortunate, so far this year there have been fewer than five crashes involving a Tesla, none of which were fatal; whereas there have been an estimated 3,000 deaths this year due to traffic accidents involving human controlled vehicles. The NHTSA published a report with data from the first 6 months of 2016 showing a 10.4 percent increase in traffic deaths – the largest jump since 1966, mostly due to distracted driving. Estimates in 2016, based on a study conducted by KPMG and the Center for Automotive Research, revealed test results showing that 93% of accidents were due to human error. In 2015, researchers estimated that by mid-21st century driverless cars could reduce traffic fatalities by up to 90%. Using 2013 as a baseline, this estimate would save 29,447 lives a year. In the U.S., that equates to 300,000 fatalities prevented over the course of a decade.
However, greater understanding is needed of the term "autopilot" with respect to vehicles. Merriam-Webster defines autopilot as 1) a device for automatically steering ships, aircraft, and spacecraft; also: the automatic control provided by such a device; and 2) automatic pilot. Automatic pilot is defined as a state or condition in which activity or behavior is regulated automatically in a predetermined or instinctive manner – doing his job on automatic pilot.
By these definitions, and our human understanding of the term, we question the use of the term "Autopilot" to describe the operations of the Tesla at the time of the crash. What we now know is that a pickup truck suddenly swerved into the right lane to avert hitting the fire truck, and with the Tesla traveling at 65 mph, there was no time for the driver to react.
Despite the website claims, the Tesla operator's manual warns that the system is ill equipped to handle this exact sort of situation. "Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead." What does this mean? Simply stated, a vehicle in autopilot mode, or any car currently equipped with adaptive cruise control or automated emergency braking, will not brake to avoid hitting a stopped vehicle. It might even accelerate towards it. For example, if a vehicle in front of a car with adaptive cruise control changes lanes or turns off the road and there is a stopped vehicle immediately in front of the automated vehicle, the stopped vehicle will not be detected by the adaptive cruise control. The vehicle may even accelerate to reach the set cruise control speed since it sees the roadway as 'clear'. This is because the automated vehicle only sees the vehicle in front abruptly putting on their turn signal and leaving the lane, but does not see the stopped vehicle. It is yet unknown what types of objects the adaptive cruise control would ignore if stationary in the roadway – a downed pole perhaps, or a child stopped on a bicycle? It would seem to be in the interest of public safety to reveal the types of objects the vehicle may not brake for, and to better explain the features of these systems.
Suffice it to say, it is always important to read the owner's vehicle manual but in the case of newer vehicles that are equipped with autopilot mode, adaptive cruise control, automated emergency braking, or similar, it is vital for the driver to fully understand the benefits and shortcomings of these systems before driving the vehicle. Likewise, manufacturers need to be cautious when naming their systems. Autopilot gives the impression that the vehicle can run on its own without human assistance. If sued, would a court agree with a driver that based on the name he thought the vehicle could drive itself? It fits common understanding of the term. We realize other issues would also be at play, but semantics do matter. Otherwise, there could be more crashes of the type experienced with the Tesla and the fire truck, or perhaps worse. Insurers and agents may need to step in and educate their insureds as manufacturers continue to add more computerized driver assist features to vehicles.

