(Bloomberg) -- The self-driving car, that cutting-edge creationthat’s supposed to lead to a world without accidents, is achievingthe exact opposite right now: The vehicles have racked up a crashrate double that of those with human drivers.

|

The glitch?

|

They obey the law all the time, as in, without exception. Thismay sound like the right way to program a robot to drive a car, butgood luck trying to merge onto a chaotic, jam-packed highway withtraffic flying along well above the speed limit. It tends not towork out well. As the accidents have piled up — all minorscrape-ups for now — the arguments among programmers at placeslike Google Inc. and Carnegie Mellon University are heating up:Should they teach the cars how to commit infractions from time totime to stay out of trouble?

|

“It’s a constant debate inside our group,” said RajRajkumar, co-director of the General Motors-Carnegie MellonAutonomous Driving Collaborative Research Lab in Pittsburgh. “Andwe have basically decided to stick to the speed limit. But when yougo out and drive the speed limit on the highway, pretty mucheverybody on the road is just zipping past you. And I would be oneof those people.”

|

Last year, Rajkumar offered test drives to members of Congressin his lab’s self-driving Cadillac SRX sport utility vehicle. TheCaddy performed perfectly, except when it had to merge onto I-395South and swing across three lanes of traffic in 150 yards (137meters) to head toward the Pentagon. The car’s cameras and lasersensors detected traffic in a 360-degree view but didn’t know howto trust that drivers would make room in the ceaseless flow, so thehuman minder had to take control to complete the maneuver.

|

“We end up being cautious,” Rajkumar said. “We don’t want to getinto an accident because that would be front-page news. Peopleexpect more of autonomous cars.”

|

Not at fault

|

Turns out, though, their accident rates are twice as high as forregular cars, according to a study by the University ofMichigan’s Transportation Research Institute in Ann Arbor,Michigan. Driverless vehicles have never been at fault, the studyfound: They’re usually hit from behind in slow-speed crashes byinattentive or aggressive humans unaccustomed to machine motoriststhat always follow the rules and proceed with caution.

|

Related: Robot cars get 32-acre Michigan City to honedriving skills

|

“It’s a dilemma that needs to be addressed,” Rajkumar said.

|

It’s similar to the thorny ethical issues driverless carcreators are wrestling with over how to program them to makelife-or-death decisions in an accident. For example, should anautonomous vehicle sacrifice its occupant by swerving off a cliffto avoid killing a school bus full of children?

|

California is urging caution in the deployment of driverlesscars. It published proposed rules this week that would require ahuman always to be ready to take the wheel and also compelcompanies creating the cars to file monthly reports on theirbehavior. Google — which developed a model with no steeringwheel or gas pedal — said it is “gravely disappointed” in theproposed rules, which could set the standard for autonomous-carregulations nationwide.

|

Fast track

|

Google is on a fast track. It plans to make its self-driving-cars unit a stand-alone business next year and eventuallyoffer a ride-for-hire service, according to a person briefed on thecompany’s strategy.

|

Google cars have been in 17 minor crashes in 2 million miles(3.2 million kilometers) of testing and account for most of thereported accidents, according to the Michigan study. That’s partlybecause the company is testing mainly in California, whereaccidents involving driverless cars must be reported.

|

Related: Driver-assist systems seen saving about 10,000 U.S.lives a year

|

The most recent reported incident was Nov. 2 in MountainView, California, Google’s headquarters, when a self-driving GoogleLexus SUV attempted to turn right on a red light. It came to a fullstop, activated its turn signal and began creeping slowly into theintersection to get a better look, according to a report thecompany posted online. Another car stopped behind it and also beganrolling forward, rear-ending the SUV at 4 mph. There were noinjuries and only minor damage to both vehicles.

|

Robot-car stop

|

Ten days later, a Mountain View motorcycle cop noticed trafficstacking up behind a Google car going 24 miles an hour in a busy 35mph zone. He zoomed over and became the first officer to stop arobot car. He didn’t issue a ticket — who would he give it to?— but he warned the two engineers on board about creating ahazard.

|

“The right thing would have been for this car to pull over, letthe traffic go and then pull back on the roadway,” said SergeantSaul Jaeger, head of the police department’s traffic- enforcementunit. “I like it when people err on the side of caution. But cansomething be too cautious? Yeah.”

|

While Google rejects the notion that its careful cars causecrashes, “we err on the conservative side,” said Dmitri Dolgov,principal engineer of the program. “They’re a little bit like acautious student driver or a grandma.”

|

More aggressive

|

Google is working to make the vehicles more “aggressive” likehumans — law-abiding, safe humans — so they “cannaturally fit into the traffic flow, and other people understandwhat we’re doing and why we’re doing it,” Dolgov said. “Driving isa social game.”

|

Google has already programmed its cars to behave in morefamiliar ways, such as inching forward at a four-way stop to signalthey’re going next. But autonomous models still surprise humandrivers with their quick reflexes, coming to an abrupt halt, forexample, when they sense a pedestrian near the edge of a sidewalkwho might step into traffic.

|

Related: Self-driving cars: Who's liable when software is atthe wheel?

|

“These vehicles are either stopping in a situation or slowingdown when a human driver might not,” said Brandon Schoettle,co-author of the Michigan study. “They’re a little faster to react,taking drivers behind them off guard.”

|

That could account for the prevalence of slow-speed, rear-endcrashes, he added.

|

Behave differently

|

“They do behave differently,” said Egil Juliussen, seniordirector at consultant IHS Technology and author of a study on howGoogle leads development of autonomous technology. “It’s a problemthat I’m sure Google is working on, but how to solve it is notclear.”

|

One approach is to teach the vehicles when it’s OK to break therules, such as crossing a double yellow line to avoid a bicyclistor road workers.

|

“It’s a sticky area,” Schoettle said. “If you program them tonot follow the law, how much do you let them break the law?”

|

Initially, crashes may rise as more robot autos share the road,but injuries should diminish because most accidents will be minor,Schoettle said.

|

“There’s a learning curve for everybody,” said Jaeger, of theMountain View Police, which interacts more with driverless carsthan any other law-enforcement unit. “Computers are learning, theprogrammers are learning and the people are learning to get used tothese things.”

|

Please give us a Like on Facebook!

|

Copyright 2018 Bloomberg. All rightsreserved. This material may not be published, broadcast, rewritten,or redistributed.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.