(Bloomberg) -- The self-driving car, that cutting-edge creationthat’s supposed to lead to a world without accidents, is achievingthe exact opposite right now: The vehicles have racked up a crashrate double that of those with human drivers.

The glitch?

They obey the law all the time, as in, without exception. Thismay sound like the right way to program a robot to drive a car, butgood luck trying to merge onto a chaotic, jam-packed highway withtraffic flying along well above the speed limit. It tends not towork out well. As the accidents have piled up — all minorscrape-ups for now — the arguments among programmers at placeslike Google Inc. and Carnegie Mellon University are heating up:Should they teach the cars how to commit infractions from time totime to stay out of trouble?

Continue Reading for Free

Register and gain access to:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.