Only in the twisted logic of the through-the-windshield view of the world, Keith Naughton of Bloomberg frames the story of how human drivers keep rear-ending self-driving cars because they follow the rules of the road as a bug, not a feature. A glitch.
They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.
They are mostly minor fender-benders, caused by aggressive drivers who are not expecting someone to actually stop at a red light before making a turn. In other cases, the self-driving cars are reacting to situations more quickly than humans can. But Google is working on the problem:
Google has already programmed its cars to behave in more familiar ways, such as inching forward at a four-way stop to signal they’re going next. But autonomous models still surprise human drivers with their quick reflexes, coming to an abrupt halt, for example, when they sense a pedestrian near the edge of a sidewalk who might step into traffic.
What a surprise that would be for a human driver, a car actually taking care not to hit a pedestrian. You can't have that! One consultant notes "It’s a problem that I’m sure Google is working on, but how to solve it is not clear.”
It is totally clear how to solve this problem: Enforce speed limits and other rules meant to control cars. Fine the crap out of people who are following too close and rear-end the Google cars. Then there wouldn't be a problem.
If this is the tragic flaw with self-driving cars, then bring them on.