As readers of my Observer column know, I regard the Google self-driving car project as very significant for a number of reasons. One is that it signals a need to re-examine our assumptions about what machines can and cannot do. (I had hitherto assumed that driving was a task that only a human could do with reasonable safely.) The other is that the technology could have a devastating (and as yet undiscussed) impact on employment. (Millions of people earn their living from driving; and in many cultures it’s a route to first employment for immigrants — c.f. New York taxi-drivers.)
This Business Insider piece is useful not because it undermines that logic, but because it puts the astonishing success of the technology into perspective by highlighting the circumstances in which self-driving cars can run into difficulties.
The first challenge is driving in snow.
When snow is on the road, the cars often have a tough time “seeing” the lane markers and other cues that they use to stay correctly positioned on the road. It will be interesting to see how the Google team sorts that one out. [Yes, but human drivers have the same problems, as I know from my own experience driving on East Anglian roads in a blizzard.]
A second challenge, apparently, is when the car encounters a change in a road that is not yet reflected in its onboard “map.” In those situations, the car can presumably get lost, just the way a human can. [In this case a human copes better — I know because I have an outdated SatNav map which sometimes has me driving through open fields on new motorway sections.]
A third challenge is driving through construction zones, accident zones, or other situations in which a human is directing traffic with hand signals. The cars are excellent at observing stop signs, traffic lights, speed limits, the behavior of other cars, and other common cues that human drivers use to figure out how fast to go and where and when to turn. But when a human is directing traffic with hand signals–and especially when these hand signals conflict with a traffic light or stop sign–the cars get confused.
(Imagine pulling up to an intersection in which a police officer is temporarily directing traffic and overriding a traffic light. What should the car pay attention to? How should the car be “taught” to give the police officer’s hand signals more weight than the traffic light? How should the car interpret the hand signals, which are often different from person to person? And what if the cop is just pointing at you and yelling, which happens frequently in intersections in New York?)
According to an engineer (not a Googler) who was involved in the conversation I had about this latter challenge, none of these problems are insurmountable. But they’re certainly interesting.
One of the other interesting points made in the article is that insurance premiums might one day be higher for human-driven vehicles, because they will be, statistically, less ‘safe’.