How many times have you encountered a road reality that didn’t match up to navigation via Google Maps? Plenty of people have wrecked after tossing out their common sense in favor of depending solely upon GPS and Google Maps. There’s quite a bit to be excited about when it comes to Google’s self-driving car project, but many people have no clue that those autonomous cars can’t drive in 99% of the U.S. Part of that is due to the fact that Google’s self-driving cars can’t be taken out in snow or in heavy rains. Another very real limitation comes into play if the self-driving car encounters something different than what the map says, such as unmapped traffic lights or stop signs.
*Please see update at the end of this article.*
Many people have heard that Google's autonomous cars can “drive anywhere a car can legally drive.” But that’s only true in some cases. As Technology Review pointed out, Google’s self-driving cars can “see” moving objects in real time, but only recognizes stationary objects like traffic lights if it’s on a pre-made map. “If it encountered an unmapped traffic light, and there were no cars or pedestrians around, the car could run a red light simply because it wouldn't know the light is there.” It also would plow right into a huge pothole or over an open manhole; if there was cop directing traffic around an accident, it wouldn't recognize that either.
Certain aspects of the car’s design do not seem to be widely appreciated. For example, Bernard Soriano, the California DMV official responsible for autonomous vehicles in the state, was unaware that the car couldn't handle unmapped intersection stop signs, despite numerous briefings from Google.
Google is reluctant to talk about how dependent its cars are on maps and how dangerous mistakes on maps could be. But the “public has a right to be concerned” about Google’s refusal to provide that sort of safety-related information.
The Google self-driving car project recently reported that the team “has ‘driven’ over 30,000 miles in our self-driving vehicles – that’s the equivalent of roughly 10 trips across the United States!” That sounds great, but in reality, Google’s self-driving cars had driven about 700,000 miles on real roads as opposed to the “matrix.”
Ron Medford, safety director for Google’s self-driving car program, claims its self-driving cars have driven more than 4 million miles within the company’s “Matrix-style” computer simulation of California roads and that type of virtual simulation is “more valuable” than real driving as it allows “manufacturers to test their software under far more conditions and stresses than could possibly be achieved on a test track.”
California, however, ruled (pdf) that Google’s “just press go” self-driving prototype car isn't going to cut it as self-driving cars must come equipped with a steering wheel and brake and accelerator pedals so a driver would be able to take “immediate physical control” of the car on public roads if needed. The rules go into effect on September 16, meaning Google’s prototype cars with no manual controls will be banned on California roads. Google said it will add “temporary manual controls” for driving tests on private roads.
Did you know that Google’s self-driving cars know “almost nothing about parking?” Four years ago, during an extreme autonomous driving test, researchers took what the autonomous car could accurately do, such as “forward driving at speeds up to 70mph, and in reverse at speeds up to 30mph," and applied that to autonomous sliding, which is attempting to slide the autonomous car into a parking slot. They managed to place “the car within about two feet of the desired location,” adding “to the best of our knowledge, this represents the state of the art in terms of accurately controlling a vehicle in such a maneuver.” Yet the self-driving cars still don’t know much about normal parking in “big, open parking lots or multilevel garages.” Instead, the driverless car would drop you off at your destination and drive away.
Other things Google’s self-driving cars can’t do include avoiding “creaming squirrels” as animals of that size are still too small for the sensors to detect; it can’t go “off the grid” where there is no cell signal and therefore no access to the all-important Google maps.
Google said it started “safety” first as its self-driving cars “have sensors that remove blind spots, and they can detect objects out to a distance of more than two football fields in all directions, which is especially helpful on busy streets with lots of intersections.” About 100 of Google’s prototype autonomous cars have a capped “speed of 25 mph, to make them easier to handle and limit damage if there’s an accident.”
There’s a new patent, however, to detect road rage in other drivers. It’s called a “method to detect nearby aggressive drivers and adjust driving modes.” Dmitri Dolgov, lead software engineer for Google’s self-driving car project, told Reuters the autonomous cars can go 10 mph over the speed limit “when traffic conditions warrant.”
That is the car’s decision to “break the law” as opposed to the driver’s decision to speed, which happens to be something tracked by car manufacturers. The idea that your car might narc on your speeding habits had enough people worried that Ford Global Vice President of Marketing Jim Farley retracted his statement made at CES. He originally said, “We know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing.” His point was supposed to be how Ford knows if drivers are speeding, but doesn’t pass along that knowledge to law enforcement or auto insurance companies.
Autonomous cars are coming and for many people with disabilities that stop them from driving, it will be a terrific quality of life improvement. It’s just not as close as some folks believe. There’s plenty of competition in the autonomous car market. Some folks are trying to decide “if your robot driver should kill you to save a child.”
Meanwhile Israeli software and microchip company Mobileye has outfitted over “100 vehicles models and 3.3 million actual cars” with technology capable of “detecting pedestrians, animals, debris, lands, street signs, traffic lights.” According to Morgan Stanley, “Mobileeye’s secret sauce is its proprietary software algorithm that interprets the video feed from a front mounted camera to analyze the road ahead of the car and interpret the environment around the car.” RBC Capital Markets forecasts that 80% of cars in Europe and 55% in North America will be fitted with such camera-based features by the end of this decade.
Keep an eye on the UK, as Britain will allow driverless cars on public roads starting in January 2015.
*Update* sent by email from Kelly Mason, Global Communications & Public Affairs, Google:
The critical piece of misinformation that was reported was that the cars cannot detect "unmapped traffic lights or stop signs." This is not true.
The fact is that the car operates by combining mapped information with information collected in real time from the car's sensors as it drives down the street. The combination of these two sources of information is what allows the car to drive safely -- if there is an unmapped stop sign or traffic light, the sensors will identify it.
*Yet another update!*
On the backend, this article has caused a regular pooh storm and a flurry of emails. Google is unhappy. MIT is not particularly thrilled. Google wants you to know its cars would “identify” an umapped traffic light or stop sign, yet the fact is that MIT Lee Gomes has documented email proof that Chris Urmson, director of the car project, said a car could run a red light. Here is that quote:
Yes, if a traffic light were teleported into existence somewhere in the map, and the car didn't know about it and there was no other traffic causing it to slow or stop, it could potentially run a red light. In practice this is very unlikely, as adding a traffic light takes time and construction and that this construction would have been detected by other cars, or from announcements made, and the traffic lights would be mapped when they came on line.
So Google’s self-driving cars might identify the light, but the real question – as pointed out courtesy of Gomes – the car may see it, but will it obey whatever the light or sign says?