IN the United States, a person cannot legally drive until he or she has passed a licensing test. Shouldn’t self-driving vehicles have to pass a test as well?
Of course they should. Self-driving technology represents a radical shift in the way we travel, but it is still very new and not yet proven to be safe. We should at least apply the same standards to a computer system that we do for a 16-year-old who is behind the wheel for the first time.
The issue has become more urgent since the recent fatal crash of a Teslavehicle in self-driving mode. Joshua Brown, 40, was in his Tesla Model S, an electric sedan, using the autopilot option (called a “public beta release” by the company) when a tractor-trailer turned in front of him. The autopilot system did not apply the brakes, and Mr. Brown was killed in the collision.
Self-driving cars (or vehicles in a self-driving mode like the Tesla) aren’t required to pass any special tests before they go on the roads. Like any other new automobile feature, autonomous driving is tested by the manufacturer, which gives assurances that the technology meets applicable laws and safety standards. The Department of Transportation gets involved only when a problem is detected.
From the initial information about the crash, the system apparently failed in pattern recognition, which, at its most basic, distinguishes between visually ambiguous objects. In a press release, Tesla said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”
Humans excel at pattern recognition, but computer vision can have trouble with it, as we recently outlined in a white paper issued bySustainable Worldwide Transportation, a research consortium at the University of Michigan. This is one reason many secure websites use Captchas — those pattern-recognition tests typically consisting of oddly shaped numbers and letters — to prevent computers (“bots”) from automatically registering on websites, spamming comment sections or joining email lists.
There are other examples of pattern recognition that are known to cause difficulties for self-driving systems or which are difficult (but doable) for human drivers and haven’t yet been solved for autonomous systems. They include the detection of downed power lines (difficult to detect at all, then to interpret), flooded roads (water may appear to be the road surface), large potholes and road debris (some are potentially damaging, some not), temporary traffic-control devices (unexpected but critical to respond to appropriately), fire and smoke on or near the road (may not be detected by all required sensors) or other unusual hazards.
Pattern recognition is not the only aspect of self-driving vehicle performance that needs testing. We should also assess each vehicle’s sensing hardware, software and decision algorithms in situations ranging from the mundane — driving on a sunny day in little traffic — to those that pose potential problems for human and computer drivers alike, like driving at night in rain with glare reflected from wet roads.
And other issues need testing, too. First, the cars’ visual and sensing abilities in bad weather are still subpar; manufacturers admit that rain and especially snow pose major obstacles. Second, self-driving vehicles have not been tested thoroughly under a variety of demanding conditions, such as driving at night or on poorly marked or unmapped roads. Third, self-driving vehicles will face the occasional ethical dilemma in their decision making: If a crash is unavoidable, should the car hit the other vehicle or the pedestrian nearby?
Given the potential safety implications, we need standardized, comprehensive tests of driverless technology.
We should establish a graduated license system similar to that in place for human drivers, in which the license for a class of self-driving vehicle is limited to the situations that it can safely negotiate. A full license would be granted only when the vehicles pass an unrestricted test.
Self-driving vehicles are not specifically regulated on a federal level, although the National Highway Traffic Safety Administration is developing guidelines. (Eight states and the District of Columbia have basic legislation to allow driving or testing on their roads.) Since the authority to set vehicle-safety standards for vehicle manufacturers rests with the federal government, the highway administration should take the lead in defining the specific requirements for the licensing test.
We have tests for driver’s licenses because people differ in their skills and abilities. Systems for self-driving vehicles are no different in this respect: When we share the road, we need to know who, or what, is behind the wheel.
FIND THE ORIGINAL HERE: http://www.nytimes.com/2016/07/07/opinion/a-16-year-old-needs-a-license-shouldnt-a-self-driving-car.html?_r=0