It’s time to curb the enthusiasm around self-driving cars.
After an Uber test vehicle operating in autonomous mode struck and killed 49-year-old Elaine Herzberg in Tempe, Ariz., on the night of March 18, manufacturers, suppliers and regulators have been forced to re-evaluate the development of self-driving technology. To make these vehicles and their testing programs safe, companies need to recognize their limits and respond accordingly, experts say.
“This incident was uncomfortably soon in the history of automated driving,” wrote Bryant Walker Smith, a University of South Carolina law professor who specializes in self-driving vehicles, in response to the crash. “Automated driving is a challenging work in progress that may never be perfected.”
While core autonomous vehicle technology, such as cameras, radar, lidar and decision-making algorithms, has become much more sophisticated, test vehicles still struggle with basic maneuvers that human drivers would easily handle. Often these situations involve predicting and responding to the actions of human-driven vehicles and pedestrians, meaning human lives are at stake.
“Most [self-driving] cars just safely follow the rules,” said Alexander Mankowsky, an expert in future r&d at Daimler.
Cars need to learn how to interpret situations where they need to cooperate with other drivers and pedestrians, Mankowsky said. Otherwise, they will never reach 100 percent effectiveness.
The industry kicked off 2018 at CES touting the arrival of self-driving cars in the near term. However, some companies have taken a step back after the worst-case scenario has been realized.
Uber suspended its autonomous vehicle pilot programs in Arizona, California, Pittsburgh and Toronto after the fatality. Toyota Motor Corp. stopped its supervised self-driving tests, and Aptiv’s autonomous driving subsidiary, NuTonomy, suspended its pilot in Boston following demands from local regulators. Hyundai told Reuters it is now “cautious about mass producing self-driving cars.”
These measured responses, in addition to potential shifts in regulatory and consumer sentiment, are likely to affect timelines to deploy driverless vehicles within the next three years.
“What remains clear is that the corner cases will take a long time to solve,” wrote Barclays analyst Brian Johnson in a note to investors.
As manufacturers sober up after the Uber fatality, here are the questions and concerns they face in making these cars a safe reality.
Chaotic world
Compared with a controlled environment, self-driving cars have a harder time navigating streets in busy city centers, where a multitude of pedestrians, human-driven cars, light rails, cyclists and other actors complicate their decisions.
“Mobility in cities is very dense,” Daimler’s Mankowsky said. “You are never alone.”
In the Uber crash, Tempe police said, video showed Herzberg stepping unexpectedly in front of the vehicle. Police said the vehicle did not show any signs of slowing down.
Mankowsky has been tasked with researching how autonomous vehicles can successfully communicate their intentions and actions to pedestrians and other transport modes around them, allowing individuals to properly engage with the vehicle. Daimler is running a trial using external light and sound configurations to send messages to pedestrians.
“All the social interactions are not researched,” Mankowsky said. “Designwise, we need to enable agency in others.”
Companies such as Uber, Waymo and General Motors’ Cruise Automation target autonomous ride-hailing fleets, sometimes dubbed robotaxis, as the first vehicles to use self-driving car technology. But these cars could face challenges replicating the customer experience. For instance, Uber and Lyft customers in out-of-the-way locations can call drivers and direct them verbally, often requesting the driver find them in a crowd of people or down side roads.
“Humans do a lot to get to the passenger,” said Molly Nix, head of autonomous vehicle user experience for Uber’s Advanced Technology Group. “There’s a crazy amount of complexity with how humans navigate these situations.”
Nix works to understand how Uber customers use a self-driving ride-hailing service, from how the app works to what the in-car experience will be like.
“We have to get people to shift over behaviors,” Nix said. “We’ve got a lot of short-term problems to solve.”
Full self-driving capability can be an uphill battle, literally. To safely navigate hills, cars need to be able to sense vertically as well as they do laterally — a task current sensors struggle to tackle.
“If you’re approaching a hill, you’re basically looking into the road,” said Kay Stepper, head of driver assistance and automated driving at Robert Bosch. “On the opposite end, if you’re going downwards, you’re looking into air.”
He added that sensors’ placement, such as behind the windshield, within the fascia or on the side, can limit whether they’re able to accurately detect objects when a car is going up or down.
Current autonomous vehicle test pilots suggest this is an obstacle for most manufacturers. Waymo is testing its Chrysler Pacifica minivans in Mountain View, Calif., Atlanta, Phoenix, Austin, Texas, and Michigan — all relatively flat areas. Ford Motor Co. is testing its self-driving delivery service in Ann Arbor, Mich., and Miami.
Cruise is testing in hilly San Francisco, though locations of its crash reports suggest the vehicles are staying primarily in flat neighborhoods.
How these vehicles will be regulated may become one of the biggest unknowns moving forward after Herzberg’s death. And without a solid legal framework for the future, it’s nearly impossible to plan long-term vehicles and services that are ensured to be compliant.
“How safe is safe enough?” said Jim Adler, managing director of Toyota AI Ventures. “It’s not even a technology question or a business question, it’s a policy question.”
The Department of Transportation has released guidelines for vehicle manufacturers, though it is not mandatory for companies to follow them.
The federal government is also considering legislation, known as the AV START bill, that would allow manufacturers to bypass federal safety standards to deploy tens of thousands of autonomous vehicles. The bill has been facing opposition from Democratic senators, who reiterated their positions after the Uber crash.
“Congress must take concrete steps to strengthen the AV START Act with the kind of safeguards that will prevent future fatalities,” said Sen. Richard Blumenthal, D-Conn. “In our haste to enable innovation, we cannot forget basic safety.”
The National Transportation Safety Board and NHTSA are investigating the crash, and their findings may influence legislation.
However, to be effective and facilitate development, regulations need to be developed alongside the technology, Adler said, not after it.
“It’s important that we have these conversations simultaneously, not serially,” he said. “Then you can get ahead of a lot of these issues.”