Last August, I was a passenger in an autonomous Uber in downtown Pittsburgh in the same type of Volvo XC90 crossover with the same hardware as in the one that struck and killed a pedestrian in Tempe, Arizona, on Sunday night.

As the Volvo I was in headed down a one-way street at 25 miles per hour, two women randomly stepped off the curb in front of the vehicle. The Volvo detected them and slowed, letting pass the women who were seemingly oblivious to traffic.

As experts try to learn why Elaine Herzberg, 49, was hit as she walked across a road, questions abound. A safety driver was behind the steering wheel, but video released Wednesday by police shows the driver looking down before the crash.

The initial focus is on the why the lidar-equipped car never reacted to the pedestrian – and whether one Uber engineer at the wheel was enough supervision when many companies use two.

The Tempe incident shows the Volvo flunked Autonomous 101. Indeed, in a Google car ride I took at the company’s Mountain View headquarters in 2015, Google deliberately ran a bicyclist in front of my car. The lidar-equipped, no-steering wheel auto-bot slowed and avoided the cyclist without incident. On a public street in a Lexus SUV Google tester earlier that year, a similar scenario occurred with a pair of jaywalkers.

Uber and Google point to the aptitude of lidar-based systems (short for “light detection and ranging,” the units bounce lasers off their surroundings in all directions) as proof that these vehicles are ready for public roads. Lidar is superior to the camera and radar systems found on the semi-autonomous showroom cars I’ve driven like the Tesla Model S and Cadillac CT6 with Super Cruise. Experts say lidar’s detection ability is actually enhanced at night – when the Tempe incident occurred – because there is no glare from the sun.

Yet, the video from the Tempe Uber XC90’s dash cam shows that the vehicle not only doesn’t slow down for the pedestrian, it seems to be totally unaware of her presence. The self-driving car doesn’t appear to brake or swerve before impact. Even without expensive lidar systems, many modern cars utilize simpler radars and cameras to brake for pedestrians as well as other objects. Uber’s Volvo XC90 was equipped with all three types of hardware in addition to an on-board computer that contained mapping – a so-called “geo-fence” – of the Uber’s route.

If lidar has a weakness, says Varudevan, it is rain or dense fog – elements that occur infrequently in Arizona. “It’s why the companies like to test there, because they are ‘weather-fenced’ as well as geo-fenced.”

The weather was clear on the night of the Tempe incident.

At root, autonomous cars are computer-based systems that engineers put through rigorous “stress testing” so that autonomous functions are repeatable. It’s why engineers are determined to get a report on Uber’s incident: Did a software algorithm misinterpret the situation? Did the lidar fail?

Experts are also worried about the need for better redundant systems – and that includes the human safety-drivers behind the wheel so they can take over.

Notably, there was only one person in the Tempe vehicle. My Pittsburgh test had two humans – one “operator” (Uber-speak for on-board engineers) ready to take over the wheel, the other to monitor on-board systems via a laptop and center screen. In the dashcam video released by the authorities after the Tempe incident, the single operator, Rafaela Vasquez, seems to be fiddling with the screen, distracted from the road until the moment of impact.

Through a spokesperson, Uber declined to comment on operator guidelines, saying “our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”

Teams from the National Transportation Safety Board and National Highway Traffic Safety Administration are investigating the crash.

Kelley Blue Book auto analyst Karl Brauer notes that Waymo (Google’s autonomous testing arm) also tests in Arizona, but without anyone at the wheel (an engineer rides in the back seat of its Chrysler Pacifica minivans). Other companies are testing the idea of “teleoperation” in which driverless vehicles are monitored remotely should they get into intractable situations.

The Tempe incident raises a question: Whether autonomous autos should be held to a higher standard than human drivers.

Brauer says the woman would likely have been struck by a human driver, given that she was crossing a dark four-lane road at night.

“We know that autonomous cars can see their surroundings better than humans,” he says. “So do we hold autonomy culpable for their theoretical capabilities? Do we hold them accountable to a higher standard?”

Whichever the case, self-driving cars must be able deal with unusual situations like that Tempe. “Otherwise,” says Brauer, “what’s the point?”