The safety driver in a self-driving Uber was not being very safe — aka, not paying attention — when the vehicle in autonomous mode struck and killed a woman in an Arizona city earlier this year, police records show.

Included in a massive Tempe Police Department report this week were details about the March 18 fatal crash. The 318-page report found that Rafaela Vasquez, the 44-year-old driver, was frequently looking down and even smiling and laughing at what appears to be a cellphone streaming an episode of the talent search show, The Voice.

In the moments before the test vehicle hit 49-year-old Elaine Herzberg, who was walking her bicycle across a Tempe, Arizona, road, the test driver, Vasquez, was apparently streaming the TV show through Hulu. A video of the moments before the crash shows Vasquez looking toward her right knee while occasionally looking up and around.

Police seized Vasquez’s two LG cellphones and found that YouTube, Netflix, and Hulu had been used on them. After search warrants were served, Hulu confirmed that the phone and account linked to Vasquez was streaming an episode of the show up until the time of the crash.

Last month, the National Transportation Safety Board’s preliminary report noted that the test driver said she’d been looking at the self-driving system interface before the crash. The NTSB also found that the vehicle didn’t alert the driver about the pedestrian, even after sensors detected Herzberg six seconds before impact. The car didn’t automatically brake or slow down after detecting the pedestrian. Vasquez manually hit the brakes after hitting the woman. 

With this week’s release of police data, Uber responded with a statement about its own internal review and strict rules about phone usage during testing. The Uber self-driving program was suspended after the fatality and the Arizona program shut down. Autonomous testing is expected to resume in other cities in the coming months.

“We continue to cooperate fully with ongoing investigations while conducting our own internal safety review. We have a strict policy prohibiting mobile device usage for anyone operating our self-driving vehicles. We plan to share more on the changes we’ll make to our program soon,” an Uber spokesperson said in an email. 

Other information, including post-crash photos of the damaged Uber and the red bicycle Herzberg was walking, were included in the report, along with audio clips of 911 calls from Vasquez and bystanders, and police officer video footage once emergency crews arrived at the crash site.

A photo from the police report show's the self-driving Uber's front-end damage after the fatal crash.

Advertisements

A photo from the police report show’s the self-driving Uber’s front-end damage after the fatal crash.

Image: tempe police department

We’ve seen this before — drivers using semi-autonomous or nearly autonomous features in cars start looking away from the road, going on their phones, and watching much more entertaining things like TV shows and movies. 

A fatal Tesla crash in Florida may have involved the driver watching a Harry Potter movie on a portable DVD player while the car’s Autopilot mode was engaged. It’s the same situation: the driver thinks the car’s got this because they’ve repeatedly seen the car handle the road without them. In the Uber crash, the car was in autonomous mode for 19 minutes before hitting the pedestrian. 

Experts and researchers know that humans get bored and lose focus once machines start taking over most of the work. 

Cody Fleming, assistant professor of systems engineering at the University of Virginia, told me in a conversation about autonomous vehicle levels — Level 0 means no vehicle automation with the driver fully in control; Level 5 is the highest, with the car able to drive itself in any situation and condition — that humans don’t do well with boredom. That’s precisely what happens when the autonomous vehicle takes over the main parts of driving. 

Suddenly you have to “come online” in a critical situation — instantaneously. “Imagine that the first thing when you wake up in the morning is you have to make a safety critical decision,” Fleming offered as a comparison.

The handoff between robot and human responsibility isn’t ever as smooth as it theoretically could be. That’s because humans aren’t robots. It’s very difficult to go from barely-aware-of-the-cars-around-you to alert and focused, making split-second decisions to brake or slow down. That’s why Waymo, Google’s self-driving car company, has sped ahead. They’ve left behind those murky, semi-autonomous levels with a driver at the ready (while the car performs basic functions on its own) and moved to fully driverless cars. 

Waymo is leap-frogging the stages where human intervention is needed at the drop of a hat. It just doesn’t work well enough. Either the human needs to be in charge — or the robot.

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2018%2f5%2fedb35993 b243 5d45%2fthumb%2f00001

©2018 Cyberian.pk

 
or

Log in with your credentials

or    

Forgot your details?

or

Create Account