Uber self-driving test car involved in accident resulting in pedestrian death | TechCrunch

https://techcrunch.com/2018/03/19/uber-self-driving-test-car-involved-in-accident-resulting-in-pedestrian-death/
Some incredibly sad news out of Arizona. An autonomous Uber test SUV driving in Tempe, Arizona was involved in a fatal collision last night. The Uber vehicle was in autonomous mode at the time.
This is reported as the first time an autonomous vehicle operating in self-driving mode has resulted in a human death. This could be the first time AV has caused death to other person than “car driver” – remember Tesla accident.

The outcome of this newest incident can have effect on the path forward for AV regulation and Uber’s plans to sell it’s technology to car makers.

82 Comments

  1. Tomi Engdahl says:

    Tesla rebuked by death crash investigators
    http://www.bbc.com/news/technology-43617752

    Electric car-maker Tesla has been admonished by the US watchdog investigating a recent fatal crash involving one of its cars.

    The National Transportation Safety Board said it was “unhappy” the firm had made public details of the probe.

    The company had blogged on Friday that the Model X car’s Autopilot self-steering system was in use at the time of the accident.

    Walter Huang, an Apple engineer, died after his car hit a barrier.

    “In each of our investigations involving a Tesla vehicle, Tesla has been extremely co-operative on assisting with the vehicle data.

    “However, the NTSB is unhappy with the release of investigative information by Tesla.”

    Tesla’s semi-autonomous driving technology had been involved in an earlier fatality when a Model S car collided with a lorry in 2016.

    The death of a woman in Arizona, who was hit by one of Uber’s self-drive test vehicles in March, had also recently raised fresh concerns about the use of automated car tech on public roads.

    Reply
  2. Tomi Engdahl says:

    Self-Driven: Uber and Tesla
    https://hackaday.com/2018/04/02/self-driven-uber-and-tesla/

    Self-driving cars have been in the news a lot in the past two weeks. Uber’s self-driving taxi hit and killed a pedestrian on March 18, and just a few days later a Tesla running in “autopilot” mode slammed into a road barrier at full speed, killing the driver. In both cases, there was a human driver who was supposed to be watching over the shoulder of the machine, but in the Uber case the driver appears to have been distracted and in the Tesla case, the driver had hands off the steering wheel for six seconds prior to the crash. How safe are self-driving cars?

    Trick question! Neither of these cars were “self-driving” in at least one sense: both had a person behind the wheel who was ultimately responsible for piloting the vehicle. The Uber and Tesla driving systems aren’t even comparable. The Uber taxi does routing and planning, knows the speed limit, and should be able to see red traffic lights and stop at them (more on this below!). The Tesla “Autopilot” system is really just the combination of adaptive cruise control and lane-holding subsystems, which isn’t even enough to get it classified as autonomous in the state of California. Indeed, it’s a failure of the people behind the wheels, and the failure to properly train those people, that make the pilot-and-self-driving-car combination more dangerous than a human driver alone would be.

    You could still imagine wanting to dig into the numbers for self-driving cars’ safety records

    Nonetheless, Tesla’s Autopilot has three fatalities now, and all have one thing in common — all three drivers trusted the lane-holding feature well enough to not take control of the wheel in the last few seconds of their lives. With Uber, there’s very little autonomous vehicle performance history, but there are leaked documents and a pattern that makes Uber look like a risk-taking scofflaw with sub-par technology that has a vested interest to make it look better than it is.

    When only self-driving cars are allowed on the road, they’ll be responsible for 100% of accidents. The only thing that matters is the relative safety of humans and machines, expressed per mile or per trip or per hour. So let’s talk about that.

    Humans are reportedly terrible drivers because 35,000 died on US highways last year. This actually demonstrates that people are fantastically good at driving, despite their flaws. US drivers also drove three trillion miles in the process. Three relevant statistics to keep in your mind as a baseline are 90 million miles per fatality, a million miles per injury, and half a million miles per accident of any kind.

    Tesla: Autopilot, but not Autonomous

    Uber: Scofflaws and Shambles

    Takeaway

    Other self-driving car technologies have significantly better performance records, no fatalities, or have subjected themselves to even the lightest public scrutiny by testing their vehicles in California, the only state with disclosure requirements for autonomous vehicles. As a California rule comes into effect (today!) that enables full deployment of autonomous vehicles, rather than just testing, in the state, we expect to see more data in the future. I’ll write up the good side of self-drivers in another article.

    Reply
  3. Tomi Engdahl says:

    Jensen Huang on the Uber Tragedy and Why Nvidia Suspended Testing
    https://spectrum.ieee.org/view-from-the-valley/transportation/self-driving/jensen-huang-on-the-uber-tragedy-and-why-nvidia-suspended-testing

    The Uber tragedy earlier this month sent a shudder throughout the autonomous vehicle industry. Some of the companies working on the technology paused testing. Nvidia was one of them

    Reply
  4. Tomi Engdahl says:

    Uber’s fatal self-driving car accident reactions and fallout
    https://www.vision-systems.com/articles/2018/03/uber-s-fatal-self-driving-car-accident-reactions-and-fallout.html?cmpid=enl_vsd_vsd_newsletter_2018-04-10&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2061449

    Reactions

    Here is where all the speculation, reactions, and strong opinions on the issue come into play. Missy Cummings, an engineering professor and director of the Humans and Autonomy Laboratory at Duke University, told CNN that the video released of the accident shows that the autonomous technology fell short of how it should be designed to perform.

    In terms of what this means going forward, IEEE Spectrum raises an excellent point. Through this tragic accident, the autonomous vehicle industry can potentially learn from and improve on their failures to decrease or remove the likelihood that something like this could ever happen again. But for now, anyone who was already skeptical of the idea of driverless cars on our public roads, especially—of course— those outside of the industry, the level of trepidation has been officially raised.

    Reply
  5. Tomi Engdahl says:

    How to Make Self-Driving Car Road Testing Safe
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1333143

    Until now I’ve just had to take it on faith that the self-driving car test vehicles were safe. But for now, I’d rather have some assurance that these companies are actually getting their test platform safety right.

    Reply
  6. Tomi Engdahl says:

    Uber’s First Tragedy Likely Not the Last
    https://www.eetimes.com/author.asp?section_id=36&doc_id=1333166

    Over the last 18 months, tech companies (and the media), busy promoting the imminent advent of the fully autonomous robocar, have given short shrift to the myriad lingering “unknowns” of autonomy.

    After Uber’s autonomous vehicle killed a pedestrian last month, I filed a story entitled “Robocar Testing: It’s Simulation, Stupid!” and posted it on LinkedIn. Rick Calle, head of AI research business development at Qualcomm, responded and asked the following questions:

    Über was the first tragedy, how do we make them the last one, Junko Yoshida? I’m pretty sure they use Simulation too, yet does everyone simulate scenarios of sensor failure, effects of sparse Lidar sampling at distance, and unpredictable events?

    Reply
  7. Tomi Engdahl says:

    Tesla Autopilot chief leaves to join Intel
    https://www.reuters.com/article/us-tesla-moves-jimkeller/tesla-autopilot-chief-leaves-to-join-intel-idUSKBN1HX0XP

    (Reuters) – Tesla Inc’s (TSLA.O) Autopilot chief Jim Keller is leaving the electric vehicle manufacturer to join chipmaker Intel Corp (INTC.O).

    Reply
  8. Tomi Engdahl says:

    Waymo van involved in serious collision in Arizona
    https://techcrunch.com/2018/05/04/waymo-van-involved-in-serious-collision-in-arizona/?utm_source=tcfbpage&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&sr_share=facebook

    A Waymo self-driving vehicle was involved in a serious accident in Chandler, Arizona earlier this afternoon. Local police said there were minor injuries from the incident after a sedan swerved into the Waymo van to avoid another collision.

    Although Waymo has said it will be testing vehicles without safety drivers in Arizona, this was not one of them. An operator was in the driver’s seat at the time of the crash, though the car was in autonomous mode, police said.

    Reply
  9. Tomi Engdahl says:

    Uber reportedly thinks its self-driving car killed someone because it ‘decided’ not to swerve
    https://www.theverge.com/2018/5/7/17327682/uber-self-driving-car-decision-kill-swerve

    The car’s sensors saw her, but may have flagged the detection as a ‘false positive’

    Uber has discovered the reason why one of the test cars in its fledgling self-driving car fleet struck and killed a pedestrian earlier this year, according to The Information. While the company believes the car’s suite of sensors spotted 49-year-old Elaine Herzberg as she crossed the road in front of the modified Volvo XC90 on March 18th, two sources tell the publication that the software was tuned in such a way that it “decided” it didn’t need to take evasive action, and possibly flagged the detection as a “false positive.”

    Uber Finds Deadly Accident Likely Caused By Software Set to Ignore Objects On Road
    https://www.theinformation.com/articles/uber-finds-deadly-accident-likely-caused-by-software-set-to-ignore-objects-on-road?shared=56c9f0114b0bb781

    Uber has determined that the likely cause of a fatal collision involving one of its prototype self-driving cars in Arizona in March was a problem with the software that decides how the car should react to objects it detects, according to two people briefed about the matter.

    The car’s sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber’s software decided it didn’t need to react right away.

    Like other autonomous vehicle systems, Uber’s software has the ability to ignore “false positives,” or objects in its path that wouldn’t actually be a problem for the vehicle, such as a plastic bag floating over a road.

    Reply
  10. Tomi Engdahl says:

    David Shepardson / Reuters:
    Uber hires former NTSB chairman to advise the company on its safety culture; report says that a software flaw was responsible for the fatal accident in March — (Reuters) – Uber Technologies Inc [UBER.UL] said Monday it has hired a former National Transportation Safety Board (NTSB) …

    Uber sets safety review; media report says software cited in fatal crash
    https://www.reuters.com/article/us-uber-selfdriving/uber-hires-former-ntsb-chair-to-advise-on-safety-culture-after-fatal-crash-idUSKBN1I81Z4

    Reply
  11. Tomi Engdahl says:

    David Shepardson / Reuters:
    US NTSB report on Uber’s self-driving car that killed a pedestrian in March in AZ says the car failed to identify the pedestrian or brake until it was too late — WASHINGTON (Reuters) – An Uber Technologies Inc [UBER.UL] self-driving vehicle that struck and killed a woman in Tempe …

    Uber disabled emergency braking in self-driving car: U.S. agency
    https://www.reuters.com/article/us-uber-crash/ntsb-uber-self-driving-car-failed-to-recognize-pedestrian-brake-idUSKCN1IP26K

    Uber disabled an emergency braking system in a self-driving vehicle that struck and killed a woman in Arizona in March and which failed to properly identify the pedestrian, raising serious questions about its performance, the National Transportation Safety Board said in a preliminary report released on Thursday.

    Reply
  12. Tomi Engdahl says:

    Uber shuts Arizona self-driving program two months after fatal crash
    https://www.reuters.com/article/us-autos-selfdriving-uber/uber-shuts-arizona-self-driving-program-two-months-after-fatal-crash-idUSKCN1IO2SD

    Uber has shut down its self-driving car operation in Arizona two months after a fatal crash involving one of its vehicles, the company said on Wednesday.

    Reply
  13. Tomi Engdahl says:

    EEVblog #1088 – Uber Autonomous Car Accident Report
    https://www.youtube.com/watch?v=DbT2aBlC0no

    The NTSB today released the report into the fatal Uber Autonomous car accident.

    The RADAR, LIDAR, and cameras DID detect and classify pedestrian bicycle correctly.
    The system DID determine that emergency braking was required.
    But Uber disabled the systems emergency braking feature in autonomous mode.
    Uber also disabled Volvo’s inbuilt pedestrian safety detection system.
    There is also no system to alert the driver that the system detected
    an emergency braking scenario.
    Uber are rooted.

    Preliminary Report Released for Crash Involving Pedestrian, Uber Technologies, Inc., Test Vehicle
    https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx

    Reply
  14. Tomi Engdahl says:

    NTSB: Uber’s Autonomous Car Had Disabled Emergency Braking System
    http://innovation-destination.com/2018/06/12/ntsb-ubers-autonomous-car-had-disabled-emergency-braking-system/?NL=ED-004&Issue=ED-004_20180619_ED-004_939&sfvc4enews=42&cl=article_2_b&utm_rid=CPG05000002750211&utm_campaign=18002&utm_medium=email&elq2=1d82e0c54b6545eb9bee2759ab3db445

    Early results seem to indicate a conflict between Volvo’s ADAS system programming and Uber’s software stack for autonomous-vehicle operation.

    By Murray Slovick, Contributing Editor

    The National Transportation Safety Board (NTSB) released its preliminary report after investigating the crash of an Uber Technologies Inc. test vehicle. The vehicle was based on a modified 2017 Volvo XC90 and operating with a self-driving system in computer control mode. It struck a pedestrian who was walking her bike across the roadon Sunday, March 18, 2018, in Tempe, Ariz. The woman, Elaine Herzberg,died as a result of the accident.

    While NTSB doesn’t yet indicate probable cause or fault, the report found that the Uber self-drive system detected the pedestrian six seconds before the crash. However, its software did not engage the car’s brakes to prevent the collision.

    Reply
  15. Tomi Engdahl says:

    Pwned with ’4 lines of code’: Researchers warn SCADA systems are still hopelessly insecure
    How Shamoon and Stuxnet et al ran riot
    https://www.theregister.co.uk/2018/06/18/physically_hacking_scada_infosec/

    Industrial control systems could be exposed not just to remote hackers, but to local attacks and physical manipulation as well.

    A presentation at last week’s BSides conference by researchers from INSINIA explained how a device planted on a factory floor can identify and list networks, and trigger controllers to stop processes or production lines.

    The talk – Hacking SCADA: How We Attacked a Company and Lost them £1.6M with Only 4 Lines of Code – reviewed 25 years of industrial control kit, going back to the days of proprietary equipment and X21 connections before discussing proof-of-concept attacks.

    Historically everything was “air-gapped” but this has changed as the equipment has been adapted to incorporate internet functionality. This facilitates remote monitoring

    Godfrey explained that security has never been a design criteria for industrial control kit and this hasn’t changed with the advent of IoT in the domain of SCADA systems. As a result, issues such as default hard-coded credentials and lack of encryption abound.

    Worse yet, most systems are running either old or hopelessly obsolete versions of Windows. Most terminals are running Windows 7 but some run Windows 98

    “Industrial control setups certainly don’t have the maturity of enterprise environments,”

    Industrial control systems run water supply, power grid and gas distribution systems as well as factories, building management systems and more.

    Denial-of-service in industrial control environments is easy and fuzzing (trying a range of inputs to see which causes an undesigned effect) also offers a straightforward way to uncover hacks.

    INSINIA has developed a device that automatically scans networks and shuts down components. The “weaponised” Arduino micro-controller looks like a regular programmable logic controller (PLC) to other devices on the network. If it is physically planted on a targeted environment, it can quickly enumerate networks before sending stop commands. It can “kill industrial processes with only four lines of code”, according to Godfrey.

    The wider security community has recognised the risk posed to industrial control systems from malware in the wake of high-profile attacks such as the Shamoon assault on Saudi Aramco and the BlackEnergy attacks on electricity distribution facilities in Ukraine.

    The famous Stuxnet attack on Iran’s uranium-enrichment facilities

    large number of industrial control systems exposed to the internet, which are easily found using Shodan, the search engine for the IoT.

    Reply
  16. Tomi Engdahl says:

    Uber safety driver of fatal self-driving crash was watching Hulu, not the road
    https://techcrunch.com/2018/06/22/uber-safety-driver-of-fatal-self-driving-crash-was-watching-hulu-not-the-road/?utm_source=tcfbpage&sr_share=facebook

    A safety driver operating an Uber self-driving vehicle looked down at a phone that was streaming The Voice on Hulu

    The lengthy report reveals that safety driver Rafaela Vasquez was streaming the show The Voice on her phone at the time of the crash.

    Police determined that Vasquez’s eyes were off the road for 3.67 miles of the 11.8 total miles driven, or about 31 percent of the time.

    Based on the data, police reported that Vasquez could have avoided hitting Herzberg if her eyes were on the road. The case has been submitted to the Maricopa County Attorney’s office for review against Vasquez, who could face charges of vehicular manslaughter.

    Reply
  17. Tomi Engdahl says:

    Jon Porter / The Verge:
    Report: Uber operations manager sent 890-word email to execs with serious concerns about autonomous program just days before an Uber vehicle killed a pedestrian — Solutions that could have prevented collision were proposed — An 890-word email was sent to Uber’s executives …

    Uber manager raised concerns about self-driving program just days before fatal collision
    https://www.theverge.com/2018/12/11/18135983/uber-whistleblower-fatal-tampa-collision-ipo-safety-email

    Solutions that could have prevented collision were proposed

    An 890-word email was sent to Uber’s executives that raised safety concerns about the company’s autonomous vehicle program just days before an Uber vehicle killed a pedestrian in Arizona last March. The email was sent to the head of Uber’s autonomous vehicle division and other top executives and lawyers by a manager in the group. It was made public by The Information, which validated the assertions through interviews with current and former employees. The email complained about near misses that frequently weren’t investigated properly or even ignore, and about backup drivers who lacked proper training and vetting.

    Even before the fatal crash, Miller’s email claims that Uber’s vehicles were getting into accidents with alarming regularity. It notes that the company’s fleet was “hitting things every 15,000 miles,” and that a vehicle was damaged “nearly every other day” in February 2018. Near misses reportedly occurred as frequently as every 100 miles, while backup drivers had to take control once every one to three miles. The more miles the fleet drove, the more likely incidents were to arise.

    Reply
  18. Tomi Engdahl says:

    Uber’s self-driving cars return to public roads after fatal crash
    https://www.cnet.com/news/ubers-self-driving-cars-return-to-public-roads-after-fatal-crash/

    After a nine-month hiatus, Uber says, its autonomous vehicles are now safer.

    Reply
  19. Tomi Engdahl says:

    Uber’s Fatal Self-Driving Car Crash
    https://www.thedrive.com/tech/27023/10-lessons-from-ubers-fatal-self-driving-car-crash

    The fallout from the first autonomous car fatality continues to swirl a year later, as the once high-flying technology faces a “trough of disillusionment.”

    Reply
  20. Tomi Engdahl says:

    … Two things are noteworthy about this sequence of events. First, at no point did the system classify her as a pedestrian. According to the NTSB, that’s because “the system design did not include consideration for jaywalking pedestrians.”
    Second, the constantly switching classifications prevented Uber’s software from accurately computing her trajectory and realizing she was on a collision course with the vehicle. The system used an object’s previously observed locations to help compute its speed and predict its future path. However, “if the perception system changes the classification of a detected object, the tracking history of that object is no longer considered when generating new trajectories,” the NTSB reports.

    How terrible software design decisions led to Uber’s deadly 2018 crash
    NTSB says the system “did not include consideration for jaywalking pedestrians.”
    https://arstechnica.com/cars/2019/11/how-terrible-software-design-decisions-led-to-ubers-deadly-2018-crash/

    Reply
  21. Tomi Engdahl says:

    NTSB Investigation Into Deadly Uber Self-Driving Car Crash Reveals Lax Attitude Toward Safety
    https://spectrum.ieee.org/cars-that-think/transportation/self-driving/ntsb-investigation-into-deadly-uber-selfdriving-car-crash-reveals-lax-attitude-toward-safety

    The Uber car that hit and killed Elaine Herzberg in Tempe, Ariz., in March 2018 could not recognize all pedestrians, and was being driven by an operator likely distracted by streaming video, according to documents released by the U.S. National Transportation Safety Board (NTSB) this week.

    But while the technical failures and omissions in Uber’s self-driving car program are shocking, the NTSB investigation also highlights safety failures that include the vehicle operator’s lapses, lax corporate governance of the project, and limited public oversight.

    A radar on the modified Volvo XC90 SUV first detected Herzberg roughly six seconds before the impact, followed quickly by the car’s laser-ranging lidar. However, the car’s self-driving system did not have the capability to classify an object as a pedestrian unless they were near a crosswalk.

    Reply
  22. Tomi Engdahl says:

    How terrible software design decisions led to Uber’s deadly 2018 crash
    NTSB says the system “did not include consideration for jaywalking pedestrians.”
    https://arstechnica.com/cars/2019/11/how-terrible-software-design-decisions-led-to-ubers-deadly-2018-crash/

    Reply
  23. Tomi Engdahl says:

    Brad Templeton / Forbes:
    NTSB finds Uber’s safety driver the primary cause of fatal Tempe crash, also blames lack of “safety culture” at Uber; AI not detecting jaywalkers played no role — The National Transportation Safety Board presented its findings today on the fatal crash involving an Uber test robocar and Elaine Herzberg.

    NTSB Hearing Blames Humans, Software And Policy For Fatal Uber Robocar Crash – But Mostly Humans
    https://www.forbes.com/sites/bradtempleton/2019/11/19/ntsb-hearing-blames-humans-software-and-policy-for-fatal-uber-robocar-crash/

    Reply
  24. Tomi Engdahl says:

    Backup Driver In Fatal Uber Self-Driving Car Crash Charged With Negligent Homicide
    https://www.forbes.com/sites/rachelsandler/2020/09/15/backup-driver-in-fatal-uber-self-driving-car-crash-charged-with-negligent-homicide/?utm_campaign=forbes&utm_source=facebook&utm_medium=social&utm_term=Gordie/#676f7264696

    The safety driver in an Uber self-driving car that struck and killed a pedestrian in 2018 has been charged with negligent homicide, Arizona officials announced Tuesday.

    A grand jury charged Rafaela Vasquez with one count of negligent homicide for allegedly causing the death of 49-year-old Elaine Herzberg, who was crossing the street at night in Tempe, Arizona.

    Vasquez who was the safety monitor in the self-driving test vehicle that hit Herzberg, failed to brake until it was too late, investigators said.

    It’s believed to be the first pedestrian death involving a self-driving vehicle.

    The Tempe Police Department found the accident could have been avoided if Vasquez wasn’t distracted watching “The Voice” on her phone.

    Prosecutors decided in 2019 that Uber is not criminally liable for the accident and declined to pursue charges against the company, though the National Transportation Safety Board concluded that Uber also “had an inadequate safety culture, exhibited by a lack of risk assessment mechanisms, of oversight of vehicle operators, and of personnel with backgrounds in safety management.”

    The crash caused Uber to pull its autonomous vehicle testing from Arizona altogether, and now it tests cars in Pittsburgh. 

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*