Uber self-driving test car involved in accident resulting in pedestrian death | TechCrunch

Some incredibly sad news out of Arizona. An autonomous Uber test SUV driving in Tempe, Arizona was involved in a fatal collision last night. The Uber vehicle was in autonomous mode at the time.
This is reported as the first time an autonomous vehicle operating in self-driving mode has resulted in a human death. This could be the first time AV has caused death to other person than “car driver” – remember Tesla accident.

The outcome of this newest incident can have effect on the path forward for AV regulation and Uber’s plans to sell it’s technology to car makers.


  1. Tomi Engdahl says:

    Uber Robocar Kills Pedestrian, Despite Presence of Safety Driver

    A self-driving Uber vehicle reportedly killed a pedestrian in Tempe, Ariz. last night, the first time such a thing has happened. The only other self-driving fatality—in May 2017—involved the driver of a Tesla that crashed into a truck.

    Uber has not confirmed that the vehicle, a modified Volvo XC90 SUV, was in self-driving mode. However, the Tempe Police Department said, in a statement, that it “was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel.”

  2. Tomi Engdahl says:

    Here’s how Uber’s self-driving cars are supposed to detect pedestrians
    Redundant, overlapping vision systems

    Something unexpectedly entering the vehicle’s path is pretty much the first emergency event that autonomous car engineers look at. The situation could be many things — a stopped car, a deer, a pedestrian — and the systems are one and all designed to detect them as early as possible, identify them and take appropriate action. That could be slowing, stopping, swerving, anything.

    Uber’s vehicles are equipped with several different imaging systems which work both ordinary duty (monitoring nearby cars, signs and lane markings) and extraordinary duty like that just described. No less than four different ones should have picked up the victim in this case.

  3. Tomi Engdahl says:

    Exclusive: Tempe police chief says early probe shows no fault by Uber

    Pushing a bicycle laden with plastic shopping bags, a woman abruptly walked from a center median into a lane of traffic and was struck by a self-driving Uber operating in autonomous mode.

    “The driver said it was like a flash, the person walked out in front of them,” said Sylvia Moir, police chief in Tempe, Ariz., the location for the first pedestrian fatality involving a self-driving car. “His first alert to the collision was the sound of the collision.”

    Traveling at 38 mph in a 35 mph zone on Sunday night, the Uber self-driving car made no attempt to brake, according to the Police Department’s preliminary investigation.

    Elaine Herzberg, 49, was unconscious at the scene and later died of her injuries at a local hospital.

    The self-driving Volvo SUV was outfitted with at least two video cameras, one facing forward toward the street, the other focused inside the car on the driver, Moir said in an interview.

    From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said. The police have not released the videos.

    The Tempe police will collaborate with investigators from the National Transportation Safety Board and the National Highway Traffic Safety Administration in probing the accident.

    While hundreds of autonomous cars operate in Arizona, Moir said she was aware of only one other accident, which occurred a year ago. It also involved an Uber in self-driving mode, which was flipped onto its side. But authorities determined that the other car involved was at fault

    “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident, either,” Moir said.

    However, if Uber is found responsible, that could open a legal quagmire.

    “I won’t rule out the potential to file charges against the (backup driver) in the Uber vehicle,” Moir said.

    But if the robot car itself were found at fault? “This is really new ground we’re venturing into,” she said.

  4. Tomi Engdahl says:

    Toyota pauses automated driving testing on U.S. roads following Uber accident

    Automaker Toyota has temporarily ceases its public road testing of its fully autonomous ‘Chauffeur’ system in the U.S. after an accident earlier this week saw an Uber self-driving test vehicle strike a pedestrian, which ultimately resulted in her death.

    Police have stated that initial findings suggest the accident would’ve been extremely difficult to avoid regardless of whether a human or an AV system was in control at the time

  5. Tomi Engdahl says:

    Is Robocar Death the Price of Progress?

    A self-driving Uber vehicle, driving in autonomous mode with a safety driver behind the wheel, hit and killed a pedestrian Sunday night in Tempe, Arizona.

    The preliminary investigation shows that Uber’s robocar, identified as 2017 Volvo XC90, was driving “at approximately 40 miles per hour,” said Tempe police during a press briefing Monday.

    Police told reporters that they found “no significant signs of [the Uber vehicle] slowing down” as it approached the pedestrian with a bicycle crossing the street outside a crosswalk.

    It remains unclear if the accident, which appears to be the first reported fatal crash involving a self-driving vehicle and a pedestrian in the United States, might trigger a slowdown in the autonomous vehicle race.

    Heated discussions, however, have already erupted on social media forums. One investor on LinkedIn deemed the death the price of progress: “We lose one to save thousands; it’s called acceptable casualties.” An insurance analyst saw the accident as a novel — and unwelcome — hazard: “… you did sign up to face drunk drivers and cars with flat tires careening out of control and a host of other risks from cars on the roads because those risks are part and parcel of vehicles being driven by people … but no one signed up to face AVs.”

    This tragedy is certain to affect consumer perceptions of autonomous vehicles and, more significantly, the AV discussions unfolding in Washington, D.C.

  6. Tomi Engdahl says:

    Apportioning Blame When Robocars Have Accidents

    When a self-driving car is involved in an accident, the rush to judgment begins before the hubcaps stop rolling.

    Was it the fault of the car’s programmers? The fleet operator? The person in the car, or in the car in front of it, or on the bicycle next to it? How about the road regulators—federal, state, and local?

    The blame game played out on Monday

    At first it was unclear who, if anyone, was at fault. Then, later in the day, Tempe Police Chief Sylvia Moir told the San Francisco Chronicle that the car’s own video record of the accident had led her to conclude, preliminarily, that the automated system and its professional minder probably did nothing wrong.

    “It’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

  7. Tomi Engdahl says:

    Uber’s former head of self-driving cars put safety second

    Evidence previously surfaced in the Waymo v. Uber trial told of Anthony Levandowski’s ambivalence towards car safety

    “We don’t need redundant brakes & steering, or a fancy new car, we need better software,” then-Google engineer Anthony Levandowski wrote in an email to Larry Page in January 2016. “To get to that better software faster we should deploy the first 1000 cars asap. I don’t understand why we are not doing that. Part of our team seems to be afraid to ship.” Shortly thereafter, Levandowski would leave to found his own self-driving trucking company, which was quickly acquired by Uber.

  8. Tomi Engdahl says:

    Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam

    Arizona officials saw opportunity when Uber and other companies began testing driverless cars a few years ago. Promising to keep oversight light, they invited the companies to test their robotic vehicles on the state’s roads.

    Then on Sunday night, an autonomous car operated by Uber — and with an emergency backup driver behind the wheel — struck and killed a woman on a street in Tempe, Ariz.

  9. Tomi Engdahl says:

    Tempe Police release footage of fatal crash from inside self-driving Uber

    The video raises more questions about what happened this past weekend

    The Tempe Police Department has released the first footage of this week’s fatal crash involving a self-driving Uber. Two angles of the crash — one facing out at the road, and one facing in at the Uber safety driver — were compiled into a 22-second video that was released on the Tempe Police’s Twitter account Wednesday night.

  10. Tomi Engdahl says:

    Here’s how Uber’s self-driving cars are supposed to detect pedestrians
    Redundant, overlapping vision systems

    first such incident and will certainly be scrutinized like no other autonomous vehicle interaction in the past. But on the face of it it’s hard to understand how, short of a total system failure, this could happen, when the entire car has essentially been designed around preventing exactly this situation from occurring.

    Something unexpectedly entering the vehicle’s path is pretty much the first emergency event that autonomous car engineers look at.

    Uber’s vehicles are equipped with several different imaging systems which work both ordinary duty (monitoring nearby cars, signs and lane markings) and extraordinary duty like that just described. No less than four different ones should have picked up the victim in this case.

    The lidar unit, if operating correctly, should have been able to make out the person in question, if they were not totally obscured, while they were still more than a hundred feet away

    Front-mounted radar.
    Depending on the radar unit Uber employed — likely multiple in both front and back to provide 360 degrees of coverage — the range could differ considerably.
    The radar signature of a person is not nearly so recognizable

    Short and long-range optical cameras.
    The cameras on the Uber vehicle watch for telltale patterns that indicate braking vehicles (sudden red lights), traffic lights, crossing pedestrians and so on. Especially on the front end of the car, multiple angles and types of camera would be used

    Detecting people is one of the most commonly attempted computer vision problems, and the algorithms that do it have gotten quite good.

    That said, it can be hard at night.

    Safety driver. It may sound cynical to refer to a person as a system, but the safety drivers in these cars are very much acting in the capacity of an all-purpose failsafe. People are very good at detecting things

    the car was certainly equipped with technology that was intended to, and should have, detected the person and caused the car to react appropriately. Furthermore, if one system didn’t work, another should have sufficed — multiple failbacks are only practical in high-stakes matters like driving on public roads

  11. Tomi Engdahl says:

    All cyclists will need to fit detection beacons, says cycle industry boss

    “Bicycles will definitely have to communicate with other vehicles,” says CONEBI’s general manager.

  12. Tomi Engdahl says:

    Experts analyzing video of Uber’s autonomous car hitting a pedestrian argue car’s sensors should have detected her and a human driver could’ve responded quicker

    Human Driver Could Have Avoided Fatal Uber Crash, Experts Say

    Human driver may have avoided impact: forensic crash analysts
    Self-driving sensors should have detected victim, experts say

    The pedestrian killed Sunday by a self-driving Uber Technologies Inc. SUV had crossed at least one open lane of road before being hit, according to a video of the crash that raises new questions about autonomous-vehicle technology.

    Forensic crash analysts who reviewed the video said a human driver could have responded more quickly to the situation, potentially saving the life of the victim, 49-year-old Elaine Herzberg. Other experts said Uber’s self-driving sensors should have detected the pedestrian as she walked a bicycle across the open road at 10 p.m., despite the dark conditions.

    Herzberg’s death is the first major test of a nascent autonomous vehicle industry that has presented the technology as safer than humans who often get distracted while driving. For human driving in the U.S., there’s roughly one death every 86 million miles, while autonomous vehicles have driven no more than 15 to 20 million miles in the country so far, according to Morgan Stanley analysts.

    “As an ever greater number of autonomous vehicles drive an ever greater number of miles, investors must contemplate a legal and ethical landscape that may be difficult to predict,” the analysts wrote in a research note following the Sunday collision. “The stock market is likely too aggressive on the pace of adoption.”

    Smith said the video doesn’t fully explain the incident but “strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver (as well as by the victim).”

  13. Tomi Engdahl says:

    Who’s at Fault in Uber’s Fatal Collision?

    Uber’s accident earlier this week—the first fatality involving a pedestrian and an autonomous car—already has fingers pointing to various parties, though it’s still too early to tell who’s responsible. The plot thickens every day with new rumors and information.

    A woman, jaywalking at night. Walking a bicycle. Possibly homeless. On a busy street known for partying. In Arizona, with permissive laws. Involving Uber, with permissive ethics. Which had hired an ex-felon to be the safety driver in the autonomous vehicle (AV).

    As we wait for a full investigation, we can start untangling the strands of responsibility,

  14. Tomi Engdahl says:


    “I think the sensors on the vehicles should have seen the pedestrian well in advance,” says Steven Shladover, a UC Berkeley research engineer who has been studying automated systems for decades and watched the video. “If she had been moving erratically, it would have been difficult for the systems to predict where this person was going,” he says, but the video shows no evidence of that.

  15. Tomi Engdahl says:

    Robo-Uber: What Went Wrong

    But for the tech community, it is past time to start thinking about what could have prevented the autonomous car from killing a woman crossing the street.

    Footage of the collision shows that the self-driving car not only did not stop; it didn’t even slow down when a human — whose movements were neither fast nor sudden — crossed its path. Mike Demler, senior analyst, The Linley Group, calls it “the million-dollar question.”

    He asked: “Uber needs to answer — what was the purpose of the driver being on the road at that hour? Was it a nighttime test? Were the radar/lidar functioning? Is their software just totally incapable of responding to emergency conditions?”

    Shocking to many automotive experts is that none of the sensors — including radars, lidars, vision — that were embedded inside Uber’s self-driving car (Volvo XC90) seemed to have their eyes on the road. Nor — as indicated by the fully functional driver-facing camera — did the so-called “safety driver.”

    Phil Magney, founder and principal of VSI Labs, told us, “The fact that the AV stack failed to detect is surprising.” However, he added, “Unless some sensors or features were temporarily disabled for testing or isolating certain functions.”

    “VSI Labs believes that lidar would have been the most beneficial in this accident, although it is entirely possible that the radar and/or the camera could have picked up the pedestrian as well,”

    But considering that the accident happened in the dark, Demler said, “The radar and lidar should have picked her up first. At the reported 38 mph, the Volvo traveled approximately 17 m/s. Even short-range radar (SRR) has a range of up to ~100 m. For example, in a TI whitepaper, they describe a device with an unambiguous range of 80 m. That would give the vehicle four to five seconds to respond.”

    Demler added, “But the vehicle must have long-range radar (LRR), too. This Bosch LRR has a detection range of up to 250 m .”

    Furthermore, “There’s the lidar. The Uber Volvo has a rotating 64-element Velodyne lidar on its roof. That has a range of 120 m.”

    Asked for the weakest link among all of these sensors, Demler described the cameras as “obviously the weakest sensor to use at night.”

    Given all that, Demler believes, “There’s no excuse for the sensor system not detecting the woman crossing the road.”

    “Ideally, thermal cameras looking out the front corners could be used to reduce this kind of pedestrian accident where the target is dressed in dark clothing and it is dark.” Millimeter-wave radar would have certainly picked up the pedestrian, he added.

  16. Tomi Engdahl says:

    Police chief said Uber victim “came from the shadows”—don’t believe it

    YouTube videos give a different impression of the site of a deadly Uber crash.

    On Sunday night, an Uber self-driving car killed 49-year-old Elaine Herzberg in Tempe, Arizona. A key argument in Uber’s defense has been that the road was so dark that even an attentive driver would not have spotted Herzberg in the seconds before the crash.

    Herzberg “came from the shadows right into the roadway,” Tempe police chief Sylvia Moir told the San Francisco Chronicle on Monday. “The driver said it was like a flash.”

    When police released footage from the Uber vehicle’s onboard camera on Wednesday, it seemed to somewhat support this view.

    But then people in the Tempe area started making their own videos—videos that give a dramatically different impression of that section of roadway.

    “It’s not as dark as that video made it look,”

    The more likely explanation is that the Uber vehicle’s dashcam was poorly configured for nighttime recording, and so the video gives a misleading impression of how bright the scene was and how much warning the driver had.

    And even if it’s true that the road were poorly lit, it’s not clear if that would exonerate Uber. Uber’s cars have lidar and radar sensors in addition to cameras, and those sensors don’t require ambient light to function. So the vehicle should have spotted Herzberg even if the road was pitch black.

  17. Tomi Engdahl says:

    Daisuke Wakabayashi / New York Times:
    Sources: Uber’s autonomous cars struggled with goal of 13 miles before driver intervention; Waymo says its cars had average of ~5.6K miles between interventions — SAN FRANCISCO — Uber’s robotic vehicle project was not living up to expectations months before a self-driving car operated …

    Uber’s Self-Driving Cars Were Struggling Before Arizona Crash

    Uber’s robotic vehicle project was not living up to expectations months before a self-driving car operated by the company struck and killed a woman in Tempe, Ariz.

    The cars were having trouble driving through construction zones and next to tall vehicles, like big rigs. And Uber’s human drivers had to intervene far more frequently than the drivers of competing autonomous car projects.

    Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

    Yet Uber’s test drivers were being asked to do more — going on solo runs when they had worked in pairs.

  18. Tomi Engdahl says:

    Waymo CEO on fatal autonomous Uber crash: Our car would have been able to handle it

    Nearly a week after an autonomous Uber SUV claimed the first life in testing of self-driving vehicles, the CEO of another tech company says he is confident its cars would have performed differently under the circumstances.

    “I can say with some confidence that in situations like that one with pedestrians — in this case a pedestrian with a bicycle — we have a lot of confidence that our technology would be robust and would be able to handle situations like that,” Krafcik

    Waymo is testing its self-driving fleet in parts of Phoenix, where autonomous taxis are shuttling some members of the public. The company has a fleet of Chrylser Pacifica minivans with custom-designed hardware

    Last fall, Waymo became the first company to issue a detailed safety report to federal officials on its self-driving vehicle program, under a voluntary self-assessment program.

  19. Tomi Engdahl says:

    Uber robocar aftermath, and beyond

    18th of Mach 2018 it happened, autonomous car, with human driver, hit a person, who was killed. Immediately news was like terrorist news, “Uber killer robocar etc.” Then news started to be more reasonable, like “Even human driver could not have saved the over driven person”, “She jumped from nowhere”, “It was not robot car’s fault”

    The good thing here is that everything is logged, like airplanes famous black box, so it can be easily seen what robocar did and what it didn’t. This is much better situation than asking later from (human) eyewitnesses or passengers or even from driver, which motives are most likely biased. Soon we will know what actually happened, only facts not assumptions.

    Robocars don’t see! It is common misunderstand to use word “see” for robots. Robots (cars) perceive world around them and use this information to make decision what to do next. Robot cars use multiple sensors for perceiving world around. Almost all robocars have following sensors, LIDAR, RADAR and Stereo camera.

    Lidar sends laser beam and because it sweeping both horizontally and vertically, it creates point cloud of environment and can detect objects about round 100 meters away. RADAR is based on radio frequency waves and it can detect objects round 150 meters away but not so accurately than LIDAR. Stereo cameras are main used keep robot car in lane and when in enough light, also to detect objects up to 30 meters. Note that all these sensors are overlapping

    If objects trajectory is on autonomous car’s trajectory, so in collision course, analyser calculates the best action to do, like braking and/or avoiding.

    Human is useless! The biggest problem with evolving robocars is not technology or even price but human. When human driver is focused, he/she can react under 500ms, but what happen when you sit to autonomous car? The more the car is automated the more driver is relaxed and starts doing something with phone, laptop or newspaper. Now, if an autonomous car fails for reason or another, they still do, driver’s reaction time is in seconds, and most cases much too late.

    The machine should have done better!

    RADAR and LIDAR had most likely easily spotted, i.e. measured the object, bicyclist, as we humans know, because it was on their range. Now comes the trickier part, analyser tried to analyse what is that object and what is its trajectory.

    The promise of autonomous car is that they are better drivers as humans, not worse.

    Finally, who will be prosecuted? The human driver, Uber, Volvo, no-one (Okay, not the victim anyway) In many countries the human is responsible driver always, whatever is the autonomous car’s autonomy level. But as stated earlier, it is almost unhuman for human to keep focused all the time when car is, kilometre after kilometre, driving smoothly, accurately and reliable. So, should we then make car manufacture guilty? The biggest car manufacturer in Europe has said that they will insurance all their autonomous car and this insurance will cover all accidents happened to robocars.

    So, what will happen now? Courts and lawyers will do their work, but what will this mean to robocar development? This will not be the last robocar accident where human is killed.

    Depends little bit of who or which will be named as guilty, if it’s technology, then most likely current robocar driving permissions will be halted, for a while. Little bit like when a plane crash, then same model planes are grounded. Anyway, this was not a breaking news, not even any news, this was inevitable, it was foreseen, it was just matter of time, now it happened, honeymoon is over.

  20. Tomi Engdahl says:

    Robo-Uber: What Went Wrong

    It’s prudent, of course, to await reports by the National Highway Transportation Safety Administration (NHTSA) and the National Traffic Safety Bureau (NTSB) before assigning blame for the self-driving Uber vehicle’s fatal accident last Sunday night.

    But for the tech community, it is past time to start thinking about what could have prevented the autonomous car from killing a woman crossing the street.

    “VSI Labs believes that lidar would have been the most beneficial in this accident, although it is entirely possible that the radar and/or the camera could have picked up the pedestrian as well,” said Magney.
    Demler said, “At the distance we see in the video, all of the sensors should have picked up the woman crossing the street with her bike. As soon as she was illuminated by the headlights, which was one to two seconds before impact, the vehicle cameras should have detected her.”

    But considering that the accident happened in the dark, Demler said, “The radar and lidar should have picked her up first. At the reported 38 mph, the Volvo traveled approximately 17 m/s. Even short-range radar (SRR) has a range of up to ~100 m. For example, in a TI whitepaper, they describe a device with an unambiguous range of 80 m. That would give the vehicle four to five seconds to respond.”

    Demler added, “But the vehicle must have long-range radar (LRR), too. This Bosch LRR has a detection range of up to 250 m .”

    Furthermore, “There’s the lidar. The Uber Volvo has a rotating 64-element Velodyne lidar on its roof. That has a range of 120 m.”

  21. Tomi Engdahl says:

    Arizona governor suspends Uber from autonomous testing

    Arizona Gov. Doug Ducey suspended Uber’s self-driving vehicle testing privileges Monday in the wake of a pedestrian fatality in a Phoenix suburb last week.

    He said he expects public safety to be the top priority for those who operate self-driving cars.

    “The incident that took place on March 18 is an unquestionable failure to comply with this expectation,” Ducey said.

    The move by the Republican governor marks a major step back from his embrace of self-driving vehicles. He previously welcomed Uber and other autonomous vehicle companies to use Arizona as a place for testing under few, if any, regulations. In early March, he authorized self-driving vehicle companies to run tests without a person in the car to act as a safety operator.

  22. Tomi Engdahl says:

    Uber blocked from testing self-driving cars on Arizona roads

    Uber has been barred from testing its self-driving cars on public roads in Arizona following the accident last week involving one of its testing vehicles that killed a pedestrian crossing the street in its path. Arizona Governor Doug Ducey released a letter sent to Uber CEO Dara Khosrowshahi in which he described the accident as captured by onboard cameras as “disturbing and alarming.”

    The governor, who has been a strong proponent of self-driving testing in the state up until this point

  23. Tomi Engdahl says:

    Uber Disabled Volvo SUV’s Safety System Before Fatality

    Uber Technologies Inc. disabled the standard collision-avoidance technology in the Volvo SUV that struck and killed a woman in Arizona last week, according to the auto-parts maker that supplied the vehicle’s radar and camera.

    “We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case,” Zach Peterson, a spokesman for Aptiv Plc, said by phone. The Volvo XC90’s standard advanced driver-assistance system “has nothing to do” with the Uber test vehicle’s autonomous driving system, he said.

    Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system. Uber’s system failed to slow the vehicle as 49-year-old victim Elaine Herzberg crossed the street pushing a bicycle.

  24. Tomi Engdahl says:

    Are Autonomous Vehicles Safe? Not Yet Apparently
    An Uber autonomous vehicle was recently involved in a fatal accident in Tempe, Ariz. So, what happens next?

    Murphy’s Law says that “anything that can go wrong will go wrong,” and it’s finally caught up with autonomous vehicles. On Sunday night March 17, an autonomous car operated by Uber, including an emergency backup driver behind the wheel, struck and killed a woman on a street in Tempe, Ariz. Hours after the crash, Uber announced the suspension of all tests of its autonomous vehicles in Pittsburgh, Phoenix, San Francisco, and Toronto.

  25. Tomi Engdahl says:

    Uber Ordered To Take Its Self-Driving Cars Off Arizona Roads

    After failing to meet an expectation that it would prioritize public safety as it tested its self-driving technology, Uber has been ordered to take its self-driving cars off Arizona roads (Warning: source may be paywalled; alternative source). “The incident that took place on March 18 is an unquestionable failure to comply with this expectation,”

    Uber Ordered to Take Its Self-Driving Cars Off Arizona Roads

  26. Tomi Engdahl says:

    First Uber, now Tesla?! The autonomous car industry is going to have some tough times ahead. :(

    Tesla fatal car crash prompts NTSB investigation

    The United States National Transportation Safety Board is conducting an investigation into a fatal car crash involving a Tesla Model X car. On March 23, a Tesla car crashed into a freeway divider, killing the driver, causing a fire and shutting down two lanes of Highway 101 near Mountain View, Calif. It’s not clear if Tesla’s automated control system, Autopilot, was active at the time of the crash, the NTSB said in a tweet

  27. Tomi Engdahl says:

    Nvidia’s Jensen Huang cautions patience in judging Uber AV engineers

    Nvidia CEO Jensen Huang faced a number of questions regarding Uber’s recent self-driving test vehicle accident, in which an SUV equipped with Uber’s autonomous technology

    Earlier on Tuesday, Reuters broke the news that Nvidia was suspending its own autonomous testing programs around the world. Huang didn’t address the suspension on stage

    “First of all, what happened is tragic and sad,”

    “It also is a reminder of exactly why we’re doing this.”

    Huang explained that in fact, as a result of the accident, he actually believes that investment will rise in self-driving system design, specifically because previously companies might have thought they could get away with meager or minimal investment in those areas, and instead will be realizing it’s the one area where they can’t compromise in favor of attempting to lower costs.

    “I think that the world is going to, as a result, be much more serious about investing in development systems, which is good,” he said.

    Huang said that Uber has engineers who are “intensely serious about what they do,” and said that he “wouldn’t judge them” until we have more information about what occurred with the accident. “We don’t know exactly what happened,” he said. “And we gotta give them the chance to go and understand for themselves.”

  28. Tomi Engdahl says:

    Uber will not reapply for self-driving car permit in California

    Uber, after suspending its self-driving car operations in all markets following a fatal crash, has decided not to reapply for its self-driving car permit in California. Uber’s current permit in California expires March 31.

    “We proactively suspended our self-driving operations, including in California, immediately following the Tempe incident,” an Uber spokesperson told TechCrunch.

    Uber’s decision not to reapply comes in tandem with a letter the DMV sent to Uber’s head of public affairs, Austin Heyworth, today.

    This comes following Arizona’s decision to block Uber’s self-driving cars in its city. In Arizona Governor Doug Ducey’s letter to Uber CEO Dara Khosrowshahi, Ducey said the video from the accident was “disturbing and alarming.”

  29. Tomi Engdahl says:

    EEVblog #1068 – Autonomous Uber Incident Update

    An update on the autonomous self driving Uber Volvo XC-90 involved in the pedestrian fatality.
    It is being reported that Uber disabled the Intel Mobileye collision avoidance sensor that is factory fitted in Volvo XC90.
    Intel have ran the dashcam footage of the accident through the Mobileye system and said that even with the dark footage it would have detected the pedestrian a second before the incident.

    EEVblog #1066 – Uber Autonomous Car Accident – LIDAR Failed?

  30. Tomi Engdahl says:

    Carolyn Said / San Francisco Chronicle:
    Uber says it will not renew its self-driving permit in California and that it has suspended testing in Pittsburgh, Toronto and San Francisco — Uber plans to end all self-driving-car testing in California, according to a letter from the state Department of Motor Vehicles to the company.

    Uber puts the brakes on testing robot cars in California after Arizona fatality

  31. Tomi Engdahl says:

    Devin Coldewey / TechCrunch:
    Nvidia says it will suspend self-driving tests globally, following Uber’s self-driving incident last week; stock down 9%+

    Nvidia suspends all autonomous vehicle testing

    Nvidia is temporarily stopping testing of its autonomous vehicle platform in response to last week’s fatal collision of a self-driving Uber car with a pedestrian. TechCrunch confirmed this with the company, which offered the following statement:

    “The accident was tragic. It’s a reminder of how difficult [self-driving car] technology is and that it needs to be approached with extreme caution and the best safety technologies. This tragedy is exactly why we’ve committed ourselves to perfecting this life-saving technology.”

    Likely someone pointed out that it wasn’t particularly charming to respond to a fatal system failure in an autonomous vehicle by saying that “ultimately” they’ll be safer, even if it’s true.

    Toyota also suspended its autonomous vehicle testing out of concern for its own drivers’ well-being. Uber of course ceased its testing operations at once.

  32. Tomi Engdahl says:

    Fatal driverless crash: Radar-maker says Uber disabled safety systems
    App biz refuses to comment – but it DID write the software

    Uber reportedly disabled safety systems on the autonomous Volvo XC90 that killed a pedestrian Stateside last week, according to the makers of the car’s sensors.

    “We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case,” Zach Peterson, a spokesman for Aptiv, told Bloomberg.

    Uber declined to comment to The Register, though it did confirm that it wrote the software the car was running. Aptiv, a UK-based maker of car parts including radars and cameras, did not respond to our enquiries. The company was formerly known as Delphi Automotive. In 2015 Delphi stated, and later denied, that a vehicle it was using for self-driving technology trials was involved in a near-miss with a rival car operated by Google.

    Uber’s modified XC90s are fitted with front, side and rear-facing cameras “watching for braking vehicles, crossing pedestrians, traffic lights, and signage,” according to a document produced by the controversial taxi app firm’s Advanced Technologies Group. The cars are also fitted with a top-mounted LIDAR sensor with all-round coverage.

    Crash investigators are expected to be focusing on why the XC90′s sensor suite appeared to have failed to detect Herzberg crossing the road with her bike, particular the LIDAR as it uses lasers to see through the dark.

    The relatively low-resolution camera footage released by police is unlikely to represent what the car’s LIDAR and radar packages should have picked up.

  33. Tomi Engdahl says:

    Nvidia CEO Jensen Huang clarifies Uber is not using its Drive platform

    While Uber makes use of Nvidia hardware in its own self-driving automotive technology, it does not employ Nvidia’s Drive autonomous computing platform, which includes the GPU maker’s own real-time sensor fusion, HD mapping and path planning. Nvidia CEO Jensen Huang shared this information today during a Q&A session attended by reporters at the company’s GPU Technology Conference in San Jose.

    “Uber does not use Nvidia’s Drive technology,” Huang said. “Uber develops their own sensing and drive technology.”

    Others running self-driving test programs on public roads, including Toyota Research Institute, have also paused, but some, including Waymo and Intel, have instead publicly declared that their own systems wouldn’t have failed where Uber’s did in this instance, and have continued their own testing programs on public roads.

  34. Tomi Engdahl says:

    Robocars: Time to Discuss Safety Validation

    I am not suggesting that anyone should just up and halt development or production of their autonomous vehicles. But it’s time for the industry to come together on robocars’ safety validation.

  35. Tomi Engdahl says:

    Exclusive: Arizona governor and Uber kept self-driving program secret, emails reveal

    A cozy relationship with governor Doug Ducey enabled an autonomous program with limited expert oversight – but governor denies it was ‘secret’

  36. Tomi Engdahl says:

    Uber settles with family of woman killed by self-driving car

    Elaine Herzberg, 49, died after being hit by the automated Uber in Tempe, Arizona

    The family of the woman killed by an Uber self-driving vehicle in Arizona has reached a settlement with the ride services company, ending a potential legal battle over the first fatality caused by an autonomous vehicle.

    Terms of the settlement were not given.

    Fall-out from the accident could stall the development and testing of self-driving vehicles, which are designed to eventually perform far better than human drivers and sharply reduce the number of motor vehicle fatalities.

    Uber and microchip developer Nvidia Corp have put self-driving car testing programs on hold following the fatality, which is believed to be the first death of a pedestrian struck by a self-driving vehicle.

    The fatality also presents an unprecedented liability challenge

    Meanwhile, Nvidia Corp has sought to distance itself from Uber saying it does not use Nvidia’s self-driving platform architecture.

    The ride-hailing service uses Nvidia’s graphics processing units known as GPUs, its chief executive Jensen Huang said.

    “Uber does not use Nvidia drive technology. Uber develops its own sensing and drive technology,” Huang said.

    Nvidia’s platform is used by more than 370 companies developing self-driving technology, including automakers and robotaxi companies and makers of self-driving hardware, such as sensors.

    Nvidia’s shares have fallen by about 9.5% since the company said on Tuesday it was temporarily halting its self-driving tests on public roads out of respect for the victim in the 18 March crash in Tempe.

  37. Tomi Engdahl says:

    Nvidia, Marvell share driver’s seat

    Uber’s tragic fatal accident in Arizona put Nvidia’s autonomous vehicle tests on hold, but hasn’t dented the company’s long-term enthusiasm for the technology.

    Marvell Semiconductor this week announced its secure automotive Ethernet switch is to be integrated into the Nvidia DRIVE Pegasus Platform.

    The switch silicon, Marvell’s 88Q5050, includes features like trusted boot, deep packet inspection, and low power consumption.

    Nvidia pitches the 320-trillion operations per second DRIVE Pegasus platform at “no human required” Level 5 autonomous operations.

    The Ethernet chip gives Pegasus DRIVE a secure connection to sensors, cameras, controllers and the like, with blacklisting and whitelisting on all Ethernet ports.

    Source: https://www.theregister.co.uk/2018/03/29/network_roundup_march_29/

  38. Tomi Engdahl says:

    Tesla Says Driver’s Hands Weren’t on Wheel at Time of Accident

    Driver died after vehicle hit a California highway barrier
    Tesla lost more than $5 billion in market value the past week

    Tesla Inc. said computer logs of the Model X vehicle involved in a fatal crash a week ago showed the driver didn’t have his hands on the steering wheel for six seconds before the accident.

    “The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla said. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

    The collision occurred days after a separate Uber Technologies Inc. accident killed a pedestrian, raising fresh questions about self-driving features and sending ripples across the broader autonomous-vehicle industry.

  39. Tomi Engdahl says:

    Jensen Huang on the Uber Tragedy and Why Nvidia Suspended Testing

    The Uber tragedy earlier this month sent a shudder throughout the autonomous vehicle industry. Some of the companies working on the technology paused testing. Nvidia was one of them.

    “We suspended because there is obviously a new data point as a result of the accident,” said Nvidia CEO Jensen Huang. “As good engineers, we should simply wait to see if we can learn something from that experience. We don’t know that we would do anything different, but we should give ourselves time to see if we can learn from that incident. It won’t take long.”

  40. Tomi Engdahl says:

    Tesla says fatal crash involved Autopilot

    Tesla has provided another update to last week’s fatal crash. As it turns out, Tesla said the driver had Autopilot on with the adaptive cruise control follow-distance set to minimum. However, it seems the driver ignored the vehicle’s warnings to take back control.

    The promise of Tesla’s Autopilot system is to reduce car accidents. In the company’s blog post, Tesla notes Autopilot reduces crash rates by 40 percent, according to an independent review by the U.S. government. Of course, that does not mean the technology is perfect in preventing all accidents.


  41. Tomi Engdahl says:

    Tesla Driver Died Using Autopilot, With Hands Off Steering Wheel

    A week after crash, company says it recovered computer logs
    Driver received several visual, one audible ‘hands-on’ warning

    Tesla Inc. confirmed the Model X driver who died in a gruesome crash a week ago was using Autopilot and defended the safety record of its driver-assistance system that’s back under scrutiny following a fatality.

  42. Tomi Engdahl says:

    Uber’s fatal self-driving car accident reactions and fallout

    First, the details of the accident, which is believed to be the first self-driving car accident that has resulted in a pedestrian fatality. NBC News reported that the vehicle, a gray 2017 Volvo XC90 SUV, had an operator in the driver’s seat and was traveling at about 40 miles per hour in autonomous mode when it struck Herzberg, who was walking her bicycle across the street—outside the crosswalk—late at night. Onboard video footage of the accident was released by Tempe police.


    Here is where all the speculation, reactions, and strong opinions on the issue come into play. Missy Cummings, an engineering professor and director of the Humans and Autonomy Laboratory at Duke University, told CNN that the video released of the accident shows that the autonomous technology fell short of how it should be designed to perform.

    Others such as Matthew Johnson-Roberson, an engineering professor at the University of Michigan who works with Ford Motor Co. on autonomous vehicle research, are more confident that the issue here likely lies in an issue with the autonomous vehicle software.

    “The real challenge is you need to distinguish the difference between people and cars and bushes and paper bags and anything else that could be out in the road environment,” he told Bloomberg. “The detection algorithms may have failed to detect the person or distinguish her from a bush.”

    Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University who works on autonomous vehicles, offered a similar sentiment in the Bloomberg article, noting that LiDAR actually functions better in the dark because the glare of sunshine can sometimes create interference. Because of this, the LiDAR would certainly have detected an obstacle, and any shortcoming would likeky be ascribed to classification software “because it was an interesting combination of bicycle, bags and a pedestrian standing stationary on the median,” he said. Had the software recognized a pedestrian standing close to the road, he added, “it would have at least slammed on the brakes.”

    In terms of what this means going forward, IEEE Spectrum raises an excellent point. Through this tragic accident, the autonomous vehicle industry can potentially learn from and improve on their failures to decrease or remove the likelihood that something like this could ever happen again. But for now, anyone who was already skeptical of the idea of driverless cars on our public roads, especially—of course— those outside of the industry, the level of trepidation has been officially raised.

  43. Tomi Engdahl says:

    Nvidia’s CEO on Uber, AI, and Moore
    Robocar road tests suspended

    Nvidia’s chief executive faced questions on Uber, China, cryptocurrencies, and more at the company’s annual GTC event here. Like most CEOs, Jensen Huang was upbeat and often turned his answers to favorite topics such as the company’s work in AI.

    The event drew 8,500 registrants interested in hearing about the latest in GPU computing, especially in AI and self-driving cars.

    The most sensitive questions focused on a pedestrian killed by a self-driving Uber car. Nvidia joined companies including Toyota and Volvo in suspending road tests of its robocars, but its timing was awkward. The GPU company announced its suspension a week after the accident, the day of Huang’s keynote here.


Leave a Comment

Your email address will not be published. Required fields are marked *