Tesla’s Autopilot being investigated by the government following fatal crash | Ars Technica

http://arstechnica.com/cars/2016/06/teslas-autopilot-being-investigated-by-the-government-in-a-fatal-crash/

This unfortunate accident can affect the development of self-driving cars.

24 Comments

  1. Tomi Engdahl says:

    Driver killed in Tesla Autopilot crash filmed a close call earlier this year
    http://www.theverge.com/2016/6/30/12072634/tesla-autopilot-crash-autonomous-mode-viral-video

    The driver killed in the first known fatal crash of a Tesla vehicle in the semi-autonomous Autopilot mode had previously posted a viral video of his Model S avoiding an accident with Autopilot engaged.

    The video even garnered a tweet and a link from Elon Musk.

    https://www.youtube.com/watch?feature=youtu.be&v=9I5rraWJq6E

    Reply
  2. Tomi Engdahl says:

    Man killed in gruesome Tesla autopilot crash was saved by his car’s software weeks earlier
    Probe launched after ex-Navy SEAL, 40, dies in semi smash
    http://www.theregister.co.uk/2016/06/30/tesla_autopilot_crash_leaves_motorist_dead/

    An investigation was launched today after the driver of a Tesla was killed in what is understood to be a malfunction of the car’s autopilot.

    Joshua Brown, a 40-year-old Ohio man, was killed on May 7 while driving his 2015 Tesla Model S on Route 27 in Florida. The car was using the optional autopilot system, which controls the luxury sedan’s speed and distance from objects, when both he and the computer system failed to spot an 18-wheel tractor trailer on the road.

    “The vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S,” Tesla said in a statement on Thursday.

    “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

    A Tragic Loss
    https://www.teslamotors.com/blog/tragic-loss

    Reply
  3. Tomi Engdahl says:

    Jordan Golson / The Verge:
    US NHTSA opens investigation into fatal accident involving Tesla’s Autopilot mode; Tesla says it’s the first fatality in 130M+ miles driven with Autopilot — A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active.

    Tesla driver killed in crash with Autopilot active, NHTSA investigating
    http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s

    A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.

    The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver — who Tesla notes is ultimately responsible for the vehicle’s actions, even with Autopilot on — nor the car noticed the big rig or the trailer “against a brightly lit sky” and brakes were not applied.

    In a tweet, Tesla CEO Elon Musk said that the vehicle’s radar didn’t help in this case because it “tunes out what looks like an overhead road sign to avoid false braking events.”

    “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”

    Reply
  4. Tomi Engdahl says:

    Wall Street Journal:
    First known fatality involving an autonomous vehicle opens the door for regulatory oversight

    Scant Oversight of Self-Driving Technology
    First known fatality involving autonomous vehicle opens a door that has largely been closed
    http://www.wsj.com/article_email/teslas-autopilot-flew-under-regulators-radar-1467402007-lMyQjAxMTE2MzA4MTIwMzE5Wj

    Last October, Tesla Motors Inc. Chief Executive Elon Musk heralded the arrival of the company’s autonomous-driving technology, inviting owners of its electric cars to download software that let the autos operate themselves under certain conditions.

    His message to owners on its website: “Your Autopilot has arrived.”

    Auto-safety regulators, meanwhile, were relatively silent on the technology even though many experts viewed Tesla’s program as the most aggressive self-driving system on U.S. roads. The National Highway Traffic Safety Administration, embroiled in managing a sharp increase in safety recalls, including tens of millions of rupture-prone air bags, lacks authority to approve or disapprove of the advanced technology or meaningfully slow its deployment.

    Instead, car-safety regulators were forced to wait until a major mishap before significantly addressing Tesla’s Autopilot system.

    The May 7 fatal crash in Florida that killed 40-year-old Joshua Brown when his Tesla Model S drove under the trailer of an 18-wheel semi truck turning in front of the car offers NHTSA officials their first significant chance to flex regulatory muscle.

    Seven weeks later, NHTSA opened an initial investigation that, depending on the findings, could result in a recall of 25,000 Tesla vehicles and pressure to change the software.

    “NHTSA has no premarket regulatory authority,” David Strickland, a former agency head who now represents auto makers pursuing driverless technologies at law firm Venable LLP, said on Friday. “The only thing the agency can do is make a decision whether the vehicle is noncompliant with the existing federal motor vehicle safety standards.”

    NHTSA is readying new guidelines for autonomous-vehicle developers

    To protect drivers, Tesla offered disclosures and a three-step warning system to keep occupants attentive even as the car drove itself. The system received critical acclaim, with a Norwegian publication finding Autopilot was far more effective at reliably operating itself compared with Daimler AG ’s suite of similar technologies on its Mercedes-Benz vehicles.

    Reply
  5. Tomi Engdahl says:

    Tesla Crashes BMW-Mobileye-Intel Event
    Recent auto-pilot fatality casts pall
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1330034&

    The three-way partnership event in Munich announcing the development of a standard platform for autonomous driving was somewhat somber and low energy.

    Ironically, Tesla’s first fatal crash — involving a Tesla driver using his car’s auto-pilot mode — triggered one of the first questions asked at the BMW/Mobileye/Intel press conference Friday (July 1).

    The accident, revealed Thursday, June 30, and currently under investigation by U.S. auto-safety regulators, is likely to direct more scrutiny toward autonomous driving technology, which thus far has evolved with little oversight.

    The three-way partnership event in Munich was somewhat somber and low-key.

    Why Intel Got Inside BMW-Mobileye Deal
    Push for ‘industry standard’ for autonomous cars
    http://www.eetimes.com/document.asp?doc_id=1330031&

    Reply
  6. Tomi Engdahl says:

    Fatal Tesla crash spurs criticism of on-the-road beta testing
    http://www.autonews.com/article/20160702/OEM11/160709981/fatal-tesla-crash-spurs-criticism-of-on-the-road-beta-testing?X-IgnoreUserAgent=1&X-IgnoreUserAgent=1

    WASHINGTON/SAN FRANCISCO — Tesla Motors Inc. says the self-driving feature suspected of being involved in a May 7 fatal crash is experimental, yet it’s been installed on all 70,000 of its cars since October 2014.

    For groups that have lobbied for stronger safety rules, that’s precisely what’s wrong with U.S. regulators’ increasingly anything-goes approach.

    “Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment,” said Jackie Gillan, president for Advocates for Highway and Auto Safety, a longtime Washington consumer lobbyist who has helped shape numerous auto-technology mandates. “This is going to happen again and again and again.”

    Tesla’s use of technology still in development

    The National Highway Traffic Safety Administration said Thursday that it is investigating the crash, which comes as the regulator says it is looking for ways to collaborate with the industry.

    NHTSA is expected to announce guidelines as soon as this month that will set some parameters for self-driving cars on U.S. roads.

    “We’re crafting a Declaration of Independence, not a Constitution,”

    In the Florida crash, Tesla’s “Autopilot” semi-autonomous driving feature failed to detect the white side of the tractor trailer against a brightly lit sky, so it didn’t hit the brakes, according to the company.

    In a blog post, Tesla said the May accident was the first known fatality in more than 130 million miles of Autopilot driving. That compares with one fatality in every 94 million miles among all U.S. vehicles, according to Tesla.

    In fact, highway deaths are on the rise.

    “Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said.

    Reply
  7. Tomi Engdahl says:

    Tesla Autopilot 2.0 Is Coming This Year, Source Confirms
    https://hardware.slashdot.org/story/16/07/05/2138250/tesla-autopilot-20-is-coming-this-year-source-confirms

    A source close to Tesla Motors confirmed to TechnoBuffalo that Tesla Autopilot 2.0 is coming soon. Other media outlets like Teslarati have reported on prototype Model S and Model X vehicles operating in the wild sporting two forward facing cameras, which may indicate part of the new hardware necessary to take advantage of Autopilot 2.0′s additional features. “The dual camera system is capable of recognizing and reacting to stop signs and traffic lights with no driver input,”

    U.S. regulators are actively investigating 25,000 Tesla Model S cars after a fatal crash involving a vehicle using the “Autopilot” mode was reported.

    Reply
  8. Tomi Engdahl says:

    Second Tesla Autopilot Crash Under Review By US Regulators
    https://yro.slashdot.org/story/16/07/06/2321228/second-tesla-autopilot-crash-under-review-by-us-regulators

    The Wall Street Journal and many other publications are reporting that U.S. auto-safety regulators are currently reviewing a second crash that occurred while Tesla’s Autopilot mode was activate. The Detroit Free Press reports that a Michigan art gallery owner told police that he survived a rollover crash that happened when his Tesla Model X was in self-driving mode last Friday.

    A Second Self-Driving Tesla Crash Is Reported
    http://time.com/money/4395345/tesla-self-driving-second-crash/

    Are drivers relying too much on technology?

    RECOMMENDED FOR YOU
    The Pound is Why Britain Will Survive Brexit
    The Pound is Why Britain Will Survive Brexit
    Where the World’s Billionaires Live
    Promoted
    Where the World’s Billionaires Live
    Recommended by

    Tesla’s plans for self-driving cars took another hit on Wednesday, when reports came out that a second one of the company’s cars crashed while being tested in autopilot mode.

    According to the Detroit Free Press, a Michigan art gallery owner told police that he survived a rollover crash that happened when his Tesla Model X was in self-driving mode last Friday.

    Reply
  9. Tomi Engdahl says:

    Bloomberg:
    National Transportation Safety Board opens investigation into fatal Tesla crash to see whether it reveals any systemic issues with driverless car technology

    Driver Automation to Be Scrutinized in NTSB Probe of Tesla Crash
    http://www.bloomberg.com/news/articles/2016-07-08/driver-automation-to-be-scrutinized-in-ntsb-probe-of-tesla-crash

    For years, U.S. investigators have been calling for more automation on motor vehicles, such as sensors that slam on the brakes to prevent a crash.

    At the same time, the National Transportation Safety Board, in its probes of transportation mishaps, has warned that such devices may also have a down side: the technology can confuse operators if it’s poorly designed or lead to complacency that breeds its own hazards.

    “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” O’Neil said.

    The crash was the first with a known fatality in more than 130 million miles of Autopilot driving, according to the carmaker.

    Reply
  10. Tomi Engdahl says:

    Tesla reportedly eyes brakes in fatal Model S crash
    http://www.autoblog.com/2016/07/31/tesla-autopilot-crash-auto-braking-report/

    Tesla is considering two possible scenarios that would explain the fatal Model S crash in Florida, and according to Reuters and The New York Times, neither is about Autopilot. During a meeting with the US Senate Commerce Committee, the automaker reportedly presented two theories. First is the possibility that the car’s automatic emergency braking system’s camera and radar didn’t detect the incoming truck at all. The other theory is that the braking system’s radar saw the truck but thought it was part of a big structure, such as a bridge or a building. It’s programmed to ignore huge structures to prevent false braking, after all.

    If you’ll recall, the Model S in this incident collided with a tractor trailer while Autopilot was on.

    It’s worth noting that the automaker considers its braking system a separate entity from Autopilot, which is in charge of steering and changing lanes.

    Reply
  11. Tomi Engdahl says:

    Tesla Owner In China Blames Autopilot For Crash
    https://tech.slashdot.org/story/16/08/10/2345213/tesla-owner-in-china-blames-autopilot-for-crash

    The owner of a Tesla Motors Model S sedan in China reportedly said his vehicle crashed into a car on the side of the road while the vehicle’s Autopilot system was engaged, but the automaker said the driver was using the system improperly. Luo Zhen, 33, of Beijing told Reuters that his vehicle collided with a parked car on the left side of a highway, damaging both vehicles but injuring no one.

    Tesla owner in China blames Autopilot for crash
    http://www.usatoday.com/story/money/cars/2016/08/10/tesla-model-s-autopilot-china-crash/88510532/

    Reply
  12. Tomi Engdahl says:

    NTSB: Tesla in fatal crash was speeding with Autopilot on
    http://www.computerworld.com/article/3100650/car-tech/ntsb-tesla-in-fatal-crash-was-speeding-with-autopilot-on.html#tk.rss_news

    The Tesla’s Autosteer lane-keeping assistance and traffic-aware cruise control system was engaged

    The National Transportation Safety Board (NTSB) today released a preliminary report that details the circumstances of the fatal accident involving a Tesla Model S driving with its Autopilot engaged.

    The accident, which took place May 7 in Williston, Fla., was the first known fatal crash involving a vehicle using an advanced driver assistance system (ADAS) based on computer software, sensors, cameras and radar. The all-electric Tesla hit the side of an 18-wheeler that turned left in front of it. The impact sheared away the roof of the Model S and killed Joshua Brown, 40, of Canton, Ohio.

    Tesla has been at the center of a media feeding frenzy since it revealed the accident

    The NTSB report stated that Tesla system performance data downloaded from the car indicated that the Model S’ speed just before impact was 74 mph, nine miles an hour over the speed limit on the four-lane, divided highway. The Autopilot’s Autosteer lane-keeping assistance and traffic-aware cruise control system was engaged at the time of the crash.

    “The car was also equipped with automatic emergency braking that is designed to automatically apply the brakes to reduce the severity of or assist in avoiding frontal collisions,” the report said.

    As the result of the impact with the truck, the Tesla’s battery “disengaged from the electric motors that were powering the car.

    After exiting from underneath the semitrailer, the car coasted at a shallow angle off the right side of the roadway, traveled approximately 297 feet and then collided with a utility pole,” the report said. “The car broke the pole and traveled an additional 50 feet,

    Tesla said that neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brakes were not applied.

    The accident remains under investigation.

    “NTSB investigators will continue to analyze [the] data, and use it along with other information collected during the investigation in evaluating crash events,”

    While Tesla’s Autopilot ADAS uses cameras and radar, the carmaker hasn’t yet embraced LiDAR (Light Detection and Ranging) technology, which uses lasers to create 3D scans of objects around a vehicle.

    Reply
  13. Tomi Engdahl says:

    Self-driving Tesla SUV saves Branson man’s life
    http://www.ky3.com/content/news/Self-driving-Tesla-SUV-saves-the-day-389392262.html

    BRANSON, Mo. (KY3) The future is coming, actually, it just drove into the driveway.

    We’re talking about a Tesla Model X.

    He just bought a high-tech Tesla Model X a couple weeks ago.

    “It’s the ultimate gadget. It’s the coolest technology I’ve ever seen I would say, let alone owned,” says Neally.

    “I would characterize it as the ultimate cruise control,” Neally explains about the autopilot.

    Neally says the car allows him to take his hands off the wheel for up to four minutes at a time. Then the car lets him know he needs to take the wheel at, least briefly, or else it looks for a spot to pull over and stop on the side of the road.

    In May, a man driving a Tesla on autopilot was killed in a much-publicised crash in Florida.

    It cast doubt on the capabilities of the technology.

    Then, just a week after Josh bought his space ship on wheels, he was in the middle of his commute home for his daughter’s birthday, when he felt a sudden pain.

    “It was kinda getting scary. I called my wife and just said ‘somethings wrong’ and I couldn’t breathe, I was gasping, kind of hyperventilating,” Neally says he was writhing in pain, and totally distracted from driving.

    “I just knew I had to get there, to the ER,” says Neally.

    So he trusted the self-driving Tesla to stay on the road, until it got near a hospital. Josh was able to drive himself the last couple of blocks to the ER.

    Now Neally says he’s recovered and is recieving treatment for the issue.

    Neally says the Tesla may have saved his life, it certainly helped.

    Reply
  14. Tomi Engdahl says:

    Darrell Etherington / TechCrunch:
    Tesla announces Autopilot 8.0, which relies more on radar to avoid accidents like Model S crash, available soon for cars from 2014 and on via free OTA upgrade — Today, Tesla revealed Version 8 of its Autopilot software, going live in one to two weeks. Version 8. includes updates …

    Tesla Autopilot 8.0 uses radar to prevent accidents like the fatal Model S crash
    https://techcrunch.com/2016/09/11/tesla-autopilot-8-0-uses-radar-to-prevent-accidents-like-the-fatal-model-s-crash/

    Today, Tesla revealed Version 8 of its Autopilot software, going live in one to two weeks. Version 8.0 includes updates to the signal processing tech used to interpret images received from the onboard radar. The update focuses primarily on the radar component of the Autopilot sensor system, turning it from a supplementary part of the overall tech, designed to complement the cameras, into a primary control sensor that, according to Elon Musk himself, should prevent accidents like the one that resulted in Josh Brown’s death.

    “We’re making much more effective use of radar,” Musk explained on a press call regarding the updates. “We weren’t confident that we could resolve false positives where the radar would think that it should brake, but it shouldn’t.”

    Reply
  15. Tomi Engdahl says:

    Elon Musk Says Tesla New Autopilot Features Would Have Prevented Recent Death
    https://tech.slashdot.org/story/16/09/12/1347217/elon-musk-says-tesla-new-autopilot-features-would-have-prevented-recent-death

    Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an “over-the-air” software update, would rely foremost on radar to give Tesla’s electric luxury cars a better sense of what is around them and when to brake.

    Elon Musk Says Tesla’s New Autopilot Likely Would Have Prevented Death
    http://fortune.com/2016/09/12/elon-musk-tesla-new-autopilot-death/

    The new Autopilot will be available within a week or two through an “over-the-air” software update.

    Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May.

    New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its “hands-off” driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car.

    “We’re making much more effective use of radar,” Musk told journalists on a phone call. “It will be a dramatic improvement in the safety of the system done entirely through software.”

    The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s Autopilot system since June because of the fatal accident. The agency had been briefed on the changes by Tesla TSLA 2.11% and would review them, spokesman Bryan Thomas said.

    Musk said it was “very likely” the improved Autopilot would have prevented the death of Brown, whose car sped into the trailer of a truck crossing a highway, but he cautioned that the update “doesn’t mean perfect safety.”

    “Probability of Safety”

    “Perfect safety is really an impossible goal,” Musk said. “It’s about improving the probability of safety. There won’t ever be zero fatalities, there won’t ever be zero injuries.”

    One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives

    One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives

    “Anything metallic or dense, the radar system we’re confident will be able to detect that and initiate a braking event,” he said.

    The revised system will sound warnings if drivers take their hands off the wheel for more than a minute at speeds above 45 miles per hour (72 kph) when there is no vehicle ahead, Musk said.

    If the driver ignores three audible warnings in an hour, the system will temporarily shut off until it is parked

    Reply
  16. Tomi Engdahl says:

    Questions About Tesla Autopilot Safety Hit Stone Wall
    http://www.eetimes.com/document.asp?doc_id=1330477&

    A fatal accident in China thrust Tesla’s transparency into sharp focus this week, posing fresh and daunting questions as to how safe Tesla’s Autopilot really is.

    New reports surfaced this week in China about a crash that killed a 23-year-old occupant while driving a Tesla Model S in Handan, a city about 300 miles south of Beijing.

    This took place on January 20, 2016 — four months before Joshua Brown died in Florida, in a Tesla Model S on Autopilot.

    The Chinese government news channel CCTV reported that the Chinese driver, Gao Yaning, borrowed his father’s Tesla Model S. He was driving on the highway, when his car hit a street-sweeper truck on the side of the road at highway speed.

    CCTV showed a video footage of the accident captured by the Tesla Model S driver’s dash camera.

    The police found no sign that the vehicle applied the brakes before hitting the truck. Local media reported that the Autopilot was engaged at the time of the accident.

    That crash, according to the Chinese reports, was under investigation for the first half of this year, the result of which is a lawsuit filed in July by the victim’s family against Tesla China.

    Tesla’s credibility and transparency in question
    If reports are true, China’s Tesla fatality in January presents a problem for Tesla.

    Reply
  17. Tomi Engdahl says:

    Are Tesla Crashes Balanced Out By The Lives That They Save?
    https://tech.slashdot.org/story/16/11/13/2245224/are-tesla-crashes-balanced-out-by-the-lives-that-they-save

    Friday EE Times shared the story of a Tesla crash that occurred during a test drive. “The salesperson suggested that my friend not brake, letting the system do the work. It didn’t…” One Oregon news site even argues autopiloted Tesla’s may actually have a higher crash rate.

    Tesla’s own numbers show Autopilot has higher crash rate than human drivers
    http://katu.com/news/auto-matters/teslas-own-numbers-show-autopilot-has-higher-crash-rate-than-human-drivers

    A couple of weeks ago, I wrote about Tesla’s claim that its Autopilot driver-assistance software is safer than a human driver.

    After a fatal Autopilot crash last May, the company said the death was the first in 130 million miles of Autopilot driving—and noted that, “among all vehicles, in the U.S., there is a fatality every 94 million miles.”

    The clear implication: Autopiloted Teslas are safer than human-piloted cars, and lives would be saved if every car had Autopilot.

    But Tesla’s statistics are questionable at best. The small sample size—one crash—makes any calculation of Autopilot fatality rate almost meaningless.

    Furthermore, Tesla compared its Autopilot crash rate to the overall U.S. traffic fatality rate—which includes bicyclists, pedestrians, buses and 18-wheelers.

    A better yardstick for comparison is the fatality rate for U.S. drivers of cars and light trucks compiled by the Insurance Institute for Highway Safety.

    By that yardstick, the Tesla Autopilot driver fatality rate is almost four times higher than typical passenger vehicles.

    Reply
  18. Tomi Engdahl says:

    Adding Some Statistical Perspective To Tesla Autopilot Safety Claims
    http://www.forbes.com/sites/samabuelsamid/2016/07/05/adding-some-statistical-perspective-to-tesla-autopilot-safety-claims/#70f1f2a2f8f6

    “Figures never lie, but liars always figure.”

    “Tell me which side of the argument you are on and I will give you the statistics to prove you are right.”

    Variations of those quotes have been around for ages and the origins are debatable. However, there is a great deal of truth to both idioms. Whether discussing unemployment numbers, economic growth, the latest poll numbers or safety claims, the first thing you must always do before accepting the data is to understand the question that was asked to get that data. Subtle changes in the question can have a huge impact on the results.

    Based on Tesla’s statements we can assume the 130 million miles is the total of all miles traveled in Autopilot mode by all Model S and X vehicles globally since the update was released in October 2015. If we assume that we must also assume there have been no fatal accidents in other parts of the world that we don’t know about yet. Given the amount of telemetry that Tesla collects from their vehicles let’s give them the benefit of the doubt on this one. So one fatality in 130 million miles stands for the moment.

    How about the one fatality every 94 million miles in the United States? The best source for such data is the Department of Transportation’s Fatality Analysis Reporting System (FARS) which compiles accident data from state and local agencies nationwide.

    In 2014, Americans traveled 3.026 trillion miles on the road and a total of 32,675 people died along the way. That actually works out to one death ever 92.6 million miles

    That last part is important because the FARS data includes all traffic deaths, those in cars and trucks as well as those riding motorized or pedal cycles and pedestrians struck by a vehicle. As far as we know, no Autopilot equipped vehicle has struck and killed a pedestrian or cyclist. So Tesla’s comparison is actually looking at two quite different data sets. In 2014, 4,586 motorcyclists and 5,813 pedestrians/cyclists were killed.

    That leaves 22,276 vehicle occupants (drivers and passengers) that died. This latter set are probably the ones we should be comparing to Tesla’s one death in 130 million miles

    Based on that statistic, humans are actually better drivers than computers. However, even that isn’t necessarily a valid comparison.

    Reply
  19. Tomi Engdahl says:

    Another Tesla Crash, What It Teaches Us
    Clash between Tesla’s prudence and human nature
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1330813&

    Tesla crashed on a test drive while AutoPilot engaged. Nobody got hurt. But the minor incident gives us a plenty to think about.

    Earlier this week, I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.

    Thankfully, nobody got hurt. This post got no traction in the media.

    This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.

    However, this accident, and moreso, subsequent discussions in the Tesla Motors Club forum, intrigued me for two reasons.

    First, it’s a reminder that it ain’t easy to curb drivers’ appetite to “test the limits” of so-called AutoPilot, despite the carmaker’s stern warnings.

    The key case in point is Tesla’s first fatal accident, which took place in Florida last May. After splurging on such a well-regarded, expensive automobile, who wouldn’t want to test its limits in driving and brag about it? The inevitable result is a clash between Tesla’s prudence and human nature.

    Second, AutoPilot is still in the making. New technologies keep emerging, allowing the automaker to continue to improve it via software updates.

    I was amazed to see so many posts by other Tesla users — all apparently very knowledgeable. They discussed the limitations of radar, problems AutoPilot handling hills, and the differences between software versions 7.1 and 8.0.

    If this isn’t “inside baseball,” what is? I’d hate to think that an average driver needs to do this much homework to really understand why AutoPilot just doesn’t work in certain situations and why it isn’t autonomous.

    statement.

    “After speaking with our customer and the Tesla employee, we determined that this accident was the result of a miscommunication between the individuals inside the car.”

    The accident happened near Los Angeles area. The vehicle that crashed on a test drive was running the software version 8.0.

    As expressed in the statement above, Tesla stressed that this accident was not the result of the driver intentionally testing the limits of AutoPilot, but it happened because a miscommunication inside the car.

    To be clear, the radar was already added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but it was only meant to be a supplementary sensor to the primary camera and image processing system, the company explained.

    A the time of version 8.0 release, Tesla made it clear that it changed its mind. “We believe radar can be used as a primary control sensor without requiring the camera to confirm visual image recognition.”

    Reply
  20. Tomi Engdahl says:

    Darrell Etherington / TechCrunch:
    NHTSA closes probe into June 2016 Tesla crash, clearing Autopilot of fault and praising its safety features, including a ~40% drop in crashes since introduction — The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation …

    NHTSA’s full final investigation into Tesla’s Autopilot shows 40% crash rate reduction
    https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/

    The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation into last year’s fatal crash involving a driver’s use of Tesla’s semi-autonomous Autopilot feature. The report clears Tesla’s Autopilot system of any fault in the incident, and in fact at multiple points within the report praises its design in terms of safety, and highlights its impact on lowering the number of traffic incidents involving Tesla vehicles overall.

    NHTSA notes that crash rates involving Tesla cars have dropped by almost 40 percent since the wide introduction of Autopilot.

    It’s essentially as good as result as Tesla can have hoped for from the U.S. traffic safety agency’s investigation, which took place over the last six months. Reuters reported earlier on Thursday that the investigation would not result in a recall of Tesla vehicles, but the full findings show that in fact, the federal regulatory body found plenty to praise while conducting its inquiry.

    The investigation does conclude with a minor admonition that Tesla could perhaps be more specific about its system limitations in its driver-assist features

    U.S. traffic safety agency to close Tesla Autopilot investigation without recall request
    https://techcrunch.com/2017/01/19/u-s-traffic-safety-agency-to-close-tesla-autopilot-investigation-without-recall-request/

    The U.S. National Highway Traffic Safety Administration (NHTSA) will close the investigation it began six months ago into a driver death that occurred while using Tesla’s Autopilot highway semi-autonomous driving feature, Reuters reports. The investigation did not find cause for a recall of Tesla vehicles with Autopilot, the report claims.

    U.S. regulator finds no evidence of defects after Tesla death probe
    http://www.reuters.com/article/us-tesla-safety-idUSKBN1532F8

    U.S. auto safety regulators said on Thursday they found no evidence of defects in a Tesla Motors Inc (TSLA.O) car involved in the death of a man whose Model S collided with a truck while he was using its Autopilot system.

    The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks.

    Reply
  21. Tomi Engdahl says:

    Driver in fatal Tesla Autopilot crash had seven seconds to take action
    It’s not known what he was doing at the time
    http://www.theverge.com/2017/1/19/14326604/tesla-autopilot-crash-driver-seven-seconds-inattentive-nhtsa

    The Tesla driver killed in a crash while the Autopilot system was activated last year would have seen the tractor trailer for at least seven seconds prior to impact, according to the NHTSA investigation of the accident. This should have given the driver enough time to take “some action,” said Bryan Thomas, communications director for NHTSA, though it’s not known “whether that was enough time to avoid or mitigate the crash.”

    The NHTSA investigation report called seven seconds a “period of extended distraction,” and noted that similar crashes generally had a “much shorter time” available for both the system and driver to detect and respond to a pending collision, usually less than three seconds. The report called distractions longer than seven seconds to be “uncommon, but foreseeable.”

    Reply
  22. Tomi Engdahl says:

    Fatal Tesla Crash: That’s Not All, Folks
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1331950&

    The NTSB’s report contains a few surprises, including how and where Tesla’s captured data is stored, routed and saved inside a vehicle.

    The National Transportation Safety Board (NTSB) last week released, after more than a year of suspense, a 500-page document or “docket,” about a fatal highway crash involving a Tesla S and a tractor-semitrailer truck.

    For an automotive industry now fixated on the development of self-driving cars — among which the Tesla S is one of the grand experiments — the NTSB probe is a treasure trove of data. However, it falls short of determining who, or what, is to blame for the death of the driver.

    Driver Assistance System
    Among the documents released last week, the one entitled “Driver Assistance System” poses particular interest for EE Times readers.

    The report delves into details of how Tesla’s driver assistance system — consisting of a Bosch radar system, Mobileye image capture & processing system, an ultrasonic sensor system and gateway electronic control unit — works. It literally reads like a teardown of the Model S driver-assistance system.

    It also contains a few surprises, including how and where Tesla’s captured data is stored, routed and saved inside a vehicle, and how it’s sent to Tesla’s server using a virtual private network connection established via Wi-Fi, or using the vehicle’s 3G cellular data capability.

    The report says that Tesla S stores non-geo-located data in-vehicle in non-volatile memory using a removable SD card installed within the Gateway ECU.

    Really, an SD card? What role does the SD card play?

    No Event Data Recorder?
    After reading the report, Mike Demler, a senior analyst at The Linley Group, told EE Times, “I find the description of some of Tesla’s control and data-recording systems to be interesting.” In particular, he said, “The statement in the report that says, ‘This SD card is large enough to typically maintain a complete record of all stored data for the lifetime of the vehicle’ is interesting.” He asked: “How could they determine how much data will be generated over the lifetime of the vehicle?”

    Unfortunately, the NTSB report doesn’t answer such questions.

    But one thing is clear. The NTSB apparently sees this “removable” SD card as a proxy for an event data recorder (EDR). Because current NTSB specs do not require an EDR (it’s completely voluntary), the NTSB appears to conclude that Tesla did enough.

    Reply
  23. Tomi Engdahl says:

    Johana Bhuiyan / Recode:
    NTSB says causes of fatal Tesla crash in May 2016 included driver’s over-reliance on Autopilot, approves safety recommendations for automakers, DoT, and NHTSA

    A federal agency says an overreliance on Tesla’s Autopilot contributed to a fatal crash
    The National Transportation Safety Board met on Tuesday to determine the cause of May’s fatal Tesla crash.
    https://www.recode.net/2017/9/12/16294510/fatal-tesla-crash-self-driving-elon-musk-autopilot

    Reply
  24. Tomi Engdahl says:

    Two die in Texas after Tesla ‘on auto-pilot with no one in driving seat’ crashes into tree and starts massive four-hour fire that took 32,000 GALLONS of water to extinguish
    https://www.dailymail.co.uk/news/article-9484391/Two-die-Tesla-auto-pilot-no-one-driving-crashes-tree.html

    The 2019 Tesla Model S slammed into tree in Carlton Woods at 11.25pm Saturday
    Constable Mark Herman said ‘no one was driving’ at the time of the accident
    Vehicle ‘was moving at high speed when it failed to negotiate a cul-de-sac turn’

    Officials told KPRC 2 that the $80,000 vehicle was moving at high speed when it failed to negotiate a cul-de-sac turn, ran off the road and crashed.

    Fire fighters used 32,000 gallons of water over four hours to try to put out the flames because the car’s batteries kept reigniting.

    At one point, deputies had to call Tesla to ask them how to put out a fire in the battery.

    The National Highway Traffic Safety Administration (NHTSA) is now investigating 23 crashes involving Tesla cars believed to be on Autopilot, the New York Times reported.

    Tesla CEO Elon Musk tweeted Saturday to say that vehicles with Autopilot engaged were ‘now approaching a 10 times lower chance of accident’ than the average vehicle.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*