<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: Tesla’s Autopilot being investigated by the government following fatal crash &#124; Ars Technica</title>
	<atom:link href="http://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/</link>
	<description>All about electronics and circuit design</description>
	<lastBuildDate>Thu, 16 Apr 2026 22:29:11 +0000</lastBuildDate>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.9.14</generator>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1708632</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Mon, 19 Apr 2021 15:27:49 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1708632</guid>
		<description><![CDATA[Two die in Texas after Tesla &#039;on auto-pilot with no one in driving seat&#039; crashes into tree and starts massive four-hour fire that took 32,000 GALLONS of water to extinguish
https://www.dailymail.co.uk/news/article-9484391/Two-die-Tesla-auto-pilot-no-one-driving-crashes-tree.html

The 2019 Tesla Model S slammed into tree in Carlton Woods at 11.25pm Saturday
Constable Mark Herman said &#039;no one was driving&#039; at the time of the accident
Vehicle &#039;was moving at high speed when it failed to negotiate a cul-de-sac turn&#039;

Officials told KPRC 2 that the $80,000 vehicle was moving at high speed when it failed to negotiate a cul-de-sac turn, ran off the road and crashed.

Fire fighters used 32,000 gallons of water over four hours to try to put out the flames because the car&#039;s batteries kept reigniting.

At one point, deputies had to call Tesla to ask them how to put out a fire in the battery.

The National Highway Traffic Safety Administration (NHTSA) is now investigating 23 crashes involving Tesla cars believed to be on Autopilot, the New York Times reported.

Tesla CEO Elon Musk tweeted Saturday to say that vehicles with Autopilot engaged were &#039;now approaching a 10 times lower chance of accident&#039; than the average vehicle.]]></description>
		<content:encoded><![CDATA[<p>Two die in Texas after Tesla &#8216;on auto-pilot with no one in driving seat&#8217; crashes into tree and starts massive four-hour fire that took 32,000 GALLONS of water to extinguish<br />
<a href="https://www.dailymail.co.uk/news/article-9484391/Two-die-Tesla-auto-pilot-no-one-driving-crashes-tree.html" rel="nofollow">https://www.dailymail.co.uk/news/article-9484391/Two-die-Tesla-auto-pilot-no-one-driving-crashes-tree.html</a></p>
<p>The 2019 Tesla Model S slammed into tree in Carlton Woods at 11.25pm Saturday<br />
Constable Mark Herman said &#8216;no one was driving&#8217; at the time of the accident<br />
Vehicle &#8216;was moving at high speed when it failed to negotiate a cul-de-sac turn&#8217;</p>
<p>Officials told KPRC 2 that the $80,000 vehicle was moving at high speed when it failed to negotiate a cul-de-sac turn, ran off the road and crashed.</p>
<p>Fire fighters used 32,000 gallons of water over four hours to try to put out the flames because the car&#8217;s batteries kept reigniting.</p>
<p>At one point, deputies had to call Tesla to ask them how to put out a fire in the battery.</p>
<p>The National Highway Traffic Safety Administration (NHTSA) is now investigating 23 crashes involving Tesla cars believed to be on Autopilot, the New York Times reported.</p>
<p>Tesla CEO Elon Musk tweeted Saturday to say that vehicles with Autopilot engaged were &#8216;now approaching a 10 times lower chance of accident&#8217; than the average vehicle.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1562684</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Wed, 13 Sep 2017 11:48:34 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1562684</guid>
		<description><![CDATA[Johana Bhuiyan / Recode: 	
NTSB says causes of fatal Tesla crash in May 2016 included driver&#039;s over-reliance on Autopilot, approves safety recommendations for automakers, DoT, and NHTSA


A federal agency says an overreliance on Tesla’s Autopilot contributed to a fatal crash
The National Transportation Safety Board met on Tuesday to determine the cause of May’s fatal Tesla crash. 
https://www.recode.net/2017/9/12/16294510/fatal-tesla-crash-self-driving-elon-musk-autopilot]]></description>
		<content:encoded><![CDATA[<p>Johana Bhuiyan / Recode:<br />
NTSB says causes of fatal Tesla crash in May 2016 included driver&#8217;s over-reliance on Autopilot, approves safety recommendations for automakers, DoT, and NHTSA</p>
<p>A federal agency says an overreliance on Tesla’s Autopilot contributed to a fatal crash<br />
The National Transportation Safety Board met on Tuesday to determine the cause of May’s fatal Tesla crash.<br />
<a href="https://www.recode.net/2017/9/12/16294510/fatal-tesla-crash-self-driving-elon-musk-autopilot" rel="nofollow">https://www.recode.net/2017/9/12/16294510/fatal-tesla-crash-self-driving-elon-musk-autopilot</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1552753</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Wed, 28 Jun 2017 19:50:41 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1552753</guid>
		<description><![CDATA[Fatal Tesla Crash: That’s Not All, Folks
http://www.eetimes.com/author.asp?section_id=36&amp;doc_id=1331950&amp;

The NTSB&#039;s report contains a few surprises, including how and where Tesla&#039;s captured data is stored, routed and saved inside a vehicle.

The National Transportation Safety Board (NTSB) last week released, after more than a year of suspense, a 500-page document or “docket,” about a fatal highway crash involving a Tesla S and a tractor-semitrailer truck.

For an automotive industry now fixated on the development of self-driving cars — among which the Tesla S is one of the grand experiments — the NTSB probe is a treasure trove of data. However, it falls short of determining who, or what, is to blame for the death of the driver.  

Driver Assistance System
Among the documents released last week, the one entitled “Driver Assistance System” poses particular interest for EE Times readers.

The report delves into details of how Tesla’s driver assistance system — consisting of a Bosch radar system, Mobileye image capture &amp; processing system, an ultrasonic sensor system and gateway electronic control unit — works. It literally reads like a teardown of the Model S driver-assistance system.

It also contains a few surprises, including how and where Tesla’s captured data is stored, routed and saved inside a vehicle, and how it’s sent to Tesla’s server using a virtual private network connection established via Wi-Fi, or using the vehicle’s 3G cellular data capability.

The report says that Tesla S stores non-geo-located data in-vehicle in non-volatile memory using a removable SD card installed within the Gateway ECU.

Really, an SD card? What role does the SD card play?

No Event Data Recorder?
After reading the report, Mike Demler, a senior analyst at The Linley Group, told EE Times, “I find the description of some of Tesla’s control and data-recording systems to be interesting.” In particular, he said, “The statement in the report that says, ‘This SD card is large enough to typically maintain a complete record of all stored data for the lifetime of the vehicle’ is interesting.” He asked: “How could they determine how much data will be generated over the lifetime of the vehicle?”

Unfortunately, the NTSB report doesn’t answer such questions.

But one thing is clear. The NTSB apparently sees this “removable” SD card as a proxy for an event data recorder (EDR). Because current NTSB specs do not require an EDR (it&#039;s completely voluntary), the NTSB appears to conclude that Tesla did enough.]]></description>
		<content:encoded><![CDATA[<p>Fatal Tesla Crash: That’s Not All, Folks<br />
<a href="http://www.eetimes.com/author.asp?section_id=36&#038;doc_id=1331950&#038;amp" rel="nofollow">http://www.eetimes.com/author.asp?section_id=36&#038;doc_id=1331950&#038;amp</a>;</p>
<p>The NTSB&#8217;s report contains a few surprises, including how and where Tesla&#8217;s captured data is stored, routed and saved inside a vehicle.</p>
<p>The National Transportation Safety Board (NTSB) last week released, after more than a year of suspense, a 500-page document or “docket,” about a fatal highway crash involving a Tesla S and a tractor-semitrailer truck.</p>
<p>For an automotive industry now fixated on the development of self-driving cars — among which the Tesla S is one of the grand experiments — the NTSB probe is a treasure trove of data. However, it falls short of determining who, or what, is to blame for the death of the driver.  </p>
<p>Driver Assistance System<br />
Among the documents released last week, the one entitled “Driver Assistance System” poses particular interest for EE Times readers.</p>
<p>The report delves into details of how Tesla’s driver assistance system — consisting of a Bosch radar system, Mobileye image capture &amp; processing system, an ultrasonic sensor system and gateway electronic control unit — works. It literally reads like a teardown of the Model S driver-assistance system.</p>
<p>It also contains a few surprises, including how and where Tesla’s captured data is stored, routed and saved inside a vehicle, and how it’s sent to Tesla’s server using a virtual private network connection established via Wi-Fi, or using the vehicle’s 3G cellular data capability.</p>
<p>The report says that Tesla S stores non-geo-located data in-vehicle in non-volatile memory using a removable SD card installed within the Gateway ECU.</p>
<p>Really, an SD card? What role does the SD card play?</p>
<p>No Event Data Recorder?<br />
After reading the report, Mike Demler, a senior analyst at The Linley Group, told EE Times, “I find the description of some of Tesla’s control and data-recording systems to be interesting.” In particular, he said, “The statement in the report that says, ‘This SD card is large enough to typically maintain a complete record of all stored data for the lifetime of the vehicle’ is interesting.” He asked: “How could they determine how much data will be generated over the lifetime of the vehicle?”</p>
<p>Unfortunately, the NTSB report doesn’t answer such questions.</p>
<p>But one thing is clear. The NTSB apparently sees this “removable” SD card as a proxy for an event data recorder (EDR). Because current NTSB specs do not require an EDR (it&#8217;s completely voluntary), the NTSB appears to conclude that Tesla did enough.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1534012</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Fri, 20 Jan 2017 15:25:59 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1534012</guid>
		<description><![CDATA[Driver in fatal Tesla Autopilot crash had seven seconds to take action
It’s not known what he was doing at the time
http://www.theverge.com/2017/1/19/14326604/tesla-autopilot-crash-driver-seven-seconds-inattentive-nhtsa

The Tesla driver killed in a crash while the Autopilot system was activated last year would have seen the tractor trailer for at least seven seconds prior to impact, according to the NHTSA investigation of the accident. This should have given the driver enough time to take “some action,” said Bryan Thomas, communications director for NHTSA, though it’s not known “whether that was enough time to avoid or mitigate the crash.”

The NHTSA investigation report called seven seconds a “period of extended distraction,” and noted that similar crashes generally had a “much shorter time” available for both the system and driver to detect and respond to a pending collision, usually less than three seconds. The report called distractions longer than seven seconds to be “uncommon, but foreseeable.”]]></description>
		<content:encoded><![CDATA[<p>Driver in fatal Tesla Autopilot crash had seven seconds to take action<br />
It’s not known what he was doing at the time<br />
<a href="http://www.theverge.com/2017/1/19/14326604/tesla-autopilot-crash-driver-seven-seconds-inattentive-nhtsa" rel="nofollow">http://www.theverge.com/2017/1/19/14326604/tesla-autopilot-crash-driver-seven-seconds-inattentive-nhtsa</a></p>
<p>The Tesla driver killed in a crash while the Autopilot system was activated last year would have seen the tractor trailer for at least seven seconds prior to impact, according to the NHTSA investigation of the accident. This should have given the driver enough time to take “some action,” said Bryan Thomas, communications director for NHTSA, though it’s not known “whether that was enough time to avoid or mitigate the crash.”</p>
<p>The NHTSA investigation report called seven seconds a “period of extended distraction,” and noted that similar crashes generally had a “much shorter time” available for both the system and driver to detect and respond to a pending collision, usually less than three seconds. The report called distractions longer than seven seconds to be “uncommon, but foreseeable.”</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1533942</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Fri, 20 Jan 2017 08:35:21 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1533942</guid>
		<description><![CDATA[Darrell Etherington / TechCrunch: 	
NHTSA closes probe into June 2016 Tesla crash, clearing Autopilot of fault and praising its safety features, including a ~40% drop in crashes since introduction  —  The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation … 


NHTSA’s full final investigation into Tesla’s Autopilot shows 40% crash rate reduction
https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/

The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation into last year’s fatal crash involving a driver’s use of Tesla’s semi-autonomous Autopilot feature. The report clears Tesla’s Autopilot system of any fault in the incident, and in fact at multiple points within the report praises its design in terms of safety, and highlights its impact on lowering the number of traffic incidents involving Tesla vehicles overall.

NHTSA notes that crash rates involving Tesla cars have dropped by almost 40 percent since the wide introduction of Autopilot.

It’s essentially as good as result as Tesla can have hoped for from the U.S. traffic safety agency’s investigation, which took place over the last six months. Reuters reported earlier on Thursday that the investigation would not result in a recall of Tesla vehicles, but the full findings show that in fact, the federal regulatory body found plenty to praise while conducting its inquiry.

The investigation does conclude with a minor admonition that Tesla could perhaps be more specific about its system limitations in its driver-assist features

U.S. traffic safety agency to close Tesla Autopilot investigation without recall request
https://techcrunch.com/2017/01/19/u-s-traffic-safety-agency-to-close-tesla-autopilot-investigation-without-recall-request/

The U.S. National Highway Traffic Safety Administration (NHTSA) will close the investigation it began six months ago into a driver death that occurred while using Tesla’s Autopilot highway semi-autonomous driving feature, Reuters reports. The investigation did not find cause for a recall of Tesla vehicles with Autopilot, the report claims.

U.S. regulator finds no evidence of defects after Tesla death probe
http://www.reuters.com/article/us-tesla-safety-idUSKBN1532F8

U.S. auto safety regulators said on Thursday they found no evidence of defects in a Tesla Motors Inc (TSLA.O) car involved in the death of a man whose Model S collided with a truck while he was using its Autopilot system.

The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks.]]></description>
		<content:encoded><![CDATA[<p>Darrell Etherington / TechCrunch:<br />
NHTSA closes probe into June 2016 Tesla crash, clearing Autopilot of fault and praising its safety features, including a ~40% drop in crashes since introduction  —  The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation … </p>
<p>NHTSA’s full final investigation into Tesla’s Autopilot shows 40% crash rate reduction<br />
<a href="https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/" rel="nofollow">https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/</a></p>
<p>The U.S. National Highway Traffic Safety Administration has released its full findings following the investigation into last year’s fatal crash involving a driver’s use of Tesla’s semi-autonomous Autopilot feature. The report clears Tesla’s Autopilot system of any fault in the incident, and in fact at multiple points within the report praises its design in terms of safety, and highlights its impact on lowering the number of traffic incidents involving Tesla vehicles overall.</p>
<p>NHTSA notes that crash rates involving Tesla cars have dropped by almost 40 percent since the wide introduction of Autopilot.</p>
<p>It’s essentially as good as result as Tesla can have hoped for from the U.S. traffic safety agency’s investigation, which took place over the last six months. Reuters reported earlier on Thursday that the investigation would not result in a recall of Tesla vehicles, but the full findings show that in fact, the federal regulatory body found plenty to praise while conducting its inquiry.</p>
<p>The investigation does conclude with a minor admonition that Tesla could perhaps be more specific about its system limitations in its driver-assist features</p>
<p>U.S. traffic safety agency to close Tesla Autopilot investigation without recall request<br />
<a href="https://techcrunch.com/2017/01/19/u-s-traffic-safety-agency-to-close-tesla-autopilot-investigation-without-recall-request/" rel="nofollow">https://techcrunch.com/2017/01/19/u-s-traffic-safety-agency-to-close-tesla-autopilot-investigation-without-recall-request/</a></p>
<p>The U.S. National Highway Traffic Safety Administration (NHTSA) will close the investigation it began six months ago into a driver death that occurred while using Tesla’s Autopilot highway semi-autonomous driving feature, Reuters reports. The investigation did not find cause for a recall of Tesla vehicles with Autopilot, the report claims.</p>
<p>U.S. regulator finds no evidence of defects after Tesla death probe<br />
<a href="http://www.reuters.com/article/us-tesla-safety-idUSKBN1532F8" rel="nofollow">http://www.reuters.com/article/us-tesla-safety-idUSKBN1532F8</a></p>
<p>U.S. auto safety regulators said on Thursday they found no evidence of defects in a Tesla Motors Inc (TSLA.O) car involved in the death of a man whose Model S collided with a truck while he was using its Autopilot system.</p>
<p>The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1523590</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Mon, 14 Nov 2016 11:42:19 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1523590</guid>
		<description><![CDATA[Another Tesla Crash, What It Teaches Us
Clash between Tesla’s prudence and human nature
http://www.eetimes.com/author.asp?section_id=36&amp;doc_id=1330813&amp;

Tesla crashed on a test drive while AutoPilot engaged. Nobody got hurt. But the minor incident gives us a plenty to think about.

Earlier this week, I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.

Thankfully, nobody got hurt. This post got no traction in the media. 

This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.

However, this accident, and moreso, subsequent discussions in the Tesla Motors Club forum, intrigued me for two reasons.

First, it’s a reminder that it ain’t easy to curb drivers’ appetite to “test the limits” of so-called AutoPilot, despite the carmaker’s stern warnings.

The key case in point is Tesla’s first fatal accident, which took place in Florida last May. After splurging on such a well-regarded, expensive automobile, who wouldn’t want to test its limits in driving and brag about it? The inevitable result is a clash between Tesla’s prudence and human nature.

Second, AutoPilot is still in the making. New technologies keep emerging, allowing the automaker to continue to improve it via software updates.

I was amazed to see so many posts by other Tesla users — all apparently very knowledgeable. They discussed the limitations of radar, problems AutoPilot handling hills, and the differences between software versions 7.1 and 8.0.

If this isn’t “inside baseball,” what is? I’d hate to think that an average driver needs to do this much homework to really understand why AutoPilot just doesn’t work in certain situations and why it isn’t autonomous.

statement.

“After speaking with our customer and the Tesla employee, we determined that this accident was the result of a miscommunication between the individuals inside the car.”

The accident happened near Los Angeles area. The vehicle that crashed on a test drive was running the software version 8.0.

As expressed in the statement above, Tesla stressed that this accident was not the result of the driver intentionally testing the limits of AutoPilot, but it happened because a miscommunication inside the car.

To be clear, the radar was already added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but it was only meant to be a supplementary sensor to the primary camera and image processing system, the company explained.

A the time of version 8.0 release, Tesla made it clear that it changed its mind. &quot;We believe radar can be used as a primary control sensor without requiring the camera to confirm visual image recognition.&quot;]]></description>
		<content:encoded><![CDATA[<p>Another Tesla Crash, What It Teaches Us<br />
Clash between Tesla’s prudence and human nature<br />
<a href="http://www.eetimes.com/author.asp?section_id=36&#038;doc_id=1330813&#038;amp" rel="nofollow">http://www.eetimes.com/author.asp?section_id=36&#038;doc_id=1330813&#038;amp</a>;</p>
<p>Tesla crashed on a test drive while AutoPilot engaged. Nobody got hurt. But the minor incident gives us a plenty to think about.</p>
<p>Earlier this week, I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.</p>
<p>Thankfully, nobody got hurt. This post got no traction in the media. </p>
<p>This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.</p>
<p>However, this accident, and moreso, subsequent discussions in the Tesla Motors Club forum, intrigued me for two reasons.</p>
<p>First, it’s a reminder that it ain’t easy to curb drivers’ appetite to “test the limits” of so-called AutoPilot, despite the carmaker’s stern warnings.</p>
<p>The key case in point is Tesla’s first fatal accident, which took place in Florida last May. After splurging on such a well-regarded, expensive automobile, who wouldn’t want to test its limits in driving and brag about it? The inevitable result is a clash between Tesla’s prudence and human nature.</p>
<p>Second, AutoPilot is still in the making. New technologies keep emerging, allowing the automaker to continue to improve it via software updates.</p>
<p>I was amazed to see so many posts by other Tesla users — all apparently very knowledgeable. They discussed the limitations of radar, problems AutoPilot handling hills, and the differences between software versions 7.1 and 8.0.</p>
<p>If this isn’t “inside baseball,” what is? I’d hate to think that an average driver needs to do this much homework to really understand why AutoPilot just doesn’t work in certain situations and why it isn’t autonomous.</p>
<p>statement.</p>
<p>“After speaking with our customer and the Tesla employee, we determined that this accident was the result of a miscommunication between the individuals inside the car.”</p>
<p>The accident happened near Los Angeles area. The vehicle that crashed on a test drive was running the software version 8.0.</p>
<p>As expressed in the statement above, Tesla stressed that this accident was not the result of the driver intentionally testing the limits of AutoPilot, but it happened because a miscommunication inside the car.</p>
<p>To be clear, the radar was already added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but it was only meant to be a supplementary sensor to the primary camera and image processing system, the company explained.</p>
<p>A the time of version 8.0 release, Tesla made it clear that it changed its mind. &#8220;We believe radar can be used as a primary control sensor without requiring the camera to confirm visual image recognition.&#8221;</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1523548</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Mon, 14 Nov 2016 09:38:17 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1523548</guid>
		<description><![CDATA[Adding Some Statistical Perspective To Tesla Autopilot Safety Claims
http://www.forbes.com/sites/samabuelsamid/2016/07/05/adding-some-statistical-perspective-to-tesla-autopilot-safety-claims/#70f1f2a2f8f6

“Figures never lie, but liars always figure.”

“Tell me which side of the argument you are on and I will give you the statistics to prove you are right.”

Variations of those quotes have been around for ages and the origins are debatable. However, there is a great deal of truth to both idioms. Whether discussing unemployment numbers, economic growth, the latest poll numbers or safety claims, the first thing you must always do before accepting the data is to understand the question that was asked to get that data. Subtle changes in the question can have a huge impact on the results. 

Based on Tesla’s statements we can assume the 130 million miles is the total of all miles traveled in Autopilot mode by all Model S and X vehicles globally since the update was released in October 2015. If we assume that we must also assume there have been no fatal accidents in other parts of the world that we don’t know about yet. Given the amount of telemetry that Tesla collects from their vehicles let’s give them the benefit of the doubt on this one. So one fatality in 130 million miles stands for the moment. 

How about the one fatality every 94 million miles in the United States? The best source for such data is the Department of Transportation’s Fatality Analysis Reporting System (FARS) which compiles accident data from state and local agencies nationwide.

In 2014, Americans traveled 3.026 trillion miles on the road and a total of 32,675 people died along the way. That actually works out to one death ever 92.6 million miles

That last part is important because the FARS data includes all traffic deaths, those in cars and trucks as well as those riding motorized or pedal cycles and pedestrians struck by a vehicle. As far as we know, no Autopilot equipped vehicle has struck and killed a pedestrian or cyclist. So Tesla’s comparison is actually looking at two quite different data sets. In 2014, 4,586 motorcyclists and 5,813 pedestrians/cyclists were killed. 

That leaves 22,276 vehicle occupants (drivers and passengers) that died. This latter set are probably the ones we should be comparing to Tesla’s one death in 130 million miles

Based on that statistic, humans are actually better drivers than computers. However, even that isn’t necessarily a valid comparison.]]></description>
		<content:encoded><![CDATA[<p>Adding Some Statistical Perspective To Tesla Autopilot Safety Claims<br />
<a href="http://www.forbes.com/sites/samabuelsamid/2016/07/05/adding-some-statistical-perspective-to-tesla-autopilot-safety-claims/#70f1f2a2f8f6" rel="nofollow">http://www.forbes.com/sites/samabuelsamid/2016/07/05/adding-some-statistical-perspective-to-tesla-autopilot-safety-claims/#70f1f2a2f8f6</a></p>
<p>“Figures never lie, but liars always figure.”</p>
<p>“Tell me which side of the argument you are on and I will give you the statistics to prove you are right.”</p>
<p>Variations of those quotes have been around for ages and the origins are debatable. However, there is a great deal of truth to both idioms. Whether discussing unemployment numbers, economic growth, the latest poll numbers or safety claims, the first thing you must always do before accepting the data is to understand the question that was asked to get that data. Subtle changes in the question can have a huge impact on the results. </p>
<p>Based on Tesla’s statements we can assume the 130 million miles is the total of all miles traveled in Autopilot mode by all Model S and X vehicles globally since the update was released in October 2015. If we assume that we must also assume there have been no fatal accidents in other parts of the world that we don’t know about yet. Given the amount of telemetry that Tesla collects from their vehicles let’s give them the benefit of the doubt on this one. So one fatality in 130 million miles stands for the moment. </p>
<p>How about the one fatality every 94 million miles in the United States? The best source for such data is the Department of Transportation’s Fatality Analysis Reporting System (FARS) which compiles accident data from state and local agencies nationwide.</p>
<p>In 2014, Americans traveled 3.026 trillion miles on the road and a total of 32,675 people died along the way. That actually works out to one death ever 92.6 million miles</p>
<p>That last part is important because the FARS data includes all traffic deaths, those in cars and trucks as well as those riding motorized or pedal cycles and pedestrians struck by a vehicle. As far as we know, no Autopilot equipped vehicle has struck and killed a pedestrian or cyclist. So Tesla’s comparison is actually looking at two quite different data sets. In 2014, 4,586 motorcyclists and 5,813 pedestrians/cyclists were killed. </p>
<p>That leaves 22,276 vehicle occupants (drivers and passengers) that died. This latter set are probably the ones we should be comparing to Tesla’s one death in 130 million miles</p>
<p>Based on that statistic, humans are actually better drivers than computers. However, even that isn’t necessarily a valid comparison.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1523543</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Mon, 14 Nov 2016 09:33:23 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1523543</guid>
		<description><![CDATA[Are Tesla Crashes Balanced Out By The Lives That They Save? 
https://tech.slashdot.org/story/16/11/13/2245224/are-tesla-crashes-balanced-out-by-the-lives-that-they-save

Friday EE Times shared the story of a Tesla crash that occurred during a test drive. &quot;The salesperson suggested that my friend not brake, letting the system do the work. It didn&#039;t...&quot; One Oregon news site even argues autopiloted Tesla&#039;s may actually have a higher crash rate.


Tesla&#039;s own numbers show Autopilot has higher crash rate than human drivers
http://katu.com/news/auto-matters/teslas-own-numbers-show-autopilot-has-higher-crash-rate-than-human-drivers

A couple of weeks ago, I wrote about Tesla’s claim that its Autopilot driver-assistance software is safer than a human driver.

After a fatal Autopilot crash last May, the company said the death was the first in 130 million miles of Autopilot driving—and noted that, “among all vehicles, in the U.S., there is a fatality every 94 million miles.”

The clear implication: Autopiloted Teslas are safer than human-piloted cars, and lives would be saved if every car had Autopilot.

But Tesla’s statistics are questionable at best. The small sample size—one crash—makes any calculation of Autopilot fatality rate almost meaningless.

Furthermore, Tesla compared its Autopilot crash rate to the overall U.S. traffic fatality rate—which includes bicyclists, pedestrians, buses and 18-wheelers. 

A better yardstick for comparison is the fatality rate for U.S. drivers of cars and light trucks compiled by the Insurance Institute for Highway Safety.

By that yardstick, the Tesla Autopilot driver fatality rate is almost four times higher than typical passenger vehicles.]]></description>
		<content:encoded><![CDATA[<p>Are Tesla Crashes Balanced Out By The Lives That They Save?<br />
<a href="https://tech.slashdot.org/story/16/11/13/2245224/are-tesla-crashes-balanced-out-by-the-lives-that-they-save" rel="nofollow">https://tech.slashdot.org/story/16/11/13/2245224/are-tesla-crashes-balanced-out-by-the-lives-that-they-save</a></p>
<p>Friday EE Times shared the story of a Tesla crash that occurred during a test drive. &#8220;The salesperson suggested that my friend not brake, letting the system do the work. It didn&#8217;t&#8230;&#8221; One Oregon news site even argues autopiloted Tesla&#8217;s may actually have a higher crash rate.</p>
<p>Tesla&#8217;s own numbers show Autopilot has higher crash rate than human drivers<br />
<a href="http://katu.com/news/auto-matters/teslas-own-numbers-show-autopilot-has-higher-crash-rate-than-human-drivers" rel="nofollow">http://katu.com/news/auto-matters/teslas-own-numbers-show-autopilot-has-higher-crash-rate-than-human-drivers</a></p>
<p>A couple of weeks ago, I wrote about Tesla’s claim that its Autopilot driver-assistance software is safer than a human driver.</p>
<p>After a fatal Autopilot crash last May, the company said the death was the first in 130 million miles of Autopilot driving—and noted that, “among all vehicles, in the U.S., there is a fatality every 94 million miles.”</p>
<p>The clear implication: Autopiloted Teslas are safer than human-piloted cars, and lives would be saved if every car had Autopilot.</p>
<p>But Tesla’s statistics are questionable at best. The small sample size—one crash—makes any calculation of Autopilot fatality rate almost meaningless.</p>
<p>Furthermore, Tesla compared its Autopilot crash rate to the overall U.S. traffic fatality rate—which includes bicyclists, pedestrians, buses and 18-wheelers. </p>
<p>A better yardstick for comparison is the fatality rate for U.S. drivers of cars and light trucks compiled by the Insurance Institute for Highway Safety.</p>
<p>By that yardstick, the Tesla Autopilot driver fatality rate is almost four times higher than typical passenger vehicles.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1514055</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Tue, 20 Sep 2016 08:19:17 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1514055</guid>
		<description><![CDATA[Questions About Tesla Autopilot Safety Hit Stone Wall
 http://www.eetimes.com/document.asp?doc_id=1330477&amp;

A fatal accident in China thrust Tesla’s transparency into sharp focus this week, posing fresh and daunting questions as to how safe Tesla’s Autopilot really is.

New reports surfaced this week in China about a crash that killed a 23-year-old occupant while driving a Tesla Model S in Handan, a city about 300 miles south of Beijing.

This took place on January 20, 2016 — four months before Joshua Brown died in Florida, in a Tesla Model S on Autopilot.


The Chinese government news channel CCTV reported that the Chinese driver, Gao Yaning, borrowed his father’s Tesla Model S. He was driving on the highway, when his car hit a street-sweeper truck on the side of the road at highway speed.

CCTV showed a video footage of the accident captured by the Tesla Model S driver’s dash camera.

The police found no sign that the vehicle applied the brakes before hitting the truck. Local media reported that the Autopilot was engaged at the time of the accident.


That crash, according to the Chinese reports, was under investigation for the first half of this year, the result of which is a lawsuit filed in July by the victim’s family against Tesla China.

Tesla’s credibility and transparency in question
If reports are true, China’s Tesla fatality in January presents a problem for Tesla.]]></description>
		<content:encoded><![CDATA[<p>Questions About Tesla Autopilot Safety Hit Stone Wall<br />
 <a href="http://www.eetimes.com/document.asp?doc_id=1330477&#038;amp" rel="nofollow">http://www.eetimes.com/document.asp?doc_id=1330477&#038;amp</a>;</p>
<p>A fatal accident in China thrust Tesla’s transparency into sharp focus this week, posing fresh and daunting questions as to how safe Tesla’s Autopilot really is.</p>
<p>New reports surfaced this week in China about a crash that killed a 23-year-old occupant while driving a Tesla Model S in Handan, a city about 300 miles south of Beijing.</p>
<p>This took place on January 20, 2016 — four months before Joshua Brown died in Florida, in a Tesla Model S on Autopilot.</p>
<p>The Chinese government news channel CCTV reported that the Chinese driver, Gao Yaning, borrowed his father’s Tesla Model S. He was driving on the highway, when his car hit a street-sweeper truck on the side of the road at highway speed.</p>
<p>CCTV showed a video footage of the accident captured by the Tesla Model S driver’s dash camera.</p>
<p>The police found no sign that the vehicle applied the brakes before hitting the truck. Local media reported that the Autopilot was engaged at the time of the accident.</p>
<p>That crash, according to the Chinese reports, was under investigation for the first half of this year, the result of which is a lawsuit filed in July by the victim’s family against Tesla China.</p>
<p>Tesla’s credibility and transparency in question<br />
If reports are true, China’s Tesla fatality in January presents a problem for Tesla.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Tomi Engdahl</title>
		<link>https://www.epanorama.net/blog/2016/07/01/teslas-autopilot-being-investigated-by-the-government-following-fatal-crash-ars-technica/comment-page-1/#comment-1512652</link>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
		<pubDate>Mon, 12 Sep 2016 14:34:21 +0000</pubDate>
		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=43425#comment-1512652</guid>
		<description><![CDATA[Elon Musk Says Tesla New Autopilot Features Would Have Prevented Recent Death 
https://tech.slashdot.org/story/16/09/12/1347217/elon-musk-says-tesla-new-autopilot-features-would-have-prevented-recent-death

Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an &quot;over-the-air&quot; software update, would rely foremost on radar to give Tesla&#039;s electric luxury cars a better sense of what is around them and when to brake. 

Elon Musk Says Tesla&#039;s New Autopilot Likely Would Have Prevented Death
http://fortune.com/2016/09/12/elon-musk-tesla-new-autopilot-death/

The new Autopilot will be available within a week or two through an “over-the-air” software update.

Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May.

New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its “hands-off” driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car.

“We’re making much more effective use of radar,” Musk told journalists on a phone call. “It will be a dramatic improvement in the safety of the system done entirely through software.”

The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s Autopilot system since June because of the fatal accident. The agency had been briefed on the changes by Tesla TSLA 2.11% and would review them, spokesman Bryan Thomas said.

Musk said it was “very likely” the improved Autopilot would have prevented the death of Brown, whose car sped into the trailer of a truck crossing a highway, but he cautioned that the update “doesn’t mean perfect safety.”

“Probability of Safety”

“Perfect safety is really an impossible goal,” Musk said. “It’s about improving the probability of safety. There won’t ever be zero fatalities, there won’t ever be zero injuries.”

One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives

One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives

“Anything metallic or dense, the radar system we’re confident will be able to detect that and initiate a braking event,” he said.

The revised system will sound warnings if drivers take their hands off the wheel for more than a minute at speeds above 45 miles per hour (72 kph) when there is no vehicle ahead, Musk said.

If the driver ignores three audible warnings in an hour, the system will temporarily shut off until it is parked]]></description>
		<content:encoded><![CDATA[<p>Elon Musk Says Tesla New Autopilot Features Would Have Prevented Recent Death<br />
<a href="https://tech.slashdot.org/story/16/09/12/1347217/elon-musk-says-tesla-new-autopilot-features-would-have-prevented-recent-death" rel="nofollow">https://tech.slashdot.org/story/16/09/12/1347217/elon-musk-says-tesla-new-autopilot-features-would-have-prevented-recent-death</a></p>
<p>Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an &#8220;over-the-air&#8221; software update, would rely foremost on radar to give Tesla&#8217;s electric luxury cars a better sense of what is around them and when to brake. </p>
<p>Elon Musk Says Tesla&#8217;s New Autopilot Likely Would Have Prevented Death<br />
<a href="http://fortune.com/2016/09/12/elon-musk-tesla-new-autopilot-death/" rel="nofollow">http://fortune.com/2016/09/12/elon-musk-tesla-new-autopilot-death/</a></p>
<p>The new Autopilot will be available within a week or two through an “over-the-air” software update.</p>
<p>Tesla Motors Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May.</p>
<p>New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its “hands-off” driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car.</p>
<p>“We’re making much more effective use of radar,” Musk told journalists on a phone call. “It will be a dramatic improvement in the safety of the system done entirely through software.”</p>
<p>The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s Autopilot system since June because of the fatal accident. The agency had been briefed on the changes by Tesla TSLA 2.11% and would review them, spokesman Bryan Thomas said.</p>
<p>Musk said it was “very likely” the improved Autopilot would have prevented the death of Brown, whose car sped into the trailer of a truck crossing a highway, but he cautioned that the update “doesn’t mean perfect safety.”</p>
<p>“Probability of Safety”</p>
<p>“Perfect safety is really an impossible goal,” Musk said. “It’s about improving the probability of safety. There won’t ever be zero fatalities, there won’t ever be zero injuries.”</p>
<p>One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives</p>
<p>One of the main challenges of using cameras and radars for a braking system is how to prevent so-called false positives</p>
<p>“Anything metallic or dense, the radar system we’re confident will be able to detect that and initiate a braking event,” he said.</p>
<p>The revised system will sound warnings if drivers take their hands off the wheel for more than a minute at speeds above 45 miles per hour (72 kph) when there is no vehicle ahead, Musk said.</p>
<p>If the driver ignores three audible warnings in an hour, the system will temporarily shut off until it is parked</p>
]]></content:encoded>
	</item>
</channel>
</rss>
