The Florida crash that killed the driver of a Tesla Model S while that car was operating under Autopilot software reverberated far and wide in global media.
While human drivers are imperfect, with reaction times limited by millions of years of evolution, they at least provide a focus for blame if they crash their cars.
So the results of a National Transportation Safety Board inquiry into the fatal Tesla crash in May 2016, released on Tuesday, have been awaited with some interest.
DON’T MISS: NHTSA to investigate Tesla Model S Autopilot crash that killed driver
The report concluded that Tesla’s Autopilot system was lacking “system safeguards” to ensure it was used only within appropriate circumstances.
“Tesla allowed the driver to use the system outside of the environment for which it was designed,” said NTSB Chairman Robert Sumwalt, and Autopilot as implemented in that Model S “gave far too much leeway to the driver to divert his attention.”
The driver, later identified as 40-year-old Joshua Brown of Ohio, died at the scene of the Florida crash on May 7, 2016, which occurred after a tractor-trailer rig turned left in front of him.
2016 Tesla Model S
Brown’s Tesla Model S, running under Autopilot control, hit the semi and passed entirely under its trailer, shearing off part of the Tesla roof.
While European tractor-trailer rigs have long had side guard-rails to prevent such effects, the trucking industry has so far defeated efforts to require them on U.S. trailers.
According to a statement from Tesla shortly after the incident, it occurred because “neither Autopilot nor the driver noticed the white side of the … trailer against a brightly lit sky, so the brake was not applied.”
CHECK OUT: Tesla Autopilot crash: what one Model S owner has to say
During its investigation, a team of five Safety Board investigators traveled to Florida to inspect the highway where the crash took place, even scanning the area with lasers to create a three-dimensional model of the fatality site.
The full 76-page NTSB report listed several deficiencies it had identified in Tesla’s implementation of the semi-autonomous driving assist software.
Tesla was not able to ensure that its drivers were paying attention, even when the car was traveling at highway speeds, it noted, nor did it monitor driver attention to the system.
Tesla Autopilot
This meant that drivers could refrain from steering or watching the road for long periods of time, simply by ignoring the car’s warnings.
Tesla also failed to ensure that Autopilot was used only on roads with limited access or designated highways, the agency said, as recommended in its owners manual.
The dead man’s family defended Tesla following release of the report, releasing the following statement:
We heard numerous times that the car killed our son. That is simply not the case. There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.
2016 Tesla Model S
Tesla too responded to the report, issuing the following statement, attributed to an anonymous spokesperson:
At Tesla, the safety of our customers comes first, and one thing is very clear: Autopilot significantly increases safety, as NHTSA has found that it reduces accident rates by 40 percent.
We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.
Drivers must “accept” a warning screen in which they agree to keep their hands on the wheel “at all times” and “maintain control of your vehicle” when first enabling Autopilot, and are reminded of that at every subsequent use.
Now, if the system senses insufficient driver steering input, an escalating series of audio and visual warnings ends in locking the driver out of Autopilot use for the remainder of that trip.
_______________________________________
Follow GreenCarReports on Facebook and Twitter.
View original article at: “https://www.greencarreports.com//news/1112656_fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb”
… [Trackback]
[…] Read More on that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] Read More on that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] Info to that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] Information on that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] Information to that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] There you will find 83782 additional Info on that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]
… [Trackback]
[…] Read More on to that Topic: autoseu.com/fatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsbfatal-tesla-autopilot-crash-system-safeguards-lacking-says-ntsb/ […]