Writers at the respected Consumer Reports were enthusiastic fans of the Tesla Model S electric car they bought and drove for more than a year, using enthusiastic language rarely seen in its sober reviews.
But as reliability data accumulated, the magazine also reported that the car appeared to have had numerous teething problems—usually rectified by Tesla’s well-respected Service Centers.
Yesterday, however, CR published a bracing editorial piece that castigates Tesla for its Autopilot driver-assistance software, saying it was “too much, too soon,” and recommending that its automatic steering function be disabled.
START HERE: NHTSA to investigate Tesla Model S Autopilot crash that killed driver
The crux of its argument is that the name of the system, Autopilot, and some of its marketing around the launch of software it still deems a beta-test version, “create potential for driver confusion.”
That led Laura MacCleery, CR’s vice president of consumer policy and mobilization, to recommend that “Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.”
Referring to a fatal May 7 crash of a Tesla Model S operating on Autopilot, the piece argues:
Tesla Autopilot Test
While the exact cause of the fatal accident is not yet known, the incident has caused safety advocates, including Consumer Reports, to question whether the name Autopilot, as well as the marketing hype of its roll-out, promoted a dangerously premature assumption that the Model S was capable of truly driving on its own.
Tesla’s own press release for the system announced “Your Autopilot has arrived” and promised to relieve drivers “of the most tedious and potentially dangerous aspects of road travel.”
DON’T MISS: Let’s be clear: Tesla’s Autopilot is not a ‘self-driving car’
Consumer Reports notes that, at the same time, Tesla’s press release says the driver “is still responsible for, and ultimately in control of, the car.”
Tesla has no intention of disabling the system, however, according to its CEO Elon Musk.
He had told The Wall Street Journal in an interview, published two days earlier, that instead the company will “redouble its efforts to educate consumers on how the system works.”
2016 Tesla Model X
The operation of the Autopilot system and crashes that occurred while it was engaged are now being investigated by both the National Highway Traffic Safety Admnistration and the National Traffic Safety Board.
Separately, the Securities & Exchange Commission is looking into whether Tesla Motors should have disclosed the May 7 fatal crash as a “material event” that could affects its finances.
In a July 8 letter, the NHTSA gave Tesla two deadlines to respond to a list of 10 questions it has asked about the fatal crash and the general operation of the Autopilot system.
It wants answers to its highest-priority questions by July 29, with responses to the remainder due by August 26.
2016 Tesla Model S
If Tesla does not do so, it could theoretically face fines of up to $21,000 per day.
Despite widespread coverage of Autopilot as an “autonomous driving” system, it is actually nothing of the kind.
CHECK OUT: More Tesla Autopilot crashes surface; NHTSA to investigate another
The Autopilot system is broadly similar to bundles of electronic driver-assistance functions offered by other automakers.
They include radar-based adaptive cruise control, lane-correction assist, blind-spot monitoring, automatic crash braking, and various other systems that use radar, camera, and other sensors and software to compensate for driver inattention and unawareness.
_______________________________________
Follow GreenCarReports on Facebook and Twitter.
View original article at: “https://www.greencarreports.com//news/1105008_consumer-reports-slams-tesla-on-autopilot-too-much-too-soon”