Forget Autopilot mode -- Tesla is in fight-back mode.
The company is pushing back against accusations that it acted unethically in the case of the fatal May 7 accident involving its Model S sedan while in Autopilot mode. Now it is dealing with the fallout over another accident, this time of a Model X.
But investigators, and Tesla, still don't know exactly what caused that second accident -- was it Autopilot? Driver error? -- and it doesn't appear that answers are forthcoming.
In IT Blogwatch, we buckle up and go along for the ride.
Another crash already? What happened? Yoni Heisler gives us all the grimy details:
The National Highway Traffic Safety Administration (NHTSA)...will be investigating yet another Tesla accident...a July 1 incident involving a Tesla Model X that veered into a guard rail before ultimately crashing into a concrete median. When the dust settled, the driver was okay but the Model X...was completely turned upside down.
Upside down -- yikes. But that doesn't mean Autopilot was engaged, does it? Turns out, we're not yet sure. Greg Gardner tells us why:
Albert Scaglione and his artist son-in-law, Tim Yanke, both survived [the] crash...The Free Press was not able to reach Scaglione...or Yanke, but Dale Vukovich of the Pennsylvania State Police, who responded to the crash, said Scaglione told him that he had activated the Autopilot feature.
So we're hearing second-hand that Autopilot was involved. Surely Tesla knows what happened, right? As Phil LeBeau explains, that is not exactly the case:
As questions about the accident continued...Tesla's statements became less certain. Initially, spokespeople...said they had "no reason to believe that Autopilot had anything to do with this accident." Later, the company tweaked its statement to say it had "no data at this point to indicate that Autopilot was engaged or not engaged."
...
The company said it's tried to call Scaglione three times and has been unable to reach him.
Seems like we will be trying to figure this one out for a bit. In the meantime, what was all that about ethics accusations involving the fatal Model S crash that came to light last week? Lucas Mearian breaks it down for us:
A magazine article...called out Tesla and the NHTSA for not disclosing the May 7 accident for nearly eight weeks.
...
Musk responded to the...article, calling it "fundamentally incorrect" and a mischaracterization of Tesla's latest filing with the Securities and Exchange Commission. Musk defended...Autopilot technology, stating...the...article made two false assumptions: That the accident was caused by Autopilot failure; and that "a single accident involving Autopilot," is material to Tesla's investors...Musk [added] there was no evidence to suggest that Autopilot was not operating as designed.
So what does this mean for the future of self-driving cars? Some people are still enthusiastic about the technology, but as LadyMae clearly shows us others, not so much:
I don't even like using cruise control so there's no way...I'd use a self-driving car.