Elon Musk's Hubris Runs on Autopilot. Will It Hurt Tesla? Getty Images

Elon Musk's Hubris Runs on Autopilot. Will It Hurt Tesla?

Unlike GM, Tesla doesn’t have other car models to fall back on. “If Tesla loses Tesla, that’s it," says tech analyst Rob Enderle.

Last Sunday, as federal investigators were probing accidents involving Tesla’s Autopilot self-driving feature, Tesla founder Elon Musk had already moved on, tweeting about the imminent release of his company’s “Top Secret Master Plan, Part 2.”

Then on Wednesday, as consumer safety advocates were calling out Tesla for the Autopilot’s safety risks and urging the National Highway and Transportation Safety Administration to take a more rigorous look at self-driving features, Musk was tweeting a pro-Autopilot love letter from Road & Track titled "Leave Tesla Alone."

Next, on Thursday, with a federal investigation ongoing, Musk tweeted that the Autopilot feature was not to blame in a Pennsylvania crash July 1 in which a car flipped over when it hit a guardrail, ricocheting across traffic and hitting a concrete median.

All this begs the question: Is it a bad idea for the CEO of a technology company that hasn’t been able to deliver on its promised production numbers--and thus far has enjoyed a “go do your thing and report back with something amazing” attitude from regulators—to brush off safety concerns? (Meanwhile, Ford, GM and FCA are mired in recall after recall.)

Or is Musk’s bold, take-no-prisoners attitude best for company that’s built its brand on disruption?

Jacob Chapman, a partner with Gelt Capital, a transportation and technology venture capitalist firm in San Francisco, thinks the hubris is in keeping with Musk’s persona and it’s what Musk’s core audience expects.

“I think the biggest mistake Musk made is calling it Autopilot,” Chapman says. Unlike Google self-driving cars, Chapman notes that Teslas with the Autopilot software are not self-driving vehicles with more sophisticated Level 3 LiDAR sensors (technology that uses lasers to measure distance). The Teslas have only Level 2 ultrasound and video sensors, more like an advanced cruise control where the driver must still keep hands next to wheel.

But Musk’s marketing “mistake” was deliberate, says Chapman. “Elon Musk is a pretty savvy guy. He knew that calling it Autopilot would be good from a marketing perspective. He could have called a lot of things that are bland and also more accurate than what it is.”

Rob Enderle, technology analyst for the Enderle Group, says that Musk’s approach to Autopilot is “wrong.”

“It reminds me a lot of GM and the Corvair,” he says, referring to General Motors' attempt in the 1960s to dismiss consumer advocate Ralph Nader’s safety concerns after drivers died in crashes when they were impaled by the car’s steering column.

“Instead of just trying to work with Nader and fix the problem, they decided that they’d misdirect and try to put Nader out of business, and instead they made Nader famous,” Enderle says. Similarily, Tesla "called the product Autopilot, people used it inappropriately, and somebody died.

“As far as I know only one person has died so far, but it does open Tesla to liability because if somebody’s led to believe something will do something and it doesn’t and they die as a result, well as the manufacturer of the product you should be liable. And it really isn’t Autopilot. It’s enhanced cruise control; it’s not self-driving. It is being positioned as more than it is and the company should be doing everything it can to make sure it doesn’t continue. And if they continue to fight this, it could kill the company.”

Unlike GM, Tesla doesn’t have other car models to fall back on, says Enderle. “If Tesla loses Tesla, that’s it.”

Enderle sees the Autopilot controversy damaging other self-driving efforts. "People are led to believe that self-driving cars are unsafe, and [Autopilot] isn't self-driving technology, and self-driving cars are not [unsafe]."

In a letter to President Obama on Wednesday, a group of consumer safety advocates, including former NHTSA Administrator Joan Claybrook, called for more regulation of self-driving technology writ large, while framing concerns specifically around Autopilot incidents.

“Technology that cannot sense a white truck in its path, and that fails to brake when a collision is imminent, has no place on the public roads,” the letter stated, referring to an accident in Florida where the Autopilot on a sunny day did not see a truck making a left turn up ahead, and the Tesla crashed into the truck.

“By releasing Autopilot prematurely in Beta mode, Tesla is unconscionably using our public highways as a test lab and its customers as human guinea pigs,” said the letter.

Appealing to tech enthusiasts is one thing, says Chapman. But Tesla wants to get bigger—it sold in pre-order a half million Model 3s. “They’re definitely making the outreach where they want to become a household brand. … I think they’ve managed to get away with a lot because their current customer base is really tech forward. That’s not where they want to be ten years from now.

“I think when the Model 3 starts hitting the road, you’ll start seeing some tempering of what they’re doing. I think they’ll be less aggressive in the technology they put into the mass market car.”

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish