The cars are definitely electric. That second objective, according to a new documentary, has turned out to be more elusive.
Informed by the New York Times’ coverage of Cade Metz and Neal Boudette, director Emma Schwartz’s “Elon Musk’s Crash Course” raises a skeptical eyebrow at Tesla’s vaunted Autopilot feature, which is sometimes described as the self-driving software. It states that Autopilot has failed to deliver on its promise and that lives have been put at risk as a result. Here are three main arguments that Schwartz’s film puts forward.
1. Despite Tesla’s claims that its technology would revolutionize cars for safety, its cars have sometimes failed to recognize certain safety hazards in Autopilot mode — and Tesla drivers have had fatal traffic accidents while using them.
According to “Elon Musk’s Crash Course,” a 2016 investigation by the National Highway Traffic Safety Administration (NHTSA) found that about 38 Tesla accidents had occurred in the United States while the cars were in Autopilot mode, but the film describes three in which drivers died.
The first is that of Josh Brown, a bomb dismantler for the US Navy in the Iraq War and the founder of a company that sought to expand Internet service into rural America. Described by his friends as a passionate tech enthusiast, Brown loved his Tesla and often filmed videos behind the wheel. When Musk retweeted one such video in April 2016, in which the car in Autopilot mode steered itself out of the way of a truck merging too aggressively, Brown was elated.
Brown was driving in the same mode through Williston, Florida, after leaving Disney World the following month when his Tesla ran under a tractor-trailer without slowing down. Brown, 40, was killed in the crash. (Despite rumors that Brown was watching a movie, the documentary makes it clear that no movies were found on Brown’s laptop. Still, the NHTSA and the National Transportation Safety Board, or NTSB, found that Brown was guilty of not paying attention to the road.) In the film, Musk can be heard in an audio recording who later says that radar upgrades added to the Autopilot software after Brown’s accident may have saved Brown’s life.
In March 2018, 38-year-old Apple engineer Walter Huang died when his Tesla, running in Autopilot mode, hit a concrete barrier in Mountain View, California, traveling at more than 70 mph. Former NTSB chairman Robert L. Sumwalt says on screen that Huang was playing a video game.
And in March 2019, Jeremy Banner, 50, was killed in another Florida traffic accident, nearly identical to the one that killed Brown. The Tesla was on Autopilot when a tractor-trailer crossed the road. Banner’s car failed to recognize the side of the vehicle in the bright sunlight and went under it and tore off the roof.
Sumwalt claims in “Crash Course” that Tesla ignored its safety recommendations after crashes. “If innovation is implemented, we have to make sure it happens safely,” he says, “otherwise it becomes the Wild West out there.”
2. Some former Tesla engineers privately harbored doubts about Musk’s promises to the public about Tesla’s ability to drive itself.
Despite Musk’s claims from 2015 that self-driving cars were essentially a “problem solved” and that the problems were solved on their own, several former staffers in “Crash Course” argue that this was not the case behind closed doors.
For example, they say certain decisions were made somewhat arbitrarily — such as the decision to use cameras instead of a popular radar system known as lidar. “There was no in-depth research phase where different vehicles were equipped with an array of sensors. A lot of team members would have liked that,” said Akshat Patel, Autopilot’s technical program manager from 2014 to 2015. “Instead, the conclusion was drawn first and the testing and development activities began to prove that conclusion was correct.”
Others claim they were concerned that the Autopilot technology was being sold to and used by people who believed it would provide the same elevator-like transportation experience Musk had once described — drivers who thought they could get in, offer a destination, and then sit back and watch. relax . When Brown’s crash happened, “I was aware that people were relying on the system to do things it was not designed or capable of doing,” said JT Stukes, senior project engineer at Tesla from 2014 to 2018. “It The fact that that kind of accident happened is of course tragic, but it was going to happen.”
Raven Jiang, an engineer who also worked on Autopilot at Tesla from 2015 to 2016, notes that in the same time frame, Elizabeth Holmes’ violations at Theranos were revealed to the public. “Some of those stories were in the back of my mind,” Jiang says. “It certainly has made me question a lot more about what’s behind some of this public optimism.”
3. Tesla enjoys great social support in any case.
The most recent footage in “Crash Course” is from last month. Musk, dressed in a black cowboy hat and black pilots, grins onstage in front of a whooping, ecstatic crowd at the launch party for Tesla’s new Gigafactory in Austin. Partygoers hold up their phones to film him talking – a stark reminder that Musk is a mega-celebrity and a hero to many.
A Tesla owner, Alex Poulos, points out that Musk superfans sometimes refer to themselves as “Musketeers.” Kim Paquette, another Tesla owner who is part of an elite group testing new versions of the self-driving software, shows off her collection of HotWheels-sized Teslas and says she’s “honoured” to participate in the testing process. “People who buy a Tesla understand that it’s not self-driving yet,” she says. Even Brown’s family says “part of Joshua’s legacy is the accident” [that caused his death] provided additional improvements, making the new technology even safer,” it said in a statement on behalf of them at a construction dedication for him. “Our family takes comfort and pride in the fact that our son is making such a positive impact on future road safety.”
And yet, says Poulos: “Full self-driving, I paid for that and I don’t have it. It’s right there in its name, right? And I don’t think that’s fair to say.
“Musk, I think he has a huge responsibility,” he adds. “I think he should be a little more careful about what he tells his followers.”
#Elon #Musks #Crash #Key #Arguments #Tesla #Documentary
Comments
Post a Comment