In 2016, Tesla announced to the world the greatest technological innovation in the automotive industry in decades: self-driving cars. No warnings, asterisks, or fine print. An advertising video released by the brand that year began with an advance note: “The person in the driver’s seat is only there for legal reasons. He’s not doing anything. The car drives itself.” After all, it wasn’t like that.
The video shows the Model X car doing things the system was not programmed to do, autopilot software manager Ashok Elluswamy reveals in court testimony following a deadly crash that prompted a lawsuit against the company in 2018.
In the statements transcribed by the court, to which the Reuters agency had access, the Tesla engineer explains that his team received orders from CEO and founder Elon Musk to create a video that “demonstrated the capabilities of the system.”
“The intent of the video was not to demonstrate exactly what was available to consumers in 2016, but rather to show what was possible to build into the system,” he explained.
Full self-driving hardware on every Tesla from Tesla on Vimeo.
The entire video was staged, far from what a real driving scenario would be. The circuit was programmed with the help of a 3D map, from a house in Menlo Park, California, to the location of Tesla’s headquarters at the time, in Palo Alto, and the driver had to intervene several times.
The system was not even set up for the car to stop at a red light and turn to a green one, as shown in the video. And when he was trying to prove he could park himself, with the driver already out of the car, the test Model X hit a fence in the Tesla parking lot, says Ashok Elluswamy.
However, when the video was posted, Elon Musk described it this way on Twitter: “Tesla drives itself (without human intervention) along highways, highways and city streets, and then finds a place to park. park”.
Tesla drives itself (without any human intervention) through urban streets from highway to streets, then finds a parking spot https://t.co/V2T7KGMBo
-Elon Musk (@elonmusk) October 20, 2016
A more cautious message is conveyed on the website, stating that the autopilot technology was designed to assist with steering, braking, speed and lane change, but “does not make the vehicle autonomous.”
Tesla has been the subject of a criminal investigation since 2021 based on the claim that the brand’s cars can “drive themselves” after numerous accidents, some with fatalities.
In the case that led to court testimony from Ashok Elluswamy, an accident that killed Apple engineer Walter Huang in 2018, the national transportation and safety committee concluded was caused by the distraction of the person behind the wheel and the limitations of the driver. automatic, which failed to detect that the driver was not paying attention.
Officially, the brand advises drivers to keep their hands on the wheel when using autopilot, but there are ways to “cheat the system,” Ashok Elluswamy explained in court, arguing that the technology is safe, as long as the car doesn’t be it. Indeed, driving alone.
Source: TSF