fbpx
Industry > Mess

Did Tesla fake its ‘full autopilot‘ video?

The EV automaker is in hot water once again surrounding its autonomous driving capabilities

This is the video in question, where it has obviously been edited and sped up to make it watchable. SCREENSHOT FROM TESLA

Back in 2016, Tesla published a video on its blog that claimed to demonstrate the company’s ability to build self-driving cars. The blog post, called “Full Self-Driving Hardware on All Teslas, and the roughly four-minute-long clip are still online at the time of this writing, and was even promoted on Twitter by chief executive Elon Musk at the time as evidence that “Tesla drives itself.”

In the video, a Tesla is seen pulling out from a driveway onto public roads and then making its way across an unidentified town. Along the way, it negotiates highways, stops at red lights and intersections, and even parks itself at its final destination.

While a driver can be seen sitting in the driver’s seat, a message at the beginning states that the person in the driver’s seat is only there for legal reasons” and “he is not doing anything. The car is driving itself.” Now a news report claims that not all was quite as it seemed with the footage.

Autopilot uses cameras and software to identify objects and routes without any sensors like LIDAR for redundancy. SCREENSHOT FROM TESLA

As news agency Reuters reports, a senior engineer at the carmaker is now apparently on record as saying that the video was staged. The news agency cites a deposition involving Tesla’s director of Autopilot software, Ashok Elluswamy, that was carried out as part of legal action over a driver’s death in a Tesla in 2018.

A deposition is an opportunity for the different parties in a civil lawsuit to obtain testimony from a witness under oath prior to a trial, and the one in this case seemingly recorded Elluswamy as saying that the intent of the video “was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system.

He goes on to say that the car was driving along a preset route, and that the human driver did intervene during trial runs. Elluswamy also stated that the car crashed into a fence in the Tesla parking lot while trying to park itself.

Tesla's self-driving software has led to fatal crashes in recent years. PHOTO FROM REUTERS

Reuters approached Elluswamy, Musk and Tesla for comment, but none of them replied when the story went to print. The lawsuit in question came about as the result of the death of 38-year-old Apple engineer Walter Huang, who died in 2018 when his Model X crashed into a concrete barrier in California.

His widow is now suing Tesla, claiming that the carmaker is promoting its self-driving systems as being safer than they really are. The complaint also claims that Tesla added safety features after Huang’s death that would have saved him, including the ability to change lanes independently, transition from one highway to another, exit a highway, and activate automatic emergency braking.

The lawsuit is scheduled to go to trial in March, and is one of several filed against the firm by families of killed drivers. At least 35 crashes resulting in 19 deaths involving Teslas have been investigated by US traffic safety regulators since 2016, while Tesla states on its website that until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain in control of their car at all times.



Frank Schuengel

Frank is a German e-commerce executive who loves his wife, a Filipina, so much he decided to base himself in Manila. He has interesting thoughts on Philippine motoring. He writes the aptly named ‘Frankly’ column.



Comments