fbpx
Industry > Mess

Is Tesla heading for more trouble over its self-driving claims?

US gov’t thinks brand is falsely advertising autonomous driving

Tesla cars running on Autopilot have collided with parked emergency vehicles. PHOTO FROM TESLA

Firebrand CEO Elon Musk likes to boast to his 59 million Twitter followers about the full self-driving (FSD) capabilities of his electric cars, but it seems not everyone is taken in by his company’s claims. Two democratic senators in the United States are currently calling for an investigation by the Federal Trade Commission into the way Tesla advertises its Autopilot and FSD features. Senators Richard Blumenthal of Connecticut and Edward Markey of Massachusetts have written to FTC chair Lina Khan, requesting that she investigate what they believe are “potentially deceptive and unfair practices” deployed by Tesla in the way the carmaker is marketing these technologies.

It’s the second bit of bad news for the brand within a few days, after the National Highway Traffic Safety Administration launched a formal investigation into a spate of crashes involving Teslas and first-responder vehicles. According to the agency, since January 2018, there have been at least 11 instances where a Tesla running on Autopilot has collided with cars that had stopped ahead of it to help with other incidents. The majority of collisions happened at night, and it appears that many of the obstacles that Tesla’s systems might have issues recognizing were even secured with flares or other lights.

Blumenthal and Markey wrote: “We fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public.” They are, of course, aiming at various comments from Musk about the topic, and also a YouTube video by Tesla called “Full Self-Driving.” Their argument is that consumers will think their cars are capable of driving themselves based on all this, when in reality, Tesla vehicles are quite some ways away from achieving that. On the vehicle-autonomy scale that reaches from 0 to 5, Tesla has only reached Level 2 so far, meaning it offers partial driving automation but still very much requires a human operator behind the wheel to function. This places the firm behind the likes of Audi and Honda, which both have Level 3 cars on the market today.

Any customer would have been fooled into thinking that Teslas can really drive by themselves. SCREENSHOT FROM TESLA

Until we reach the true self-driving status of Level 5—where no human driver is required anymore and even traditional driving controls can be removed—it will still take some time. A lot of time, actually. Too complex are the systems needed to make everything work even in a perfect road environment with standardized signs and orderly traffic. To give you a simple example of how complicated this issue is, when we went to Australia to drive the Chevrolet Colorado, the engineers there also talked about autonomous driving and its many challenges. They even had problems getting the system to recognize the same traffic signs from different states within the country as these weren’t standardized enough.

Don’t even start thinking about using this tech on the chaotic roads of Metro Manila, where even the most advanced in-car computer will quickly capitulate (and indeed, some manufacturers switch off part of their driver-assistance features on vehicles for sale in the Philippines). Elon Musk likes to maintain that his expensive EVs can drive themselves, but the truth is that they need humans doing most of the driving to get from A to B. Claiming otherwise is at the very least questionable, and it will be interesting to see what the outcome of any FTC probe will be.



Frank Schuengel

Frank is a German e-commerce executive who loves his wife, a Filipina, so much he decided to base himself in Manila. He has interesting thoughts on Philippine motoring. He writes the aptly named ‘Frankly’ column.



Comments