Tesla CEO Elon Musk is currently being investigated by the U.S securities and exchange commission over false self-driving claims.
SEC’s recent move against Tesla is to determine whether the automaker promoted false capabilities of its FSD and Autopilot offering in a video posted on its website.
Recall that earlier this month, a former senior engineer at Tesla, Ashok Elluswamy who worked on its autopilot software, revealed that a video that was posted by Tesla on its website in 2016, showcasing one of its vehicle driving itself was fake.
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
The video which carried the tag line, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself”, was used by Tesla as an evidence to showcase its vehicle drives itself by relying on its many built-in sensors and self-driving software.
In Ashok’s July deposition, which was used as evidence in a lawsuit against Tesla for a fatal 2018 crash involving former Apple engineer Walter Huang, he disclosed that Musk wanted the Autopilot team to record a demonstration of the system’s capabilities.
He further stated that the video was created using 3D mapping on a predetermined route from a house in Menlo Park, California, to Tesla’s office in Palo Alto, adding that Drivers had to intervene to take control during test runs, and the scenes that were left on the cutting room floor included the test car crashing into a fence in Tesla’s parking lot when trying to park itself without a driver.
Meanwhile Tesla’s CEO Musk promoted the video at that time, tweeting Tesla vehicles require “no human input at all” to drive through urban streets to highways and eventually to find a parking spot.
Based on the SEC investigation, we could see lawsuits or other consequences for Musk, including limitations on his future activity as an officer of a public company if they choose to pursue enforcement of any violations they find.
Meanwhile, Tesla has on several occasions faced litigation for multiple fatal crashes involving its Autopilot system. Its driver-assist technologies, Autopilot and “full self-driving” have been investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”
According to regulators, Tesla vehicles running its Autopilot software were involved in 273 reported crashes in 2021. Meanwhile, in Tesla’s fourth quarter (Q4) report in 2021, it disclosed that it recorded one crash for every 4.35 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, it recorded one crash for every 1.52 million miles driven.
Tesla’s Autopilot is a suite of systems that allows drivers to cede physical control of their electric vehicles, although they must pay attention at all times.
The vehicles can maintain speed and safe distance behind other cars, stay within their lane lines and make lane changes on highways. An expanded set of features, called the “Full Self-Driving” beta, adds the ability to maneuver city and residential streets, halting at stop signs and traffic lights, and making turns while navigating vehicles from point to point.