Federal Regulators Expand Probe into Tesla's Full Self-Driving Over Fatal Crashes, Encompassing 2.4 Million Vehicles

Image for Federal Regulators Expand Probe into Tesla's Full Self-Driving Over Fatal Crashes, Encompassing 2.4 Million Vehicles

A recent social media post by Adam Lowisz, stating, > "This is why we need Tesla FSD. Tesla FSD would prevent this," has reignited public discussion surrounding the safety and capabilities of Tesla's Full Self-Driving (FSD) system. This sentiment emerges as federal regulators intensify their scrutiny of the advanced driver-assistance technology, particularly following a series of incidents, including a fatal pedestrian collision.

As of August 2025, Tesla's FSD Beta software, currently at version 12.4.1, continues to be classified as SAE Level 2 automation. This means the system, while offering features like urban driving and intersection navigation, still requires active driver supervision at all times. Tesla explicitly states that drivers must remain attentive and prepared to take over control, as the system does not render the vehicle fully autonomous.

The National Highway Traffic Safety Administration (NHTSA) initiated a significant investigation in October 2024, covering approximately 2.4 million Tesla vehicles equipped with FSD. This probe was prompted by at least four reported collisions where FSD was engaged, including a fatal incident in November 2023 where a pedestrian was struck by a Tesla Model Y in Arizona. These crashes notably occurred under conditions of reduced roadway visibility, such as sun glare, fog, or airborne dust.

NHTSA's inquiry is assessing FSD's engineering controls and their ability to detect and respond appropriately to challenging visibility conditions. The agency has previously issued recalls related to FSD Beta (February 2023) and Autopilot (December 2023) due to concerns about the systems not adequately adhering to traffic safety laws and potential driver misuse. The U.S. Justice Department has also issued subpoenas concerning Tesla's FSD and Autopilot systems.

Critics and safety experts have raised concerns about Tesla's "camera-only" approach to autonomous driving, arguing it may be less robust in adverse conditions compared to systems that integrate additional sensors like lidar and radar. Leaked "Tesla Files" in July 2025 further revealed thousands of customer complaints regarding unintended acceleration, braking issues, and over 1,000 documented crashes, highlighting ongoing safety questions and issues with data transparency in accident investigations. Tesla's internal safety reports have also faced criticism for comparing only airbag-deploying crashes, potentially understating overall incident rates.

Despite regulatory warnings and ongoing investigations, Tesla CEO Elon Musk continues to promote FSD's capabilities, including plans for robotaxi services. However, NHTSA has cautioned against public communications that might encourage drivers to disengage from the driving task, reiterating that Tesla vehicles are not self-driving and require constant human oversight. The ongoing federal oversight underscores the significant challenges and safety debates surrounding the widespread deployment of advanced driver-assistance systems.