Tesla finds itself once again at the center of a major safety investigation by U.S. federal regulators, raising serious questions about the reliability of its much-touted full self-driving (FSD) technology. But here's where it gets controversial: despite promises of cutting-edge automation, there have been numerous instances where Tesla vehicles, while in full self-driving mode, ran red lights or dangerously drove on the wrong side of the road. Some of these incidents even resulted in collisions and injuries, sparking concerns about the system's readiness and safety.
According to a recent filing from the National Highway Traffic Safety Administration (NHTSA) dated Tuesday, the agency has received 58 separate reports involving Tesla vehicles violating traffic laws during full self-driving operation. Notably, many drivers reported that these unexpected and hazardous behaviors occurred without any prior warning from the car, leaving them unprepared and surprised.
This investigation adds to a series of legal and regulatory challenges Tesla is facing. Just a couple of months ago, a jury in Miami held Tesla partly liable for a fatal 2019 crash linked to its Autopilot driver-assist feature—a system distinct from full self-driving—resulting in a hefty damages award exceeding $240 million to the victims' families. Tesla has announced it will appeal this decision.
The current investigation scrutinizes nearly 2.9 million Tesla vehicles equipped with either version of the FSD software. There are two versions in play: Level 2 “Full Self-Driving (Supervised),” which mandates that drivers remain fully attentive and ready to take control, and a more advanced version that Tesla is still testing, which purportedly requires no driver intervention at all—this latter system has been the subject of high expectations from Tesla’s CEO Elon Musk, who has promised its rollout for years but has yet to fully deliver.
This probe continues a troubling pattern, as earlier investigations have linked Tesla's FSD to multiple crashes and even fatalities. Tesla insists that its system is not truly autonomous and that drivers must always be alert and prepared to intervene. However, the real-world results seem to challenge this narrative.
Beyond FSD, Tesla is also under scrutiny by the NHTSA for its “summon” feature, which allows cars to autonomously navigate to their owners, a technology that has reportedly caused minor collisions in parking lots. Additionally, a separate NHTSA probe launched last year targets driver-assistance functions in 2.4 million Teslas after crashes in conditions with low visibility such as fog, including a tragic case where a pedestrian died.
Moreover, in August, NHTSA opened yet another investigation focusing on Tesla’s apparent delays in reporting crashes promptly to the agency—a legal requirement designed to ensure swift action and public safety.
It’s important to note that such federal investigations often pave the way for vehicle recalls, putting Tesla under intense pressure to demonstrate that its latest software updates truly enhance safety and reliability, ideally reaching a point where drivers no longer need to constantly monitor the road.
Elon Musk has set an ambitious target to have hundreds of thousands of fully self-driving Tesla cars and robotaxis operating on public roads by the end of next year. But the big question remains: can Tesla overcome these safety challenges and regulatory hurdles, or are these investigations signaling deeper issues with the technology?
In the meantime, Tesla’s stock took a hit, falling 2% on Thursday amid the mounting scrutiny. What do you think—are these investigations fair warnings for Tesla to improve, or is the company being held back by cautious regulators? Share your thoughts and let the conversation begin.