Tesla shares fall as Feds launch investigation of nearly 2.9 million vehicles

Auto insurance issues come to the fore as manufacturer's software questioned

Tesla shares fall as Feds launch investigation of nearly 2.9 million vehicles

Insurance News

By

Self-driving cars have been billed as one of the major uncertainties facing the auto insurance industry. If the car drives itself – who is at fault if there is an accident? That question is suddenly front and center as news brakes that Tesla’s software systems are in the spotlight.

Federal auto safety regulators have opened a sweeping investigation into nearly 2.9 million Tesla vehicles fitted with the company’s Full Self-Driving (FSD) system, following dozens of reports that the software caused cars to run red lights, veer into oncoming traffic, and crash.

The National Highway Traffic Safety Administration (NHTSA) said the new inquiry will review 58 incidents involving traffic violations attributed to FSD, including 14 crashes that left 23 people injured. Officials said they are examining whether Tesla’s driver-assistance technology “induced vehicle behavior that violated traffic safety laws.”

Six of the reported crashes occurred when Teslas drove into intersections against red traffic signals, according to NHTSA. Four of those collisions resulted in injuries.

The probe covers nearly every Tesla in the United States equipped with FSD, a Level 2 driver-assistance system that requires constant human supervision. Regulators said they are also reviewing how FSD performs near railroad crossings, after lawmakers raised concerns about near misses in recent months.

“This is a preliminary evaluation,” the agency said, describing it as the first step toward a possible recall if an unreasonable risk to safety is found.

Expanding Regulatory Scrutiny

The investigation extends a series of federal reviews into Tesla’s automated-driving features. The company has faced criticism for marketing its systems with names such as “Full Self-Driving” and “Autopilot,” which suggest higher levels of automation than the technology actually provides. Tesla maintains that FSD “will drive you almost anywhere with your active supervision, requiring minimal intervention,” but that it does not make the car autonomous.

A driver in Houston told NHTSA in 2024 that their Tesla “is not recognizing traffic signals. This results in the vehicle proceeding through red lights, and stopping at green lights. Tesla doesn’t want to fix it, or even acknowledge the problem, even though they’ve done a test drive with me and seen the issue with their own eyes.”

Tesla issued a software update this week but has not commented publicly on the investigation.

Legal and Financial Pressures Mount

The probe lands amid a wave of litigation and investor concern. In August, a Miami jury found Tesla partly responsible for a 2019 fatal crash involving its Autopilot system — a separate but related driver-assistance feature — awarding more than $240 million in damages. Tesla said it would appeal.

Congress and state regulators have stepped up scrutiny of automated driving systems, questioning whether safety claims have outpaced reality. The confirmation of a new NHTSA administrator has also accelerated enforcement actions across the automotive sector.

Tesla’s other automation features remain under review. Earlier this year, NHTSA began investigating 2.6 million vehicles with a remote “summon” function, which allows owners to direct their cars to drive toward them without a driver inside. That feature has been linked to several parking-lot crashes.

Another inquiry launched in 2024 covers 2.4 million Teslas equipped with FSD after collisions in fog and glare, including one fatality. A separate August investigation is examining whether Tesla has failed to report accidents to regulators as required by federal law.

Implications for the Insurance Industry

The investigation carries significant implications for insurers and the allocation of liability in semi-autonomous driving. Traditionally, insurers have borne the cost of collisions through personal motor coverage, with drivers held legally responsible for their own conduct. But if regulators or courts conclude that crashes stem from defective vehicle software, the balance could shift toward manufacturer liability.

Such a shift would be profound. It could reshape the auto insurance market by moving claims away from personal lines toward product liability, professional indemnity, or manufacturer-level captive insurance. Underwriters would need to model not only driver risk, but also the failure probabilities of complex, AI-driven software systems.

Carriers might have to develop new pricing mechanisms to reflect dual exposure: driver negligence and potential design flaws. Some industry executives already foresee hybrid policies combining motor, technology errors and omissions, and product liability coverage to account for such risk-sharing.

For insurers, Tesla’s FSD probe is more than a question of driver safety—it is a preview of how fault may be assigned in the coming era of semi-autonomous vehicles. A single software update that fails to recognize a red light could create systemic liability exposure across millions of cars.

Market Reaction

Tesla shares fell roughly 2 percent on Thursday. Analysts said investors are increasingly sensitive to reputational and legal risks associated with automation, a technology once viewed as the company’s strongest advantage.

As the NHTSA inquiry progresses, regulators, courts, and insurers will all be watching not only for a potential recall, but also for precedent-setting decisions about responsibility in an age when software, not humans, may be at the wheel.

Stephen Owens

Keep up with the latest news and events

Join our mailing list, it’s free!