Elon Musk is never shy of controversy, and it looks like the maverick billionaire’s car company is back in the limelight as a newly released mode allows the car to speed, and drive itself aggressively.
The U.S. National Highway Traffic Safety Administration (NHTSA) is seeking details from Tesla Inc. about a newly reintroduced driver-assistance feature called “Mad Max” — a mode that allows vehicles to operate at higher speeds and make more assertive maneuvers than other versions of its Full Self-Driving (FSD) system.
The regulator confirmed it is “in contact with the manufacturer to gather additional information,” following reports that vehicles equipped with the feature have been seen overtaking aggressively, weaving through traffic, and exceeding posted speed limits. The agency reiterated that drivers remain legally responsible for vehicle operation and adherence to traffic laws, even when driver-assistance systems are engaged.
The inquiry comes just weeks after NHTSA opened a broader investigation into roughly 2.9 million Tesla vehicles equipped with FSD, citing 58 reports of potential safety issues — including 14 crashes and 23 injuries — involving red-light violations and erratic lane behavior.
Tesla has not commented publicly on the new inquiry, but company materials describe the mode as offering “higher speeds and more frequent lane changes” compared to other FSD profiles.
This is not the first time Tesla has faced questions over how its software manages assertive driving behavior. In 2022, the company issued a recall affecting more than 50,000 vehicles to disable a “rolling-stop” feature that allowed cars to move through intersections without a complete halt. Federal regulators at the time said the feature violated traffic laws.
The return of a similarly aggressive driving mode has drawn renewed attention, particularly as Tesla remains under scrutiny from multiple government agencies, including the California Department of Motor Vehicles, over how it markets and names its Autopilot and Full Self-Driving systems.
Earlier this month, Tesla’s latest software release introduced two new driving profiles at opposite ends of the speed spectrum: “Sloth Mode,” designed for slower travel, and “Mad Max,” which emphasizes faster lane changes and higher cruising speeds. The name — borrowed from the dystopian film franchise known for chaotic driving — has raised questions among safety advocates about Tesla’s messaging at a time when its automation systems remain under investigation.
For insurers, the development is significant. Advanced driver-assistance systems (ADAS) like Tesla’s FSD are already reshaping the risk landscape — complicating questions of liability when accidents occur and prompting a rethink of how risk is priced.
The reintroduction of a mode explicitly tied to more assertive driving could influence underwriting assumptions around frequency and severity of claims. Industry observers note that insurers are closely watching how regulators define “driver responsibility” in semi-automated systems. If an incident occurs while a driver-assist mode is engaged, the balance of liability between human driver and manufacturer remains a grey area.
NHTSA has emphasized that Tesla’s FSD remains a supervised system that requires constant driver attention. It does not make vehicles autonomous. Yet, as investigations multiply, insurance professionals are increasingly concerned about how quickly manufacturers are deploying features that blur the line between assistance and autonomy.
The outcome of the federal inquiry into “Mad Max” mode could therefore extend beyond Tesla, influencing how regulators and insurers alike evaluate future automated-driving technologies.
Globally, regulators have tended to take a conservative stance on self-driving features. European and Canadian regulators have placed stricter limits on automated lane changes and speed adjustments without driver input. The U.S., by contrast, has relied largely on post-launch enforcement — a reactive model that puts more emphasis on recalls and investigations after potential safety violations arise.
For insurers, that approach means underwriting automation risk remains an exercise in uncertainty. If Tesla’s “Mad Max” mode is found to induce unsafe or illegal driving behavior, carriers may revisit assumptions about how driver-assistance systems interact with human oversight, and how policy exclusions should be worded to address technology-induced negligence.
While Tesla markets its FSD as capable of navigating complex roads with minimal intervention, the company stresses that human drivers must remain alert and in control at all times. For now, regulators appear focused on ensuring that drivers — not algorithms — bear the ultimate responsibility for compliance with traffic law.