FSD hits the next level: Tesla wants you to text and drive

Ippolito Visconti Author Automotive
Musk declared that Tesla is “almost” ready to allow owners utilizing the supervised FSD system to “text and drive”.
FSD tesla

Elon Musk has once again revised the timeline for Tesla‘s long-promised, perpetually delayed “unsupervised” Full Self-Driving (FSD). Musk pushed the delivery date, promised every year for the last six, from its imminent end-of-year deadline to the vague realm of “a few months away”, likely sometime in 2026. This move, however, was immediately overshadowed by a much more audacious, if legally questionable, announcement.

Musk declared that Tesla is “almost” ready to allow owners utilizing the supervised FSD system to “text and drive”. And also within “a month or two”. Texting while driving is illegal in most jurisdictions, including the entire United States, carrying significant fines and legal penalties. For Tesla to “allow” this feature, it would necessitate assuming legal liability for the consumer’s vehicle while FSD is engaged.

FSD tesla

This would be the definition of achieving SAE Level 3-5 autonomy, or the long-promised, non-supervisory self-driving status. But there is zero evidence that Tesla has even begun the exhaustive legal and regulatory steps required for this shift.

For now, the company limits its true autonomous operations to internal pilot programs, like the Robotaxi fleet in Austin. It uses onboard supervisors to prevent an accident approximately every 100,000 kilometers. This supervised reality stands in stark contrast to the CEO’s fantasy of enabling distracted driving.

Musk’s only defense for this dangerous proposal is that Tesla will “look at the data” before enabling the feature. This is rich, considering Tesla is infamous for its opaque data practices.

FSD tesla

The company’s quarterly “Autopilot Safety Report” is a masterpiece of carefully curated information. It’s designed to give the illusion of safety without providing any real answers. The methodology is self-reported, counting only crashes severe enough to deploy airbags or restraint systems. Furthermore, the report exhibits a clear road bias, as Autopilot is primarily used on inherently safer, limited-access highways, while the comparative US average includes all road types, including much more dangerous urban streets.

While Tesla did, for the first time, separate Autopilot and FSD mileage figures, the core issue remains. The data does not prove that FSD crashes once every X million miles. It only proves that human plus FSD crashes once every X million miles. According to Tesla’s own definition of a collision. Nevertheless, if the company goes through with this, it will be the first time a major automaker has actively encouraged drivers to break the law.