Elon Musk’s attack on Waymo is a hypocritical projection

Ippolito Visconti Author Automotive
Musk recently had the gall to dismiss Waymo’s current fleet of 2,500 operational, fully self-driving vehicles as “rookie numbers”.
tesla robotaxi

Year after year since 2018, Elon Musk has delivered his famously reliable prediction. Tesla will finally solve Full Self-Driving “by the end of the year” or, failing that, “next year”. It has never happened. This year’s supposed victory lap involved the launch of a Robotaxi service in Austin, Texas. An effort so small-scale and geofenced it immediately exposed Musk’s hypocrisy, as he has previously criticized Waymo for the exact same limitations.

tesla robotaxi

The situation becomes even more ironic when comparing fleet sizes and safety records. Despite running its service with human supervisors, Tesla’s Robotaxi crash rate is nearly double that of Waymo, which operates its fully autonomous service without human employees onboard.

Yet, Musk recently had the gall to dismiss Waymo’s current fleet of 2,500 operational, fully self-driving vehicles as “rookie numbers”. To put that comment in context, Tesla is believed to have a paltry 30 Robotaxis in Austin. While the company claims to run a service in the Bay Area, it’s officially categorized as simple ride-hailing because human drivers remain in the driver’s seat, and Tesla hasn’t even applied for the necessary autonomous driving permit in California.

waymo robotaxi

Tesla has resorted to increasingly disingenuous claims about its Full Self-Driving (FSD) system being statistically “safer than humans”. This misleading data relies on the quarterly “Autopilot Safety Report”, a document notoriously optimized for PR, not transparency.

The report suffers from three main flaws. First, the methodology is self-declared. Only accidents severe enough to deploy an airbag are counted, conveniently omitting minor fender-benders. Second, there is a distinct road type bias. Autopilot is predominantly used on safer, limited-access highways, skewing the comparison against human drivers whose data includes dangerous urban and rural roads. Finally, the driver mix is inherently safer. Tesla owners are typically affluent, tech-savvy early adopters who statistically crash less frequently than the general public.

While Tesla has recently begun separating Autopilot and FSD mileage data, the foundational problem remains a matter of semantics: even with the flawed data, the correct claim Tesla should be making is that FSD with human supervision is safer than human drivers alone. The battle is not “FSD versus humans”, it is FSD + humans versus humans.