Raffi Krikorian knows a thing or two about self-driving cars. As the former head of Uber’s autonomous vehicle division, he spent years training safety drivers to intervene when the system failed. During his two-year tenure, Uber’s pilot program recorded zero accidents. Zero. So when his Tesla Model X, running Full Self-Driving (FSD), suddenly lost control on a routine Sunday drive through the Bay Area and slammed into a concrete wall, the irony wasn’t exactly subtle.
Krikorian had his son in the back seat. He walked away with a concussion and days of neck pain. The kid was unharmed. The car was totaled. And on the insurance report, only one name appeared, his name.

Tesla’s FSD is classified as a Level 2 driver assistance system, which in legal terms means the human is always responsible, regardless of what the software decided to do in the half-second before impact. Tesla gets the credit when the system works flawlessly. The driver gets the bill when it doesn’t. Krikorian notes that in a Florida manslaughter case, the plaintiff had to hire outside hackers to extract crash data from the vehicle’s chip after Tesla claimed the information wasn’t available.
But the most unsettling part of Krikorian’s account, published in The Atlantic, isn’t the crash itself. It’s what led to it. He describes how FSD’s near-perfect performance across hundreds of drives gradually eroded his alertness. Monitoring a system that almost never fails is, paradoxically, more dangerous than driving a car that’s openly unreliable. An unreliable machine keeps you sharp. A perfect one makes you a passenger. A nearly perfect one gets you killed.

The Insurance Institute for Highway Safety has data on it: drivers using adaptive cruise control for just one month are six times more likely to reach for their phone. After a system disengages unexpectedly, it takes the average driver five to eight seconds to fully re-engage.
Krikorian intervened. The logs confirm he turned the wheel. It wasn’t enough. He closes with a pointed comparison: in July 2025, BYD announced it would assume liability for accidents caused by its automatic parking feature, no insurance claims, no impact on the driver’s record. A limited policy, sure, but proof that sharing responsibility between manufacturer and driver is a choice, not an impossibility. Tesla has simply chosen differently.