Waymo, the autonomous-vehicle subsidiary of Alphabet, said on December 5, 2025 that it will voluntarily file a software recall with federal regulators after its self-driving robotaxis failed multiple times to properly stop for school buses — including at least 19 incidents documented this school year in Austin, Texas, where vehicles illegally passed stopped buses with stop-arms extended and red lights flashing. The recall follows an internal update deployed on November 17 that the company said “meaningfully improved performance,” but regulators and school-district authorities say violations continued afterward, prompting a full safety review under the oversight of the National Highway Traffic Safety Administration (NHTSA). Waymo insists it remains committed to safety and will continue rolling out software updates as needed.
Sources: Reuters, TechCrunch
Key Takeaways
– Waymo’s robotaxi software failed to reliably respond to stopped school-bus signals — a critical safety scenario — at least 19 times this school year in Texas.
– Despite a patch released mid-November, the issues persisted, prompting Waymo to issue a formal voluntary recall rather than wait for regulatory enforcement.
– The recall triggers a federal investigation by NHTSA and raises fresh doubts about the readiness of fully autonomous vehicles to handle high-stakes, pedestrian-heavy situations.
In-Depth
The latest safety fiasco surrounding Waymo underscores the high stakes involved in deploying fully autonomous vehicles — especially in scenarios involving children, school traffic, and unpredictable pedestrian movement. On December 5, 2025, Waymo publicly committed to issuing a voluntary software recall after acknowledging that its robotaxis failed multiple times to stop for school buses with flashing red lights and extended stop-arms — a serious safety lapse that violates traffic law and puts students at risk. According to data from Texas school-district officials, at least 19 such incidents occurred this school year, with cameras capturing Waymo vehicles illegally overtaking stopped buses as children were boarding or disembarking.
The company says it initially attempted to fix the problem with a mid-November update, which it asserts brought performance “meaningfully” closer to — or even better than — human-driver behavior. But the continued reports after the update made clear the fix wasn’t sufficient. Rather than wait for regulators, Waymo opted for a formal recall, signaling both an acknowledgment of the bug’s severity and a bid to preserve public trust. By doing so, Waymo hopes to stay ahead of forced enforcement by the National Highway Traffic Safety Administration (NHTSA), which has already opened an investigation and demanded detailed information about the company’s “fifth-generation” self-driving system and school-bus handling logic.
The recall isn’t like a traditional automotive recall that pulls vehicles off the road — Waymo’s cars remain in service while the company deploys software updates. Still, the announcement stands as a stark reminder that even advanced autonomous systems can misjudge complex real-world driving contexts: school-bus loading zones, pedestrian crossings, and children darting across streets remain among the trickiest scenarios for machine judgment. For critics of robotaxi deployment, the recall confirms what many have feared — that autonomy may not yet be capable of matching, much less exceeding, human judgment in high-stakes environments without robust, context-aware safeguards.
From a broader perspective, the recall and federal scrutiny could slow acceptance of self-driving services. Communities and regulators may demand tighter oversight, stricter testing standards, or even temporary moratoriums on driverless deployment in sensitive zones like school districts. For Waymo, the path forward depends on restoring confidence: hiring more safety monitors, improving sensor logic, or introducing operational constraints during peak school traffic hours might all be on the table. Yet for all the promise of lower accident rates and technological innovation, this episode highlights that even the most sophisticated systems remain vulnerable when real life — with all its unpredictability — gets in the way.

