U.S. transportation safety officials have launched a formal investigation into Alphabet’s autonomous vehicle unit Waymo after multiple reported incidents in which its self-driving robotaxis illegally passed stopped school buses in Austin, Texas, raising serious safety concerns; this action by the National Transportation Safety Board (NTSB) follows earlier scrutiny from the National Highway Traffic Safety Administration (NHTSA) and a voluntary recall of thousands of Waymo vehicles intended to fix software issues, yet reports indicate violations continued after software updates, prompting school officials to urge operational restrictions and intensifying federal examination of self-driving tech safety protocols.
Sources:
https://techcrunch.com/2026/01/23/waymo-probed-by-national-transportation-safety-board-over-illegal-school-bus-behavior/
https://www.reuters.com/world/us-safety-board-opens-probe-into-waymo-robotaxis-passing-stopped-school-buses-2026-01-23/
https://www.webpronews.com/autonomous-ambitions-hit-a-stop-sign-ntsb-scrutinizes-waymos-handling-of-school-buses/
Key Takeaways
• Federal authorities have escalated scrutiny of autonomous vehicle safety, with the NTSB investigating Waymo’s compliance with school bus stop laws.
• Waymo previously issued a software recall to address robotaxi behavior, but reports from the Austin Independent School District suggest continued illegal passing incidents post-update.
• The probe adds pressure on an industry already under regulatory review, as local officials call for operational limits when children are present.
In-Depth
In an unfolding transportation safety story that puts autonomous vehicle oversight at center stage, the United States National Transportation Safety Board has opened an official investigation into how Waymo’s robotaxis handle encounters with stopped school buses. According to reporting from multiple outlets, federal safety investigators are examining numerous incidents in which driverless vehicles operated by Waymo allegedly failed to stop when encountering buses with extended stop signs and flashing lights in Austin, Texas. The NTSB inquiry comes on the heels of a separate probe by the National Highway Traffic Safety Administration, highlighting a deepening concern among regulators about how self-driving systems adhere to fundamental traffic laws designed to protect children.
Waymo, a division of Alphabet and a high-profile player in the autonomous vehicle space, had previously acknowledged issues with its software. In late 2025 the company initiated a voluntary recall of more than 3,000 vehicles to address problems with how its robotaxis respond to stopped school buses. Despite the company’s efforts to push out software updates, local officials in the Austin Independent School District reported that violations persisted, with nearly two dozen incidents recorded since the start of the school year. Those reports have triggered both local calls for restricting Waymo operations during peak school hours and new federal scrutiny.
While Waymo has stated that its data shows improvements and an overall safety advantage over human drivers, the continuation of these violations raises questions about the reliability of autonomous systems in complex real-world traffic environments. Federal investigators will spend months collecting evidence, interviewing stakeholders, and analyzing system behavior to determine whether the robotaxis’ programming sufficiently prioritizes established road safety rules, particularly those designed to protect children. In the broader context, this investigation underscores the challenges self-driving technology faces as it transitions from testing to full public deployment, and it may influence regulatory frameworks that govern autonomous vehicles nationwide.

