One of The purported benefits of self-driving car technology are that each car can learn from a vehicle’s mistakes. Here’s how Waymo puts it his website: “The Waymo driver learns from the collective experiences collected within our fleet, including previous generations of hardware. »
But in Austin, Waymo vehicles struggled for months to learn how to stop for school buses as drivers picked up and dropped off children. An official from the Austin Independent School District (AISD) alleged that the vehicles had, in at least 19 cases, “illegally and dangerously” passed district school buses while their red lights were flashing and their stop arms were extended instead of coming to a complete stop, as required by law.
In early December, Waymo even issued a federal recall related to the incidents, acknowledging at least 12 of them to federal regulators at the National Highway Traffic Safety Administration (NHTSA), which oversees highway safety. According to federal documents, engineers at the autonomous vehicle company had “developed software modifications to address this behavior” weeks before.
But even after the recall, school bus passing incidents continued, according to school officials and a report from the National Transportation Safety Board (NTSB), an independent federal safety oversight agency that is also investigating the situation.
Now, emails and text messages between school officials and Waymo representatives, obtained by WIRED through a public records request, show efforts by the Austin Public School District and Waymo to try to resolve the issue. AISD even held a half-day “data collection” event in a school parking lot in mid-December, documents show, with several employees rounding up school buses and stop signals from across the fleet so the self-driving car company could collect information related to the vehicles and their flashing lights.
Yet by mid-January, more than a month later, the school district reported that at least four more school bus passing incidents had occurred in Austin. “The data we collected from the beginning of the school year to the end of the semester shows that about 98 percent of people who receive a violation do not receive another violation,” a school police official said. told the local NBC affiliate that month. “That tells us that the person is learning, but it doesn’t appear that the Waymo automated driving system is learning through its software updates, recall, etc., because we still have violations.”
The situation raises questions about curious blind spots in self-driving technologies and the industry’s ability to compensate for them even after they are spotted.
Autonomous driving software has long struggled to recognize flashing emergency lights and traffic safety devices with long, thin arms, including barriers and stop arms, says Missy Cummings, who researches autonomous vehicles at George Mason University and served as a safety adviser to NHTSA during the Biden administration. “If [the company] “I didn’t solve this problem until a few years ago, the more they drive, the more of a problem it’s going to be,” she said. “That’s exactly what’s happening here.”
Waymo did not respond to WIRED’s requests for comment. A spokesperson for the Austin Independent School District referred WIRED to the NTSB while the incidents are investigated. An NTSB spokesperson declined to answer questions from WIRED while its investigation continues.
Illegal overtaking
By the middle of winter 2025, AISD officials were frustrated. In one of 19 incidents alleged by a district attorney in a letter Later released by federal highway safety regulators, a Waymo passed a school bus letting children off “just moments after a student crossed in front of the vehicle, and while the student was still on the road.”
“Alarmingly,” the attorney wrote, five of the alleged incidents occurred after Waymo assured the district that it had updated its software to resolve the problem. Federal regulators with NHTSA had already launched an investigation in behavior. “Austin ISD is evaluating all potential legal remedies available to it and intends to take all necessary measures to protect the safety of its students, if necessary,” the attorney warned.
A few days later, on the same day in early December that news of Waymo’s recall made headlines, Rob Patrick, Waymo’s emergency response and outreach manager, called the deputy chief of the Austin District Police Department, according to information obtained by WIRED. Patrick and Waymo had a proposition, according to an email Deputy Chief Travis Pickford sent to his district counterparts: The company wanted to collect data on the district’s school buses, presumably so its cars could more easily identify when they shouldn’t be passed.
“Specifically, they wanted to focus their data collection on the orange and red light signals on each of our school buses and the ability of their cars to see them at different distances,” Pickford wrote in the email.
The following Monday, three days later, school transportation officials agreed to gather at least seven buses representing all models of their fleet of more than 550 vehicles at the school’s sports complex. Officials even sent Waymo the configuration specifications for the bus lights. By mid-afternoon the following Wednesday, the data collection event was over and Waymo told school officials the company had what it needed.
But Waymo’s school bus passing incidents haven’t stopped.
A preliminary report from the NTSB published in early March discovered that a subsequent incident on January 12 occurred after a Waymo remote assistant, a Michigan-based human charged with “helping” the software when it had difficulties on the roadwrongly told the robo-taxi that the school bus in front of it did not have active signals. Six vehicles passed the school bus while it was stopped, the agency said. The investigation is still ongoing.
Just days after these January incidents, a Waymo struck a child who was crossing in front of the vehicle near a school in Santa Monica, California. The child is said to be unharmed and Waymo said later that his models showed that a human driver would have hit the child at a speed higher than that of his car.
Robot learning
Cummings, an autonomous vehicle researcher, says she’s not surprised that Waymo’s data collection event didn’t solve all of her school bus problems. It can take weeks or even months for software developers to use new data to train a self-driving car’s AI-driven systems. Additionally, data collected in the parking lot of a sports facility, outside of the context in which Waymo’s cars typically drive, might not be of much help. “Data generated in a parking lot won’t be enough,” she says.
Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they have resolved this issue and can demonstrate it with specific testing.”
Philip Koopman, an autonomous vehicle software and safety researcher at Carnegie Mellon University, says school bus incidents could be particularly challenging for Waymo software because stop signs mean slightly different things in different contexts. There are stop signs at intersections, stop signs held in the hands of construction workers directing traffic, and stop signs attached to school buses. Waymo is trying to teach its software “something very subtle,” he says.
The entire episode highlights the complex challenges of operating a software-driven robot on sometimes unpredictable human roads. Teaching software to drive safely 99% of the time is the easy part. But “the last percent is really difficult, because we’re trying to teach an exception,” Koopman says.
“Waymo is struggling to teach its machine learning the lesson Waymo wants it to learn,” Koopman says. “It’s no surprise. This was always going to be a problem.”
