Waymo exec admits harsh truth about company's safety record
Waymo had a particularly rocky end to 2025. And it doesn't help that a recent comment from a Waymo executive, who aimed to ease people's minds, may have fallen a little short. The latest issue began to surface in November, when the Austin Independent School District noticed a disturbing trend: ...
Waymo had a particularly rocky end to 2025. And it doesn't help that a recent comment from a Waymo executive, who aimed to ease people's minds, may have fallen a little short.
The latest issue began to surface in November, when the Austin Independent School District noticed a disturbing trend: Waymo vehicles were not stopping for school buses that had their crossing guard and stop sign deployed.
Waymo quick facts:
- Waymo One available 24/7 to customers in Los Angeles, Phoenix, and the San Francisco Bay Area, as of July 2025
- Founded in 2009
- Passed first U.S. state self-driving test in Las Vegas, Nevada, in 2012 (Source:IEEE Spectrum)
- Spun out from Alphabet as separate subsidiary in 2016
Waymo robotaxis had been illegally blowing past city school buses an average of 1.5 times per week during the school year. The Austin ISD initially attempted to address the matter privately, sending a letter to Waymo regarding the violations.
The company assured school officials that a software patch had fixed the issue, but there were five more violations in just the two weeks after Waymo claimed the problem was resolved.
On Dec. 1, after Waymo received its 20th citation from Austin ISD for the current school year, Austin ISD decided to release the video of the previous infractions to the public.
By Dec. 5, the company was forced to issue a voluntary recall to fix the issue.
"Holding the highest safety standards means recognizing when our behavior should be better,” Muaricio Peña, chief security officer for Waymo, said at the time, while also praising the company's safety record.
But the company's safety record can be pretty misleading, according to some experts. Photo by Boston Globe on Getty Images
Waymo safety record isn't what it seems
While the Austin ISD incident was the most high-profile, it wasn't Waymo's only big mistake in December.
The week before Christmas, Waymo was forced to suspend service in San Francisco, as apparently its vehicles did not know the “four-way-stop” rule that applies to intersections with inoperable traffic lights.
Related: Waymo is back online in San Francisco, but may struggle after failure
A massive blackout in the city, with more than 800,000 residents, left Waymo vehicles very confused.
The vehicles were filmed stuck at numerous intersections, unsure of how to navigate the situation, causing even more turmoil on the roads as drivers slowly inch past electricity-less city blocks.
“The sheer scale of the outage led to instances where vehicles remained stationary longer than usual to confirm the state of the affected intersections. This contributed to traffic friction during the height of the congestion,” Waymo said in a statement as it temporarily shut down operations in the city.
While autonomous vehicle advocates call people hesitant to use the vehicles "Luddites," high-profile mishaps like these could be contributing to the lack of public enthusiasm for the technology.
According to AAA, just 13% of U.S. drivers would trust riding in self-driving vehicles, which is actually an increase from the same poll in 2024. Still, six in 10 drivers said they were afraid to ride in a self-driving vehicle.
A San Francisco man went viral in November after filming his first ride with Waymo. Seconds after starting his trip, the autonomous vehicle attempts to pull away from the curb, nearly hitting the vehicle that was about to pass to its left.
The man in the vehicle screams as the car behind him honks furiously. According to the news report of the incident, the man said he will never take a Waymo ride again.
Waymo’s safety data show that its vehicles are significantly safer than human drivers, but the closer you look at the data, the less convincing they become.
“In like 95% of situations where a disengagement or accident happens with autonomous vehicles, it’s a very regular, routine situation for humans,” Henry Liu, professor of engineering at the University of Michigan, said recently. “These are not challenging situations whatsoever.”
"We have seen many reports from autonomous vehicle developers, and it looks like the numbers are very good and promising,” Liu said. “But I haven’t seen any unbiased, transparent analysis on autonomous vehicle safety. We don’t have the raw data."
Even the data from Waymo are suspect, according to Liu.
Waymo vehicles primarily drive on urban streets with a speed limit of 35 miles per hour or less. "It's not really fair to compare that with human driving," according to Liu.
Waymo admits that fatal crash mitigation data are incomplete
After consistently declining for 30 years, roadway fatalities in the U.S. have risen over the past decade.
Fatalities jumped to nearly 35,000 in 2015, an 8% increase from the year prior, and rose another 6.5% the following year, according to U.S. Transportation Department data. Fatalities peaked in 2021 at 43,230, representing a 10.8% year-over-year increase from the previous year.
Related: Waymo customer swears off autonomous driving after close call
Waymo, the most widely adopted autonomous driving company, has the most passenger miles under its belt, so it gets more scrutiny than its rivals, Tesla Robotaxi and Zoox.
“Waymo is already improving road safety in the cities where we operate, achieving more than a tenfold reduction in serious injury or worse crashes,” Trent Victor, Waymo's director of safety research and best practices, recently told Bloomberg.
Waymo has driven approximately 127 million miles across its fleet and has been involved in at least two crashes with fatalities. However, the autonomous vehicle was not directly found responsible for either of them.
The problem is that this actually represents a higher death-per-mile rate than that of average American drivers, who travel about 123 million miles for every fatality.
Victor acknowledged that “there is not yet sufficient mileage to make statistical conclusions about fatal crashes alone,” adding that “as we accumulate more mileage, it will become possible to make statistically significant conclusions on other subsets of data, including fatal crashes as its own category.”
Activists question the safety of autonomous driving on city streets
While Waymo operates in major cities across the country and expands its footprint in some of those same cities, some city dwellers are not fond of the idea of robots operating two-ton vehicles with minimal oversight.
Advocacy groups are making their voices heard in New York City, where Waymo recently received permission to conduct tests.
“This was a pilot initiated with very little public input,” Michael Sutherland, a policy researcher with Open Plans, told Gothamist. “From a safety perspective, this is a technology that hasn’t been tested out in incredibly dense cities like New York City.”
Waymo did not return a request for comment.
Waymo says compared to those with human drivers, its autonomous vehicles have been involved in 88% fewer crashes with serious injuries.
However, groups such as Safe Street Rebel say they have documented hundreds of crashes and failures by autonomous vehicles over the years.
Related: Waymo pumps the brakes as dangerous issue comes to light
What's Your Reaction?