2 Sources
2 Sources
[1]
Robot vs. human: Who's the better driver?
Zoom in: A Waymo robotaxi incident outside a Santa Monica elementary school suggests an AI brain would react faster than a human -- but it's not that simple. Catch up quick: The National Highway Traffic Safety Administration is investigating after a Waymo robotaxi last week struck a child who ran across the street from behind a double-parked SUV near an elementary school. * Santa Monica Police said first responders evaluated the student, with her parent present, and did not report any injuries. The intrigue: Waymo claims its driverless vehicle behaved as expected, slamming the brakes as soon as it detected the child, slowing from 17 mph to under 6 mph before making contact. * A "fully attentive human driver" in the same situation would have hit the child at approximately 14 mph, according to Waymo's computer modeling. * "This significant reduction in impact speed and severity is a demonstration of the material safety benefit of the Waymo Driver, " the company wrote in a Jan. 28 blog post. The big picture: This incident fits into a much larger debate about whether autonomous vehicles can match -- or exceed -- the safety of human drivers. * People fear self-driving cars, yet nearly 40,000 people are killed each year in traffic accidents involving human drivers. * The answer to that question is crucial to winning the public's trust as robotaxis spread quickly across America. Reality check: Reacting quickly isn't the only way to avoid crashes, safety experts tell Axios. Context, judgment and driving experience matter, too. * Young drivers have quicker reflexes, for example -- but old drivers have much better safety records, notes AV safety expert Philip Koopman, emeritus professor at Carnegie Mellon University. * A careful, competent human driver would have avoided a panic stop in the first place by adjusting their driving behavior amid the chaos of school drop-off -- or taking a different route altogether, he argued. Waymo should share video of the incident to provide more context, including what the child was doing before she emerged from behind the car, adds Missy Cummings, former senior safety advisor at NHTSA and now head of the autonomy and robotics center at George Mason University. * NHTSA says it plans to investigate whether the Waymo robotaxi exercised appropriate caution in a school zone. What we're watching: This larger debate is already playing out in the insurance industry, where autonomous vehicles are testing how risk is priced. * At least one online provider, Lemonade, examined Tesla's data and concluded its Full Self-Driving (FSD) software was so much safer than human driving that it warranted a 50% rate cut. The bottom line: Are humans holding robots to a higher standard than themselves? So far, the answer is yes.
[2]
It happened: a Waymo robotaxi has struck a child near a school
The National Highway Traffic Safety Administration has opened an investigation after a Waymo robotaxi hit a child near an elementary school in Santa Monica, causing minor injuries and fueling renewed skepticism over whether autonomous vehicles are ready for prime-time. The National Highway Traffic Safety Administration (NHTSA) says the child ran into the street from behind a stopped or double-parked SUV and was struck by the Waymo vehicle during normal school drop-off hours. The agency says there were other children, a crossing guard, and several other double-parked vehicles in the vicinity. The incident underscores concerns about how Waymo prioritizes the safety of young children in its AI models, and occurred just days after the Austin Independent School District in Texas reported the robotaxis were failing to stop for school buses, and called for Waymo to cease operations entirely during mornings and afternoons on school days - and if that sounds familiar, that's because you're paying attention. Waymo was already facing scrutiny from NHTSA after one of its self-driving cars were caught on video illegally passing a stopped school bus that was letting children off in Atlanta, Georgia. In dealing with its own issues with Waymo, the Austin Independent School District (ISD) representatives did not mince words: Put simply, Waymo's software updates are clearly not working as intended nor as quickly as required. We cannot allow Waymo to continue endangering our students while it attempts to implement a fix. AUSTIN ISD Given the strong emotions that robotaxis seem to elicit (and, of course, the fact that this incident seems to have been a long time coming), you'd think Waymo would have had a polished and prepared response. Instead, "Waymo shows no empathy," according to Electrive. "Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle," reads Waymo's official response. "The Waymo Driver [software] braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made." Waymo further stated that the company immediately notified local police of the incident, and left the vehicle at the scene until it received official clearance to continue on its way. "The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene," Waymo said, in its statement. So, yeah - the optics are bad, and Waymo seems to be blaming the kid for getting hit, which won't win it many fans. From a business perspective, too, Waymo couldn't have picked a worse time to hit a kid (regardless of who , as the company's self driving cars are being deployed in rising numbers across US cities, including San Francisco and Miami, as the company races to show the sort of growth it needs to justify continued investment. For context, GM has shuttered its own self-driving taxi business, Cruise, after investing more than $12 billion (with a "b") in its development, and Tesla seems to be unable to make its loss-leading, camera-only "Robotaxi" robotaxi solution work without human safety monitors either in the vehicle or in closely-following chase cars. The US Senate's Commerce Committee has scheduled a hearing on self-driving car safety for 04FEB2026, which will include Waymo's Chief Safety Officer, Mauricio Peña. SOURCES: Electrive, Fox 11 Los Angeles, Reuters. If you're considering going solar, it's always a good idea to get quotes from a few installers. To make sure you find a trusted, reliable solar installer near you that offers competitive pricing, check out EnergySage, a free service that makes it easy for you to go solar. It has hundreds of pre-vetted solar installers competing for your business, ensuring you get high-quality solutions and save 20-30% compared to going it alone. Plus, it's free to use, and you won't get sales calls until you select an installer and share your phone number with them.
Share
Share
Copy Link
A Waymo robotaxi struck a child who ran into the street near a Santa Monica elementary school, prompting a federal investigation into autonomous vehicle safety. The incident, which resulted in no reported injuries, has intensified debate over whether self-driving technology can match human judgment in complex scenarios like school zones, even as the company claims its AI driver reacted faster than any human could have.
The National Highway Traffic Safety Administration has opened an investigation after a Waymo robotaxi struck a child near an elementary school in Santa Monica last week
1
. The child ran into the street from behind a double-parked SUV during normal school drop-off hours, when other children, a crossing guard, and several double-parked vehicles were present in the area2
. Santa Monica Police reported that first responders evaluated the student with her parent present and did not report any injuries1
. The incident has triggered renewed scrutiny of autonomous vehicle safety and whether self-driving technology is ready for widespread deployment in complex urban environments.Waymo claims its driverless vehicle performed as expected, with the AI driver immediately detecting the child and braking hard, reducing speed from approximately 17 mph to under 6 mph before contact
1
2
. According to the company's computer modeling, a fully attentive human driver in the same situation would have hit the child at approximately 14 mph1
. Waymo argued in a Jan. 28 blog post that this significant reduction in impact speed demonstrates the material safety benefit of autonomous vehicles over human drivers1
. The company's response, however, has been criticized for lacking empathy and appearing to blame the child for the incident2
.
Source: Axios
While faster reaction times offer advantages, safety experts argue that autonomous vehicle safety requires more than quick reflexes. Philip Koopman, an AV safety expert and emeritus professor at Carnegie Mellon University, notes that young drivers have quicker reflexes but older drivers have much better safety records
1
. He suggests that a careful, competent human driver would have avoided a panic stop entirely by adjusting their driving behavior amid the chaos of school drop-off or taking a different route altogether1
. Missy Cummings, former senior safety advisor at the National Highway Traffic Safety Administration and now head of the autonomy and robotics center at George Mason University, has called for Waymo to share video of the incident to provide more context, including what the child was doing before emerging from behind the car1
. The investigation will examine whether the robotaxi exercised appropriate caution in a school zone1
.This incident occurred just days after the Austin Independent School District in Texas reported that Waymo robotaxis were failing to stop for school buses and called for the company to cease operations during morning and afternoon school hours
2
. The district stated bluntly that software updates are clearly not working as intended nor as quickly as required, and that they cannot allow Waymo to continue endangering students while attempting to implement fixes2
. Waymo was already facing scrutiny after one of its self-driving cars was caught on video illegally passing a stopped school bus letting children off in Atlanta, Georgia2
. These repeated incidents underscore concerns about how Waymo prioritizes child safety in its AI models2
.
Source: Electrek
Related Stories
The incident fits into a larger debate about whether autonomous vehicles can match or exceed the safety of human drivers, a question crucial to winning public trust as robotaxis spread across America
1
. While people fear self-driving cars, nearly 40,000 people are killed each year in traffic accidents involving human drivers1
. The timing is particularly challenging for Waymo as the company deploys vehicles in rising numbers across US cities including San Francisco and Miami, racing to demonstrate growth that justifies continued investment2
. GM has already shuttered its own self-driving taxi business, GM's Cruise, after investing more than $12 billion, and Tesla appears unable to make its camera-only robotaxi solution work without human safety monitors2
.The debate is already playing out in the insurance industry, where autonomous vehicles are testing how risk is priced
1
. At least one online provider, Lemonade, examined Tesla's data and concluded its Full Self-Driving software was safer than human driving, warranting a 50% rate cut1
. The US Senate's Commerce Committee has scheduled a hearing on self-driving car safety for February 4, 2026, which will include testimony from Waymo's Chief Safety Officer, Mauricio Peña2
. As the investigation unfolds, the fundamental question remains: Are humans holding robots to a higher standard than themselves? So far, the answer is yes1
.Summarized by
Navi
1
Policy and Regulation

2
Technology

3
Technology
