As Waymo prepares to launch its robotaxis in 20 cities this year, we're likely going to hear more stories of Waymo passengers who become the unwitting victims of anti-AI rage.

At this point in SF we've all seen stories, or witnessed something live, in which a drunk or just plain angry person lashes out at a Waymo vehicle, getting in its way or otherwise taunting it, taking adavantage of its sensors and software that instruct it not to budge an inch if there is a person too close by. In some cases, these incidents turn violent, with the individual jumping on or vandalizing the vehicle.

And, of course, we all remember how Waymo vehicles were targeted for tagging and arson during the anti-ICE protests in LA last June.

The New York Times today delves into the problem of what to do when you're trapped in the backseat of a Waymo that comes under attack on the street, and how the company tends to handle these situations. SF resident Doug Fulop describes a scary situation in January that went on for six minutes in which an unhinged man punched at the windows of a Waymo he was in with two other passengers, screaming at them for "giving money to a robot."

Fulop says he and his friends were trapped and stayed frozen inside while they called Waymo Support for help, but they were told to just stay in the vehicle and there was no override system available — like, no one could hop behind the wheel and take control, for instance, and the company wasn't taking remote control of the car.

Ultimately, Fulop and the other two passengers waited and watched while the suspect began attracting a crowd of onlookers, some of whom were apparently cheering him on. And at one point when he got distracted from blocking or hitting the vehicle, it sensed that he had stepped far enough away and it began driving off. Police only later arrived on the scene, and confirmed the details of Fulop's account.

Fulop tells the Times that he stopped using Waymo at night after this happened, and he adds, "As passengers, we deserve more safety than that if someone is trying to attack us. This can’t be the policy to be trapped there."

But this raises a novel problem — should the autonomous car have recognized the volatile situation and tried to maneuver away from the attacker, even if it meant potentially striking him or a pedestrian? We know Waymo's software has gotten a bit more aggressive and human-like than it was when the cars originally hit the road in SF — which was over four years ago now, if you can believe it — but the cars remain very cautious around crosswalks in particular if pedestrians are anywhere nearby.

Another rider, Anders Sorman-Nilsson, a tech writer and speaker, tells the Times about an incident he experienced in Los Angeles in which five individuals on e-bikes surrounded the Waymo he was in and tried to get its doors open. They banged on the windows and demanded he open up, which he did not. Interestingly, Sorman-Nilsson says that he felt safer than if there had been a human taxi driver present, both because the car's exterior cameras were recording everything, and because a human driver might have panicked and tried to make him give up his wallet or phone to make the suspects go away.

As Waymos become more prevalent both here and elsewhere, sketchy incidents like these are likely to only multiply.

And, if lots people start losing their jobs en masse to AI, one can easily imagine these AI cars becoming targets for a whole lot of rage.

Waymo already offers robotaxi services in SF and San Jose, as well as Phoenix, Miami, Austin, and Atlanta — the latter two are through partnerships with Uber. The autonomous cars are testing or in a waitlist phase in 16 other cities, with plans to launch in 20 cities by the end of 2026. Upcoming service has been announced in at least a half dozen other cities as well.

Previously: Waymo Safety Concerns During Emergencies Being Discussed In SF Hearing

Photo by Sam Szuchan