There’s no evidence that the self-driving cars were at fault in these accidents, but there is a very clear pattern that when people realize they’ve been in an accident with a car that has no driver, they just drive right away.
The self-driving cars of General Motors’ Cruise division were the first to start operating as driverless taxi cabs in February, but in the tech industry, being first can sometimes come with unexpected burdens.
There was of course the very funny April "Ain't nobody in it!" incident when an SFPD officer tried to pull over a Cruise driverless vehicle that did not have a human driver onboard. But there was also the more ominous late June episode where eight Cruise vehicles went into brick mode and just stopped in the middle of Gough Street blocking traffic “for a couple of hours” — and we would later learn that 60 Cruise cars got disabled that night over a 90-minute period because of a server snafu.
Now we see another unanticipated problem, as The Examiner reports the driverless Cruise cars have been involved in nine hit-and-run accidents this year. This is ostensibly because the drivers in the other vehicles don’t know how to stop and share information with the other driver when there is no other driver, or they see the car is driverless, and figure they can just probably get away with it because what is the robot going to do.
“This year, there have been nine ‘hit and run’ collisions involving Cruise driverless autonomous vehicles in San Francisco, according to reports filed with the DMV,” the Examiner reports. “In fact, almost every accident involving a driverless Cruise has resulted in the human driver leaving the scene.”
We will say again that there is no evidence that any of these accidents were the Cruise vehicle’s fault. In fact, there is plenty of evidence that, in most cases, the other driver was at fault. The Examiner pored over the DMV accident reports and they noted that “Cruise’s collision reports from this year contain instances of human-driven cars blowing through stop signs, making left turns from the right lane and backing into stationary cars.”
But the pattern of hit-and-runs suggests that in the way that these autonomous vehicles currently operate, confused drivers have no idea how they should react properly in an accident situation. This was not a question on my driver’s license test! Leaving the scene is a potential felony, but there has been no real information campaign on what to do if you hit a driverless car.
Well, the Examiner asked the SFPD. “Drivers involved in collisions with autonomous vehicles should stay at the scene, call 911 and wait for police to arrive, a spokesperson for SFPD” told The Examiner. But come on, this is the SFPD. How many hours do they expect you to wait there before they show up? Are there conditions where you're allowed to leave if you have a medical condition, or some other extenuating circumstance? These are issues that are still unresolved by regulators and the courts.
And you also have the issue where cars are still a little buggy. SFMTA director Jeffrey Tumlin (who is clearly irked he has no regulatory authority over these things, and that’s a fair complaint) spoke at a recent SFMTA meeting about an accident involving a Cruise car that stopped in the street. That incident “appears to be the kind of crash that autonomous driving technology is supposed to be designed to avoid,” Tumlin said.