Uber's San Francisco self-driving car launch had a rocky first day. First, the California DMV told the company that, despite its protestations to the contrary, it still needed to acquire permits to operate the Volvo SUVs on city streets. Shortly thereafter, video emerged depicting an Uber self-driving car running a red light on 3rd Street — through a crosswalk with a pedestrian in it. And then, to cap it all off, state regulators told the company that it had to immediately cease picking up passengers with its autonomous vehicles. Now, with two more reported instances surfacing of Uber self-driving cars running red lights, the $51 billion company has decided the best path forward is to blame its human drivers.
In addition to the much-publicized video from yesterday morning, embedded below, a former Business Times reporter tweeted that she saw an autonomous Uber run a red light on Van Ness Avenue. In an ironic twist, the Uber almost struck her Lyft. The former reporter, Annie Gaus, also posted a photo of the Uber in the middle of an intersection — with cross traffic clearly having the green light.
Just passed a 'self-driving' Uber that lurched into the intersection on Van Ness, on a red, nearly hitting my Lyft.
— Annie Gaus (@AnnieGaus) December 14, 2016
“The Uber car sort of jutted out into the intersection,” she recounted to the Guardian. “It was close enough that we were both kind of like, ‘Woah.’ It’s close enough that you kind of react and are sort of rattled."
“I don’t think anybody has a good understanding of how this works in a city context,” she added.
(Not enough time to get a good shot, but...whoops!) pic.twitter.com/XK49nMF2Q4
— Annie Gaus (@AnnieGaus) December 14, 2016
Suggesting that this was more than first day jitters, KRON 4 got its hands on a set of photos that the channel says show an autonomous Uber driving through a red light on Harrison at 4th Street. The pictures were taken on Sunday morning, which means that the car was likely being used for testing or mapping purposes and did not carry a paying passenger. Still, it would suggest that the software piloting the autonomous vehicles had problems as recently as three days before the much publicized launch of the autonomous ride-hail service. That is, unless these incidents are all the result of human error — a.k.a. Uber drivers.
“These incidents were due to human error," an Uber spokesperson told the Guardian about the both the Van Ness incident and the 3rd Street incident. "This is why we believe so much in making the roads safer by building self-driving Ubers. The drivers involved have been suspended while we continue to investigate.”
Isn't that neat? It's the humans, not the un-permitted software, that is at fault according to Uber. Unfortunately, that argument likely won't sway the DMV.
According to the Business Times, a spokesperson with the DMV told the tech giant that the department is considering legal action. “It is essential that Uber takes appropriate measures to ensure safety of the public,” wrote the DMV to Uber. “If Uber does not confirm immediately that it will stop its launch and seek a testing permit, DMV will initiate legal action.”
While next steps for both Uber and the DMV remain unclear, one pressing question has finally been answered: We now know what it takes for Uber to admit its drivers are actually fallible.
Previously: State Regulators Tell Uber It Must Immediately Stop Self-Driving Car Program
Uber Says 'FU' To DMV, Rolls Out Self-Driving Cars Without Approval