We are now learning that a Thanksgiving Day multi-car crash that injured nine people is being blamed on a Tesla in “full self-driving” mode, as the Tesla reportedly came to an abrupt stop after a lane change, causing a chain-reaction pileup in the Yerba Buena tunnel.

You might recall that on Thanksgiving Day this year shortly after noon, KGO reported on a massive, multi-car accident on the eastbound lanes of the Bay Bridge, inside the Yerba Buena tunnel. (KGO has a video of the aftermath, seen below). KRON4 reported at the time that “Sixteen people, eight adults and eight juveniles, were treated at the scene for very minor injuries,” and that “Two of the juveniles were taken to the hospital.”

A mere 13 hours before that incident, Tesla co-founder and CEO Elon Musk tweeted “Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen,” and “Congrats to Tesla Autopilot/AI team on achieving a major milestone!” Hmmm, could these two occurrences be somehow related?

It appears these two occurrences may indeed be related. CNN reports that the car that allegedly caused the crash was a Tesla in “full self-driving” mode, according to the California Highway Patrol crash report they obtained. A similar report on KPIX states that “A driver told authorities that their Tesla's ‘full-self-driving’ software braked unexpectedly on Thanksgiving, triggering an eight-car injury accident on the Bay Bridge.”

KTVU adds that the Tesla driver “told police he was in Full-Self Driving mode, which had malfunctioned.” That reporting picks up on Reuters obtaining a police report that said, per Reuters, that “The driver of a 2021 Tesla Model S involved in an eight-vehicle crash last month on San Francisco's Bay Bridge told police he was in Full-Self Driving (FSD) mode which had malfunctioned, “

CNN summarizes the report they obtained. “The report states that the Tesla Model S was traveling at about 55 mph and shifted into the far left-hand lane, but then braked abruptly, slowing the car to about 20 mph,” according to CNN. “That led to a chain reaction that ultimately involved eight vehicles to crash, all of which had been traveling at typical highway speeds.”

We should note that none of this is proof that a self-driving Tesla caused the accident. The Tesla driver merely claims the car was in “full self-driving” mode. The claim has not been confirmed. “The Tesla driver told police the FSD malfunctioned but police were unable to determine if the software was in operation or if his statement was accurate,” as Reuters explains. “The police report said the vehicle made an unsafe lane change and was slowing to a stop, which led to another vehicle hitting the Tesla.”

Of course, Tesla could clear all this up right now by releasing information on whether the car was really in “full self-driving” mode, as they certainly have this data stored. The company has not responded to requests for comment form numerous media outlets, though that said, any Elon Musk company pretty much never responds for comment.

No one died in this incident, and none of the injuries appears to have been serious. But that’s just good luck, this thing could have really gone south. And Tesla may be playing a dangerous game with the claim of “full self-driving,” which is a marketing claim, not a technical description. And we know that Tesla drivers are sometimes the kind of people who are more partial to techie marketing claims than concern for their fellow motorists.

Last year on CNBC’s Squawk Box, NTSB chairperson Jennifer Homendy said, “It’s clear that if you’re marketing something as ‘full self-driving’ and it is not full self-driving, and people are misusing the vehicles and the technology, but you have a design flaw and you have to prevent that misuse, And part of that is how you talk about your technology.”

“It is not full self-driving technology,” she added. “It’s misleading.”

Related: Tesla Driver Who Was Behind Wheel On Autopilot When Car Killed Two People Charged With Manslaughter [SFist]

Image: Dogpatch via Reddit