A reporter’s public records request has produced video of the Thanksgiving Day multi-car crash that the driver claims was caused by his Tesla in “full self-driving” mode, and hmmm, the reporter who obtained the video is suddenly shadowbanned on Elon Musk’s Twitter.
An eight-vehicle Bay Bridge crash on Thanksgiving Day took on a whole new look when we learned in late December that the driver whose Tesla caused the pileup claimed his Tesla was in “full self-driving” mode. And it may be no coincidence that the crash, which injured nine people including a 2-year-old, came just 13 hours after Tesla CEO Elon Musk tweeted "Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option."
We don’t have proof that the Tesla was in full self-driving mode when it caused the crash. But we do now have proof that the Tesla in question did cause the chain reaction pileup, as The Intercept has obtained surveillance video of the accident (seen below), as well as the California Highway Patrol accident report of the incident. And it does appear to the naked eye to be an example of the “phantom braking” phenomenon that has dogged Teslas for the last year or so.
I obtained surveillance footage of the self-driving Tesla that abruptly stopped on the Bay Bridge, resulting in an eight-vehicle crash that injured 9 people including a 2 yr old child just hours after Musk announced the self-driving feature.— Ken Klippenstein (@kenklippenstein) January 10, 2023
Full story: https://t.co/LaEvX9TzxW pic.twitter.com/i75jSh2UpN
The Intercept summarizes the above incident saying “The driver told police that he had been using Tesla’s new ‘Full Self-Driving’ feature, the report notes, before the Tesla’s ‘left signal activated’ and its ‘brakes activated,’ and it moved into the left lane, ‘slowing to a stop directly in [the second vehicle’s] path of travel.’”
But more tellingly, The Intercept has obtained and published the full CHP Traffic Crash Report. We’ll quote its most revealing segment below, but realize it is in CHP jargon where “V-1” refers to the Tesla itself, and “P-1” (Person 1) refers to the Tesla’s driver.
“P-1 stated V-1 was in Full SelfDriving mode at the time of the crash, I am unable to verify if V-1’s Full 24 Self-Driving Capability was active at the time of the crash,” a CHP officer writes of the November 24 incident. “On 11/27/2022 at approximately 1330 hours, I contacted P-1 via telephone to clarify his statement. He related to me in essence the following: He was driving V-1 on I-80 eastbound in Full Self Driving Mode 9 Beta Version traveling at approximately 55 miles per hour. Prior to the Yerba Buena Island tunnel entrance, V-1 moved from the #1 lane to the #2 lane. When V-1 was in the tunnel, V-1 moved from the #2 lane into the #1 lane and started slowing down unaccountably. When V-1 was about 20 miles per hour, he felt a rear impact.”
elon gonna start calling public records requests "doxxing" any minute now https://t.co/k12Z1MhF2h— Alex Shultz (@AlexShultz) January 10, 2023
This matches with the Tesla “phantom breaking” phenomenon, which as the Washington Post reported in February 2022, “Owner reports of phantom braking to NHTSA rose to 107 complaints in the past three months, compared with only 34 in the preceding 22 months.”
And as we mentioned above, a two-year-old was injured in this multi-car crash. And The Intercept reporter Ken Klippenstein’s public records request also turned up this eerie photo from the scene of a stroller next to all of the potentially deadly wreckage.
Regarding The Intercept reporter Ken Klippenstein: I can personally vouch that the above phenomenon is correctly described. As of today, when you search the search bar on Elon Musk-owned Twitter, the reporter Ken Klippenstein’s account will not display in the results! Seems like an obvious shadowban of Klippenstein, but I am not exactly holding my breath for some Matt Taibi-Bari Weiss “Twitter Files” investigation into it.
If Tesla’s vehicles truly have “full self-driving capabilities,” then how do you explain this? Our public roads shouldn’t be a test track for unproven technology. @NHTSAgov must reign in Tesla’s hazardous advanced driving systems. https://t.co/W5ObuSmITd— Jan Schakowsky (@janschakowsky) January 10, 2023
KGO tried to talk to the driver (described as “A 76-year-old lawyer from San Francisco”). They did not succeed in getting a response, but they did get Klippenstein to talk.
"With any technology, there's a period of having to test it out and work out the kinks," Klippenstein told KGO. "And if we're all guinea pigs in the system, if they're testing this out, let's say on roads that we're all driving on and not in controlled settings, if something goes wrong, we're going to experience that in real time."
The National Highway Traffic Safety Administration (NHTSA) is still investigating the crash, and all of the claims described above are still just claims. But KRON4 points out a very scary pattern: “The NHTSA has investigated a total of 35 crashes that potentially involved Tesla’s autopilot driving system. A total of 19 people died in the crashes.”
Image: Dogpatch via Reddit