In what's believed to be a first-of-its-kind prosecution, a Tesla owner in Los Angeles has been charged with two counts of manslaughter after his 2016 Tesla Model S blew a red light while on autopilot and plowed into another car, killing the two people inside.

There have been a number of cases around the country of semi-autonomous vehicles not living up to the utmost safety standards that their makers claim they have. And just this week a rival self-driving software company took out a full-page ad in the New York Times warning that Tesla software remains half-baked and saying "We did not sign up our families to be crash test dummies for thousands of Tesla cars."

Now, a case in Los Angeles County, while not technically against Tesla, is likely to put Tesla's practice of using actual drivers on actual roads to test its software in the spotlight.

As the LA Times reports, the tragic incident happened just after midnight on December 29, 2019, in Gardena, California, in LA's South Bay. Two people, Gilberto Alcazar Lopez and his passenger, Maria Guadalupe Nieves-Lopez, came to an intersection in a Honda Civic, and then proceeded through it when their light turned green. Just then, a Tesla on autopilot exited a freeway nearby, blew a red light, and smashed into them, killing them instantly.

The person behind the wheel of the Model S was 27-year-old Kevin George Aziz Riad, and this week, prosecutors filed manslaughter charges against him.

While a test driver for Uber's self-driving taxi program in Tempe, Arizona is facing trial for negligent homicide for the March 2018 death of a pedestrian — the woman behind the wheel, 46-year-old Rafaela Vasquez, was reportedly watching TV on her phone at the time of the crash — that trial has not yet occurred. And experts are saying this LA County case is the first felony case of its kind, charging a Tesla owner for deaths that occurred while operating the car in autopilot mode in a non-testing environment. (Negligent homicide remains a Class 4 felony in Arizona.)

Because Tesla's Autopilot is considered "Level 2" autonomy for a vehicle — meaning that while software is controlling steering and acceleration, the driver can take over at any time — legal scholars have been clear that driver remain wholly responsible in these scenarios.

"It’s a wake-up call for drivers,” says Alain Kornhauser, director of the self-driving car program at Princeton University, speaking to the LAT. "It certainly makes us, all of a sudden, not become so complacent in the use of these things that we forget about the fact that we’re the ones that are responsible — not only for our own safety but for the safety of others."

And, Kornhauser adds, if Riad ends up being found guilty, "it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible.'"

The California Department of Motor Vehicles is reportedly revisiting its stance on Tesla's self-driving software. While Waymo and Uber currently have to report all their crash data to the DMV, Tesla has been able to opt out of this rule for some reason.

And next up in terms of news stories will be incidents involving Tesla's newer "Full Self-Driving" mode, which is in beta right now and looks very much not ready for prime time. A CNN review of the software in November showed a car making all kinds of dangerous maneuvers on the streets of Brooklyn, and the driver needing to take over in multiple incidents — like one where the car wanted to swerve in front of a UPS truck to avoid hitting a bicyclist.

Photo: Bram Van Oost