A class action lawsuit has been filed against Elon Musk’s xAI by three teenagers who say the company deliberately allowed its Grok software to power third-party apps to create nonconsensual nudes and sexually explicit images of them and 18 others as minors.

According to the lawsuit, nonconsensual nudes and sexually explicit photos and videos were created from the yearbook photos and social media images of three Tennessee teenagers and at least 18 others, some of whom are still minors. As Mother Jones reports, the images were distributed through Discord and Telegram, where they were traded among users and used to obtain additional child sexual abuse material. As NPR reports, the lawsuit says the company allowed the images to be created and distributed.

“One of the plaintiffs received a link to a Discord server that had similar images of at least 18 other minor females,” per Mother Jones, “many of whom Jane Doe 1 recognized from her school.”

The lawsuit alleges the person who created the images had a “close and friendly relationship” with one of the plaintiffs.

The plaintiffs said the images appeared highly realistic and were not labeled as AI-generated, with one video depicting a plaintiff “undressing until she was entirely nude,” per NPR.

The perpetrator did not use Grok directly or post the images on X, but instead used an unnamed third-party app powered by Grok, the lawsuit alleges, claiming xAI licensed its technology to outside developers, including some operating overseas, in a way that allowed the company to distance itself from potential liability.

The lawsuit notes that the perpetrator was later arrested.

As Mother Jones reports, the lawsuit follows months of backlash over Grok’s image-generation features, including its “Imagine” tool, which at one point allowed users to generate nonconsensual nude images. The complaint argues that a system capable of producing sexualized images of adults cannot reliably be restricted from generating child sexual abuse material.

The lawsuit also references a February article from the Washington Post that the chatbot’s permissiveness around explicit content was used to drive user growth.

“xAI—and its founder Elon Musk,” the lawsuit reads, via NPR, “saw a business opportunity: an opportunity to profit off the sexual predation of real people, including children.”

Ashley St. Clair, a conservative content creator and parent of one of Elon Musk’s children, has said that Grok generated nonconsensual sexual images of her, including some dating back to when she was a minor, per Mother Jones.

"Like a rag doll brought to life through the dark arts, this [AI-generated] child can be manipulated into any pose, however sick, however fetishized, however unlawful. To the viewer, the resulting video appears entirely real," says the lawsuit. "For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse."

NPR notes that the three teens say they’re seeking to change how AI companies handle sexually explicit content.

“We want to make it [a business decision] that does not make any business sense anymore,” she said.

As SFist reported in January, California Attorney General Rob Bonta announced his office was launching an investigation into the mounting allegations against Grok. 35 state attorneys also wrote a letter demanding that xAI add protections against creating child sexual abuse material and nonconsensual deepfakes, and the chatbot is under investigation in Europe as well.

Meanwhile, Musk has announced mass layoffs at xAI, saying that the company needs to be "rebuilt from the foundations."

Previously: California Now Investigating xAI Over Grok's Non-Consensual Deepfake Imagery

Top image: MIAMI, FLORIDA - JANUARY 26: In this photo illustration, the Grok Imagine website is seen on an ipad (in the foreground) and a computer screen on January 26, 2026 in Miami, Florida. The European Commission has launched an investigation into Elon Musk's X over concerns its AI tool Grok was used to create sexualized images of real people. (Photo illustration by Joe Raedle/Getty Images)