
Three unidentified plaintiffs contended in a lawsuit submitted on Monday in California federal court that Elon Musk’s company xAI must be held liable for permitting its AI systems to create abusive sexual images of recognizable minors.
The trio of plaintiffs seeks to initiate a class action lawsuit on behalf of any individual whose genuine images as minors have been transformed into sexual content by Grok. They claim that xAI failed to implement fundamental safety measures employed by other leading laboratories to stop their image models from generating pornography involving real individuals and minors.
The legal case, Jane Doe 1, Jane Doe 2, a minor, and Jane Doe 3, a minor versus x.AI Corp. and x.AI LLC, has been filed in the United States District Court for the Northern District of California.
Other deep learning image-generating technologies utilize diverse strategies to avert the production of child pornography from ordinary photographs. The lawsuit asserts that xAI did not embrace these standards.
Significantly, if a model is capable of generating nude or erotic content from real pictures, it becomes nearly impossible to prevent it from creating sexual content featuring children. Musk’s public endorsement of Grok’s capability to produce sexual imagery and portray real individuals in revealing outfits is a fundamental aspect of the case.
The company has not commented on a request from TechCrunch.
One plaintiff, Jane Doe 1, had images from her high school homecoming and yearbook altered by Grok to show her naked. An anonymous informant who reached out to her on Instagram alerted her that the images were spread online, providing her with a link to a Discord server containing sexualized pictures of her and other minors she recognized from school.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
A second plaintiff, Jane Doe 2, was alerted by law enforcement about altered, sexualized depictions of her generated by a third-party mobile application that utilizes Grok models. A third individual, Jane Doe 3, was also informed by law enforcement who found an altered, pornographic image of her on the device of a suspect they had detained. The plaintiffs’ attorneys assert that since third-party utilization still necessitates xAI code and servers, the company should be deemed responsible.
All three plaintiffs, two of whom are minors, report severe distress related to the dissemination of these images and the implications it may hold for their reputations and social interactions. They are requesting civil penalties under various laws designed to safeguard exploited children and avert corporate negligence.

