Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsElon Musk's Pornography Machine. (Does the word "deportation" sound familiar?)
https://www.theatlantic.com/technology/2026/01/elon-musks-pornography-machine/685482/?gift=-Cg_YfCqAEDBFSjg_lEgre3VOLzGSitHKRiruaWIYEogift link.
Earlier this week, some people on X began replying to photos with a very specific kind of request. Put her in a bikini, take her dress off, spread her legs, and so on, they commanded Grok, the platforms built-in chatbot. Again and again, the bot complied, using photos of real peoplecelebrities and noncelebrities, including some who appear to be young childrenand putting them in bikinis, revealing underwear, or sexual poses. By one estimate, Grok generated one nonconsensual sexual image every minute in a roughly 24-hour stretch.
Although the reach of these posts is hard to measure, some have been liked thousands of times. X appears to have removed a number of these images and suspended at least one user who asked for them, but many, many of them are still visible. xAI, the Elon Muskowned company that develops Grok, prohibits the sexualization of children in its acceptable-use policy; neither the safety nor child-safety teams at the company responded to a detailed request for comment. When I sent an email to the xAI media team, I received a standard reply: Legacy Media Lies.
Musk, who also did not reply to my request for comment, does not appear concerned. As all of this was unfolding, he posted several jokes about the problem: requesting a Grok-generated image of himself in a bikini, for instance, and writing 🔥🔥🤣🤣 in response to Kim Jong Un receiving a similar treatment. I couldnt stop laughing about this one, the worlds richest man posted this morning sharing an image of a toaster in a bikini. On X, in response to a users post calling out the ability to sexualize children with Grok, an xAI employee wrote that the team is looking into further tightening our gaurdrails [sic]. As of publication, the bot continues to generate sexualized images of nonconsenting adults and apparent minors on X.
snip
Perhaps most telling of all, as I reported in September, xAI launched a major update to Groks system prompt, the set of directions that tell the bot how to behave. The update disallowed the chatbot from creating or distributing child sexual abuse material, or CSAM, but it also explicitly said there are **no restrictions** on fictional adult sexual content with dark or violent themes and teenage or girl does not necessarily imply underage. The suggestion, in other words, is that the chatbot should err on the side of permissiveness in response to user prompts for erotic material. Meanwhile, in the Grok Subreddit, users regularly exchange tips for unlocking Grok for Nudes and Spicy Shit and share Grok-generated animations of scantily clad women.
Although the reach of these posts is hard to measure, some have been liked thousands of times. X appears to have removed a number of these images and suspended at least one user who asked for them, but many, many of them are still visible. xAI, the Elon Muskowned company that develops Grok, prohibits the sexualization of children in its acceptable-use policy; neither the safety nor child-safety teams at the company responded to a detailed request for comment. When I sent an email to the xAI media team, I received a standard reply: Legacy Media Lies.
Musk, who also did not reply to my request for comment, does not appear concerned. As all of this was unfolding, he posted several jokes about the problem: requesting a Grok-generated image of himself in a bikini, for instance, and writing 🔥🔥🤣🤣 in response to Kim Jong Un receiving a similar treatment. I couldnt stop laughing about this one, the worlds richest man posted this morning sharing an image of a toaster in a bikini. On X, in response to a users post calling out the ability to sexualize children with Grok, an xAI employee wrote that the team is looking into further tightening our gaurdrails [sic]. As of publication, the bot continues to generate sexualized images of nonconsenting adults and apparent minors on X.
snip
Perhaps most telling of all, as I reported in September, xAI launched a major update to Groks system prompt, the set of directions that tell the bot how to behave. The update disallowed the chatbot from creating or distributing child sexual abuse material, or CSAM, but it also explicitly said there are **no restrictions** on fictional adult sexual content with dark or violent themes and teenage or girl does not necessarily imply underage. The suggestion, in other words, is that the chatbot should err on the side of permissiveness in response to user prompts for erotic material. Meanwhile, in the Grok Subreddit, users regularly exchange tips for unlocking Grok for Nudes and Spicy Shit and share Grok-generated animations of scantily clad women.
2 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Elon Musk's Pornography Machine. (Does the word "deportation" sound familiar?) (Original Post)
usonian
Monday
OP
RockRaven
(18,729 posts)1. CSAM Machine
"Pornography machine" is a soft-peddling, enabling euphemism (IOW typical of legacy media like The Atlantic).
Mike Nelson
(10,916 posts)2. I was surprised...
... went there to see what people were saying about Venezuela. It used to be very political... mostly political posts. Now it seems mostly sex... like a sexed-up version of TicToc. Countess clips with a sexual theme, most with very attractive people, usually women... how can I say it... well, Al Bundy would shout, "Hooters!" very often.