The Web Watch Basis (IWF) says its analysts have found “legal imagery” of ladies aged between 11 and 13 which “seems to have been created” utilizing Grok.
The AI software is owned by Elon Musk’s agency xAI. It may be accessed both by way of its web site and app, or by way of the social media platform X.
The IWF stated it discovered “sexualised and topless imagery of ladies” on a “darkish internet discussion board” through which customers claimed they used Grok to create the imagery.
The BBC has approached X and xAI for remark.
The IWF’s Ngaire Alexander informed the BBC instruments like Grok now risked “bringing sexual AI imagery of youngsters into the mainstream”.
He stated the fabric could be categorized as Class C beneath UK regulation – the bottom severity of legal materials.
However he stated the consumer who uploaded it had then used a special AI software, not made by xAI, to create a Class A picture – probably the most severe class.
“We’re extraordinarily involved in regards to the ease and pace with which individuals can apparently generate photo-realistic youngster sexual abuse materials (CSAM),” he stated.
The charity, which aims to remove child sexual abuse material from the web, operates a hotline the place suspected CSAM might be reported, and employs analysts who assess the legality and severity of that materials.
Its analysts discovered the fabric by on the darkish internet – the pictures weren’t discovered on the social media platform X.
X and xAI had been beforehand contacted by Ofcom, following studies Grok can be utilized to make “sexualised photos of youngsters” and undress girls.
The BBC has seen a number of examples on the social media platform X of individuals asking the chatbot to change actual photos to make girls seem in bikinis with out their consent, in addition to placing them in sexual conditions.
The IWF stated it had acquired studies of such photos on X, nevertheless these had not thus far been assessed to have met the authorized definition of CSAM.
In a earlier assertion, X stated: “We take motion in opposition to unlawful content material on X, together with CSAM, by eradicating it, completely suspending accounts, and dealing with native governments and regulation enforcement as crucial.
“Anybody utilizing or prompting Grok to make unlawful content material will undergo the identical penalties as in the event that they add unlawful content material.”

