Warning: mentions of kid sexual abuse materials.
Three years after Musk declared eradicating youngster exploitation on X ‘precedence #1’, eSafety says CSEM is extra accessible on his platform than another mainstream service and has launched two investigations into the platform.
Little one sexual exploitation materials is extra outstanding on Elon Musk’s X than another mainstream service, in accordance with Australia’s on-line security regulator.
The feedback, made in response to the xAI chatbot Grok mass undressing controversy, additionally state that the efforts of Musk’s firm to handle the issue have been “insufficient and ineffective”.
The evaluation is contained in paperwork obtained beneath freedom of knowledge and tabled within the Senate, which reveal the regulator is working two separate investigations: one into X over the internet hosting of kid sexual exploitation materials (CSEM), and a second into xAI, Musk’s synthetic intelligence firm, over its Grok chatbot getting used to generate it.
In a letter to X dated January 8, 2026, eSafety’s basic supervisor of regulatory operations Heidi Snell wrote that the regulator had contacted the corporate 5 occasions since August 2024 in regards to the availability of kid sexual exploitation materials on the platform.
“The supply of CSEM continues to look significantly systemic on X,” Snell wrote.
“eSafety has not recognized CSEM to be as readily accessible on another mainstream service.”
The letter famous Musk’s personal November 2022 assertion that “eradicating youngster exploitation is precedence #1”. Greater than three years later, eSafety stated the issue had not been mounted.
eSafety additionally discovered that abnormal hashtags had been being co-opted to promote the fabric, that means folks utilizing X usually had been seemingly being uncovered to it.
An inside eSafety briefing ready for Communications Minister Anika Wells detailed what exterior organisations had discovered about Grok.
Grok accountable
It cited the UK-based Web Watch Basis’s investigations that youngster sexual exploitation materials was seemingly being generated by the xAI chatbot. AI Forensics, an algorithmic auditing agency, recognized imagery of youngsters as younger as 5 in bikinis or “clear” clothes, in addition to content material depicting Nazi symbols and ISIS terrorist materials, together with pictures of executions and propaganda.
One account on X had used Grok to generate vital quantities of fabric concentrating on a redacted particular person from eSafety with graphic abuse “depicting violence, together with stabbing particular employees, beheadings, different depictions of homicide and different excessive content material”. The account has since been eliminated.
X launched an announcement on January 4, 2026, about eradicating materials and suspending accounts. eSafety was unimpressed: the assertion “didn’t seek advice from any adjustments X meant to implement on its service to stop future misuse of Grok on X from happening”.
The briefing famous that whereas Grok is technically individually regulated from xAI, X stays liable for how the function operates on its platform. This implies its use could be regulated by the On-line Security Act’s business codes for social media platforms that require firms to stop them from getting used to create or distribute youngster sexual exploitation materials.
X challenges security requirements
X can be difficult the validity of the net security requirements themselves, with a Federal Court docket listening to set for Could 2026.
The eSafety briefing flagged a niche within the regulatory framework: present business codes and requirements cowl youngster exploitation materials however don’t impose systemic obligations round AI-generated image-based abuse of adults.
Further codes concentrating on kids’s entry to age-restricted materials got here into drive on March 9, 2026, however eSafety famous these “is not going to forestall image-based abuse of Australian adults from happening utilizing AI and being posted on-line”.
An eSafety spokesperson stated that the company is “persevering with to evaluate and examine X’s compliance with its obligations beneath relevant business codes and requirements in relation to youngster sexual exploitation materials. This consists of ongoing engagement with X relating to its obligations.”
X didn’t instantly reply to a request for remark.
- Survivors of abuse can discover help by calling Bravehearts at 1800 272 831 or the Blue Knot Basis at 1300 657 380. The Youngsters Helpline is 1800 55 1800. In an emergency, name 000.

