Britain’s media regulator, Ofcom, has launched an investigation into Elon Musk’s X. The probe focuses on whether its Grok AI chatbot generated illegal sexually explicit content—including deepfakes of real people and minors.
Ofcom said it received “deeply concerning reports.” Users allegedly used Grok to create undressed images of individuals without consent. Some outputs may amount to intimate image abuse or pornography. Others appear to depict children in sexualized ways—potentially constituting child sexual abuse material (CSAM).
This follows strong criticism from Prime Minister Keir Starmer. He called the images “disgusting” and “unlawful.” He also urged X to “get a grip” on Grok and backed Ofcom’s authority to act.
Under UK law, creating or sharing non-consensual intimate imagery is illegal. So is generating AI-based sexual content involving minors. Platforms must also prevent UK users from seeing such material. They must remove it as soon as they become aware.
X has already faced global backlash over Grok’s image tool. French authorities referred the case to prosecutors, calling the content “manifestly illegal.” Indian regulators have also demanded answers.
In response, X restricted the feature to paying users. The company says it removes all illegal content and permanently bans violators. “Anyone using Grok to make illegal content will face the same consequences as if they uploaded it,” X stated.
When asked for comment, xAI—the team behind Grok—dismissed concerns with a short reply: “Legacy Media Lies.”
Ofcom will now assess whether X properly evaluated the risk to UK users. It will also check if the company considered dangers to children.
If Ofcom finds serious violations, it can go to court. It may ask payment providers or advertisers to cut ties with X. In extreme cases, it could force UK internet providers to block the platform.
X did not immediately respond to requests for comment. Still, this case tests the UK’s Online Safety Act—and whether tech giants will be held accountable for AI harms.













