Ofcom Investigation into X: Legal Implications for Online Platforms
On 12 January 2026, the UK communications regulator Ofcom opened a formal investigation into X, the social media platform formerly known as Twitter, over its compliance with the UK's Online Safety Act 2023. The investigation focuses on whether X has met its legal duties to protect users, particularly children, from illegal and harmful content following reports that AI chatbot integrated into the platform, Grok, has been used to generate and distribute sexualised and potentially illegal imagery.
For digital platforms, regulators, and IP practitioners alike, the unfolding Ofcom investigation signals a critical moment in the enforcement of online content. It also underlines the broader trend towards heightened legal scrutiny of generative AI tools and the responsibilities of platforms that host or deploy them.
What triggered the investigation?
Ofcom's investigation was prompted by reports that X's AI chatbot, Grok, was being used to create and share:
- Undressed or sexualised images of people without consent, potentially constituting intimate image abuse or pornography; and
- Sexualised images of children, which may amount to child sexual abuse material (CSAM) under UK law.
Under the Online Safety Act 2023, providers of regulated user-to-user services, like X, have a legal duty to prevent access to illegal content, assess the risks posed by their services, and take proportionate measures to mitigate harm, including to children.
Ofcom says it first contacted X on 5 January and set a deadline of 9 January for the company to explain what steps it had taken to protect UK users. After reviewing X's response and the available evidence, the regulator determined that a formal investigation was necessary.
Legal framework: The Online Safety Act 2023
The Online Safety Act represents one of the UK's most significant regulatory efforts to make digital platforms accountable for content circulated on their services. It imposes risk assessment and risk mitigation duties on companies that offer user-to-user services likely to be accessed by children or adults. These duties include:
- Performing a suitable and sufficient illegal content risk assessment;
- Taking proportionate measures to prevent users encountering illegal content, such as non-consensual intimate imagery and CSAM;
- Implementing highly effective age assurance systems to prevent children from accessing harmful content; and
- Ensuring the swift removal of illegal material once identified.
Ofcom's formal investigation is intended to determine whether X has complied with these duties and whether its existing content moderation and risk mitigation measures meet the legal standard set by the Act.
Enforcement Power and Potential Consequences
Ofcom has broad enforcement powers under the Online Safety Act. If the regulator concludes that X has failed to comply:
- It can impose substantial fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater;
- It can require specific actions to ensure compliance or remedial steps to reduce harm; and
- In the most serious cases, it can apply to a court for "business disruption measures", which could include blocking access to X in the UK or restricting services such as payments or advertising.
These enforcement tools reflect the UK's robust approach to online safety compliance and send a clear signal to global tech platforms that failure to protect users, especially children, carries serious legal and commercial risks.
Impact on Digital Platforms and IP Law
For legal practitioners advising technology and media clients, the Ofcom investigation into X carries several key implications:
1. Content Safety Compliance Is Non-Negotiable
Platforms cannot rely on voluntary moderation alone. They must demonstrate proactive risk assessments, rigorous age-verification systems, and documented protocols for removing illegal and harmful content.
2. AI Tools Must Be Accountable
Generative AI capabilities now extend to image and video creation. Companies integrating these tools must ensure that safeguards are not only technologically robust but also aligned with applicable legal standards in all markets where the service operates.
3. Global Platforms Face Local Legal Risks
International tech companies operating in the UK must account for domestic laws like the Online Safety Act, which may impose duties that go beyond general community standards or self-regulation.
4. Regulatory Enforcement Is Becoming More Assertive
Ofcom's willingness to investigate and potentially impose significant penalties reflects a broader trend in regulatory enforcement across jurisdictions, including the EU and other nations implementing digital safety legislation.
Looking Ahead
Ofcom's investigation into X is ongoing, and the regulator has not yet set a timeline for its conclusion. However, the case already highlights the increasing legal responsibilities placed on digital platforms, particularly those using advanced AI, and the need for comprehensive compliance strategies.
As legal practitioners, understanding these evolving regulatory frameworks is essential for advising clients on risk management, content governance, and adherence to online safety obligations.
Make an Enquiry Now
For advice relating to the Ofcom investigation into X, call our intellectual property experts on 0131 478 4724 or complete an Online Enquiry.
We have helped hundreds of individuals and businesses across the UK.
Please note the contents of this blog is given for information only and must not be relied upon. Legal advice should always be sought in relation to your specific circumstances.