UK regulators ICO and Ofcom issue formal demand to xAI over Grok

    Two of the UK's most consequential regulators have jointly turned their attention to Elon Musk's AI company. The Information Commissioner's Office and Ofcom have issued a formal demand to xAI, requesting detailed information about how the Grok AI model handles data and whether it meets transparency obligations under UK law. This is not a routine inquiry. A joint action from both the ICO and Ofcom directed at a single AI company is relatively uncommon, and the combination of data protection and communications oversight in one move gives the regulators significant reach.

    xAI launched Grok in late 2023 as a direct competitor to ChatGPT, with access initially tied to X Premium subscriptions. The model has since expanded its availability and capabilities, but it has drawn scrutiny at several points for its training data sources and content moderation approach. The UK action appears to be the most formal regulatory pressure xAI has faced since the company was founded.

    UK regulators probe xAI's Grok model over data and transparency concerns
    UK regulators probe xAI's Grok model over data and transparency concerns

    What the regulators are actually asking for

    The formal demand covers two broad areas. The ICO is focused on data practices, specifically how Grok collects, processes, and stores personal data belonging to UK residents. Under UK GDPR, companies processing personal data of UK users are required to meet lawfulness, transparency, and purpose limitation standards. The ICO wants to understand whether xAI has a documented legal basis for the data processing Grok performs, and whether users are adequately informed about how their data is used.

    Ofcom's angle is different. As the UK's communications regulator, Ofcom has authority under the Online Safety Act, which came into full effect for large platforms in 2024. Ofcom's interest in Grok likely relates to how the model interacts with users on X, which is already subject to Ofcom oversight as a designated service under that legislation. The demand asks xAI to explain how Grok's outputs are governed, particularly where those outputs appear in public-facing contexts on X.

    Why a joint action matters

    The ICO and Ofcom coordinating on a single demand is worth paying attention to. The two bodies have distinct powers, and when they act together it typically means the matter touches both data rights and platform accountability at the same time. For xAI, that creates two parallel compliance tracks rather than one. Failing to satisfy either regulator carries its own set of consequences, and the demands may not have identical response deadlines or documentation requirements.

    The ICO can issue fines of up to 17.5 million pounds or 4 percent of global annual turnover under UK GDPR, whichever is higher. Ofcom's enforcement powers under the Online Safety Act include fines of up to 10 percent of qualifying worldwide revenue for non-compliance. Neither penalty has been threatened yet, but the formal demand is the step that typically precedes an enforcement investigation if responses are inadequate.

    xAI's position in the UK market

    xAI does not have a registered UK entity in the same way that Google, Meta, and Microsoft maintain substantial local presences. The company operates primarily out of the United States, which complicates how UK regulators can enforce compliance. The ICO has previously pursued cases against US companies without a UK establishment, but those cases tend to move slowly and enforcement can depend on international cooperation mechanisms.

    Grok's integration with X does create a clearer jurisdictional hook. X, formerly Twitter, has a UK presence and is already engaged with Ofcom on Online Safety Act compliance. That existing relationship gives Ofcom a practical way to apply pressure on xAI's products that appear within X's platform, even if xAI itself sits outside straightforward UK regulatory reach.

    The broader context of AI regulation in the UK

    The UK has taken a deliberately lighter touch on AI regulation compared to the EU, which passed the AI Act in 2024. The UK government's stated approach has been to task existing regulators, including the ICO and Ofcom, with applying their current powers to AI rather than creating a dedicated AI regulator. The action against xAI is a direct product of that strategy. It means AI companies operating in the UK face a patchwork of overlapping regulatory demands rather than a single compliance framework.

    The ICO has been active on AI since 2023, publishing guidance on generative AI and data protection and opening investigations into several AI-related data scraping incidents. Ofcom's involvement is newer territory. The Online Safety Act gave Ofcom expanded powers over user-to-user services and search, and applying those powers to AI-generated content within social platforms is still being worked out in practice. The Grok inquiry may help define how Ofcom interprets its mandate in that space.

    What xAI is expected to submit

    The formal demand requires xAI to provide documentation on Grok's data processing activities, including training data sources and any use of data from UK users. The company is also expected to outline what transparency measures are in place for users who interact with Grok through X. A response deadline has been set, though the specific date has not been made public in either regulator's announcement.

    If xAI provides adequate responses, the inquiry could close without further action. If the documentation reveals compliance gaps, either regulator could escalate to a formal investigation. The ICO opened a similar process against Snap in 2023 over its My AI feature, which resulted in a preliminary enforcement notice before Snap provided additional information and the ICO updated its assessment. That case gives some indication of how the xAI inquiry might proceed over the coming months.

    Love this story? Explore more trending news on xai

    Share this story

    Frequently Asked Questions

    Q: What powers do the ICO and Ofcom have to enforce their demands against xAI?

    The ICO can fine companies up to 17.5 million pounds or 4 percent of global annual turnover under UK GDPR. Ofcom can impose fines of up to 10 percent of qualifying worldwide revenue under the Online Safety Act. Neither penalty has been issued yet, but both apply if xAI fails to comply adequately.

    Q: Does xAI have to comply with UK regulators if it has no UK office?

    UK GDPR applies to any company processing personal data of UK residents, regardless of where the company is based. Ofcom's reach is supported by X's existing UK presence, since Grok operates within X's platform, which is already subject to Online Safety Act obligations.

    Q: What specifically is the ICO investigating about Grok's data practices?

    The ICO is asking whether xAI has a documented legal basis for processing UK user data through Grok, and whether users are properly informed about how their data is collected and used during interactions with the model.

    Q: Has xAI faced regulatory action in other countries over Grok?

    Prior to this UK action, xAI drew attention from the Irish Data Protection Commission in 2024 over Grok's use of public posts from X users for training, which resulted in xAI pausing some data processing in the EU. The UK inquiry is separate and operates under UK rather than EU law.

    Q: How is the UK's approach to AI regulation different from the EU's?

    The EU passed a dedicated AI Act in 2024 that classifies AI systems by risk level and sets specific compliance requirements. The UK chose not to create a separate AI regulator, instead directing existing bodies like the ICO and Ofcom to apply their current powers to AI-related issues within their existing mandates.

    Read More