EU opens formal investigation into Snapchat over child safety violations

    European Union regulators have launched a formal investigation into Snap's Snapchat platform, alleging the company is failing to adequately protect minors from grooming and failing to prevent the sale of illegal goods through its service. The probe is being conducted under the Digital Services Act, the EU's primary law governing how large online platforms manage harmful content. If Snap is found in breach, the company faces fines of up to 6% of its global annual revenue. Snap's shares fell sharply on the news.

    Snap reported total revenue of $5.36 billion in 2024. A 6% fine on that figure would amount to roughly $321 million, which is not a trivial sum for a company that has struggled to reach sustained profitability. The DSA fine ceiling is calculated on global revenue rather than European revenue, which means the penalty exposure is significantly larger than under older, region-scoped regulatory frameworks.

    What the investigation is actually about

    The European Commission has identified two specific areas of concern. The first is child grooming: the allegation that Snapchat's design and moderation systems are not doing enough to prevent adults from using the platform to initiate contact with minors for exploitative purposes. The second is the sale of illegal goods, with regulators pointing to evidence that Snapchat is being used as a channel for drug transactions and other prohibited commerce.

    Both concerns have been raised by child safety organizations and law enforcement agencies in multiple EU member states over the past two years. The UK's National Society for the Prevention of Cruelty to Children published a report in 2023 identifying Snapchat as one of the platforms most frequently cited in child sexual abuse cases reported to police. The EU's formal investigation is the first time these concerns have been translated into a regulatory proceeding with direct financial consequences for Snap.

    Social media regulation and online child safety
    Social media regulation and online child safety

    How the Digital Services Act works

    The Digital Services Act came into full effect for very large online platforms in February 2024. It applies to platforms with more than 45 million monthly active users in the EU. Snapchat comfortably meets that threshold. Under the DSA, platforms are required to conduct annual risk assessments identifying how their services might be used to cause harm, and to implement mitigation measures proportionate to those risks.

    The law also requires platforms to give researchers and regulators access to data for audit purposes, and to provide users with meaningful ways to report harmful content. Regulators can open a formal investigation when they believe a platform's risk assessment or mitigation measures are inadequate. That is the stage Snapchat is now at. The investigation does not automatically result in a fine; Snap will have the opportunity to respond to the Commission's concerns before any penalty is issued.

    Snapchat's particular vulnerability to these allegations

    Snapchat's core design features create specific moderation challenges. Messages disappear by default after being viewed, which limits the audit trail available to both the platform and law enforcement. The Stories format and the Discover section serve public-facing content to users based on algorithmic recommendations, and Snap Maps allows users to see the approximate location of their contacts in real time.

    Each of these features, while designed for social connection and content discovery, has documented histories of misuse. The ephemeral messaging system is frequently cited in grooming cases because it reduces the likelihood that incriminating conversations will be preserved. Snap introduced a Family Center feature in 2022 that allows parents to see who their children are messaging without seeing the content of those messages, but child safety advocates have argued this is insufficient given the scale of the problem.

    Where this fits in the broader DSA enforcement picture

    Snapchat is not the first platform to face DSA enforcement action. The European Commission opened a formal investigation into X, formerly Twitter, in December 2023 over concerns about illegal content and disinformation. Meta has faced preliminary proceedings related to its advertising practices and content moderation in the EU. TikTok was fined 345 million euros in September 2023 by Ireland's Data Protection Commission over how it handled children's data, though that action was under GDPR rather than the DSA.

    The DSA enforcement machinery is still relatively new, and regulators are building case precedent as they go. The Snapchat investigation will be closely watched by other platforms because the Commission's findings, whatever they are, will effectively define what adequate child safety measures look like under the law. A finding against Snap would create a compliance baseline that Meta, TikTok, and YouTube would need to demonstrate they meet.

    The financial pressure on Snap

    Snap has been working to stabilize its business after a difficult 2022, when its stock fell more than 80% over the course of the year following a series of missed revenue targets. The company returned to modest profitability in late 2023 and reported improving advertising revenue through 2024. A large DSA fine would hit at a moment when the company's financial position is still fragile relative to larger social media peers.

    The share price reaction to the investigation announcement reflected that concern directly. Snap's stock dropped approximately 8% on the day the probe was confirmed, erasing several weeks of gains. Investors in social media companies have grown increasingly sensitive to regulatory risk in Europe, particularly after Meta absorbed a record 1.2 billion euro GDPR fine in May 2023 for transferring European user data to the United States.

    What Snap has said publicly

    Snap responded to the investigation announcement by saying the company is committed to protecting its users and cooperating fully with EU regulators. The company pointed to its existing safety tools, including age verification measures, the Family Center, and its content moderation systems, as evidence of its commitment to child protection. Snap also noted that it regularly removes accounts and content that violate its policies.

    Whether those existing measures satisfy the DSA's requirements is the precise question the investigation will answer. The European Commission typically takes between 12 and 18 months to complete a formal DSA investigation, which means a final decision on whether Snap has breached the law is unlikely before late 2026 at the earliest.

    Love this story? Explore more trending news on snapchat

    Share this story

    Frequently Asked Questions

    Q: How much could Snap be fined if found in breach of the Digital Services Act?

    The DSA allows fines of up to 6% of a company's global annual revenue. Based on Snap's 2024 revenue of $5.36 billion, that ceiling works out to roughly $321 million.

    Q: What specific features of Snapchat are regulators concerned about?

    The main concerns center on Snapchat's disappearing messages, which limit audit trails in grooming cases, and its broader content discovery and messaging systems, which regulators say are being used to facilitate contact between adults and minors as well as drug sales.

    Q: Is Snapchat the first platform to face a formal DSA investigation?

    No. The European Commission previously opened a formal DSA investigation into X in December 2023 over illegal content and disinformation concerns. Meta has also faced preliminary DSA proceedings related to advertising and content moderation.

    Q: How long will the EU's investigation into Snapchat take?

    DSA investigations typically take between 12 and 18 months to complete, meaning a final decision on whether Snap has violated the law is unlikely before late 2026.

    Q: What is the Family Center feature Snap introduced for child safety?

    Family Center is a parental supervision tool Snap launched in 2022 that lets parents see who their child is messaging on the platform, without showing the content of those messages. Child safety advocates have argued the feature does not go far enough given the documented scale of grooming activity on the platform.

    Read More