Snapchat faces EU investigation under Digital Services Act for child safety failures

    European Union regulators have opened a formal investigation into Snap Inc., citing concerns that Snapchat is not doing enough to stop child grooming and the sale of illegal goods on its platform. The probe falls under the Digital Services Act, the EU's sweeping content moderation law that took full effect for large platforms in February 2024. If regulators find Snap in breach, the company faces fines of up to 6% of its global annual revenue.

    What the EU is actually investigating

    The investigation centers on two specific failure areas. First, regulators want to know whether Snapchat has adequate systems to detect and prevent adults from using the platform to contact and groom minors. Second, they are examining whether illegal goods, including drugs and counterfeit products, are being sold through Snapchat with insufficient intervention from Snap. These are not new complaints about the platform, but the DSA gives regulators formal teeth to demand answers and impose penalties.

    The DSA requires platforms with over 45 million monthly active users in the EU to conduct thorough risk assessments and put mitigation measures in place for systemic risks. Snapchat comfortably clears that threshold. The law also gives the European Commission direct enforcement power, bypassing the need to work through individual member states. That matters because it means the investigation will not get bogged down in the kind of jurisdictional delays that plagued earlier EU tech enforcement attempts.

    EU regulators open formal DSA probe into Snapchat over child safety and illegal content concerns
    EU regulators open formal DSA probe into Snapchat over child safety and illegal content concerns

    Why Snapchat is under particular scrutiny

    Snapchat has a younger user base than most major social platforms, and its design has historically made content harder to monitor. Messages disappear after being viewed, stories vanish after 24 hours, and the platform's structure was built around privacy and ephemerality. Those features, popular with teenagers, also make it harder for the company to detect patterns of grooming or illegal commerce. That tension is at the center of what the EU is now scrutinizing.

    Snap has made some public moves on safety. The company launched a Family Center tool in 2022 that lets parents see who their child is messaging, though not the content of those messages. It also introduced a Sensitive Content Control in 2023 to limit the reach of certain posts in search and discovery. Regulators appear unconvinced that these measures are sufficient, given that the investigation was launched despite those features already being live.

    The financial stakes for Snap

    Snap reported full-year 2023 revenue of approximately 4.6 billion dollars. A fine of 6% of global annual revenue would put the potential penalty at roughly 276 million dollars, based on that figure. That is not a company-threatening number on its own, but DSA fines can also be paired with orders requiring Snap to change specific product features or operational practices, which carries a different kind of cost. Rebuilding core functionality to satisfy regulatory requirements takes time and engineering resources that would otherwise go elsewhere.

    Snap's stock has already been under pressure from weak advertising revenue and competition from TikTok and Instagram Reels. A formal EU investigation adds regulatory risk to a company that investors have been watching closely. The timing matters: Snap is in the middle of trying to rebuild advertiser confidence, and regulatory headlines about child safety are not helpful for that effort.

    How this fits into the EU's broader DSA enforcement pattern

    Snap is not the first platform to face a DSA investigation. The European Commission previously opened probes into X (formerly Twitter), TikTok, and Meta over issues ranging from illegal content to algorithmic transparency. TikTok faced a specific investigation over child protection concerns that overlaps with the issues now being raised about Snapchat. The EU has been methodical about working through its list of designated very large online platforms, and Snap was always likely to come under scrutiny given the demographics of its user base.

    What makes the DSA different from previous attempts to regulate platforms is the enforcement mechanism. Under the old framework, national data protection authorities had to act first and their decisions could be appealed and delayed for years. The DSA gives the Commission the ability to act directly and set a timeline for compliance. Platforms cannot simply wait out the process.

    What Snap has said publicly

    Snap has stated it is cooperating with the investigation and that child safety is a priority for the company. The company has not disputed the Commission's authority to investigate and has not publicly challenged the scope of the probe. That approach is in line with how most large platforms have responded to DSA investigations, at least in the early stages, where outright resistance tends to make the regulatory relationship worse without changing the outcome.

    The investigation has no fixed public deadline, but DSA proceedings have generally moved faster than traditional EU competition or privacy cases. The Commission is expected to request detailed documentation from Snap about its content moderation systems, risk assessment methodology, and the specific measures it has taken to address grooming and illegal sales. Snap's responses to those requests will likely determine how long the formal investigation phase lasts before regulators reach a preliminary finding.

    Love this story? Explore more trending news on snap

    Share this story

    Frequently Asked Questions

    Q: What specific violations is the EU investigating Snapchat for?

    Regulators are examining whether Snapchat has adequate systems to prevent child grooming by adults and whether it is doing enough to stop illegal goods such as drugs and counterfeit products from being sold through the platform.

    Q: How much could Snap be fined if found in breach of the Digital Services Act?

    The DSA allows fines of up to 6% of a company's global annual revenue. Based on Snap's 2023 revenue of around 4.6 billion dollars, that would amount to roughly 276 million dollars.

    Q: Has the EU investigated other social media platforms under the DSA?

    Yes. The European Commission has previously opened formal DSA investigations into X, TikTok, and Meta. TikTok faced a specific probe related to child protection, similar to the current Snapchat case.

    Q: Does the disappearing message feature on Snapchat affect how regulators view its safety practices?

    It is a factor regulators consider. Snapchat's ephemeral design makes it harder to detect patterns of grooming or illegal activity, and that structural challenge is part of why the platform faces greater scrutiny on content moderation.

    Q: What safety tools has Snap already launched before this investigation?

    Snap introduced a Family Center tool in 2022 allowing parents to see who their child messages, and a Sensitive Content Control in 2023 to limit certain content in search and discovery. EU regulators launched the investigation despite these features being active.

    Read More