Meta Charged With DSA Breach Over Child Safety Failures

Meta Charged With DSA Breach Over Child Safety Failures

EU Charges Meta With Digital Services Act Breach for Failing to Protect Children on Instagram and Facebook

The European Commission issued preliminary findings on April 29, 2026, that Meta's Instagram and Facebook are in breach of the Digital Services Act (DSA) for failing to prevent children under 13 from accessing their platforms. The Commission concluded that Meta was "failing to diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services" — a significant escalation in EU regulatory pressure on the world's largest social media company.

The preliminary findings do not yet constitute a final ruling. Meta now has the right to review the Commission's investigation files and respond in writing before any final decision is made. If the charge is ultimately confirmed, however, the company could face a fine of up to 6% of its total worldwide annual turnover.

What the Commission Found: Weak Age Checks and a Broken Reporting Tool

The Commission's investigation, launched in May 2024, was based on an in-depth review of Meta's risk assessment reports, internal data and documents, and the platforms' responses to formal requests for information. The findings paint a picture of age-assurance measures that are, in the Commission's assessment, structurally inadequate.

At the most basic level, children under 13 can simply enter a false date of birth when creating an account — claiming to be 13 or older — and Meta does not provide effective controls to verify whether that self-declared age is accurate. This is particularly notable given that Meta's own terms and conditions set the minimum age for both Instagram and Facebook at 13. According to the Commission, the measures the company has put in place to enforce these restrictions do not appear to be effective.

The problems extend beyond account creation. The Commission described Meta's tool for reporting underage users as "difficult to use and not effective, requiring up to seven clicks just to access the reporting form, which is not automatically pre-filled with the user's information." And even when a minor under 13 is successfully reported through this cumbersome process, the outcome is often no action at all: according to the Commission's preliminary findings, "there often is no proper follow-up, and the reported minor can simply continue to use the service without any type of check."

The Commission is also continuing its investigation into additional potential breaches related to the design of Facebook's and Instagram's online interfaces, which it says "may exploit the vulnerabilities and inexperience of minors, leading to addictive behaviour and reinforcing the so-called rabbit hole effects." Those findings have not yet been finalized.

The DSA's Teeth: What a Confirmed Breach Could Mean for Meta

The Digital Services Act, which applies to Very Large Online Platforms (VLOPs) operating in the European Union, requires companies to proactively identify, assess, and mitigate systemic risks — including risks to children's safety, privacy, and mental health. The law was specifically designed to move regulators away from reactive enforcement and toward ongoing platform accountability.

If the European Commission issues a final non-compliance decision against Meta, the financial consequences could be substantial. A confirmed DSA breach carries a maximum penalty of up to 6% of the company's total worldwide annual turnover, and the Commission retains the power to impose periodic penalty payments for continued non-compliance.

The Commission has already demonstrated its willingness to use these powers. In December 2025, it issued the first-ever non-compliance fine under the Digital Services Act, levying €120 million against X (formerly Twitter) for breaching transparency obligations, including the deceptive design of its "blue checkmark" verification system. Meta has also faced prior DSA scrutiny: in October 2025, the Commission issued preliminary findings that both Meta and TikTok had erected barriers preventing researchers from studying how content reaches children, with potential fines in that separate action flagged at nearly $20 billion.

The institutional message is unambiguous. The Commission has stated: "We will have zero tolerance for companies that do not respect our children's rights."

A Global Wave of Child Safety Regulation

The Commission's action against Meta sits within a rapidly expanding global regulatory landscape. According to data from the OECD, the number of countries considering age verification or social media restriction measures for children rose from just one at the end of 2023 to 25 by April 2026 — a striking acceleration that reflects mounting political and public pressure on social media platforms worldwide.

Australia moved first among major economies, becoming the first country to ban children under 16 from social media in December 2024. The results have been complicated: recent studies indicate that 61% of Australian children aged 12 to 15 still have access to restricted social media platforms following the ban, while 70% say it is easy to bypass. Those figures underscore a tension that regulators across the EU are also grappling with — the gap between legislative intent and practical enforcement.

Within Europe, the Commission has been developing a technical solution. On April 15, 2026, the EU announced that its age verification app — which allows users to prove their age using a passport or national ID without sharing personal data directly with platforms — is technically ready for implementation and will be available to citizens soon. EU Commission President von der Leyen stated the new app will have "the highest privacy standards in the world," and noted that platforms will not be required to use the app but must demonstrate they have alternative age checks that are equally effective.

The app's rollout has not been without criticism, however. Reports following the prototype's debut noted it was demonstrated to be bypassable, and some EU lawmakers have described current enforcement timelines as insufficiently urgent given the ongoing harms to children on live platforms.

Reactions: Regulators Beyond the EU Are Watching

The European Commission's findings echo concerns being raised by regulators outside the EU. Melanie Dawes, CEO of Ofcom, the UK's communications regulator, has stated that tech giants are "failing to put children's safety at the heart of their products" and are falling short on promises to keep children safe online.

Paul Arnold, CEO of the UK Information Commissioner's Office (ICO), has raised specific concerns about what inadequate age verification means for data protection, saying it "puts under-13s at risk by allowing their information to be collected and used unlawfully, without the protections they are entitled to."

Neither Ofcom nor the ICO commented specifically on the April 29, 2026 European Commission action. Their statements reflect the broader regulatory consensus that has been building across jurisdictions: that self-declared age fields, without verification, are not a meaningful safeguard.

What Happens Next

The April 29, 2026 findings are preliminary. Under DSA procedures, Meta has the right to examine the documents in the Commission's investigation files and submit a written defense before any final ruling is issued. The Commission will then consider Meta's response before deciding whether to issue a formal non-compliance decision.

Should a non-compliance decision be issued, the Commission could impose a fine of up to 6% of Meta's worldwide turnover, and could require the company to implement specific technical or procedural changes to bring its platforms into compliance with the DSA's child protection provisions.

Separately, the Commission's ongoing investigation into the addictive design features of Instagram and Facebook — including interface choices that may reinforce compulsive use patterns among minors — remains open. Those findings, when they come, could widen the scope of Meta's DSA exposure considerably.

For now, the question of whether social media platforms can effectively restrict access by young children — without circumvention tools, false birthdays, or parental workarounds — remains unresolved, both technically and legally, across every jurisdiction wrestling with it.

For more tech news, visit our news section.

Share:
← Back to Tech News