The European Commission has issued preliminary findings against Meta, alleging that the US tech company has failed to adequately prevent children under the age of 13 from accessing Instagram and Facebook, in breach of the Digital Services Act (DSA).
The Commission said its investigation found that Meta did not sufficiently “identify, assess and mitigate” the risks of minors bypassing age restrictions on its platforms. While Meta’s own policies set 13 as the minimum age for users, regulators say enforcement systems are not effective in practice.
According to the findings, children can easily register by entering false birth dates, with no robust verification mechanism in place to confirm their age. The Commission also criticised Meta’s reporting system for underage accounts, describing it as overly complicated and inefficient, requiring multiple steps before a report can be submitted.
Even when accounts belonging to under-13s are flagged, the Commission said follow-up action is often inconsistent, allowing some accounts to remain active without additional checks.
EU regulators estimate that around 10 to 12 percent of children under 13 are using Instagram or Facebook, contradicting Meta’s internal assessments. The Commission also accused the company of overlooking scientific research showing that younger users are more vulnerable to harm from social media exposure.
The findings require Meta to revise its risk assessment approach and strengthen safeguards aimed at detecting and removing underage users. Regulators are also calling for improved systems to prevent exposure to inappropriate content and to ensure higher levels of safety, privacy, and protection for minors online.
European Commission President Ursula von der Leyen has recently highlighted plans for an EU-wide age verification application, which would allow users to confirm their age through digital identity tools or official documents. Several EU member states are also considering stricter age limits, including potential bans for users under 15 or 16.
Meta has rejected the Commission’s preliminary conclusions. A company spokesperson said Instagram and Facebook are designed for users aged 13 and above, and that systems are in place to detect and remove underage accounts. The company also said it continues to invest in new technologies to improve age detection and expects to introduce additional measures in the near future.
The investigation, launched in May 2024, is part of the EU’s broader enforcement of the Digital Services Act. Regulators examined internal documents, risk assessments, and platform responses before issuing their findings.
If the Commission ultimately confirms the violations, it could impose fines of up to 6 percent of Meta’s global annual turnover. The company now has the opportunity to review the evidence and respond before a final decision is made.
The case comes amid wider global debate over children’s access to social media, age verification standards, and the responsibilities of major tech platforms in protecting younger users online.




