The European Commission has accused Meta of failing to protect children on its services, saying the company broke its own age rules by allowing users under 13 to be present on Facebook, Instagram and WhatsApp. The Commission’s preliminary findings say Meta’s stated minimum age of 13 was not effectively enforced and that children can sign up using false birthdates because age checks are inadequate.
EU digital commissioner Henna Virkkunen said platform terms and conditions must lead to concrete protections for users, including minors. Meta will have the chance to respond to the Commission’s findings; if breaches are confirmed the company could face substantial fines under EU law.
Investigators also faulted Meta’s reporting tools for accounts belonging to under-13s, describing the process as cumbersome and ineffective — it can take up to seven clicks to reach a reporting form that is not automatically pre-filled with the account’s details.
The case is being pursued under the Digital Services Act, the EU’s framework for policing illegal and harmful online content and imposing penalties on platforms and search engines that fail to meet its obligations. The Commission concluded Meta violated provisions of the DSA.
Separately, EU officials are working on measures to better protect children online without blanket bans, including an age-verification app intended to confirm users’ ages while limiting unnecessary data sharing. Internationally, some countries are tightening minimum ages: Australia bans social media use by under-16s, while the UK, France and Denmark have debated similar limits and Germany has signaled support for higher minimum ages on apps like Instagram and TikTok.