The European Commission on Wednesday accused Meta of failing to protect children from harms on its social media services, saying the company breached its own age rules by allowing users under 13 to be present on platforms including Facebook, Instagram and WhatsApp.
EU digital chief Henna Virkkunen said terms and conditions must be the basis for concrete action to protect users, “including children.” The Commission’s findings are preliminary and Meta will be given the opportunity to respond; if the breaches are confirmed the company could face substantial fines under EU law.
The investigation found that Meta’s policy of permitting users only aged 13 and over was not effectively enforced. Minors can register with a false birth date and there were insufficient checks to verify age. The Commission also criticised Meta’s reporting tool for under-13 accounts as hard to use and ineffective, noting it can require up to seven clicks to reach a form that is not automatically pre-filled with the account’s details.
The action comes under the Digital Services Act (DSA), the EU’s regulatory framework for tackling illegal content online and imposing penalties on platforms and search engines that do not meet its requirements. The Commission said Meta violated the DSA.
Separately, the EU has been developing measures to better protect children online rather than banning access outright. Officials recently announced a forthcoming age verification app designed to confirm a user’s age before they access certain services without sharing excessive personal data.
Globally, some countries are considering or adopting stricter minimum ages for social media. Australia has implemented a ban on children under 16 using social media, and the UK, France and Denmark have debated similar policies. Germany has indicated support for raising minimum ages on platforms such as Instagram and TikTok to reduce young people’s screen time.
Edited by: Alex Berry