The tension between Europe and big tech has opened another front, this time over child safety. Meta, which owns Facebook, Instagram, and WhatsApp, is accused of failing to put proper safeguards in place to stop children under 13 from using Instagram and Facebook. 

In a preliminary ruling released April 29, the European Commission said Meta does not have an adequate process for spotting and deleting accounts created by underage users who slip past its age restrictions, which could result in the company facing financial penalties and further enforcement action. 

The case against Meta centers on what regulators describe as basic failures in age enforcement, as Officials claim the company appears to be violating the Digital Services Act, the EU’s landmark 2022 law aimed at holding tech platforms more accountable. 

“Meta’s own general conditions indicate their services are not intended for minors under 13. Yet, our preliminary findings show that Instagram and Facebook are doing very little to prevent children below this age from accessing their services,” Henna Virkkunen, the executive vice president for tech sovereignty, security, and democracy, said in a statement.  

“The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children.” 

Among the concerns is Meta’s handling of age checks. Regulators said the company does not have strong enough systems to verify a user’s self-reported birth date, making it easier for children under 13 to slip through. They also criticized the reporting tool for minors as hard to navigate and ineffective, noting that reported users are often allowed to keep using the service without review. 

The action against Meta comes amid a wider European push to tighten child safety rules online, with regulators moving more aggressively against social media companies over child safety. Social media platforms like Snap and TikTok have also been targeted by regulators in Brussels, while Spain, France, and Denmark are among the countries considering stricter limits on youth social media use.  

Similar pressure is building in the United States as Meta and other social media companies are also facing growing scrutiny over child safety, after a California jury in March found Meta and YouTube liable for harming the mental health of a young user through addictive design features. 

Beyond enforcement, the European Union and several of its 27 member states are now looking at new age-verification systems to limit children’s access to certain online content. 

Regulators estimate that about 10 – 12% of children under 13 across the EU are using Instagram and Facebook, raising questions about how well age restrictions are being enforced in practice. 

Elon Musk Summoned to Paris Over Child Sex Abuse AI Deepfakes as $1.75 Trillion IPO Nears
French authorities are investigating suspected complicity in possessing child sexual abuse material, denial of crimes against humanity, and automated system manipulation