Home » Top stories » EU Tightens Grip on Kids’ Social Media Access
News Desk -

Share

Social media is facing stronger scrutiny in Europe as regulators take action against Big Tech platforms over child safety concerns.

Meta Platforms, the parent company of Facebook and Instagram, has been charged by the European Commission for breaching the EU’s Digital Services Act.

The Commission says Facebook and Instagram have not done enough to prevent children under 13 from accessing their platforms. In addition, it states that detection and removal systems for underage accounts are not effective enough. According to EU estimates, around 10% to 12% of children under 13 in Europe are still using these services.

Moreover, EU tech chief Henna Virkkunen said, “Instagram and Facebook are doing very little to prevent children below this age from accessing their services.” She also stressed that rules must lead to real action, adding that “terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users, including children.”

However, Meta disagrees with the findings. The company says it already uses tools to identify and remove underage accounts. At the same time, it plans to introduce additional measures in the coming weeks. A Meta spokesperson also highlighted that age verification is a broader challenge, calling it “an industry-wide challenge, which requires an industry-wide solution.”

Meanwhile, the regulatory stakes are significant. Under the Digital Services Act, companies can face fines of up to 6% of their global annual turnover if violations are confirmed after the investigation process.

Furthermore, the EU is pushing member states to adopt a new age verification app. This tool allows users to prove they meet minimum age requirements without sharing personal identity details. As a result, it aims to balance privacy with stronger protection for minors online.

In addition, Henna Virkkunen said the system “will allow everybody to keep browsing the internet in full privacy while ensuring that children do not have access to content that is not meant for them.” The system is also expected to be integrated into digital identity wallets across EU countries.

Finally, this move reflects a broader global push to regulate platforms like TikTok and Meta more strictly when it comes to child safety and design risks. As enforcement grows, social media platforms are under increasing pressure to strengthen safeguards for minors, making social media regulation a defining issue for the industry moving forward.