Meta Accused of Violating EU Law by Failing to Keep Minors Off Facebook and Instagram

The European Commission has preliminarily found Meta, parent of Facebook and Instagram, violated EU law by failing to adequately protect minors online. Investigations suggest Meta did not effectively prevent children under 13 from accessing its platforms, citing easily bypassed age verification and difficult reporting tools for underage accounts. Meta disputes these findings, emphasizing its efforts to combat underage usage. The company faces potential fines up to 6% of its global turnover if the findings are upheld.

Meta Accused of Violating EU Law by Failing to Keep Minors Off Facebook and Instagram

The European Commission has taken a significant step in its intensifying scrutiny of Meta Platforms, the parent company of Facebook and Instagram, by concluding that the tech giant has violated EU law regarding the protection of minors online. This preliminary finding centers on Meta’s alleged failure to adequately prevent children under the age of 13 from accessing its flagship social media platforms.

In its announcement on Wednesday, the Commission stated that initial investigations revealed Meta’s non-compliance with the EU’s Digital Services Act (DSA). A key contention is the perceived inadequacy in enforcing the minimum age requirement of 13 for both Instagram and Facebook. The Commission pointed out that users can easily circumvent this rule by entering a false birth date during account creation, with a lack of robust verification mechanisms in place.

Furthermore, the Commission highlighted issues with Meta’s reporting tools for accounts suspected of belonging to minors. The investigation found these tools to be “difficult to use,” requiring multiple clicks—up to seven—to access the reporting form. Even when a minor’s account is flagged, the Commission observed a consistent lack of thorough follow-up actions or effective measures to remove underage users from the platforms.

“The Commission considers that Instagram and Facebook must change their risk assessment methodology, in order to evaluate which risks arise on Instagram and Facebook in the European Union, and how they manifest,” the Commission stated in its formal announcement, signaling a call for a fundamental shift in Meta’s approach to identifying and mitigating online harms.

A Meta spokesperson responded to the Commission’s preliminary findings, stating, “We disagree with these preliminary findings. We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age.” The spokesperson emphasized Meta’s ongoing commitment to combating underage usage, noting, “We continue to invest in technologies to find and remove underage users and will have more to share next week about additional measures rolling out soon. Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue.” This highlights Meta’s perspective that age verification is a complex, shared responsibility across the digital landscape.

Meta is now afforded the opportunity to review the Commission’s preliminary investigation findings and submit a written response. The outcome of this process could have significant financial repercussions for Meta. If the Commission’s preliminary findings are upheld following its final investigation, Meta could face fines amounting to up to 6% of its total worldwide annual turnover, a substantial penalty that underscores the seriousness of the regulatory concerns.

This development comes on the heels of two notable U.S. court rulings in March that have cast a shadow over Meta’s operations and platform design. One ruling determined that certain aspects of its platform design contributed to addiction and mental health issues among teenagers, while another concluded that the company had misled users concerning the safety measures in place for children on its services. These legal challenges, both in the U.S. and now potentially in the EU, signal a growing global trend of increased accountability for social media giants in safeguarding their younger users and mitigating the negative societal impacts of their platforms.

The core of the Commission’s concern, and the broader challenge for Meta, lies in the intricate balance between user privacy, data security, and the effective implementation of age-gating mechanisms. While Meta asserts its investment in technological solutions, the regulatory bodies and courts appear to be demanding more concrete and verifiable evidence of their efficacy. This regulatory pressure, coupled with ongoing technological advancements in AI and machine learning for user behavior analysis, suggests a pivotal moment for Meta as it navigates a more stringent and scrutinized digital environment.

Meta CEO and Chairman Mark Zuckerberg arrives at Los Angeles Superior Court ahead of the social media trial tasked to determine whether social media giants deliberately designed their platforms to be addictive to children, in Los Angeles, on Feb. 18, 2026.

Meta’s stock drops almost 8% as 2 court defeats add to Zuckerberg’s recent woes
Choose CNBC as your preferred source on Google and never miss a moment from the most trusted name in business news.

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/21148.html

Like (0)
Previous 2 days ago
Next 1 day ago

Related News