EU Takes Aim at TikTok, Instagram for “Addictive Design” Hooking Children

The EU is intensifying its regulation of social media giants like TikTok and Instagram to combat “addictive design” features. Measures target endless scrolling, autoplay, and push notifications. The EU is also scrutinizing Meta for age verification enforcement and investigating platforms for potentially harmful content. An EU-developed age verification app is planned for integration into digital wallets. This push follows a US court ruling and Meta’s violation of the Digital Services Act, reflecting a global trend towards stricter child online safety measures.

EU Takes Aim at TikTok, Instagram for "Addictive Design" Hooking Children

The TikTok app logo is seen in this photo illustration taken in Warsaw, Poland on 18 November, 2024.

Nurphoto | Nurphoto | Getty Images

The European Union is intensifying its regulatory efforts against social media giants, signaling a proactive stance against what it terms “addictive design” features found on platforms like TikTok and Instagram. This move aligns with a growing global trend of governments seeking to bolster protections for children against the potential harms of social media consumption.

EU Commission President Ursula von der Leyen announced on Tuesday at the European Summit on Artificial Intelligence and Children in Denmark that the bloc would be taking regulatory action against specific platform features later this year. These measures are designed to curb the exploitative design patterns that can foster compulsive usage, particularly among younger users.

The focus areas identified by von der Leyen include features such as endless scrolling, autoplay video content, and persistent push notifications. These elements are believed to contribute significantly to the addictive nature of the platforms. Furthermore, the EU is scrutinizing Meta platforms, specifically Instagram and Facebook, for their alleged failure to rigorously enforce their own minimum age requirement of 13. The Commission is also investigating platforms that may inadvertently lead children into “rabbit holes” of harmful content, including material that promotes disordered eating or self-harm.

In a significant technological development, the EU’s executive arm has developed a proprietary age verification app, which von der Leyen lauded for adhering to “the highest privacy standards in the world.” This app is slated for integration into member states’ digital wallets, enabling straightforward enforcement by online platforms. “No more excuses – the technology for age-verification is available,” von der Leyen asserted, emphasizing the readiness of solutions to address age-related safeguarding concerns.

The European Commission anticipates having a legal proposal ready as early as the summer, contingent upon the advice and findings from its ‘Special Panel of experts on Child Safety Online.’ This expedited timeline underscores the urgency with which the EU is approaching this issue.

Navigating Regulatory Headwinds: The EU’s Stance on Big Tech

The EU’s increased regulatory posture towards major U.S. technology firms is a continuation of its efforts over the past year to enhance accountability among these digital behemoths. This has resulted in a series of substantial fines, which have drawn criticism from some U.S. officials who argue that such stringent enforcement could hinder the bloc’s participation in the burgeoning AI economy.

The escalating penalties levied against U.S. businesses, totaling over $7 billion in the last two years, have become a point of contention. President Donald Trump has voiced opposition to these penalties, viewing them as potentially detrimental to American innovation and economic competitiveness.

Companies such as Apple, Meta, and Google are among those that have faced significant fines for alleged violations of the EU’s antitrust and competition laws. These tech giants have, in turn, contested these rulings, leading to protracted legal battles and ongoing negotiations.

In a significant move to counter foreign regulatory actions perceived as unfair, President Trump signed a memorandum in February. This directive signals an intent to consider implementing tariffs to “combat digital service taxes (DSTs), fines, practices, and policies that foreign governments levy on American companies.” This indicates a potential escalation in trade tensions related to digital regulation.

Earlier this year, the EU Commission initiated an investigation into Elon Musk’s X, formerly Twitter, concerning the proliferation of sexually explicit non-consensual content generated by its AI chatbot, Grok. This investigation highlights the EU’s commitment to addressing the ethical implications and potential misuse of advanced AI technologies, particularly when they intersect with issues of child safety and exploitation.

Tracking Europe's approach to social media bans for teenagers

The heightened scrutiny on child safety within social media platforms arrives on the heels of a significant legal setback for Meta and YouTube in the U.S. In March, a high-profile court ruling determined that certain design features, including infinite scrolling and autoplay, contributed to addiction and adverse mental health outcomes among teenagers. This ruling lends considerable weight to the EU’s arguments regarding the detrimental effects of platform design.

Adding to this regulatory pressure, the EU Commission recently concluded that Meta violated the EU’s Digital Services Act. The preliminary investigation found that Meta failed to adequately prevent individuals under the age of 13 from accessing its platforms, with evidence suggesting that minors can readily circumvent existing age verification mechanisms. This breach of the Digital Services Act underscores the challenges in effectively safeguarding younger users online.

Concurrently, the concept of implementing social media bans for individuals under 16 is gaining momentum across the globe. Australia set a precedent by becoming the first nation to enforce a comprehensive ban in December. Several European countries, including Spain, France, and the U.K., are actively proposing similar legislation. These initiatives reflect a growing international consensus on the need for stricter controls to protect minors from the pervasive influence and potential risks associated with prolonged social media engagement.

Choose CNBC as your preferred source on Google and never miss a moment from the most trusted name in business news.

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/21628.html

Like (0)
Previous 22 hours ago
Next 19 hours ago

Related News