Meta Fined $375M for Violating New Mexico Child Exploitation Laws

A New Mexico jury found Meta, parent company of Facebook and Instagram, liable for nearly $400 million in civil damages. The state accused Meta of failing to protect minors from online predators and violating consumer protection laws. The verdict followed evidence suggesting Meta knew its products harmed children and prioritized profits over safety. Meta plans to appeal, while the case moves to a phase addressing public nuisance and potential product design changes.

Meta Fined 5M for Violating New Mexico Child Exploitation Laws

A New Mexico state court jury has found Meta, the parent company of Facebook and Instagram, liable for nearly $400 million in civil damages. The verdict stems from a trial where state officials accused the social media giant of failing to adequately protect minors from online predators on its platforms.

The civil trial, which commenced in Santa Fe, focused on allegations that Meta violated New Mexico’s consumer protection laws and misled residents about the safety of its flagship applications. The state’s attorney general initiated the lawsuit in 2023, following an undercover operation that involved creating a fake social media profile of a 13-year-old girl. This operation, as previously communicated, resulted in the profile being “inundated with images and targeted solicitations” from individuals engaged in child abuse.

Following deliberations, the jury determined that Meta had willfully violated the state’s unfair practices act, imposing damages calculated at $375 million based on the number of violations found. Attorneys representing New Mexico had urged the jury during closing arguments to consider a civil penalty that could exceed $2 billion, emphasizing the profound impact on vulnerable users.

In response to the verdict, a Meta spokesperson stated, “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.” Meta had previously denied the state’s allegations, asserting its “longstanding commitment to supporting young people.”

The state’s attorney general hailed the verdict as “a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.” He further asserted that “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

The legal proceedings are set to continue on May 4th with the second phase of the trial, which will be conducted before a judge rather than a jury. This phase will address whether Meta created a public nuisance and should be compelled to fund public programs designed to mitigate the alleged harms. State legal counsel is also advocating for Meta to implement significant operational changes, including robust age verification systems, proactive removal of predators, and enhanced protection for minors engaged in encrypted communications.

During the trial, prosecutors presented internal company documents that reportedly detailed discussions among Meta employees concerning the implications of CEO Mark Zuckerberg’s 2019 decision to implement end-to-end encryption for Facebook Messenger. These disclosures suggested that this move could significantly impede the company’s ability to report an estimated 7.5 million instances of child sexual abuse material to law enforcement.

In discussions prior to the verdict, the state’s attorney general addressed Meta’s defense, which argued that prosecutors had selectively presented information to create an unfavorable portrayal of the company and highlighted Meta’s ongoing efforts to enhance safety features across its applications. The attorney general expressed skepticism that the jury would be convinced by Meta’s claims, suggesting they “should be held responsible for it.”

A key focus for the state, as articulated by the attorney general, is the potential for product design changes within New Mexico that could establish a precedent for broader adoption across the nation and globally. “One of the things that I am really focused on is how we can change the design features of these products, at least within New Mexico, and that would create a standard that could then be modeled elsewhere in the country, and, frankly, around the world,” he stated.

The state’s office has also pursued legal action against Snap, another social media company, with a similar lawsuit filed in 2024 that is currently in the discovery phase. Notably, the state’s legal team has successfully navigated Section 230 motions in both the Meta and Snap cases. The tech industry has historically relied on Section 230 of the Communications Decency Act to shield them from liability for user-generated content. This legal challenge in New Mexico represents a strategic shift, focusing on app design rather than content moderation, in an effort to overcome these protections.

Responding to Meta’s assertion that prosecutors were cherry-picking evidence, the attorney general countered, “What’s interesting is they accuse us of doing that, but all we’re doing is showing the world what they knew behind closed doors and weren’t willing to tell their users.”

This New Mexico case is part of a growing wave of social media-related litigation drawing parallels to the landmark Big Tobacco lawsuits of the 1990s. These cases share common allegations of companies misleading the public regarding the safety and potential harms associated with their products.

In parallel proceedings, a separate personal injury trial involving Meta and Google’s YouTube is underway in Los Angeles Superior Court. Jurors are deliberating whether design features implemented by these companies contributed to the mental distress and alleged addiction to social media apps of a plaintiff who was a minor at the time. Furthermore, a federal trial scheduled for later this year in the Northern District of California will address claims brought by numerous school districts and parents nationwide, alleging that the actions and platforms of Meta, YouTube, TikTok, and Snap have inflicted significant negative mental health consequences on teenagers and children.

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/20103.html

Like (0)
Previous 8 hours ago
Next 5 hours ago

Related News