Instagram CEO Adam Mosseri Navigates “Problematic Use” vs. Addiction Debate in High-Stakes Social Media Trial
In a pivotal moment during a high-profile social media trial, Instagram CEO Adam Mosseri addressed the complex issue of platform engagement, distinguishing between “problematic use” and clinical addiction. Mosseri stated that while he believes social media usage can sometimes be detrimental, he does not equate it to addiction, emphasizing his lack of medical credentials in making such distinctions.
“I’m sure I said this, but I think it’s important to differentiate between clinical addiction and problematic use,” Mosseri remarked. He elaborated that the casual application of the term “addiction” can be misleading, drawing a parallel to personal habits like binge-watching a show. “So it’s a personal thing, but yeah, I do think it’s possible to use Instagram more than you feel good about. Too much is relative, it’s personal.”
Mosseri’s testimony took place in Los Angeles Superior Court, where Meta Platforms, the parent company of Instagram, along with YouTube, are facing allegations. Plaintiffs contend that these social media giants have deliberately misled the public regarding the safety of their applications, particularly concerning young users. The lawsuit asserts that specific design choices and features within these platforms have contributed to negative mental health outcomes. Notably, TikTok and Snap had previously settled with a plaintiff in this case, leaving Meta and YouTube as the remaining defendants in this particular trial.
A spokesperson for Meta commented on the proceedings, stating, “The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles.” The trial, centered on a plaintiff identified as “KGM” and her mother, alleges that features such as infinite scroll have fostered addictive behaviors and exacerbated mental health challenges. Meta’s representative countered, “The evidence will show she faced many significant, difficult challenges well before she ever used social media.”
Plaintiff’s attorney Mark Lanier pressed Mosseri on the concept of social media addiction and the company’s decision-making processes. When questioned about problematic Instagram usage, Mosseri reiterated, “I think it depends on the person.” Lanier further probed Mosseri’s role as a “decision maker,” inquiring whether profit motives preceded rigorous product testing, especially concerning child safety. Mosseri responded, “In general, we should be focused on the protection of minors, but I believe protecting minors over the long run is good for business and for profit.”
This trial is part of a broader wave of legal challenges examining the safety protocols of social media platforms and the extent of corporate knowledge regarding potential harms to children.
**Digital Filters and Mental Well-being: A Closer Look**
A key point of contention in the trial involved an examination of Instagram’s approach to digital filters, particularly those that mimic plastic surgery. An exhibit presented displayed an email exchange from November 2019 where Meta executives discussed the potential ramifications of allowing filters that could alter facial appearances.
Mosseri indicated that the company ultimately opted against enabling digital effects that could promote cosmetic surgery, following discussions about what would be permissible given advancements in digital makeup technology. The internal emails revealed concerns from the press and health experts regarding the potential for these filters to negatively impact mental health. Meta’s then-Chief Technology Officer, Andrew Bosworth, had alerted CEO Mark Zuckerberg to the plastic surgery filter, noting Zuckerberg’s concern about sufficient data on potential harm.
In another communication, former Meta executive John Hegeman suggested that a complete ban on filters that couldn’t be replicated with traditional makeup could impede the company’s competitiveness in Asian markets. He advocated for a “nuanced framework for responsible use” that would still permit the development of desirable products. Mosseri interpreted Hegeman’s sentiment as being driven by a desire for cultural relevance rather than direct financial gain, asserting that Meta does not generate revenue from filters but seeks to maintain engagement through cultural resonance.
Lanier presented an email where Mosseri was asked to choose from three options regarding the plastic surgery filters before a final decision by Zuckerberg. Option one proposed a temporary ban with a re-evaluation based on further well-being data, mitigating PR and regulatory risks but potentially limiting growth. Option two suggested lifting the ban while removing the filters from recommendations, acknowledging a “still notable risk to well-being.” The third option involved lifting the ban entirely, posing the lowest risk to growth but the highest risk to well-being and public perception. Mosseri favored option two, a choice that Lanier highlighted as involving a significant well-being risk. Margaret Stewart, then Vice President of Product Design and Responsible Innovation at Facebook, expressed her disagreement with Mosseri’s choice, advocating for a ban. Mosseri maintained that the company ultimately implemented a more “focused ban” on a subset of digital filters.
Explaining the company’s stance, Mosseri testified that digital filters cater to a minority of users seeking to enhance their posts for entertainment. He reiterated that these filters are not a direct revenue driver, stating, “We want to help people express themselves. But when it comes to revenue, that’s based on how many ads people see on Instagram. I haven’t seen any data that suggests using filters drives content consumption or ads. It’s not a revenue decision.”
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/17327.html