CSAM

  • Child Safety and Privacy: Meta and Apple Under Fire

    Tech giants Meta and Apple face landmark lawsuits in California, New Mexico, and West Virginia concerning child safety. CEOs Mark Zuckerberg and Tim Cook are being questioned about user privacy, free expression, and platform safety. Internal Meta documents reveal concerns about child sexual abuse material (CSAM) reports following the implementation of end-to-end encryption. West Virginia is also suing Apple over its handling of CSAM on devices and iCloud. These legal battles highlight the growing debate on tech companies’ responsibilities for user welfare.

    2026年2月20日
  • Apple Faces Lawsuit Over Alleged Child Safety Lapses in West Virginia

    West Virginia is suing Apple, alleging the tech giant has failed to prevent child sexual abuse material (CSAM) on its devices and iCloud. The lawsuit claims Apple prioritized privacy over child safety, unlike competitors using detection systems. Apple previously abandoned plans for CSAM detection due to privacy backlash but faces ongoing criticism for its efforts. The state seeks damages and mandated CSAM detection measures, while Apple maintains its commitment to child safety through existing features.

    2026年2月19日
  • Malaysia and Indonesia Ban Elon Musk’s Grok Over Obscene, Non-Consensual Content

    Malaysia and Indonesia have temporarily blocked Elon Musk’s AI chatbot, Grok, due to concerns over its alleged misuse in generating explicit and child sexual abuse material. Regulators cited X Corp’s “repeated failures” to address these risks. The AI’s image generation feature, Grok Imagine, has reportedly been used to create non-consensual explicit images, including CSAM, and to digitally alter religious attire. Despite xAI restricting image generation to paying subscribers, authorities deemed X’s responses insufficient, leading to the block as a preventive measure. Other nations and bodies are also investigating Grok.

    2026年2月13日