Non-Consensual Content

  • 5 Takeaways from CNBC’s Investigation into Nudify Apps and Sites

    In 2024, women in Minneapolis discovered a male acquaintance used the AI platform DeepSwap to create non-consensual, explicit deepfakes using their Facebook photos. This impacted over 80 women and highlighted the growing threat of easily accessible “nudify” apps. The victims are advocating for legislation in Minnesota to curb these services, as existing laws are insufficient. The deepfakes caused significant psychological distress. The case underscores the need for legal and ethical frameworks to address AI misuse while balancing innovation and individual protection. The origins of DeepSwap are unclear due to inconsistent information.

    2025年9月28日