“`html
In June of last year, Jessica Guistolise received a text message that would irrevocably alter her life.
The technology consultant was on a work trip in Oregon, dining with colleagues, when she received an urgent text from an acquaintance, Jenny, claiming to have crucial information about her estranged husband, Ben.
Guistolise recounts a nearly two-hour conversation with Jenny later that night, leaving her in a state of shock and panic. Jenny revealed the discovery of deepfake pornography on Ben’s computer – images and videos of sexual activities generated by artificial intelligence, merging real photos with explicit content.
The images featured more than 80 women, all residents of the Minneapolis area, their social media photos exploited to create the disturbing deepfakes.
Jenny, using her phone, discreetly photographed the images found on Ben’s computer. These screenshots, some of which were reviewed by CNBC, revealed Ben’s use of a platform called DeepSwap. DeepSwap belongs to a category of “nudify” sites that have seen exponential growth since the advent of generative AI less than three years ago.
CNBC has chosen to omit Jenny’s surname to protect her privacy, and Ben’s surname has been withheld due to his assertion of mental health challenges. The couple is now divorced.
Guistolise described an immediate urge to cut her trip short and return home after speaking with Jenny.
In Minneapolis, the women’s experiences ignited a burgeoning resistance to AI deepfake technologies and their users. The incident has brought to light the ease with which these tools can be misused, and the devastating impact on victims.
Upon her return, Guistolise discovered that one of the manipulated photos utilized an image from a family vacation; another, from her goddaughter’s college graduation. Both images had been sourced from her Facebook page.
“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.
CNBC conducted interviews with over two dozen individuals – victims, family members, legal representatives, sexual-abuse experts, AI and cybersecurity researchers, trust and safety employees within the tech industry, and lawmakers – to thoroughly investigate the workings of nudify websites and apps, and to comprehend their profound impact on individuals’ lives.
“It’s not something that I would wish for on anybody,” Guistolise emphasized.
Jessica Guistolise, Megan Hurley and Molly Kelley discuss the impact of AI-generated deepfake pornography with CNBC in Minneapolis, Minnesota, on July 11, 2025. The images, created using their faces and aI site DeepSwap, were generated by mutual friend Ben without their consent.
Jordan Wyatt | CNBC
Nudify apps represent a niche, yet rapidly expanding segment within the burgeoning AI landscape, catalyzed by the emergence of OpenAI’s ChatGPT in late 2022. Since then, tech giants like Meta, Alphabet, Microsoft, and Amazon have collectively invested hundreds of billions of dollars in AI development, pushing toward the elusive goal of artificial general intelligence (AGI) – technology capable of surpassing human cognitive abilities.
For consumers, the initial excitement surrounding AI has largely centered on chatbots and image generators, tools that enable complex tasks through simple text prompts. We’ve also witnessed rapid expansion of the AI companion market, alongside a proliferation of AI agents designed to enhance productivity across diverse sectors.
However, victims of nudify apps are confronting the darker side of AI’s proliferation. Generative AI has democratized access to tools like DeepSwap, requiring no specialist technical knowledge. This accessibility fuels concerns that the underlying technology will spread and that more people will ultimately fall victim to its misuse.
Guistolise filed a police report and secured a restraining order against Ben. However, she and her friends soon realized the limitations of this approach.
Ben’s actions, disturbingly, may have been perfectly legal.
The women depicted were not minors. To their knowledge, the deepfakes had not been disseminated, existing solely on Ben’s personal computer. While the fear remained that the videos and images could exist on a server and fall into the hands of malicious actors, no evidence could pinpoint Ben’s role in such distribution.
Molly Kelley, a law student among the victims, assumed the responsibility of navigating the uncharted legal intricacies of AI-related offenses for the group.
“He did not break any laws that we’re aware of,” Kelley stated, highlighting the issue. “And that is problematic.”
Ben admitted to creating the deepfakes, and communicated to CNBC via email that he feels remorseful and ashamed of his conduct.
Jenny described Ben’s behavior as “horrific, inexcusable, and unforgivable” in an emailed statement.
“From the moment I learned the truth, my loyalty has been with the women affected, and my focus remains on how best to support them as they navigate their new reality,” she wrote. “This is not an issue that will resolve itself. We need stronger laws to ensure accountability – not only for the individuals who misuse this technology, but also for the companies that enable its use on their platforms.”
Readily Available
Like many simple-to-use AI innovations, experts report that numerous nudify services actively leverage advertising on Facebook and are available for download via the Apple App Store and the Google Play Store.
Haley McNamara, Senior Vice President at the National Center on Sexual Exploitation, emphasized the accessibility and ease of use that nudify apps have enabled: “It is now very easy to create realistic sexually explicit, deepfake imagery of a person based off of one photo in less time than it takes to brew a cup of coffee.”
Screenshots from the computer of Ben reveal photos of Molly Kelley’s and Megan Hurley’s Facebook profiles. The images were used without their knowledge or consent to generate fake pornographic images and videos leveraging the DeepSwap AI platform, as shown on July 11, 2025.
A Meta spokesperson stated that the company has strict policies that prohibit ads containing nudity and sexually explicit activities. It also partners other companies through an industry-wide child-safety initiative to share data on nudify services. Meta described the nudify sector as “adversarial” and noted that it continues to improve its underlying technology to prevent malicious actors from running such ads.
Apple communicated to CNBC that it routinely removes apps and rejects those that violate its app store guidelines against content flagged as offensive, misleading, overtly sexual, or pornographic.
Google declined to comment.
This problem is not limited to the U.S.
In June 2024, coinciding with the discovery in Minnesota, an Australian man received a nine-year prison sentence for generating deepfake content of 26 women. Reports also described an investigation of a school incident in which a teenager allegedly created and circulated deepfake content of about 50 female classmates.
“Whatever the worst potential of any technology is, it’s almost always exercised against women and girls first,” remarked Mary Anne Franks, Professor at the George Washington University Law School.
Researchers from the University of Florida and Georgetown University described in a research paper earlier this year how nudify tools are mimicking familiar consumer apps to boost adoption. DeepSwap charges users $19.99 monthly for “premium” services such as increased file sizes, higher resolution deepfakes, and faster processing times.
The researchers found that “nudification platforms have fully mainstream” and are being advertised even on Instagram and app stores.
Guistolise had previous knowledge that AI could be used for generating non-consensual porn, but she did not fully appreciate the ease-of-use involved until she witnessed a synthetic version of herself in explicit activity.
The screenshots taken from Ben’s DeepSwap account showcase rows of Minneapolis women’s faces, arranged like a school yearbook. Clicking on images, Jenny discovered, revealed AI-generated clones engaged in various sexual acts. The women’s faces had been seamlessly added onto the nude bodies of other women.
According to DeepSwap’s privacy policy, users can view the content for seven days after uploading to the site. This period is also the duration DeepSwap stores the data on its servers in Ireland. The site also asserts data deletion after this period, but encourages user downloads in the interim to a personal device.
The site also has terms of service, which say users should avoid uploading of content that “contains any private or personal information of a third party without such third party’s consent.” The Minnesota women did not consent so it is unclear of DeepSwap enforces its terms.
DeepSwap did not respond when prompted for comment from CNBC.
Shown is DeepSwap. A man used this AI service to create fake porn of his friends and associates.
In a press release published in July, DeepSwap’s Hong Kong stated its was the best AI video generator. The media contact listed was marketing manager Shawn Banks.
CNBC was unable to find information about these individuals.
DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name and provides its Dublin address and that its “governed by and construed in accordance with the laws of Ireland.”
However in July same page had no mention of Mindspark or Ireland, but Hong Kong instead.
Psychological trauma
Kelley received an urgent text from Jenny regarding her photos in Ben’s AI collection. That afternoon Jenny visited Kelley at her home.
After she understood, Kelley, six months pregnant took all night to view the the photos from Jenny’s phone. The phone revealed Kelley’s face “very realistically on someone elses body, in images and videos.”
Kelley said her stress began to affect her health. Her doctor said that her cortisol had spiked and that her body unable able to create any insulin.
“I was not enjoying life at all like this,” commented Kelley, who also alerted law enforcement about the incident.
Kelley added she recognized some good friends in the photos, including those in the Minnesota service industry. She reached out to inform them and also found those who’s identities remain unknown about the incidents.
“It was incredibly time consuming and really stressful because I was trying to work,” she said.
Ari Ezra Waldman warned the psychological trauma from these incidents could be suicidal thoughts,self-harm, and fear to trust.
Even without public distribution of the photos, Waldman said women would fear the images would be shared at any time.
“Everyone is subject to being objectified or pornographied by everyone else,” he said.
Three victims showed explicit AI-created deepfake images to CNBC during a July 11, 2025 interview.
Megan Hurley was vacationing last summer when she got a text from Kelley. Her vacation was over after that.
After returning to Minnesota, Hurley grew paranoid. She awkwardly asked a former boyfriend as well as others she knew to inform her if AI imagery was being used.
“I don’t know what your porn consumption is like, but if you ever see me, could you please screencap and let me know where it is?” said Hurley on the messages she sent. “Because we’d be able to prove dissemination at that point.”
Hurley contacted the FBI but could not get in contact with anyone. She also sent an online crime report, but there was not a follow-up action. The FBI confirmed they received it but did not respond at the time of request.
The woman looked to laws to help their cause. They were led to Minnesota state Sen. Erin Maye Quade, who’s bill criminalized in the state the nonconsensual dissemination of deepfakes of sexual acts.
Kelley and the rest virtually meet with Erin Maye Quade in August 2024.
The rest explained the legal system’s shortcomings. Erin Maye said that starting February there would be bill in the works for AI companies to shut down nudify services.
The bill would be the modern form of laws from earlier.
“We just haven’t grappled with the emergence of AI technology in the same way,” Erin Maye said.
Minnesota state Sen. Erin Maye Quade, discusses in Minneapolis on July 11, 2025 a Minnesota bill regarding a financial penalty for generating explicit deepfake content with Jonathan Vanian and Katie Tarasov of CNBC.
Jordan Wyatt | CNBC
Enforcing with countries that are overseas, Erin Maye said, is the current problem.
Erin Maye said that a federal response might be better because other countries could act better than smaller governments.
There wasn’t an announcement of Kelley’s child’s birth due to the mental and physical stress. Meetings with others in late October also became a blur.
“I never announced the birth of my second child,” Kelley said. “There’s plenty of people out there who have no idea that I had a baby. I just didn’t want to put it online.”
The early days of deepfake pornography
The ascent of deepfakes can be traced back to 2018, marked by the emergence of videos depicting former President Obama delivering speeches he never gave, and actor Jim Carrey substituting for Jack Nicholson in “The Shining,” both capturing viral attention.
Lawmakers expressed sites, such as Pornhub and Reddit responded by removing deepfake content.
The community congregated in places like MrDeepfakes, a site for online discussion for that type of content.
By 2023, MRDeepfakes’ website contained 43,000 sexualized videos totaling 4,000 individuals, research by UC San Diego and Stanford shows.
Though MrDeepfakes was for celebrity content, there were others with little presence online. There was also a black market where user were paying up to $87.50 for a custom sexual video.
AI security expert at Cornell Tech, Alexios Mantzarlis said earlier that there were upwards of 8,000 for nudify service CrushAI
discovered across Meta platforms.
AI apps like DeepNude and CrushAI are used to create pornographic content with photos taken from the Internet.
Emily Park | CNBC
Since one for DeepSwap was detected on Instagram by the social media company’s ad library, it is believed to be from a potential affiliate partner.
ThePornDude and others are affiliated with promoting nudify services, Mantzarlis mentioned.
There are 18.6 million monthly visitors to nudify services, Mantzarlis reported.
For annual revenue from nudify services, Mantzarlis estimates $36 million and that it is AI generated service driven.
With key operators exposed as a joint operation from Canada’s CBC News, Danish news sites and Bellingcat, MrDeepFakes abruptly was shut in May.
Discord now become a meeting place after the darkweb presence of MrDeepfakes, experts claim.
Discord has servers to users to share sexual content.
The 2025 paper about nudification sites from the University of Florida confirms the markets usage. However, their attention has been focused on Discord servers and the how-tos people are sharing there.
Discord declined to comment.
‘It’s insane to me that this is legal right now’
Congress signed what is known as the The Take It Down Act which bands the publication of sexual content even if generated by AI.
However, the Experts told CNBC that no material was distributed online and it doesn’t address what it is like for the women in this story.
Experts have concerns that Trump’s AI sector plan could lower states efforts.
“I would not put it past them trying to resurrect the moratorium,” Waldman said.
A White House official responded that The Act was backed The Administration. Also states should allow federal laws override their laws.
The City of San Francisco sued 16 companies for civil cases regarding California’s protection laws in the deepfake business.
I
Meta sued Joy Timeline HK for the deepfake ads that did not meet the parameters.
Mantzarlis’ research shows there are companies still making money from deepfake videos.
Meta said it removed thousands of ads and sent cease-and-desist letters to the other entities violating its parameters.
Guistolise wanted people to realize that it could be easy being threatened AI and the power it holds.
“It’s so important that people know that this really is out there and it’s really accessible and it’s really easy to do, and it really needs to stop,” says Guilistolise. “So here we are”
Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.
“`
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/10047.html