Back to all posts

AI Deepfakes Exploit Creator Content On Reddit

2025-05-17Maggie Harrison Dupré3 minutes read
AI
Deepfakes
Content Theft

"I can't imagine I'm the first, and I'm definitely not the last." These are the chilling words of an OnlyFans creator grappling with a new, insidious form of digital violation.

The Unsettling Discovery of AI Altered Content

An OnlyFans creator is raising alarms after discovering her photos were stolen and manipulated with deepfake technology to feature a completely new face. These altered images were then widely posted across Reddit.

Bunni, a 25 year old OnlyFans creator based in the UK, told Mashable that while image theft is a frequent issue in her profession, this instance was uniquely disturbing. Typically, she explained, those who misuse her content share her images without any changes.

However, this particular scam was far more elaborate. Using deepfake tools, a scammer created an entirely new online persona named "Sofía." This fabricated individual, supposedly a 19 year old in Spain, possessed Bunni's body but featured an AI generated face.

"It was a completely different way of doing it that I've not had happen to me before," Bunni, who posted a video detailing the theft on Instagram back in February, shared with Mashable. "It was just, like, really weird."

A New Wave of Digital Impersonation

This incident is the latest example of a concerning trend where "virtual influencers" are using AI to superimpose fake faces onto the bodies of real models and sex workers. This deceptive practice is often used to sell non existent subscriptions and defraud internet users.

How Scammers Leverage Platforms Like Reddit

Operating under the guise of "Sofía," the scammer inundated various Reddit forums with the faked images and accompanying commentary. Some posts were innocuous; "Sofía" would ask for fashion advice and, according to Mashable, even shared pictures of pets. Disturbingly, "Sofía" also posted images to r/PunkGirls, a subreddit dedicated to pornographic content.

While "Sofía" didn't publicly share links to another OnlyFans page, Bunni suspects the scammer's strategy involved engaging with targets through direct messages. In these private conversations, they might have shared an OnlyFans link or directly solicited money. Bunni managed to get the imposter account removed from Reddit by contacting moderators directly. However, her story highlights the alarming ease with which scammers can merge AI with stolen content to create and distribute convincing fakes.

The Growing Challenge of AI Generated Fakes

"I can't imagine I'm the first, and I'm definitely not the last, because this whole AI thing is kind of blowing out of proportion," Bunni conveyed to Mashable. "So I can't imagine it's going to slow down."

As Mashable points out, Bunni represented a somewhat ideal target: she has a fanbase, yet isn't so famous that her likeness would be instantly or widely recognized. For creators like Bunni, pursuing legal action presents significant hurdles. It is not only expensive, but the legal framework itself is still evolving to address deepfakes.

"I don't feel like it's really worth it," Bunni admitted to Mashable. "The amount you pay for legal action is just ridiculous, and you probably wouldn't really get anywhere anyway, to be honest."

Platform Responsibility and the Path Forward

Reddit, when approached by Mashable for a statement on the matter, did not respond. This silence from the platform raises questions about accountability and the measures in place to combat such AI driven exploitation.

More on deepfakes: Gross AI Apps Create Videos of People Kissing Without Their Consent

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.