Back to all posts

The Tragic Rise of AI Powered Sextortion Scams

2025-07-18AFP3 minutes read
AI
Cybercrime
Digital Safety

The tragic suicide of a Kentucky teenager earlier this year brought a dark digital threat into the spotlight. After his death, his parents uncovered a series of threatening texts demanding $3,000. The blackmailers' weapon was not a real photo, but a nude image of their son generated by Artificial Intelligence.

This heartbreaking incident is a stark example of a growing global crisis: sextortion scams targeting children, supercharged by the easy availability of "nudify" apps. These AI tools can digitally create sexualized or nude imagery from normal photos, providing ammunition for criminals.

The Alarming Rise of AI-Fueled Sextortion

Elijah Heacock, 16, was one of thousands of young victims of this digital blackmail. His father, John Burnett, told CBS News that the perpetrators are highly organized and relentless. "They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child," he said.

The FBI has confirmed a "horrific increase" in these cases, noting that they have led to an "alarming number of suicides." The primary targets are often teenage boys between 14 and 17. The scale of the problem is vast, with a survey from the non-profit Thorn revealing that 6% of American teens have been a direct victim of AI-generated deepfake nudes.

News image related to digital policy

A Widespread and Profitable Criminal Enterprise

Experts see a direct link between these AI services and financial sextortion. The British watchdog Internet Watch Foundation (IWF) noted that criminals no longer need to trick children into sending real intimate images. Instead, they can create fake images that are "convincing enough to be harmful" using generative AI. The IWF even uncovered a "pedophile guide" that explicitly teaches predators how to use these tools to blackmail children, with one author claiming to have successfully targeted 13-year-old girls.

This is a lucrative business. An analysis by Indicator, a publication investigating digital deception, found that just 85 websites selling these services could be worth up to $36 million annually. The report estimates that 18 of the most popular sites generated between $2.6 million and $18.4 million in just six months. Alarmingly, most of these sites operate using infrastructure from major tech companies like Google, Amazon, and Cloudflare.

News image related to online content

Global Impact From Schoolyards to Legislation

The abuse has spread worldwide, causing scandals in schools and universities as students use the technology to target their own classmates. A Save the Children survey in Spain found that one in five young people there have been victims of deepfake nudes. This led to Spanish prosecutors investigating minors for creating and distributing AI-generated pornographic content of classmates and teachers.

In response, governments are starting to act. The United Kingdom now criminalizes the creation of sexually explicit deepfakes, with penalties of up to two years in jail. In the U.S., the bipartisan "Take It Down Act" was signed into law, criminalizing the non-consensual publication of intimate images and requiring platforms to remove them.

The Uphill Battle Against Nudify Apps

Tech companies are also taking steps. Meta recently filed a lawsuit against the company behind a nudify app called Crush AI for violating its platform rules. However, researchers say the fight remains a challenging game of "whack-a-mole." Indicator described the operators of these sites as "persistent and malicious adversaries" that are difficult to shut down permanently.

Those in distress or having suicidal tendencies can seek help and counselling by calling these helplines.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.