How AI Deepfake Romance Scams Devastate Victims
A Fan's Dream Becomes a Financial Nightmare
For Abigail Ruvalcaba, a message on Facebook from a handsome soap opera actor she had admired for years felt like a dream come true. The actor, with his rugged looks and charming smile, quickly won her over. She dismissed any initial doubts, and soon they were talking on the phone, sharing videos where he professed his love, and making plans to buy a beach house to start their life together.
But this was no fairytale romance. Abigail wasn't communicating with “General Hospital” star Steve Burton. She was talking to a sophisticated scammer who used the actor's likeness to swindle her. The scheme became so convincing that it ultimately led Abigail to sell her home to send money to the fraudster, costing her nearly everything.
The Alarming Rise of AI-Powered Romance Scams
While using promises of love to cheat the lonely is an age-old crime, the rise of artificial intelligence and deepfake technology has given scammers terrifyingly powerful new tools. They can now convincingly impersonate almost anyone with a significant online presence, especially celebrities whose voices and images are widely available.
This trend is growing at an alarming rate. According to the Federal Trade Commission, nearly 65,000 people reported being victims of romance scams in 2023, with total losses hitting an astounding $1.14 billion. Experts confirm that scams involving celebrity impersonations are becoming increasingly common.
“Even if you don’t want a Cinderella story, you can’t deny that a Cinderella story would be nice,” explains Ally Armeson, executive director of the nonprofit FightCybercrime.org. “I would be hard-pressed to point to a person that wouldn’t want to be adored by a celebrity.”
This isn't an isolated problem. Last year, YouTube removed thousands of AI-generated videos showing celebrities like Taylor Swift and Joe Rogan pitching scams. Other high-profile cases have emerged, with victims losing life savings to criminals pretending to be Keanu Reeves and Brad Pitt.
Celebrities Speak Out Against Impersonators
The real Steve Burton was completely unaware of the scam but has encountered numerous fans who believed they were in communication with him. He issued a public warning on Facebook, stating, “I get a thousand messages a day and 100 of them are people who think they’re talking to me on other apps… my agent, my manager, my publicist, nobody will be reaching out to you. Please be careful. You are not speaking to me anywhere unless I message you back from my Instagram @1steveburton.”
Other public figures, like “Family Feud” host Steve Harvey, have expressed deep concern over the misuse of their likeness. “My concern now is the people that it affects,” Harvey told CNN. “I don’t want fans of mine or people who aren’t fans to be hurt by something.” In response to this growing threat, a bipartisan group of lawmakers introduced the NO FAKES Act to protect individuals from unauthorized AI-generated recreations of their voice and likeness.
How a Deepfake Video Sealed the Deception
By the time Abigail Ruvalcaba understood she was trapped in an elaborate AI-bolstered scam, the damage was done. “I was in a fantasy world. He had an answer for everything,” the 66-year-old said. “I’m devastated, obviously, and I feel stupid. I should have known better.”
Vivian Ruvalcaba outside her parents’ condominium in Harbor City. Ruvalcaba’s mother, Abigail, was the victim of a celebrity romance scam. (Christina House / Los Angeles Times)
The scammer began asking for money for supposed management fees and funds to buy their shared home. Over several months, Abigail sent $81,000 through Bitcoin, gift cards, and cash transfers. To keep her convinced, the scammer sent a deepfake video of Burton professing his love. The 11-second clip, likely created by altering Burton's original warning video, featured manipulated audio and mouth movements.
“I love you so much, darling,” the deepfake video said. “I hope this puts a smile on your heart. Know that nothing will ever make me hurt you or lie to you, my queen.” While the audio was slightly robotic and the actor's mouth looked airbrushed, it was convincing enough to fool someone emotionally invested in the relationship.
How to Spot a Deepfake and Protect Yourself
Experts stress that while AI technology has become incredibly convincing, there are red flags to watch for in deepfake videos:
- Unnatural Movements: Look for strange eye movements, awkward facial expressions, or hair that looks too perfect.
- Visual Inconsistencies: Pay attention to abnormal skin tones, strange lighting, or shadows that don't match the environment.
- Mismatched Audio and Emotion: A disconnect between the words being spoken and the person's facial expression can be a major warning sign.
To protect yourself, conduct a reverse image search on photos or videos you receive to see if they have been altered or used elsewhere online. Above all, maintain a healthy dose of skepticism.
“If a celebrity that you admire slides into your DMs, the first thing to assume is that it’s a scam,” advises Iskander Sanchez-Rola of Norton Research Group. “And if the celebrity sends you a love note asking for money. Stop. That’s not love. That’s a deepfake.”
The Heartbreaking Aftermath and a Family's Fight
Abigail's daughter, Vivian, grew suspicious after learning her mother was buying large gift cards for a “friend.” When confronted, Abigail confessed to the affair, convinced she was talking to the real Steve Burton. “She said: I know him. He’s Steve Burton,” Vivian recalled. “How are you gonna tell me that’s not him? It’s his voice. It’s his face.”
Tragically, the family soon discovered that the scammer had convinced Abigail to sell her condominium, which she had owned since 1999, for a price far below its market value. Vivian managed to step in and cancel a final $70,000 transfer from the sale, but the home was already gone.
“That home was supposed to be their security in their golden years,” Vivian said. “Now it’s gone.” The family has filed a lawsuit to stop the transfer of the home and is now fighting to keep her parents home and recover what was lost in this devastating deepfake scam.