AI Tributes Gone Wrong Rod Stewarts Ozzy Video
The AI Debate Lands on the Concert Stage
Conversations surrounding the ethics of AI can often feel unwieldy and impossible to contain. From discussions about ChatGPT simplifying work to fears of a slippery slope, the debate is complex and divisive. This same sense of apprehension has now surfaced in the music world, following a controversial tribute to Ozzy Osbourne by fellow rock star Rod Stewart.
During a recent concert, Stewart faced a wave of online criticism after screening a bizarre AI-generated video honoring the late Black Sabbath frontman, proving that the ethical dilemmas of AI are no longer just theoretical.
What Did Rod Stewart's AI Tribute Show?
The video, shown at a concert in Alpharetta, Georgia, featured an AI-generated Osbourne holding a selfie stick in a heavenly setting. He was depicted posing and laughing alongside other deceased celebrities, including Tupac, Michael Jackson, and Freddie Mercury, all rendered in the uncanny, airbrushed slow-motion typical of AI content. The implication was clear: these music legends were taking selfies together in the afterlife.
The reaction was swift and harsh. One person who shared the clip called it the “craziest, most disrespectful s*** I ever saw in my LIFE!!!” Another user drew a sharp analogy, comparing it to “making a video of your dead grandma breakdancing in heaven with Princess Diana and putting it on a giant screen... without your knowledge or permission.”
A Question of Taste and Technology
Beyond the ethical concerns, many found the tribute aesthetically distasteful. The execution was so poor that it looked cheap, repetitive, and lacked any genuine prestige. It felt less like a heartfelt homage and more like an 80-year-old showing off a shiny new toy. This tackiness begs the question: what happened to the simple, poignant black-and-white “in memoriam” photo? Do we truly need to see our heroes digitally resurrected in a heavenly photo-op?
This sentiment echoes the discomfort felt with other posthumous CGI recreations, such as Peter Cushing in Rogue One: A Star Wars Story and Harold Ramis in Ghostbusters: Afterlife. In many cases, these gimmicky, Frankenstein-like resurrections feel superfluous when more respectful ways to pay homage exist.
The Unsettling Ethics of Digital Resurrection
The core of the issue lies in the ethics of not letting the deceased rest in peace. Seeing these figures put to work again dredges up unsettling questions about star exploitation, which can now continue long after death. AI allows creators to make a digital puppet say or do anything—a dream for executives, but a potential nightmare for the stars themselves. Actor Susan Sarandon has warned against this, fearing AI could make her “say and do things I have no choice about.”
While surviving family members might be able to provide input, there's no way to know if the person would have consented. Would Ozzy Osbourne, the self-proclaimed Prince of Darkness, have wanted to be memorialized beaming in the clouds with a selfie stick? Furthermore, including figures like Whitney Houston and Aaliyah, who suffered traumatic exploitation in the industry, feels particularly cruel. Shouldn't they be allowed to rest?
Including Michael Jackson is also deeply problematic, as it provides him a renewed platform and brings up complicated feelings for those who have accused him of sexual abuse.
A Call for Universal AI Literacy
It seems unlikely that Stewart or his team considered these deep emotional and ethical questions when creating the video. This leads to a crucial conclusion about the AI debate: since these tools are becoming ubiquitous, we must educate ourselves on their responsible use. The classic adage, "just because we can, doesn't mean we should," has never been more relevant.
And while many focus on guiding younger generations, Stewart’s misstep shows that everyone, regardless of age, needs to engage in these complex conversations. Understanding the social, economic, and moral implications of AI is one of the most important challenges of our time.