AI Generated Wildfire Images Spark Fear and Misinformation
The Deception of AI-Generated Wildfires
Recent Facebook posts have circulated, showcasing dramatic images of flames engulfing coniferous forests, with water bombers overhead and seemingly powerless, toy-sized firefighters below. These posts, which garnered hundreds of likes and shares, were entirely fake.
As Canada endures its second-worst wildfire year on record, BC’s wildfire service has highlighted these images as clear examples of AI-generated misinformation. This phenomenon has grown as AI image generators become more accessible and financial incentives to create viral content persist.
Jean Strong, a digital communications officer with the BC Wildfire Service, warned that these pictures have real consequences. They fail to accurately represent the terrain, fire size, or its behavior. When people on social media mistake these false images for reality, it can lead to everything from unnecessary panic to dangerous interference with official wildfire suppression efforts.
"We don't appreciate [AI images] stoking fear or anxiety more than it already is in an emergency scenario, where people are nervous," said Strong.
Unmasking the Source of Misinformation
A brief look at the Facebook page where the images appeared reveals that dramatizing disaster is a successful strategy for online engagement and profit. The photos were originally posted on July 31 to a page attributed to "Joemar Sombero," which boasts around 45,000 followers. The page's description states it is dedicated to "breaking down the world’s wonders & dangers with AI-powered storytelling."
Pages on Facebook, unlike personal profiles, are designed for businesses and can be monetized. The owner of a page can earn money through engagement and can even set up a subscription service. The Sombero page charges subscribers $1.29 per month for exclusive content. To maximize profit on platforms like Meta's, creators often need a large following (typically over 10,000) and must post frequently. While content creators can use AI, they are expected to label it as such.
A Broader Pattern of AI-Generated Disasters
The wildfire images are not an isolated incident. The "Joemar Sombero" page frequently posts dramatic, AI-generated images of various natural disasters. For instance, a post about an earthquake in Algeria was illustrated with a fabricated image of a massive crack in the earth with explosions. The creator commented, "Mother Earth keeps reminding us how fragile we are … Stay safe Algeria. Prayers for everyone affected."
Similarly, a post about a mandatory evacuation in Dare County, North Carolina, used an AI image of a flooded highway with immense waves. The accompanying comment urged people to evacuate, stating, "this looks like one of the strongest storms we've ever seen…25 ft. waves is beyond terrifying." Although the county was under an evacuation order due to Hurricane Erin, the image was a complete fabrication. Despite attempts to contact the page owner for comment, no response was received. However, in post comments, the creator defended the use of AI images and told followers to consult official sources for information.
The Economics of AI-Powered Misinformation
Phil Newell of Climate Action Against Disinformation (CAAD) explained that this Facebook page exemplifies how AI helps content creators exploit the monetization structures of social media. "It's an example of how Big Tech has just made it so much easier to use lies to convince somebody to click on your thing, and that gets you some ad money," he said. Even if the AI images are not perfect, they offer a cheap and easy way for creators to produce large volumes of emotionally charged content that drives attention and clicks.
Michael Khoo, a co-founder of Upshift Strategies, noted that profiting from misinformation around natural disasters is not new, but AI has drastically changed the scale. It has lowered the cost of producing this type of content to nearly zero, making it easier than ever to spread. "It's just so much more spam on a colossal new level," Khoo concluded.