The Hidden Energy Cost of New AI Models
How much energy does the newest version of ChatGPT consume? While no one knows for sure, one thing is certain: it’s a whole lot. OpenAI, the company behind the popular AI, has not released any official figures on its large language model's energy footprint. This has left academics to quantify the energy use per query, and their findings show it is considerably higher than for previous models.
The Black Box of AI Energy Consumption
Currently, there are no mandates forcing AI companies to disclose their energy use or environmental impact. As a result, most do not offer those statistics publicly. As of May of this year, a staggering 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosures.
Sasha Luccioni, climate lead at the AI company Hugging Face, expressed her frustration with this lack of transparency. “It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” she says. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere.”
While OpenAI's CEO, Sam Altman, has shared some figures suggesting a query consumes 0.34 watt-hours of energy, he omitted key details, such as which model these numbers refer to, and has offered no corroboration for his statements.
Academics Estimate a Massive Power Surge
Experts outside of OpenAI estimate that ChatGPT-5 may use as much as 20 times more energy than the first version of ChatGPT. “A more complex model like GPT-5 consumes more power both during training and during inference... I can safely say that it’s going to consume a lot more power than GPT-4,” Rakesh Kumar, a professor at the University of Illinois, recently told The Guardian.
While a query to ChatGPT in 2023 consumed about 2 watt-hours, researchers at the University of Rhode Island’s AI lab found that ChatGPT-5 can use up to 40 watt-hours of electricity for a medium-length response. On average, they estimate the model uses just over 18 watt-hours for such a response. This places ChatGPT-5’s energy consumption rate higher than nearly all other AI models they track.
Calculating these rates is a significant challenge due to the industry's secrecy. “It’s more critical than ever to address AI’s true environmental cost,” said University of Rhode Island professor Marwan Abdelatti. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”
Consumers Foot the Bill for AI's Ambitions
As tech companies consume ever-increasing amounts of energy to power their AI ambitions, it's everyday consumers who are suffering the consequences. They are the ones footing the bill for skyrocketing energy usage. The New York Times warns that electricity rates for individuals and small businesses could rise sharply as tech giants build more data centers.
Furthermore, Silicon Valley's backtracking on climate pledges to meet this demand will directly impact global communities, regardless of whether they use AI.
“We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People's Counsel David Lapp recently told Business Insider. “Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”