Back to all posts

AIs Surprising Energy Use A Complex Reality

2025-06-02Radhika Rajkumar17 minutes read
AI Energy
Sustainability
Tech Impact

AI is becoming an inescapable part of our lives, integrated into smartphones, search engines like Google, and various work tools. While AI promises enhanced ease and productivity, a crucial question arises: what is the environmental cost of a simple chatbot query?

With the increasing adoption of AI, its energy demands are also surging. AI systems are built on high-computation frameworks that require vast amounts of data. This data is housed in extensive networks of computers called data centers. Similar to personal computers, these massive data centers consume significant electricity. The process of training AI models also demands more computational power than traditional computing tasks.

gettyimages-1575407880 rob dobi/Getty Images

Also, learn about an AI tool that can show how much energy a single chatbot prompt uses.

Considering our daily energy use from office lighting and laptops to social media, how does AI's consumption truly stack up? Can the resource requirements of this technology be reduced or optimized over time? Is the purported time savings worth the additional emissions? And what should you understand about your own AI footprint?

We've consulted with experts and researchers to clarify AI's actual energy usage, address your sustainability concerns, and offer actionable tips.

Understanding Data Centers and AIs Thirst for Power

AI demands more resources to operate compared to other technologies. The sheer volume of data AI systems process and the computational power needed to run them distinguish AI from simpler computing tasks. An AI system functions like a synthetic brain, needing billions of data points to identify patterns. This is why models with more parameters often perform better; for instance, an image model trained on four billion cat images will likely produce a more realistic cat image than one trained on only 100 million.

All this knowledge needs a physical home. What's often referred to as "the cloud" isn't an ethereal storage space but a tangible data center—a large campus housing vast computer networks that process, store, and run complex queries on massive datasets.

Read more about how AI data centers are becoming 'mind-blowingly large'.

While these large computing facilities have always existed, primarily for enterprise cloud services, they are now in higher demand than ever due to the intensifying AI race and the increasing affordability and accessibility of AI tools.

"You have big companies that have been managing those as real estate assets," said John Medina, an SVP at Moody's. "Everyone only needed a little bit; they didn't need a ton of capacity."

Now, he stated, the pressure is on to serve a rapidly expanding customer base.

This demand is escalating energy use. The more parameters a model has, the more compute it requires, explained Vijay Gadepally, a senior staff member at MIT's Lincoln Laboratory and CTO at Radium, an AI infrastructure company. "You need more computing just to even store the model and be able to process it."

With AI investment accelerating, data center growth shows no signs of slowing. Shortly after taking office in January, President Donald Trump announced Project Stargate, a $500-billion initiative backed by companies like OpenAI, Softbank, and Oracle, to construct "colossal," 500,000-square-foot data centers. These companies are known as hyperscalers—a small but dominant group including Microsoft, Google, Meta, and AWS—building the majority of this infrastructure.

Explore why the future of computing must be more sustainable, even as AI demand fuels energy use.

However, Medina pointed out that the hype cycle might be overstating how much data center expansion is specifically for AI. "When we talk about hyperscalers, large data centers, AI data centers, we get confused. Most of it is for the cloud," he said, referring to services like storage and data processing. He mentioned that despite the buzz, data centers are processing a relatively small number of AI-related tasks.

Nevertheless, the AI boom is changing baseline standards, making direct comparisons difficult. "In the past, you didn't have a huge need like this. Four megawatts were considered hyperscale," Medina said. "Now, 50, 100 megawatts is that minimum."

Gauging AIs Energy Consumption A Growing Concern

As Sasha Luccioni, Ph.D., AI and climate lead at Hugging Face, acknowledged in a recent op-ed, we still lack precise figures on AI's energy consumption because few companies publicize their usage data.

However, several studies indicate a rise in energy consumption, driven by increasing AI demand. A 2024 Berkeley Lab analysis found that electricity consumption has grown exponentially alongside AI in recent years. GPU-accelerated servers, hardware specifically for AI, multiplied in 2017. A year later, data centers accounted for nearly 2% of total annual US electricity consumption, a figure growing by 7% annually. By 2023, this growth rate jumped to 18% and is projected to reach as high as 27% by 2028. Even if we can't isolate AI's exact share of data center energy, the correlation between increased consumption and AI expansion is clear.

Learn how your inefficient data center hampers sustainability - and AI adoption.

Boston Consulting Group estimates that data centers will represent 7.5% of all US electricity consumption by 2030, equivalent to 40 million US homes.

Mark James, interim director of the Institute for Energy and the Environment at Vermont Law and Graduate School, provided another comparison. A large facility at full capacity uses 1,000 megawatts per hour – "the same size as the peak demand of the state of Vermont -- 600,000+ people -- for months," he noted.

Currently, global data centers use about 1.5% of the world's electricity, similar to the entire airline industry. This figure is likely to be surpassed; an April 2025 IEA report found that global data center electricity use has increased by 12% annually since 2017, "more than four times faster than the rate of total electricity consumption." Data centers, directly or indirectly driven by AI, are increasingly prominent in the global energy landscape, even as other energy usage remains relatively stable.

For some, this is a cause for alarm. "This is going to be a carbon problem very quickly if we're scaling up power generation," Gadepally warned.

Want more stories about AI? Sign up for Innovation, our weekly newsletter.

Others aim to contextualize these figures. While AI is evidently driving up energy costs, research also shows that global energy consumption, in general, is rising. Newer data centers and GPUs are also more energy-efficient than older models, potentially generating relatively less carbon. "These 100, 200-megawatt massive builds are using the most efficient technology -- they're not these old power guzzlers that the older ones are," Medina said. Even as data centers proliferate, their projected consumption curve might level out due to modern technology.

Within AI energy use, not all AI types have the same footprint. While energy consumption data for proprietary models from companies like OpenAI and Anthropic isn't public (unlike open-source models), generative AI—especially image generation—appears to use more compute and thus create more emissions than standard AI systems.

An October 2024 Hugging Face study of 88 models found that generating and summarizing text consumes over 10 times the energy of simpler tasks like image and text classification. It also found that multimodal tasks, where models use image, audio, and video inputs, are "on the highest end of the spectrum" for energy use.

The AI Water Footprint More Than Just Energy

Regarding specific comparisons, research on AI's resource use varies widely. One study determined that asking ChatGPT to write a 100-word email uses an entire bottle of water—a claim that has rapidly spread on social media.

But is it accurate?

"It's possible," said Gadepally. He noted that GPUs generate substantial heat; even with other cooling methods, they still require water cooling. "You're using something like 16 to 24 GPUs for that model that may be running for 5 to 10 minutes, and the amount of heat that's generated, you can start to kind of do the math," he said.

These systems don't use just any water; they need clean, high-quality, potable water. "These pipes, they don't want to clog them up with anything," Gadepally explained. "Many data centers are in areas with stressed watersheds, so that's something to keep in mind."

New methods like immersion cooling, where processors are submerged in a liquid like mineral oil, show promise for reducing water use and energy consumption compared to methods like fans. However, the technology is still developing and would need widespread adoption to make a significant impact.

Check out the best AI image generators of 2025: Gemini, ChatGPT, Midjourney, and more.

With proprietary data still unclear, several other comparisons exist for chatbot query energy use. Jesse Dodge, a researcher from the nonprofit institute Ai2, has compared one ChatGPT query to the electricity used to power a light bulb for 20 minutes.

The Hugging Face study noted that "charging the average smartphone requires 0.022 kWh of energy, which means that the most efficient text generation model uses as much energy as 9% of a full smartphone charge for 1,000 inferences, whereas the least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation."

According to Gadepally, an AI model processing a million tokens—roughly a dollar in compute costs—emits about as much carbon as a gas-powered car driving five to 20 miles. But energy use also varies widely depending on the prompt's complexity. "Saying 'I want a short story about a dog' will likely use less compute than 'I would like a story about a dog that's sitting on a unicorn written in Shakesperean verse,'" he said.

If you're curious about the energy use of your individual chatbot queries, Hugging Face designed a tool that estimates the energy consumption of queries to different open-source models. Green Coding, an organization working with companies to track their tech's environmental impact, designed a similar tool.

AIs Energy Use in Perspective Comparisons with Other Technologies

While AI investment appears to be increasing overall energy consumption, researchers advise viewing energy use relatively.

The common metric that one ChatGPT query uses 10 times as much energy as a Google search is based on an outdated 2009 Google estimate that one Google search consumes 0.3 Watt-hours (Wh). It's hard to say if that number has changed today due to alterations in the complexity of Google searches or increased chip efficiency.

Either way, as data scientist and climate researcher Hannah Ritchie points out, that 0.3 Wh of energy is relatively small. She noted that in the US, average daily electricity usage is about 34,000 Wh per person. Using the old Google metric, a ChatGPT prompt is just 3 Wh; even with multiple daily queries, that's still not a huge percentage.

Plus, tech that doesn't explicitly use AI already consumes lots of data center bandwidth.

"What are the hottest digital applications today? TikTok, Instagram Reels, YouTube searches, streaming, gaming -- all of these things are hosted from the cloud," said Raj Joshi, another analyst and SVP at Moody's.

He and Medina added that as AI features integrate with everything from gaming to enterprise tech, attributing specific energy demands to AI or non-AI applications is becoming increasingly difficult.

Within AI, however, model needs are evolving. "It's quite significant," Gadepally said of the energy increase compared to earlier in the technology's history. He noted that inference—when a model makes predictions after training—now accounts for much more of a model's lifetime cost. "That wasn't the case with some of the original models, where you might spend a lot of your effort training this model, but the inference is actually pretty easy -- there wasn't much compute that needed to happen."

Balancing Act AIs Environmental Impact vs Its Benefits

Because AI is now inextricably linked with existing technology, experts say determining its specific impact is challenging. Whether to use it or not may depend more on individual judgment than on hard numbers.

"From a sustainability perspective, you have to balance the output of the AI with the use of the AI," Medina said. "If that output is going to save you time that you would have your lights on, your computer on, and you're writing something that takes you an hour, but [AI] can do it in five minutes, what's the trade-off there? Did you use more energy taking 30 minutes to write something that they can write you in one minute?"

Learn how AI hallucinations could help create life-saving antibiotics.

To Medina's point, AI can also advance research and technology that helps track climate change more quickly and efficiently. Ai2 has launched several AI tools that assist in collecting planetary data, improving climate modeling, preserving endangered species, and restoring oceans. Citing data from the Sustainable Production Alliance, AI video company Synthesia argues that AI-generated video produces less carbon than traditional video production methods, which depend on travel, lighting, and other resource-intensive infrastructure.

Regardless, parts of the industry are responding to concerns. In February, Hugging Face released the AI Energy Score Project, featuring standardized energy ratings and a public leaderboard of each model's estimated consumption.

Towards Greener AI Innovations and Strategies

Across the industry, organizations are exploring ways to improve AI sustainability over time. At MIT's Lincoln Lab, Gadepally's team is experimenting with "power-capping," or strategically limiting the power each processor uses to below 100% of its capacity, reducing both consumption and GPU temperature. Chinese AI startup DeepSeek's achieved a similar outcome by being more efficient with how it runs and trains its models, though they are still quite large.

That approach has its limits, though. "No one's figured out how to make a smaller model suddenly do better on high-quality image generation at scale," Gadepally said.

Discover what sparsity is: DeepSeek AI's secret, revealed by Apple researchers.

Because he doesn't foresee a decline in AI demand—especially with the proliferation of on-device phone features—Gadepally stated that efficiency and optimization are current solutions. "Can I improve my accuracy by one and a half percent instead of one percent for that same kilowatt hour of energy that I'm pumping into my system?"

He added that switching data centers to run solely on renewable energy isn't straightforward, as these sources don't turn on and off as immediately as natural gas, a necessity for large-scale computing. But by slowing AI's consumption growth curve with tactics like power capping, it becomes easier to eventually replace those energy sources with renewable ones—similar to replacing your home lightbulbs with LEDs.

To move towards sustainability, he suggested companies consider flexibility in compute locations, as some areas may be more energy-efficient, or training models during colder seasons when local energy grid demands are lower. This approach also helps lower processor temperatures without significantly impacting model performance, enhancing output reliability and reducing the need for potable water cooling. Such benefits, along with cost-effectiveness, incentivize companies to make sustainability-forward changes.

Gadepally believes companies are well-intentioned regarding sustainability; he thinks the challenge lies in implementing changes fast enough to mitigate environmental damage.

Your AI Use and the Bigger Climate Picture

If you're concerned about how your AI use impacts your carbon footprint, it's not a simple issue to resolve. Avoiding AI tools might not reduce your carbon footprint as effectively as other lifestyle changes.

Andy Masley, director of advocacy group Effective Altruism DC, compared the impact of asking ChatGPT 50,000 fewer questions (10 questions daily for 14 years) to other climate-forward actions from philanthropic network Founders Pledge.

The results are quite minimal. "If individual emissions are what you're worried about, ChatGPT is hopeless as a way of lowering them," Masley writes. "It's like seeing people who are spending too much money, and saying they should buy one fewer gumball per month."

"It saves less than even the 'small stuff' that we can do, like recycling, reusing plastic bags, and replacing our lightbulbs," Ritchie added in a Substack post referencing Masley. "If we're fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere."

Find out about the best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives.

In the broader context, Masley and Ritchie worry that focusing on AI energy consumption could divert well-intentioned users from larger, more urgent climate stressors.

Gadepally agreed that abstaining from AI offers limited benefits. "In this day and age, it's almost like saying, 'I'm not going to use a computer,'" he said. Still, he has suggestions for improving AI energy use and transparency:

Demand transparency from providers

With accurate data, firms like Gadepally's can estimate AI's energy use. Individuals can advocate for AI companies to publicize this information. As the AI field becomes more competitive, user demand for sustainability could influence the market.

Speak up during procurement processes

Sustainability is often a factor in corporate decisions, especially when evaluating vendors. Gadepally advocates applying this to AI. If your business licenses AI tools, he suggests requesting energy usage and sustainability data during negotiations.

"If large companies demand this on multi-million dollar contracts that are working with account executives, that can get very far," he pointed out, similar to how they handle other items like work travel. "Why wouldn't you ask about this, where it really does add up pretty quickly?"

Use the smallest possible model

Be intentional about the model quality relative to your needs. "Almost every provider has multiple versions of the model -- we tend to use probably the highest quality one that we have access to," which can be wasteful, Gadepally noted. "If you're able to get away with something smaller, do that."

Gadepally also encourages users to accept imperfect results more often. Prompt refinement can be done with a lower-quality model; once perfected, the prompt can be tried with a more expensive, higher-parameter model for the best answer.

In addition to these goals, Michelle Thorne, director of strategy at The Green Web Foundation – a nonprofit "working towards a fossil-free internet" – urged tech companies to phase out fossil fuels across their supply chains and take steps to reduce harms when mining for raw materials.

The Road Ahead AI Energy and Sustainability

The industry is responding to sustainability questions with initiatives like the Frugal AI Challenge, a hackathon at the 2025 AI Action Summit held in Paris this past February. Google stated in its sustainability goals its intent to replenish 120% of the freshwater it consumes across its offices and data centers by 2030.

Some argue that the bigger-is-better approach in AI may not actually yield more value or better performance, citing diminishing returns.

Learn why neglecting AI ethics is such risky business - and how to do AI right.

Ultimately, however, regulation will likely prove more effective in standardizing expectations and requirements for tech companies to manage their environmental impact, within and beyond their use of AI.

Long-term, AI expansion (and its associated costs) shows no signs of stopping. "We have sort of an insatiable appetite for building more and more technology, and the only thing that keeps you limited has been cost," Gadepally said -- a nod to Jevons Paradox, or the idea that efficiency only begets more consumption, rather than satisfaction.

For now, AI's energy future is unclear, but the tech industry at large is an increasingly significant player in a climate landscape marked by skyrocketing demand and very little time.

Get the morning's top stories in your inbox each day with our Tech Today newsletter.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.