AIs Surprising Energy Bill New MIT Report Details Costs
That's a long time in the microwave.
AI uses a whole lot of energy. Credit: Photo Illustration by Thomas Fuller/SOPA Images/LightRocket via Getty Images
Unpacking AI's Thirst for Energy
You might have encountered the statistic that each ChatGPT search consumes the equivalent of a bottle of water. While this holds some truth, it doesn't capture the full picture. A comprehensive report from the MIT Technology Review now sheds light on how the artificial intelligence industry consumes energy, specifically detailing the power required for services like ChatGPT.
The study found that large-language models (LLMs) such as ChatGPT have an energy cost ranging from 114 joules to 6,706 joules per response. To put that in perspective, it's like running a microwave for a mere one-tenth of a second on the lower end, up to a full eight seconds on the higher end. The report notes that models consuming less energy typically use fewer parameters, which can result in less accurate answers.
The Staggering Energy Cost of AI Video
If text generation seems power-intensive, AI-produced video takes it to another level. According to the MIT Technology Review's investigation, creating just a five-second video with a newer AI model demands approximately 3.4 million joules. This is over 700 times the energy needed to generate a high-quality image and is comparable to running a microwave for more than an hour.
Daily AI Use: A Significant Power Drain
To illustrate the cumulative effect, researchers calculated the energy cost for a hypothetical scenario: if someone were to ask an AI chatbot 15 questions, request 10 images, and generate three five-second videos. The total energy consumed would be roughly 2.9 kilowatt-hours of electricity. This is equivalent to keeping a microwave running for over 3.5 hours.
Data Centers Under Strain from AI Boom
The investigation also delved into the escalating energy demands of data centers, the backbone of the AI industry.
Historically, the electricity usage of data centers had remained relatively stable due to advancements in efficiency. However, the advent of energy-hungry AI technology has changed this landscape. The energy consumed by data centers in the United States has reportedly doubled since 2017. Furthermore, government data indicates that by 2028, half of all electricity used by data centers will be dedicated to powering AI tools.
The Broader Context: AI's Pervasive Expansion
This report on AI's energy footprint arrives as generative AI is being integrated into nearly every facet of digital life. Google, for instance, announced at its I/O event a stronger push into AI, with AI integrations planned for Google Search, Gmail, Docs, and Meet. Beyond productivity tools, people are using AI to conduct job interviews, create deepfakes of online personalities, and even cheat in academic settings. As this in-depth new report underscores, all these advancements come at a substantial energy cost.
Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.