ChatGPTs Thirst For Power And Water Revealed
Image credit: dszc/Getty Images
OpenAI CEO Sam Altman recently shared significant insights in a new blog post, sparking widespread discussion. Among his futuristic visions, he disclosed key figures on ChatGPT's resource consumption.
Altman's Vision: The Gentle Singularity
Sam Altman's latest blog post, titled "The Gentle Singularity," offers a look into what he describes as an AI-driven future, following a pattern of positive AI manifestos previously seen from figures in the tech world, such as those from Meta and Anthropic.
Altman's post is filled with accelerationist predictions for the 2030s, including ambitions like colonizing space, developing brain-computer interfaces, and achieving abundant energy. He also envisions a future where AI significantly reshapes labor and production, with robots building more robots, data centers constructing new data centers, and humans potentially engaging in what he terms "fake jobs" that still feel incredibly important and satisfying.
He states, "We are past the event horizon; the takeoff has started."
ChatGPT's Resource Thirst: The Numbers Revealed
One of the most noteworthy revelations in Altman's post concerns the energy and water consumption of a typical ChatGPT query—a topic that has been subject to much external speculation. Now, we have figures directly from OpenAI's CEO.
Altman wrote:
"As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity. (People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.)"
While these individual figures might not seem alarming at first glance, their implications become clearer when considering the massive scale of ChatGPT's operations.
Scaling Up: What This Means Monthly
To put these numbers into a broader context, let's consider recent user statistics. The Information reported that OpenAI is serving approximately 500 million active users per week. While this is an extrapolation, if we assume each user submits an average of 10 queries per week, the monthly resource consumption for ChatGPT could be substantial.
Based on these admittedly rough numbers, one month's worth of ChatGPT queries might use:
- 🏠 Enough electricity to power about 8,200 average American homes (approximately 7.4 gigawatt hours).
- 💧 Enough water to fill about 90 residential swimming pools (approximately 1.8 million gallons).
Of course, these estimates can vary based on the actual number of queries. Feel free to make your own calculations based on different usage assumptions. The key takeaway, however, is that we now have concrete per-query consumption figures to work with.
The Future Cost of Intelligence
Things are indeed moving at a rapid pace in the AI field, and it remains to be seen how many of Altman's predictions will materialize. He is aware of how ambitious, or even radical, some of his ideas might appear.
He concludes with a forward-looking statement:
"Intelligence too cheap to meter is well within grasp. This may sound crazy to say, but if we told you back in 2020 we were going to be where we are today, it probably sounded more crazy than our current predictions about 2030."
These insights into ChatGPT's resource use provide a valuable data point as discussions around the sustainability and impact of large-scale AI models continue to evolve.