In a recent blog post, OpenAI CEO Sam Altman addressed the energy and water consumption associated with artificial intelligence, providing insight into the environmental impact of AI technology.
Altman disclosed that an average ChatGPT query uses approximately 0.000085 gallons of water, describing this amount as “roughly one fifteenth of a teaspoon.” He also provided a figure for energy usage, stating that an average ChatGPT query consumes about 0.34 watt-hours. To put this into perspective, Altman compared it to the energy an oven would use in just over one second or a high-efficiency lightbulb would use in a couple of minutes.
Looking ahead, Altman speculated on the future cost of AI, suggesting that “the cost of intelligence should eventually converge to near the cost of electricity.” This statement reflects his thoughts on the long-term economic implications of advancing AI technology.
When asked about the methodology behind Altman’s figures, OpenAI did not immediately respond to a request for comment. The lack of detailed methodology has raised questions about the accuracy and basis of the provided statistics.
The energy demands of AI technology have been under increasing scrutiny. Researchers have projected that AI could potentially surpass the power consumption of Bitcoin mining by the end of the year, highlighting the growing concern over AI’s environmental footprint.
Previous research has presented varying figures regarding AI’s energy and water consumption. An article published last year in The Washington Post, in collaboration with researchers, found that generating a 100-word email using an AI chatbot powered by GPT-4 required “a little more than 1 bottle” of water. The publication also noted that water usage can vary significantly depending on the location of the data center, underscoring the complexity of assessing AI’s environmental impact.




