Sam Altman Reveals Minimal Water Usage of ChatGPT Queries

June 12, 2025
Sam Altman Reveals Minimal Water Usage of ChatGPT Queries

In a recent blog post published on June 10, 2025, OpenAI CEO Sam Altman claimed that each average query processed by the ChatGPT model utilizes approximately 0.000085 gallons of water, which he equates to about one fifteenth of a teaspoon. This assertion is part of a broader discussion regarding the energy consumption associated with artificial intelligence technologies and their environmental impact. Altman elaborated that a typical ChatGPT query requires around 0.34 watt-hours of energy, an amount comparable to the electricity consumed by an oven in just over one second or by a high-efficiency lightbulb in a couple of minutes.

The context of Altman’s statement emerges as AI companies face increasing scrutiny over their energy consumption and environmental footprint. A report by the International Energy Agency (IEA) indicated that AI technologies could potentially consume more electricity than Bitcoin mining by the end of the year 2025 (IEA, 2025). Moreover, a collaborative article published by The Washington Post and researchers in 2024 found that generating a 100-word email with an AI chatbot, specifically using the GPT-4 model, required a water equivalent of slightly over one bottle. This discrepancy highlights the variability of water usage based on the operational locations of data centers.

Experts have raised concerns about the sustainability of AI operations. Dr. Emily Carter, a Professor of Environmental Science at Stanford University, remarked, "As AI technologies proliferate, their resource consumption becomes a critical issue that must be addressed to mitigate environmental harm" (Carter, 2025). Furthermore, the World Bank’s 2023 report on sustainable technologies emphasized the need for AI developers to adopt more eco-friendly practices in light of their increasing resource demands.

Altman’s statements come as part of a larger narrative about the future of AI and its role in society. He predicts that, as technology evolves, the cost of intelligence will converge towards the cost of electricity, a claim that aligns with current trends in energy-efficient technologies (Altman, 2025). However, the methods by which Altman derived his water consumption figures remain unclear, as OpenAI has not provided additional details or sources to substantiate these claims.

The implications of Altman's assertions are significant, particularly in light of the growing discourse surrounding the environmental impacts of technological advancements. As AI continues to integrate into various sectors, the pressure to reduce its ecological footprint will likely intensify. A balanced approach, integrating efficiency with sustainability, will be paramount in shaping the future of AI technologies.

In conclusion, while Altman’s estimate of water usage per query might seem trivial at first glance, it serves as a reminder of the broader environmental considerations that the tech industry must confront. As AI continues to evolve and expand its influence, stakeholders will need to prioritize sustainable practices to ensure that technological progress does not come at the expense of environmental integrity.

Advertisement

Fake Ad Placeholder (Ad slot: YYYYYYYYYY)

Tags

Sam AltmanOpenAIChatGPTwater usageenergy consumptionartificial intelligencesustainabilityenvironmental impactelectricity consumptiondata centersInternational Energy AgencyThe Washington PostDr. Emily CarterStanford UniversityWorld BankAI technologyeco-friendly practicesresource consumptionAI ethicssustainable technologyenergy efficiencytechnology trendsGPT-4environmental sustainabilityAI resourcestechnological advancementsenergy costsenvironmental scrutinyfuture of AIAI in society

Advertisement

Fake Ad Placeholder (Ad slot: ZZZZZZZZZZ)