--- title: The Hidden Cost of Your AI Prompts: Energy, Water, and What We Owe the Future description: A realistic look at AI's environmental footprint. The actual numbers on energy and water consumption, what drives the impact, and why the conversation about sustainable AI is more complicated than either side admits. date: February 5, 2026 author: Robert Soares category: ai-strategy --- Every prompt costs something. Not just money. When you ask ChatGPT to write an email, data centers consume electricity, cooling systems draw water, and somewhere a power plant burns fuel to keep the lights on. The question is whether that cost matters. According to [MIT research](https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117), data center electricity consumption reached 460 terawatt-hours globally in 2022, equivalent to the 11th largest country consumer, landing somewhere between Saudi Arabia and France. Projections suggest this could reach 1,050 terawatt-hours by 2026, which would rank fifth globally if data centers were a nation. Those numbers feel abstract. Let me make them concrete. [A ChatGPT query consumes roughly ten times the electricity of a Google search](https://research.aimultiple.com/ai-energy-consumption/), according to estimates from Goldman Sachs. Advanced reasoning models like OpenAI's o3 require 7 to 40 watt-hours per query, which means up to 100 times more than basic text models. Image generation demands 20 to 40 times more energy than text, while video generation requires 1,000 to 3,000 times more. ## Training vs. Inference: Where the Energy Actually Goes There's a common misconception. Training is the one-time cost. Inference is the ongoing drain. Most discussions focus on training, the dramatic spike of energy required to teach a model, but inference now dominates total consumption because it happens millions of times daily. Training GPT-4 consumed 50 gigawatt-hours of energy, enough to power San Francisco for three days. That sounds enormous, and it is. But here's the thing: [inference now accounts for 80 to 90 percent of AI computing](https://research.aimultiple.com/ai-energy-consumption/), not training. The long, steady curve of everyone using the model eventually surpasses that initial spike. This matters for how we think about the problem. A one-time training run is a fixed cost you can amortize across the model's lifetime. But inference scales with usage. The more people use AI, the more energy it consumes. No ceiling exists. As a Hacker News user [zekrioca](https://news.ycombinator.com/item?id=42221756) put it: "Electricity and water are today's main issues, but there are many indirect others." The environmental footprint extends beyond what we can easily measure, into hardware production, mining for rare earth elements, and the embodied carbon of building data centers. ## The Water Nobody Talks About Data centers get hot. Heat requires cooling. Cooling often requires water. [According to research cited by the Environmental Law Institute](https://www.eli.org/vibrant-environment-blog/ais-cooling-problem-how-data-centers-are-transforming-water-use), a typical data center uses 300,000 gallons of water daily, equivalent to about 1,000 households. Large facilities can require 5 million gallons daily, matching the needs of a town with 50,000 residents. The per-query impact varies wildly depending on who you ask. OpenAI CEO Sam Altman claims a typical ChatGPT query uses "roughly one fifteenth of a teaspoon" of water. A Washington Post report suggested that writing an AI-generated email consumes an entire bottle of water. [Researchers at UC Riverside estimated 519 milliliters per 100-word prompt](https://undark.org/2025/12/16/ai-data-centers-water/). The truth likely sits somewhere in this range, varying by data center location, cooling technology, and electricity source. What's not in dispute is the trajectory. [By 2028, AI in the US could require as much as 720 billion gallons of water annually](https://www.aljazeera.com/opinions/2026/1/21/ais-growing-thirst-for-water-is-becoming-a-public-health-risk) just to cool AI servers, enough to meet the indoor needs of 18.5 million households. Here's where it gets complicated. Evaporative cooling uses less energy but more water. Air cooling uses more energy but less water. Optimizing for one metric worsens the other. Cornell engineer Fengqi You explained it plainly in [Undark's reporting](https://undark.org/2025/12/16/ai-data-centers-water/): "How much water you need depends on climate, technology used, and energy mix." ## The Human Reaction People are noticing. On Reddit, users expressed frustration about the casualness of AI usage, with one observing that "people are so nonchalant about it and act as if it's just like Googling something when it actually is horrible for the environment." But the Hacker News crowd tends toward a different view. User [joegibbs](https://news.ycombinator.com/item?id=39975766) argued: "We should be focusing on generating the energy we use more sustainably, rather than trying to make people spend all day deciding which personal choice to make." This captures something real about the tension between individual responsibility and systemic change. Another user, [wmf](https://news.ycombinator.com/item?id=44039808), framed it starkly: "Emissions should be fixed on the production side (decarbonization) not on the demand side (guilt/austerity)." Both perspectives contain truth. Individual choices scale when millions make them. But individual guilt without systemic change just creates anxiety while the problem grows. ## What Drives the Footprint Understanding the levers helps. Not all AI usage is equal. **Model size matters.** Larger models require more computation per inference. GPT-4 is vastly more energy-intensive than GPT-3.5, which is more intensive than a small specialized model. Many tasks that currently use frontier models could work fine with smaller ones. **Task type matters.** Text generation is relatively cheap. Image generation costs 20 to 40 times more. Video generation costs thousands of times more. A marketing team generating dozens of AI images daily has a dramatically different footprint than one using text models for email drafts. **Provider matters.** Data centers vary in their power usage effectiveness, their electricity sources, and their cooling systems. [Google reports a 33x reduction in energy and 44x reduction in carbon](https://news.mit.edu/2025/responding-to-generative-ai-climate-impact-0930) for the median prompt compared with 2024, demonstrating that efficiency varies enormously across providers. **Location matters.** [Research in Nature Sustainability found](https://www.lincolninst.edu/publications/land-lines-magazine/articles/land-water-impacts-data-centers/) that smart siting, faster grid decarbonization, and operational efficiency could cut carbon impacts by approximately 73% and water impacts by 86% compared with worst-case scenarios. The Midwest and "windbelt" states offer the best combined carbon-and-water profile. ## The Efficiency Paradox Here's where optimism gets complicated. AI is getting more efficient. Dramatically more efficient. The problem is that efficiency gains get swallowed by increased usage. This is Jevons paradox. More efficient technology makes it cheaper to use, so people use more of it, potentially offsetting the efficiency gains entirely. As one Hacker News commenter [Teever](https://news.ycombinator.com/item?id=42221756) noted succinctly: "Yeah but Jevons paradox." [Global AI electricity demand is projected to grow from 415 TWh to nearly 1,000 TWh by 2030](https://research.aimultiple.com/ai-energy-consumption/). Efficiency improvements aren't keeping pace with adoption. The industry is aware of this. Research continues on more efficient model architectures, better inference optimization, and hardware improvements. [One study proposed algorithms that could reduce energy costs by 95%](https://news.ycombinator.com/item?id=41889414) for certain operations. But extraordinary claims require extraordinary evidence, as one skeptical user noted, and the gap between laboratory results and deployed systems remains wide. ## What You Can Actually Do The honest answer is complicated. **Right-size your model choice.** Use the smallest model that accomplishes the task. A quick brainstorm doesn't need GPT-4. Bulk text generation doesn't need frontier reasoning capabilities. Many platforms offer model selection. Use it. **Write better prompts.** A clear, specific prompt that works the first time beats three vague prompts that need iteration. Every regeneration click has a cost. Templates and proven structures reduce trial and error. **Question necessity.** Not every task benefits from AI. Before sending a prompt, ask whether you actually need it. Could you accomplish this faster by just doing it yourself? Is the marginal benefit worth the resource consumption? **Consider providers.** AI companies vary in sustainability commitments. Some publish detailed efficiency data. Some power data centers with renewables. Some do neither. This information influences purchasing decisions at scale. But here's the uncomfortable truth: individual actions, while meaningful, are insufficient. The scale of the problem is systemic. [About 60 percent of increasing electricity demands from data centers will be met by burning fossil fuels](https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117), increasing global carbon emissions by about 220 million tons. No amount of conscientious prompting offsets that trajectory. ## The Industry's Responsibility The burden shouldn't rest entirely on users. It can't. Major AI companies have made sustainability commitments. Renewable energy pledges. Carbon offset programs. Efficiency research. But sustainability reports consistently show increasing absolute emissions even as efficiency per computation improves. The gap between commitment and outcome is wide. [Nine major tech companies showed consistent failures in transparency](https://news.cornell.edu/stories/2025/11/roadmap-shows-environmental-impact-ai-data-center-boom), with no company reporting AI-specific environmental metrics despite acknowledging AI as a key driver of increased energy consumption. You can't manage what you don't measure, and the industry isn't measuring. User [adrianN](https://news.ycombinator.com/item?id=44039808) identified the core issue: "The important part remains internalizing emission costs into the price of electricity." When environmental costs don't appear in the price, they get ignored. Markets optimize what they measure. ## The Question We're Avoiding Here's what nobody wants to say directly. We don't know if AI's benefits outweigh its environmental costs. We can't know, because we haven't honestly quantified either side. The benefits are real. Productivity gains. Creative capabilities. Analytical power. Research acceleration. Medicine. Science. These matter. The costs are also real. Carbon emissions. Water consumption. Resource extraction. Heat generation. Grid strain. These also matter. As Hacker News user [it_citizen](https://news.ycombinator.com/item?id=42221756) observed: "The jury is still out on the cost benefit of AI on the long run." The conversation tends toward extremes. AI will save us all. AI will doom us all. Neither position is useful. What's useful is honest accounting, which we don't have, and honest conversation about tradeoffs, which we avoid. ## Where This Leaves Us I keep coming back to something user [neves](https://news.ycombinator.com/item?id=44039808) wrote on Hacker News: "Society must have information to make decisions about what affects all the world." That's the core problem. We don't have good information. Energy estimates vary by orders of magnitude. Water calculations contradict each other. Companies don't disclose AI-specific metrics. Researchers work with incomplete data and make conflicting projections. What we do know suggests that the trajectory is concerning. AI energy consumption is growing faster than efficiency improvements. Water stress is increasing in regions with data center concentration. Carbon commitments are not being met. And adoption is accelerating. The path forward isn't obvious. It involves some combination of efficiency improvements, renewable energy deployment, smarter data center siting, usage awareness, and probably regulation. It involves tradeoffs we haven't honestly confronted. I don't have a neat conclusion. The checklist approach feels inadequate here, like arranging deck chairs while arguing about whether the ship is actually sinking. The numbers are real. The uncertainty is also real. The responsibility is distributed across billions of users, dozens of companies, and governments that haven't decided how to think about this. What I know is that every prompt costs something. The cost may be worth paying. It may not be. But pretending the cost doesn't exist is a choice, and it's one we've been making for too long. --- *Related reading: [AI Costs Explained: API Pricing](/ai-costs-explained-api-pricing) covers the financial side of AI usage. [AI Data Privacy and Compliance](/ai-data-privacy-compliance) addresses another dimension of responsible AI adoption.*