Your Hamburger Is Worse Than Your Chatbot
AI's environmental footprint is real and growing, but the outrage is wildly misplaced
10 min read
A single AI query uses about 0.3 Wh of electricity and 0.3 mL of water. A single hamburger requires 1,740 liters of water to produce. You would need to ask an AI chatbot 5.8 million questions to match the water footprint of that one burger. AI's energy use is growing fast and deserves scrutiny, but the people being outraged over it while eating beef and booking long-distance flights are worried about the wrong things.
I keep seeing the same argument. Someone on social media posts about how AI is “boiling the oceans” and destroying the planet, and the post gets tens of thousands of likes from people who possibly drove to work that morning in an ICE car, might have had a steak for dinner, and are planning a holiday flight overseas.
That said, this is not a piece arguing that AI’s environmental cost is zero or that we should stop paying attention to it. The growth trajectory is steep and the infrastructure being built right now will lock in energy choices for decades. But if your concern is the planet and you have a finite amount of outrage to spend, the data is very clear about where that outrage should go first.
What a ChatGPT query actually costs
The most repeated claim was from 2023, saying that a single ChatGPT query uses about ten times the energy of a Google search. Alex de Vries published an estimate in Joule pegging it at roughly 2.9 Wh per query. That number got picked up everywhere and is still cited today.
It was wrong, or at the very least, outdated very quickly. The estimate assumed 2,000 output tokens on NVIDIA A100 hardware. In practice, most queries generate 260 to 500 tokens, and by 2025 inference runs on H100 and B200 GPUs that are dramatically more efficient. Epoch AI’s independent analysis [8] and industry-reported disclosures from OpenAI and Google now converge on about 0.3 Wh for a standard text query (these are self-reported figures, not independently verified, and may underestimate actual usage). Google reported that a median Gemini prompt uses 0.24 Wh and 0.26 mL of water [9] . That is not ten times a Google search anymore. It is closer to four to seven times, depending on which estimate for Google you use (their own 2009 figure was 0.3 Wh; current estimates are closer to 0.04 Wh).
Image generation costs more, around 0.63 Wh per image. Video generation is expensive: roughly 1 kWh for a 5-second clip. And reasoning models like o1 or o3 can consume significantly more per prompt than a simple text completion. The averages matter, though, and most people most of the time are doing text queries.
So, 0.3 Wh. What does that mean in terms anyone can feel?
Charging a smartphone takes about 10 Wh – that is 33 AI queries. Boiling a liter of water in an electric kettle takes about 100 Wh, or roughly 330 queries for one cup of tea. An hour of Netflix streaming uses about 80 to 120 Wh when you include the TV, network equipment, and data center share – that is 300 to 400 queries. A washing machine cycle uses about 500 Wh, or 1,700 queries. A dishwasher cycle runs at about 1,800 Wh, which equals 6,000 AI queries.
xychart-beta
x-axis ["Phone charge", "Cup of tea", "1h Netflix", "Washing machine", "Dishwasher"]
y-axis "AI queries" 0 --> 6500
bar [33, 330, 350, 1700, 6000] The water comparison is even more absurd
Water is where the discourse gets particularly unhinged. Li et al. (2023), in a paper published in Communications of the ACM [1] , estimated that training GPT-3 directly evaporated about 700,000 liters of freshwater for cooling at Microsoft’s data centers in the US. Including the water consumed during electricity generation (Scope-2), the total rises to 5.4 million liters. Those are big numbers and they are worth knowing.
But context matters. Mekonnen and Hoekstra, in a 2012 study published in Ecosystems [3] , found that producing one kilogram of beef requires approximately 15,400 liters of water. A single quarter-pound hamburger patty accounts for about 1,740 liters. That means the entire water cost of training GPT-3, the model that launched the current AI era, equals the water footprint of roughly 400 hamburgers. If you include Scope-2 water, you get to about 3,100 hamburgers. The United States alone eat roughly 12 billion hamburgers per year.
At the per-query level, 0.3 mL of water per AI prompt means you need 5.8 million queries to equal one hamburger. You would need about 270,000 queries to match a ten-minute shower. A single cup of coffee, when you account for growing and processing the beans, has a water footprint of roughly 140 liters, which equals about 467,000 AI queries.
I do not know anyone who asks an AI chatbot 5.8 million questions in a year or even half or even one percent of that. I do know plenty of people who eat many hamburgers a year.
What is very important to note is that water usage isn’t generally comparable. It matters, of course, when and where water is used and a lot of water for farming is rain water. Data center cooling, by contrast, typically draws from municipal freshwater supplies in specific locations, which is why the regional impact can be disproportionate to the volume.
Regional water stress is a real issue. In The Dalles, Oregon, Google’s data centers now consume about a third of the city’s water supply, up from 12% in 2012. That is a legitimate local concern, especially in areas already facing drought. But even here, perspective helps: golf courses in Maricopa County, Arizona use 110 billion liters of water per year. Data centers in the same county are projected to use 3.4 billion liters in 2025. The golf courses use 32 times more.
It’s a farce that even new data centers are built in regions that already have low rainfall and/or ground water levels and then still use evaporative cooling. Nevertheless, the contrast of these figures is immense and sadly water usage isn’t the biggest problem by far.
The macro picture: a rounding error next to livestock and petrochemicals
Zooming out from individual queries to industry-level numbers makes the disparity even starker. The IEA’s April 2025 report Energy and AI [7] , the most comprehensive analysis available, puts global data center electricity at 415 TWh in 2024. That is about 1.5% of global electricity. In the US it is higher, around 4.4%.
Now compare that to the industries most people do not post outraged threads about.
The FAO’s GLEAM 3.0 model [6] puts the global livestock sector at 6.2 gigatons of CO₂-equivalent per year, about 12% of all anthropogenic greenhouse gases. An earlier FAO assessment using different methodology estimated 7.1 gigatons, or 14.5% [14] . Based on the IEA’s electricity figures and average global grid carbon intensity, all global data centers combined emitted roughly 180 megatons of CO₂ in 2024. That means animal agriculture emits about 35 to 40 times more greenhouse gases than every data center on the planet, including the ones running your email server, your streaming, your banking, and your AI.
Levi and Cullen, in a 2018 analysis published in Environmental Science & Technology [4] , mapped the chemical and petrochemical sector as the world’s largest industrial energy consumer at approximately 42.5 EJ per year, or about 30% of total industrial energy use. Data centers at 1.5 EJ are roughly 1/28 of that. Even by 2030, when data centers are projected to roughly double to around 945 TWh, petrochemicals will still be about 12 times larger.
A single economy passenger on a New York to London flight generates around 300 to 500 kg of CO₂ one-way [12] . If you include radiative forcing effects from high-altitude emissions, it is 600 to 1,000 kg.
To make the point clear: one transatlantic return flight emits roughly as much CO₂ as a person using an AI chatbot every single day for somewhere between 30 and 90 years, depending on usage patterns and how generous you are with the flight estimates.
xychart-beta
x-axis ["Livestock", "Steel", "Petrochem.", "Aviation", "Shipping", "DCs", "AI"]
y-axis "Mt CO₂e / year" 0 --> 7000
bar [6650, 4400, 2500, 1050, 700, 180, 25] So why does AI get all the attention?
Part of it is novelty. AI is new, growing fast, and associated with large tech companies that many people already distrust. Meat, cars, and flights are old, familiar, and personally convenient. It is always easier to be outraged about something you do not use than something you do.
Part of it is legitimate concern about the trajectory. Data center electricity demand is growing at roughly 15% annually, which is more than four times faster than overall electricity demand. Goldman Sachs, in a 2024 report on AI and data center power demand [13] , projects a 165% increase by 2030. AI workloads currently represent about 14% of data center power, but that share is rising fast as traditional cloud computing grows more slowly. The IEA projects AI could drive 35 to 50% of data center energy by 2030. Under their base case, total data center consumption roughly doubles to 945 TWh, which would be comparable to Japan’s entire electricity use.
There is also a real question about how that electricity gets generated. Natural gas and coal together still supply over 40% of additional data center electricity through 2030 according to the IEA, which risks fossil fuel lock-in. Microsoft’s total emissions rose 23.4% between 2020 and 2024. Google’s went up 13% in 2023 alone. Both companies have pledged net-zero operations by 2030 while simultaneously building fossil-fuel-powered data centers. That is ridiculous and is important to call out. But that is largely a capitalistic choice, there are alternative power sources in the world that are not this bad.
Efficiency is improving faster than most people realize
One thing that often gets left out of the doom narrative is how quickly hardware and model efficiency are improving. Google reported a 33x reduction in energy usage per median Gemini text prompt over twelve months [9] – though this largely reflects a one-time optimization leap from early deployment to production efficiency, not an annual improvement rate. NVIDIA’s B200 GPU delivers 25 times better inference energy efficiency than the H100, which itself was a massive improvement over the A100. Tech companies are the world’s largest corporate buyers of renewable energy, and renewables plus nuclear are projected to supply 60% of data center electricity by 2030.
Where to actually spend your outrage
If you care about the environment, and you should, the data gives you a clear priority list. Wynes and Nicholas, in a 2017 meta-analysis published in Environmental Research Letters [10] , ranked individual climate actions by annual CO₂ savings in developed countries, and I wrote about this before in The Bill Is Coming Due. Going car-free saves about 2.4 tons per year. Avoiding one long-haul return flight saves 1.6 tons. Switching to a plant-based diet saves about 0.8 tons. Animal products deliver 18% of the world’s calories while generating 56% of food-related emissions [5] .
Use AI or don’t, that is your choice and more things go into that decision than just the environmental footprint, of course. But not using it for environmental concerns is a minuscule drop in the bucket as of today. Even a heavy user running 100 queries a day would add about 11 kWh per year, equivalent to running a clothes dryer for about four loads. If you really want to offset your AI usage, skip one hamburger a month and you will be in the green for a year.
AI’s environmental footprint is real, growing, and deserves transparent reporting, accountability to climate commitments, and thoughtful infrastructure planning. Those are reasonable demands. But the narrative that AI is an environmental catastrophe, pushed by people who apparently do not apply the same scrutiny to their diet, their car, or their travel plans, is not environmentalism. It is looking for a convenient target for outrage.
The planet has real, enormous problems: fossil fuels, animal agriculture, aviation, petrochemicals. Those are the big line items. AI is, for now, a fraction of a percent of global emissions, powered increasingly by renewables, and improving in efficiency faster than almost any other technology.
Worry about AI’s environmental impact but only if you take the more meaningful steps first.