Managing AI’s Environmental Footprint: 6 Practical Ways to Reduce Energy, Water, and Carbon Costs

Chart showing AI’s growing carbon and energy impact on the environment

The Reality Check: AI’s footprint is real

During the Q&A of my last three talks on artificial intelligence, one question kept surfacing: AI’s environmental impact. Depending on which headlines you read, AI is either an ecological miracle or a planet-melter. The reality lies in between, AI’s energy consumption, carbon emissions, and water usage are rising quickly and unevenly. But there are proven, practical levers to reduce AI’s carbon footprint and build more sustainable AI systems without resisting its inevitable adoption.

The Hidden Cost of Every Prompt

Two things can be true at once: the per-use footprint of a prompt can be small, and the system-wide footprint can still surge. Google now estimates a median Gemini text prompt consumes roughly 0.24 Wh of electricity (about nine seconds of watching Netflix) and about five drops of water. That’s the micro view. The macro view: data-center electricity demand tied to AI is on a meteoric rise and will keep climbing.

Multiple U.S. utilities are planning big grid investments and, in some cases, proposing higher rates as data-center load explodes. Meanwhile, Google’s total emissions have risen since 2019 even as it invests in clean energy, and Microsoft reports overall emissions up versus its 2020 baseline as it builds out AI capacity. This isn’t apocalyptic, but it is still notable to your costs, compliance exposure, and social license to operate.

The environmental impact doesn't stop at electricity and water usage. Here is where it bridges into the more tangible world we live in: Over the last 36 years, the US has invested $300 billion into building out our interstate highway infrastructure (adjusted for today's inflation). In the coming year, companies in the US will spend $600 billion on building out data centers to support the predicted compute demand of AI. Let that one sink in.

Inverted Quarantines

I’ve had more than a few conversations about tech's impact on climate change with my uncle, Andrew Szasz, an environmental sociologist at UC Santa Cruz and author of Shopping Our Way to Safety. One of his many big ideas is called “inverted quarantine," where we often respond to systemic problems with private, feel-good fixes that leave the root issue untouched. In late 2020, when cryptocurrency started to run, I tried to make the case that blockchains could lower humanity's carbon footprint through smart contracts. He wasn’t buying it. And while a single Bitcoin transaction may use the energy of tens of thousands of AI prompts, the reality is that we’ll soon be running billions of prompts for every one Bitcoin transaction, so the aggregate footprint of AI could overtake blockchain much faster than people expect.

Some optimists argue that if we ever achieve superintelligence (AGI), it could end up solving the very problems it’s now worsening. A system far smarter than us might unlock new clean energy sources, optimize grids beyond human capability, or even invent technologies that break today’s carbon bottlenecks. But here’s where my uncle’s “inverted quarantine” framework is useful: placing hope in a future superintelligence can become just another private escape hatch, a way to outsource responsibility instead of tackling systemic reform now. In other words, betting on AGI to fix climate change risks becomes the ultimate inverted quarantine. It's a comforting story that delays the harder work of building sustainable systems today.

Six Practical Levers to Cut AI’s Footprint

1) Right-size by default

Make “small-first” the norm. Route routine Q&A, summarization, and form-filling to compact, efficient models. In other words, the right tool for the job. Only escalate to larger models when quality lifts justify the energy and latency tradeoffs. Make this the norm in your product and platform guardrails. Use the smaller, faster models for routine questions, and only call in the big, heavy ones when accuracy really matters.

A recent UNESCO-backed analysis shows that simple changes like shorter outputs and smaller models can cut energy use by double digits. In some cases, up to 90 percent. AWS’s GenAI Lens says the same in cloud-architecture terms: distill, prune, quantize, and select the smallest acceptable model. For everyday users, that means choosing the smaller, faster AI option when it gets the job done, like picking HD instead of 4K when you are just watching the news.

2) Put efficiency targets next to quality targets

Set a default limit on how long responses should be for everyday tasks, and let people choose more detail only when they really need it. Shorter answers aren’t just quicker but they also save energy. Show teams the balance between quality, cost, and energy so they can see the tradeoffs in real time. Every extra token has a price tag in time, money, and carbon.

3) Schedule and locate compute where it’s cleaner

Most AI tasks don’t need split-second speed. You can schedule flexible jobs to run when the grid is cleaner or in regions with greener energy. Big cloud providers already do this for their own systems, and you can ask them (or copy them) to cut your footprint too.

4) Track more than carbon

Ask your AI vendors for three basics: PUE (power usage effectiveness), WUE (water usage effectiveness), and hourly carbon intensity by region. Put those in contracts or scorecards. Microsoft and Google both publish water and carbon data and are rolling out designs (like direct-to-chip cooling and zero-water cooling) that can dramatically cut water withdrawals in new builds. If you run your own servers, track how much water they use as carefully as you track electricity. The more collective voices there are, the louder they become.

5) Run AI closer to the user when you can

For simple jobs like sorting, speech recognition, or quick text, you don’t always need a giant cloud model. Running them on a phone, laptop, or nearby server can cut energy use dramatically. Think of it like turning the lights off when you leave a room. Like your recycling bin and smart thermostat, it might feel small or inconsequential compared to smoke billowing out of a factory. But at scale, it adds up. Start by testing it in areas where speed, privacy, or user experience make it a good fit.

6) Reuse before you re-run

Don’t make AI do the same heavy lifting twice. For everyday users, that means reusing good prompts, saved outputs, or shared templates instead of starting from scratch each time. For teams, it means caching common answers, sharing fine-tuned models, and building systems that cut down on duplicate requests. Every avoided rerun saves time, money, and carbon, and at scale those savings add up fast.

A balanced bottom line

AI isn’t a climate savior or a villain. Right now, it may look like the bad guy, but we're on chapter 1 of a very long story with many plot twists yet to come. It could still turn out to be the hero, but it’s just too early to tell.

However, AI is undoubtedly an amplifier. It can drive up grid demand and water use in the wrong places, but it can also optimize traffic lights, shave fuel on 500 million trips a month, and help grid operators match demand to cleaner supply. The smart move is to do both kinds of work: the “lights off” habits (right-sized models, shorter outputs, edge where it fits) and the structural moves (carbon-aware scheduling, vendor transparency, cleaner regions, and contracts that reward efficiency). That’s how you make real progress without relying on wishful thinking or empty gestures.

Personally, I remain optimistic that AI will eventually do more good than harm. For the environment and beyond. For a list of all my information sources, please see the post on our website.

Find your next edge,

Eli

Sources:

1. Measuring the environmental impact of AI inference — Google Cloud Blog

2. Our approach to energy innovation and AI’s environmental footprint” — Google blog on efficiency gains

3. Measuring the environmental impact of delivering AI at Google scale

4. Making AI Less ‘Thirsty’: Uncovering and addressing the secret water footprint of AI models

5. Microsoft 2024 Environmental Sustainability Report (FY2023 baseline)

6. Sustainable by design: Next-generation datacenters consume zero water” — Microsoft on zero-water cooling for AI datacenters

7. Artificial intelligence technology behind ChatGPT was built in Iowa — with a lot of water”— AP News

8. How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference

9. Holistically Evaluating the Environmental Impact of Creating Language Models

10. Andrew Szasz — Shopping Our Way to Safety: How We Changed from Protecting the Environment to Protecting Ourselves* (University of California Press, 2007)


Want help applying this to your product or strategy? We’re ready when you are → Let's get started.


Next
Next

The AI Glossary Every Leader Needs: 10 Essential Terms Explained