As artificial intelligence transforms our digital landscape, its environmental impact has become impossible to ignore. A typical AI data centre, according to the International Energy Agency (IEA), uses as much power as 100,000 households right now, but the largest centres currently being constructed will consume 20 times that amount. A ChatGPT query needs nearly 10 times as much electricity as a Google search query. With these staggering numbers, the question isn’t whether we should use AI—it’s how we can use it responsibly.
The scale of the challenge
The environmental cost of our AI revolution is substantial. A single query to an AI-powered chatbot can use up to ten times as much energy as an old-fashioned Google search. Broadly speaking, a generative AI system may use 33 times more energy to complete a task than it would take with traditional software. Major tech companies are already feeling the impact: Google’s emissions were almost 50% higher in 2023 than in 2019, largely due to the energy demand tied to data centres.
Looking ahead, the numbers become even more daunting. Electricity demand from data centres worldwide is set to more than double by 2030 to around 945 terawatt-hours (TWh), slightly more than the entire electricity consumption of Japan today. Yet this challenge also presents an opportunity to innovate toward sustainability.
Practical steps for individuals
While the largest impact comes from infrastructure changes, individual users can still make meaningful choices:
Choose your battles: Be strategic about when you use AI tools. Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike.
Optimize your prompts: Craft clear, specific prompts to reduce the need for multiple attempts. “We learned that most of our consumption was not in the training of the model, it was actually in the prompting.” Better prompts mean fewer iterations and less energy waste.
Consider alternatives: For simple tasks that don’t require AI’s sophisticated capabilities, stick to traditional tools. Reserve AI for complex problems where its unique capabilities provide genuine value.
Technical solutions for organizations
Power capping and optimization: Research shows immediate opportunities for energy reduction. “We studied the effects of capping power and found that we could reduce energy consumption by about 12 to 15%, depending on the model,” Siddharth Samsi, a researcher within the LLSC, says. The trade-off for capping power is increasing task time—GPUs will take about 3% longer to complete a task, an increase Gadepally says is “barely noticeable” considering that models are often trained over days or even months.
Smart scheduling: Timing matters. Shifting AI workloads to align with times of lower energy demand—like running shorter tasks overnight or planning larger projects for the cooler months, in place where air conditioner usage is widespread—can also lead to substantial energy savings.
Model selection and training efficiency: Organizations can dramatically reduce energy consumption by making smarter training choices. Gadepally’s team decided to forgo the usual routine of training thousands of models to completion for a drug discovery application, given that most of training data would ultimately never be used. Instead, the team built a training speed estimation tool that tracks the loss curve of training models, allowing them to predict end-state accuracy after 20% of a computation is complete. “That allows us to quickly shave off about 80% of the compute, with no impact to the end model.”
The promise of emerging technologies
On-device processing: Perhaps the most promising development is the shift toward on-device AI. AI chips designed for on-device processing prioritize energy efficiency over sheer computing power, resulting in a 100 to 1,000-fold reduction in energy consumption per AI task compared to cloud-based AI. This approach eliminates energy-intensive data transmission while providing faster, more private computing.
Hardware innovation: New chip designs offer hope for dramatic efficiency improvements. “We’ve known for decades that analog compute can be much more efficient—orders of magnitude more efficient—than digital,” says Verma. Companies are developing specialized hardware that performs AI computations more efficiently than traditional processors.
Parallel computing and small models: By splitting up a task and running different parts of it at the same time, parallel computing can generate results more quickly. It can also save energy by making more efficient use of available hardware. Additionally, there is a lot of talk about small models, versions of large language models that have been distilled into pocket-size packages.
Policy and industry initiatives
The regulatory landscape is evolving to address AI’s environmental impact. Over 190 countries in the UN system have adopted the UNESCO Recommendations on the Ethics of Artificial Intelligence, which address AI’s ethical application, including its environmental impact. The European Union has passed the AI Act, a legislative framework regulating AI’s environmental impact.
Some experts propose market-based solutions. A global “energy credit trading system” could provide financial incentives for companies that adopt low-power AI solutions. Under this system, businesses implementing energy-saving AI could trade energy usage credits, financially benefiting while reducing their environmental footprint.
Looking forward: balance and responsibility
The path forward requires nuanced thinking about trade-offs. Other reports estimate that AI could help mitigate 5–10% of global GHG emissions by 2030. If AI improves energy efficiency across the broader economy, even by just one-tenth of the rate of its adoption, the net effect could be energy-neutral or even slightly positive.
The key is ensuring that AI’s energy consumption aligns with genuine value creation. “If I’m using significant AI computing power to solve a hard problem—say, removing millions of tons of carbon from the atmosphere—is that worth it? I’d argue yes. It’s a cost-benefit analysis.”
As we navigate this transformation, transparency becomes crucial. Some day in the future, Dodge says, an AI might be able—or be legally obligated—to inform a user about the water and carbon impact of each distinct request she makes. “That would be a fantastic tool that would help the environment.”
The future of AI needn’t be one of exponentially growing energy consumption. Through thoughtful design, policy intervention, and conscious usage, we can harness AI’s transformative potential while keeping its environmental footprint manageable. The choices we make today—from the prompts we craft to the infrastructure we build—will determine whether AI becomes a catalyst for sustainability or a barrier to it.

Hi, I’m Eunice, and I’m an AI enthusiast. I’m here to provide brief but useful guidance to either get you started or help you hone your AI skills.