logo

Wake up daily to our latest coverage of business done better, directly in your inbox.

logo

Get your weekly dose of analysis on rising corporate activism.

logo

The best of solutions journalism in the sustainability space, published monthly.

Select Newsletter

By signing up you agree to our privacy policy. You can opt out anytime.

Abha Malpani Naismith headshot

AI Could Soon Offset Its Own Environmental Impact By Improving Energy Efficiency, Report Finds

Concerns around the environmental cost of building and running artificial intelligence (AI) are growing. A new study takes the conversation in a new direction, asking whether the tech will eventually save as much energy as it consumes.
An open laptop displays the ChatGPT homepage, which reads, "What can I help with?" — AI

(Image: Aerps.com/Unsplash)

There’s growing tension between the innovation artificial intelligence (AI) promises and the potential damage it can cause. Presenting a more hopeful angle on what may seem like impending environmental doom is a new study by PwC in collaboration with Microsoft and Oxford University. The central question: Can AI save as much energy as it consumes?

While AI can play an impactful role in climate action — forecasting climate trends, detecting disasters early, and optimizing energy use via smart grids and automation — there are growing concerns around the environmental cost of building and running these powerful systems. It relies heavily on vast data centers, which require enormous amounts of electricity. 

Data center electricity consumption is set to more than double to around 945 terawatt-hours by 2030, according to the International Energy Agency. That is slightly more than Japan’s total electricity consumption today, and AI is the most important driver of this growth, alongside growing demand for other digital services.

Each ChatGPT query emits over 4 grams of carbon dioxide, around 10 to 20 times more than a Google search, according to the sustainability reporting startup Plan Be Eco. Multiply that by over 400 million daily users of the platform, and the tech’s carbon footprint starts to look like that of small nations. 

“I wouldn’t use the worry to paralyze,” said Sammy Lakshmanan, a PwC sustainability partner and an author of the report. “As we're not really accounting for the efficiency gains that we will see with AI empowering AI. We absolutely should be thoughtful. But we’re already learning how to deploy AI more sustainably — and use it to reduce emissions elsewhere.”

It comes down to how we evaluate our consumption, Lakshamanan said. ”We're accounting for the carbon intensity of AI developing a code, but we're not counting for the carbon intensity of a developer spending the next week writing that same code, right?” 

To support the ability to measure AI consumption more effectively, PwC developed a simulation that models two major impacts of its adoption: The rise in energy use from increasing data centers and the corresponding drop in energy demand in other sectors due to AI-driven efficiency. 

The PwC study found that if AI use keeps growing at the current pace, data centers could use 13 to 16 percent more energy by 2035 compared to a future where use stays at today’s levels. Between 2024 and 2035, total energy use in data centers could be 18 to 21 percent higher because of AI.

If AI improves energy efficiency across the broader economy, even by just one-tenth of the rate of its adoption, the net effect could be energy-neutral or even slightly positive, according to the study. How is that possible? 

“ I don't think it's a net for net, but we can definitely use AI to have a wider sustainability impact, as it can significantly reduce emissions in other places, not just carbon,” Lakshmanan said. “It can help us reduce the impact on  water resources, nature, and packaging, for example.”

Lakshmanan pointed to three factors that can keep AI’s footprint in check. "One is teaching AI to be more thoughtful, more sustainable, and teaching our developers to be more thoughtful and sustainable,” Lakshmanan said. “Then, being more efficient with how we design our data centers by looking at things like nuclear and wind and solar. And finally, when we use AI, how is that reducing emissions in other sectors?”

Companies could collectively save $2 trillion annually by 2030 by leveraging existing digital energy tools, according to the World Economic Forum. These tools — including energy management, smart grids and automation — can optimize energy use, reduce waste and cut costs. For instance, AI can automatically adjust electric vehicle charging times, manage heating and cooling, and refine manufacturing schedules to cut costs and emissions.

Cloud providers are now providing carbon tracking tools, too. “The hyperscalers [big tech companies] have made it easier for businesses to see how much energy they are using when they run AI or cloud services,” Lakshmanan said. “This helps companies understand both the cost and the carbon emissions linked to their digital activity.”

Companies need to match development and use to the task at hand. Large language models like ChatGPT are powerful, but not always efficient. 

"We learned that most of our consumption was not in the training of the model, it was actually in the prompting,” Lakshmanan said. “Now we’ve learned how to teach our AI to disregard bad prompts, to not waste energy on low-value tasks. That’s part of the evolution.”

So, what does sustainable AI look like five years from now? “I don’t think we’ll even use the term ‘sustainable AI,’” Lakshmanan said. “Because AI will, by definition, be sustainable. It will be the norm, not the exception.”

In this vision, AI models will be smarter about how they use energy, just as search engines today guide users away from nonsense queries to save computing power. Large-scale models will improve over time, reducing the need for redundant training. And both large enterprises and agile startups will find ways to scale it sustainably.

“Smaller companies may actually find it easier,” Lakshmanan said. “As they can adopt sustainable practices faster and leapfrog the competition.”

In the attempt to balance innovation with governance, the conversation needs to shift from just energy consumption to also include the benefits it can deliver. “Of course, it's a tough balance,” Lakshmanan said. “Governance shouldn't stifle innovation, but innovation also needs guardrails.” 

It all boils down to the value exchange, Lakshmanan added. “If I'm using significant AI computing power to solve a hard problem — say, removing millions of tons of carbon from the atmosphere — is that worth it? I'd argue yes. It's a cost-benefit analysis.”  

Ultimately, the technology’s environmental story isn’t written yet. 

“We're still in an exploratory phase with AI, but enterprises are starting to focus on the value case, including sustainability,” Lakshmanan said. “The challenge is, not every organization includes sustainability in their value metrics. I believe it should be a core part of performance measurement, not something off to the side. And while we're seeing progress, not all companies are evaluating sustainability with the same rigor as financial performance — at least, not yet.”

Abha Malpani Naismith headshot

Abha Malpani Naismith is a writer and communications professional who works towards helping businesses grow in Dubai. She is a strong believer in the triple bottom line and keen to make a difference. She is also a new mum, trying to work out a balance between thriving at work and being a mum. In her endeavor to do that, she founded the Working Mums Club, a newsletter for mums who want to build better careers and be better mums.

Read more stories by Abha Malpani Naismith