Big techs run to build large AI data centers. What is the environmental impact?

by Andrea
0 comments

Last week, when a hot summer heat wave covered New Jersey, it seemed the perfect time to come across a new and worrying accenture forecast: carbon emissions from the way they are on their way to increasing 11 times by 2030.

The report estimates that over the next five years, AI data centers can consume 612 Terawatts-hours of electricity-approximately the equivalent to Canada’s total annual energy consumption-boosting a 3.4% increase in global carbon emissions.

And the pressure does not stop on the power grid. At a time when freshwater resources are already under severe pressure, AI data centers will be designed to consume more than 3 billion cubic meters of water per year-a volume that surpasses the annual removal of freshwater from whole countries such as Norway or Sweden.

Continues after advertising

Not surprisingly, the Powering Sustainable AI report – offers recommendations to contain the problem and prevent these numbers from becoming a reality. But with almost daily headlines about the huge Big Tech AI -centers buildings in the US and worldwide, it cannot be skeptical. The urgency of the AI ​​race against China does not seem to leave much space – or time – to seriously think about sustainability.

Race against China

This week, OpenAi agreed to rent an enormous computing capacity of Oracle’s Data Centers as part of its Stargate initiative, which aims to invest $ 500 billion over the next four years to build AI infrastructure for OpenAi in the United States.

Oracle’s additional capacity totals about 4.5 gigawatts of energy centers in the US, according to a report by Bloomberg. A gigawatt is equivalent to the capacity of a nuclear reactor and can provide electricity to about 750,000 homes.

Continues after advertising

And it was reported that the goal () seeks to raise $ 29 billion of private capital firms to build AI date centers in the US, while already building a center of $ 10 billion in northeastern Louisiana. As part of this agreement, local concessionaire, Entergy, will provide three new power plants.

The CEO of the goal, Mark Zuckerberg, has clear his intentions: the US should quickly expand the construction of AI data centers or are in danger of being behind China in the race for AI dominance. Speaking of the Dwarkesh podcast in May, he warned that the American advantage in artificial intelligence can wear out unless he follows the aggressive expansion of China’s data centers and factory hardware.

“The US really need to focus on speeding up data centers and energy production,” said Zuckerberg. “Otherwise, we will be at a significant disadvantage.”

Continues after advertising

The US government seems aligned with this sense of urgency. David Sacks, currently the white House AI and crypto tsar, also pointed out that the expansion of energy and data centers is central to the American AI strategy – leaving little room for concerns with sustainability.

At Podcast All In, in February, Sacks argued that Washington’s “slow” approach to AI could stifle the industry. He emphasized that the US needs to pave the way for infrastructure and energy development – including AI data centers – to accompany China.

In late May, he went further, saying that speeding licenses and expanding power generation are essential to the AI’s future – something he said was “effectively impossible under Biden administration.” Your message: The US needs to accelerate construction.

Continues after advertising

And the environmental responsibility?

Meanwhile, Accenture is encouraging its customers to grow and project their AI data centers responsibly, seeking to balance growth with environmental responsibility.

The company offers a new metric, called Sustainable Ai Quotient (Saiq), to measure the true AI costs in terms of invested money, megawatts of energy consumed, tons of CO₂ emitted and cubic meters of used water.

The report says the metric will help organizations answer a basic question: “What are we really getting from the resources we invest in IA?” and will allow the company to measure its performance over time.

Matthew Robinson, Managing Director of Accenture Research and co-author of the report, emphasized to expect Accenture’s worrying forecasts to be wrong. “They kind of take off,” he said, explaining that Accenture has modeled future energy consumption from the expected number of installed AI chips, adjusted by the use and additional energy requirements of data centers.

This data was combined with regional information on electricity generation, energy matrix and emissions, while water use was evaluated based on the energy centers of AI data centers and the amount of water consumed per generated electricity unit.

“The point really is to open the conversation about the actions available to avoid this path – we don’t want to be right here,” he said. He did not comment on specific actions of companies such as OpenAi or goal, but said that, in general, clearly more effort is necessary to avoid increasing carbonization driven by AI data centers while still allowing growth.

Optimization makes sense

Accenture’s recommendations make sense: optimize the energy efficiency of workloads and IA data centers with everything, from low carbon energy options to cooling innovations. Using IA consciously, choosing smaller AI models and better pricing models to encourage efficiency. And ensure better governance on AI sustainability initiatives.

It’s hard to imagine that the biggest players in the race for the AI ​​- Big Tech giants and strongly funded startups – will brake enough time to seriously address these growing concerns. Not that it is impossible. Here’s Google (), for example: In its last sustainability report released this week, the company has revealed that its data centers are consuming more energy than ever. By 2024, Google used approximately 32.1 million electricity megawatts (MWh), with impressive 95.8%-about 30.8 million MWh-consumed by its data centers. This is more than twice the energy used by the data centers in 2020, just before the AI ​​boom for consumers.

Still, Google emphasized that it is making significant progress to clean its energy matrix, even with the high demand. The company said it reduced data centers’ power emissions by 12% by 2024, thanks to clean energy projects and efficiency improvements. And is extracting more from each watt.

Google reported that the amount of computing per unit of electricity has increased about six times in the last five years. Its efficiency in energy use (PUE) – a key measurement of data centers – is now approaching the minimum of 1.0, with a PUE reported 1.09 by 2024.

“Speaking in person, I would be optimistic,” said Robinson.

This article was originally published on Fortune.com

2025 Fortune Media IP Limited

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC