Sunday, February 25, 2024

Energy Demand in the AI Industry Threatens to Match Netherlands


A recent study has issued a warning about the energy consumption of the artificial intelligence industry, suggesting that by 2027, it could be consuming as much energy as an entire country, probably as much as the size of the Netherlands. This revelation of the addition of AI-powered services was triggered in part by the introduction of ChatGPT just last year.

One of the key drivers of this significant energy consumption is the immense power-hunger of AI applications. They far outstrip the energy requirements of conventional software, making online activities much more energy-intensive. While the study does indicate that the environmental impact of AI might not be as terrible if its current growth slows down, it is crucial to note that these predictions are inherently speculative. This is because tech giants have not been forthcoming with the necessary data to make precise assessments.

Apart from this, it’s undeniable that AI demands considerably more potent hardware than traditional computing tasks. The study, conducted by Alex De Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, is based on certain assumptions that remain constant. These include the growth rate of AI, the continued availability of AI chips, and servers running at maximum capacity consistently.

One of the main players in the AI hardware ecosystem, Nvidia, is estimated to provide roughly 95% of the AI processing equipment used by the industry. By assuming the number of these powerful computers expected to be operational by 2027, De Vries was able to predict that AI could consume anywhere from 85 to 134 terawatt-hours (TWh) of electricity each year. To put this into perspective, at the upper end of that range, AI’s energy consumption would be roughly equivalent to the annual power usage of a small country.

Mr. De Vries, the study’s author, emphasized the importance of deploying AI only where it’s genuinely needed. His peer-reviewed study, which raises crucial concerns about the energy-intensive nature of AI, has been published in the journal Joule. In response to the findings, Nvidia declined to comment, leaving the industry and stakeholders contemplating the implications of AI’s voracious energy appetite.

Do you know how much energy and power AI consumes?

The research didn’t consider the energy needed for cooling AI equipment, which is an important part of the overall energy use in data centers. Many big tech companies don’t share details about how much energy is used for cooling or how much water is needed, making it hard to see the full environmental impact of their AI systems. Some people, like Mr. De Vries, are saying that tech companies should be more transparent about this.

But what’s clear is that the demand for powerful computers for AI is growing rapidly, and this means more energy is required not only to run the computers but also to keep them cool. Cooling is essential to prevent these machines from getting too hot, which can lead to problems and damage.

For example, a company in Scotland, DataVita, has seen a huge increase in inquiries from businesses wanting to use their facility to house AI equipment. At the beginning of 2023, they were getting only a couple of inquiries a week. Now, they’re getting hundreds. This shows that there’s a growing need for special places to support AI technology.

One important thing to know is that AI processors, the chips that power AI, use a lot more energy and generate more heat compared to regular computer servers. This means they need better cooling to keep them working well. So, it’s becoming increasingly important to manage the energy and cooling needs of AI equipment as the industry keeps growing.

“A standard rack full of normal kit is about 4kWh of power, which is equivalent to a family house. Whereas an AI kit rack would be about 20 times that, so about 8kWh of power. And you could have hundreds, if not thousands, of these within a single data centre.”

Mr. Quinn added that Scotland’s colder and wetter climate naturally helps data centers keep their equipment cool, but it remains a significant challenge.

In its latest sustainability report, Microsoft, a company making substantial investments in AI development, disclosed a 34% increase in its water consumption between 2021 and 2022, amounting to 6.4 million cubic meters, roughly equivalent to the volume of 2,500 Olympic swimming pools.

Prof. Kate Crawford, the author of a book on AI and its environmental consequences, emphasized that this issue causes her sleepless nights.

Speaking to the BBC in July, she said:

“These energy-intensive systems take enormous amounts of electricity and energy, but also enormous amounts of water to cool these gigantic AI supercomputers. So we are really looking at an enormous extractive industry for the 21st Century.”

Exposing the Real Expenses of AI in the Tech World

AI offers potential solutions to some of the planet’s pressing environmental challenges. For instance, Google and American Airlines have demonstrated that an experimental AI tool can assist pilots in reducing the creation of contrails (vapour trails) by up to 50% through better altitude selection. Contrails are known contributors to global warming.

The U.S. government, along with other entities, is investing millions of dollars in the pursuit of replicating nuclear fusion, the process that powers the Sun. A breakthrough here could revolutionize the world by providing an essentially limitless, eco-friendly energy source. AI has the capacity to accelerate research in this field, which has progressed slowly since the 1960s.

In a noteworthy development this year, university researcher Brian Spears harnessed AI to predict an experimental outcome, ultimately achieving a significant breakthrough.

“For 100 trillionths of a second, we produced ten petawatts of power. It was the brightest thing in the solar system,” he wrote.

Read more

Recommended For You