Tech giants scramble to meet AI's looming energy crisis

ADVERTISEMENT

dpo-dps-seal
Welcome, Kapamilya! We use cookies to improve your browsing experience. Continuing to use this site means you agree to our use of cookies. Tell me more!

Tech giants scramble to meet AI's looming energy crisis

Agence France-Presse

Clipboard

People walk past a banner with an AI (artificial intelligence) sign at the Frankfurt book fair on October 16, 2024, on the first day of the world's biggest book fair in Frankfurt am Main, western Germany. Kirill Kudryavtsev, AFP/File


NEW YORK, United States — The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming — all while AI usage explodes worldwide.

AI depends entirely on data centers, which could consume three percent of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today.

Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage.

"There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science.

ADVERTISEMENT

Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power.

Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself.

For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30 percent.


'CLEVER' SOLUTIONS


Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves.

Today, operations use just 10 percent of what the servers consume, says Gareth Williams from consulting firm Arup.

ADVERTISEMENT

This is largely through this focus on energy efficiency.

Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly.

This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva.

For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers.

"All the big players are looking at it," Williams said.

ADVERTISEMENT

This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago.

Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers - - avoiding have to rebuild existing data centers.

"There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video.


US VS CHINA


For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last.

Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance.

ADVERTISEMENT

"But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added.

Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption.

"Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly."

In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI.

In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy.

ADVERTISEMENT

DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential.

China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.

—By Thomas Urbain, Agence France-Presse


RELATED VIDEOS




ADVERTISEMENT

ADVERTISEMENT

It looks like you’re using an ad blocker

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker on our website.

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker on our website.