Electricity grids are cracking as AI requirements rise

Image source, Getty Images

Image caption, The electricity demand of data centers is expected to double between 2022 and 2026

  • Author, Chris Baraniuk
  • Role, Technology reporter

There’s a big problem with generative AI, says Sasha Luccioni of Hugging Face, a machine learning company. Generative AI is an energy hog.

“Every time you query the model the whole thing is activated, so from a computational perspective it’s hugely inefficient,” she says.

Take the Large Language Models (LLMs) that are at the heart of many generative AI systems. They are trained in the large amount of written information, which allows them to produce text in response to almost any question.

“When you use generative AI… it generates content from scratch, it essentially makes up answers,” Dr. Luccioni explains. That means the computer has to work quite hard.

According to a recent study by Dr. Luccioni and colleagues, a generative AI system could consume about 33 times more energy than machines running task-specific software. The work has been peer-reviewed but has yet to be published in a journal.

However, it’s not your PC that’s using all this energy. Or your smartphone. The calculations we increasingly rely on happen in gigantic data centers that are out of sight and out of mind for most people.

“The cloud,” says Dr. Luccioni. “You don’t think about these huge boxes of metal that get hot and use so much energy.”

The world’s data centers are consuming more and more electricity. By 2022, they will have consumed 460 terawatt hours of electricity, and the International Energy Agency (IEA) expects this to double in just four years. Data centers could use a total of 1,000 terawatt hours per year by 2026. “This demand is approximately equivalent to Japan’s electricity consumption,” the IEA said. Japan has 125 million inhabitants.

Data centers store vast amounts of information so it can be retrieved from anywhere in the world – from your emails to Hollywood movies. The computers in those faceless buildings also power AI and cryptocurrency. They form the basis of life as we know it.

Image caption, AI can be “extremely inefficient” in using computing resources, says Sasha Luccioni

National Grid’s boss said in a speech in March that demand for electricity from data centers in Britain will increase sixfold in just a decade, largely fueled by the rise of AI. However, National Grid expects that the energy required to electrify transport and heat will be much greater in total.

Utilities in the US are starting to feel the pressure, says Chris Seiple of consultancy Wood Mackenzie.

“They are facing data center demands at exactly the same time that – thanks to government policy – ​​there is a renaissance in domestic manufacturing,” he explains. Lawmakers in some states are now reconsidering the tax breaks offered to data center developers due to the enormous strain these facilities place on local energy infrastructure, according to reports in the US.

Mr Seiple says there is a ‘land grab’ for data center locations near power stations or renewable energy hubs: “Iowa is a hotbed of data center development, there is a lot of wind energy generated there.”

Some data centers today can afford to move to more remote locations because latency – the delay, usually measured in milliseconds, between information being sent from a data center and the user receiving information – is not a major problem for the increasingly increasingly popular generative AI systems. In the past, data centers that handled emergency communications or financial trading algorithms, for example, were placed in or near major population centers for the absolute best response times.

Image source, Getty Images

Image caption, Nvidia CEO Jensen Huang shows off the new Blackwell chip in March

There is little doubt that data center energy demands will increase in the coming years, but there is great uncertainty about how much, Mr. Seiple points out.

Part of that uncertainty is due to the fact that the hardware behind generative AI is constantly evolving.

Tony Grayson is a managing director at Compass Quantum, a data center company, and he points to Nvidia’s recently launched Grace Blackwell supercomputer chips (named after a computer scientist and a mathematician), which are specifically designed to power high-performance processes, including generative AI. , quantum computers and computer-aided drug design.

Nvidia says that in the future, a company could train AIs many times larger than the largest AI systems currently available, in 90 days, using 8,000 previous-generation Nvidia chips. This requires an electricity supply of 15 megawatts.

But the same work could be done in the same time by just 2,000 Grace Blackwell chips, and according to Nvidia, they would require a four-megawatt power supply.

That equates to a consumption of 8.6 gigawatt hours of electricity – about as much as the entire city of Belfast uses in a week.

“The performance increases so much that your overall energy savings are significant,” says Mr Grayson. But he agrees that energy demand determines where data center operators locate their facilities: “People are going to places where there is cheap power.”

Dr. Luccioni notes that the energy and resources required to manufacture the latest computer chips are significant.

More technology from business

Still, it’s true that data centers have become more energy efficient over time, argues Dale Sartor, a consultant and partner at the Lawrence Berkeley National Laboratory in the US. Their efficiency is often measured in terms of energy consumption effectiveness, or PUE. The lower the number, the better. State-of-the-art data centers have a PUE of around 1.1, he notes.

These facilities still create significant amounts of waste heat and Europe is ahead of the US in finding ways to use that waste heat – such as heating swimming pools – says Mr Sartor.

Bruce Owen, UK managing director at Equinix, a data center company, says: “I still think demand will grow beyond the efficiency gains we’re seeing.” He predicts more data centers will be built, including on-site power generation facilities. Equinix was denied planning permission for a gas data center in Dublin last year.

Mr. Sartor adds that cost may ultimately determine whether generative AI is worth it for certain applications: “If the old way is cheaper and easier, there won’t be much of a market for the new way.”

Dr. However, Luccioni emphasizes that people need to clearly understand how the options before them differ in terms of energy efficiency. She is working on a project to develop energy assessments for AI.

“Instead of choosing this GPT-derived model that is very clunky and uses a lot of energy, you can choose this A+ Energy Star model that will be a lot lighter and more efficient,” she says.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top