spoiler

There’s a big problem with generative AI, says Sasha Luccioni at Hugging Face, a machine-learning company. Generative AI is an energy hog.

“Every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective,” she says.

Take the Large Language Models (LLMs) at the heart of many Generative AI systems. They have been trained on vast stores of written information, which helps them to churn out text in response to practically any query.

“When you use Generative AI… it’s generating content from scratch, it’s essentially making up answers,” Dr Luccioni explains. That means the computer has to work pretty hard.

A Generative AI system might use around 33 times more energy than machines running task-specific software, according to a recent study by Dr Luccioni and colleagues. The work has been peer-reviewed but is yet to be published in a journal.

It’s not your personal computer that uses all this energy, though. Or your smartphone. The computations we increasingly rely on happen in giant data centres that are, for most people, out of sight and out of mind.

“The cloud,” says Dr Luccioni. “You don’t think about these huge boxes of metal that heat up and use so much energy.”

The world’s data centres are using ever more electricity. In 2022, they gobbled up 460 terawatt hours of electricity, and the International Energy Agency (IEA) expects this to double in just four years. Data centres could be using a total of 1,000 terawatts hours annually by 2026. “This demand is roughly equivalent to the electricity consumption of Japan,” says the IEA. Japan has a population of 125 million people.

At data centres, huge volumes of information are stored for retrieval anywhere in the world – everything from your emails to Hollywood movies. The computers in those faceless buildings also power AI and cryptocurrency. They underpin life as we know it.

But some countries know all too well how energy hungry these facilities are. There is currently a moratorium preventing the construction of new data centres in Dublin. Nearly a fifth of Ireland’s electricity is used up by data centres, and this figure is expected to grow significantly in the next few years – meanwhile Irish households are reducing their consumption.

The boss of National Grid said in a speech in March that data centre electricity demand in the UK will rise six-fold in just 10 years, fuelled largely by the rise of AI. National Grid expects that the energy required for electrifying transport and heat will be much larger in total, however.

Utilities firms in the US are beginning to feel the pressure, says Chris Seiple at Wood Mackenzie, a consultancy.

“They’re getting hit with data centre demands at the exact same time as we have a renaissance taking place – thanks to government policy – in domestic manufacturing,” he explains. Lawmakers in some states are now rethinking tax breaks offered to data centre developers because of the sheer strain these facilities are putting on local energy infrastructure, according to reports in the US.

Mr Seiple says there is a “land grab” going on for data centre locations near to power stations or renewable energy hubs: “Iowa is a hotbed of data centre development, there’s a lot of wind generation there.”

Some data centres can afford to go to more remote locations these days because latency – the delay, usually measured in milliseconds, between sending information out from a data centre and the user receiving it – is not a major concern for increasingly popular Generative AI systems. In the past, data centres handling emergency communications or financial trading algorithms, for example, have been sited within or very near to large population centres, for the absolute best response times.

There is little doubt that the energy demands of data centres will rise in the coming years, but there is huge uncertainty over how much, stresses Mr Seiple.

Part of that uncertainty is down to the fact that the hardware behind generative AI is evolving all the time.

Tony Grayson is general manager at Compass Quantum, a data-centre business, and he points to Nvidia’s recently launched Grace Blackwell supercomputer chips (named after a computer scientist and a mathematician), which are designed specifically to power high-end processes including generative AI, quantum computing and computer-aided drug design.

Nvidia says that, in the future, a company could train AIs several times larger than the largest AI systems currently available in 90 days using 8,000 of the previous generation of Nvidia chips. This would need a 15 megawatt electricity supply.

But the same work could be carried out in the same time by just 2,000 Grace Blackwell chips, and they would need a four megawatt supply, according to Nvidia.

That still ends up as 8.6 gigawatt hours of electricity consumed – roughly the same amount that the entire city of Belfast uses in a week.

“The performance is going up so much that your overall energy savings are big,” says Mr Grayson. But he agrees that power demands are shaping where data centre operators site their facilities: “People are going to where cheap power’s at.”

Dr Luccioni notes that the energy and resources required to manufacture the latest computer chips are significant.

Still, it is true that data centres have got more energy efficient over time, argues Dale Sartor, a consultant and affiliate of Lawrence Berkeley National Laboratory in the US. Their efficiency is often measured in terms of power usage effectiveness, or PUE. The lower the number, the better. State-of-the-art data centres have a PUE of around 1.1, he notes.

These facilities do still create significant amounts of waste heat and Europe is ahead of the US in finding ways of using that waste heat – such as warming up swimming pools – says Mr Sartor.

Bruce Owen, UK managing director at Equinix, a data centre firm, says, “I still think that the demand is going to grow further than that efficiency gain that we see.” He predicts that more data centres will be built with on-site power-generating facilities included. Equinix was denied planning permission for a gas-powered data centre in Dublin last year.

Mr Sartor adds that costs may ultimately determine whether Generative AI is worth it for certain applications: “If the old way is cheaper and easier then there’s not going to be much of a market for the new way.”

Dr Luccioni stresses, though, that people will need to clearly understand how the options in front of them differ in terms of energy efficiency. She is working on a project to develop energy ratings for AI.

“Instead of picking this GPT-derivative model that is very clunky and uses a lot of energy, you can pick this A+ energy star model that will be a lot more lightweight and efficient,” she says.

  • flan [they/them]
    ·
    6 months ago

    putting the computers in one big room is probably more efficient than having them spread out all over the place though

    • RyanGosling [none/use name]
      ·
      6 months ago

      What if we just put all the servers in outer space where it’s naturally cold, then connect an extremely long cable from space to earth?

      • Owl [he/him]
        ·
        6 months ago

        Outer space has bad cooling, spaceships need tons of radiators.

        If you want an absurd scifi-sounding way to do data centers though, it's theoretically possible for a really big one to put all the computers under an open pool of water, let the water evaporate and condense into clouds at the top of the building, and let the giant building's internal rain cycle do all the cooling.

      • fox [comrade/them]
        ·
        6 months ago

        Sadly the cable would wrap around Earth and cause a kind of hourglass thing to happen. This is bad for international shipping

      • GrouchyGrouse [he/him]
        ·
        6 months ago

        You'd have to keep it in the earth's shadow, the sun would cook the fuck out of a big box in space

        • flan [they/them]
          ·
          6 months ago

          even running them would cook them - there's no air to take the heat away!

    • EcoMaowist
      ·
      edit-2
      4 months ago

      deleted by creator

      • flan [they/them]
        ·
        edit-2
        6 months ago

        Regardless of who owns the servers the servers exist and a distributed network is going to need more infrastructure than a centralized one. There will be additional inefficiencies caused by the distances data needs to travel, the number of times it needs to be resent, and the total capacity of the internet.

        I understand your argument from the ecological perspective, it makes sense and we should obviously be taking that into consideration. But I really disagree with you that the total system would be more efficient than a more centralized one that uses datacenters.