A single person with a serious AI habit may chew through enough electricity each day to keep a microwave running for more than three hours. And the actual toll may even be worse, as so many companies keep details about their AI models secret.
Speaking to two dozen researchers tracking AI energy consumption, and conducting experiments on its own, the MIT Technology Review concluded that the total energy and climate toll is incredibly difficult to get a handle on. But that didn't stop them from trying.
Working with researchers from Hugging Face, the authors of the report determined that the energy used for a single query to the open-source Llama 3.1 8B engine used around 57 joules of energy to generate a response. (That 8B means the engine uses 8 billion parameters). When accounting for cooling and other energy demands, the report said that number should be doubled, bringing a single query on that model to around 114 joules - equivalent to running a microwave for around a tenth of a second. A larger model, like Llama 3.1 405B, needs around 6,706 joules per response - eight seconds of microwave usage.
In other words, the size of a particular model plays a huge role in how much energy it uses. Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested.
It's also worth pointing out that those figures are for text-based responses. AI-generated photos actually use considerably less than text responses thanks to their smaller model size and the fact that diffusion is more energy efficient than inference, MIT TR noted.
AI video generation, on the other hand, is an energy sinkhole.
In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy - equivalent to running a microwave for an hour or riding 38 miles on an e-bike, Hugging Face researchers told the Tech Review.
"It's fair to say that the leading AI video generators, creating dazzling and hyperrealistic videos up to 30 seconds long, will use significantly more energy," the report noted.
Using that data, the authors compiled an estimate to look at the daily AI energy consumption of someone with a habit of leaning on generative models for various tasks. Fifteen questions, ten attempts at generating an image, and three tries at making an Instagram-ready five-second video would eat up the aforementioned estimate of 2.9 kWh of electricity - three and a half hours of microwave usage.
Hundreds of millions of people around the world use ChatGPT per week, OpenAI estimates.
The researchers focused on open-source LLMs that we know a lot about. But companies like OpenAI and Google keep the size and reach of their models hidden from the public, and that that seriously hampers accurate energy usage estimates.
When it comes to measuring CO2 emissions, the AI picture gets even more complicated, the Tech Review article notes. The mixture of renewable and non-renewable energy sources varies widely by location and time of day (solar isn't used at night, for instance).
The report also didn't touch on prompt caching, a technique commonly used by generative models to store responses and feed them back to users asking the same or similar questions, which can reduce energy consumption for AI models.
Regardless of those caveats, one thing is for sure: a lot of energy is being consumed to power the world's growing AI habit. Not only that, but a good portion of it is spewing carbon into the atmosphere for what is arguably questionable usefulness.
As the Tech Review report pointed out, the current spike in datacenter energy usage follows years of relatively flat consumptions thanks to steady workloads and ever-increasing efficiency. Datacenters ate up more than a fifth of the electricity in Ireland in 2023. The global energy consumption of datacenters is predicted to more than double from its current rates by 2030, surpassing the energy consumption of the entire nation of Japan by the start of the next decade. AI, naturally, is the largest driver of that increase.
There has been a lot of lip service paid to going green by tech companies over the years, who've long assured the public that their bit barns aren't an environmental threat. But now that AI's in the picture, the professed net-zero goals of tech giants like Microsoft and Google are receding into the distance.
We've covered this a lot at The Register of late, and our reporting largely aligns with what the MIT tech Review report concluded: The energy behind AI is far dirtier than tech companies would like us to believe.
Overall, datacenters are predicted to emit 2.5 billion tons of greenhouse gases by the end of the decade. That's three times more than they would have if generative AI hadn't become the latest craze.
To add insult to apocalypse, those numbers rest on a shaky data foundation, as the Tech Review report noted.
"This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI's future impact on energy grids and emissions," they said. ®
Comment X11 is very far from dead - no matter if some want it to be
VCF bundle is worth it if you make the most of every part, says CTO
What about notebooks, including AI-ready devices? Ah well, still months to go, eh Microsoft
Exclusive Framework agreement may rescue some unis from 'financial abyss' after Oracle per-employee Java license, says insider
Baseball, apple pie, and assisted programming
FrontPage Remote Procedure Call and others set to be blocked in the name of 'Secure by Default'
Probably the easiest way to a Google-free smartphone or tablet