Nearly all leading artificial intelligence developers are focused on building AI models that mimic the way humans
reason, but new research shows these cutting-edge systems can be far more energy intensive, adding to concerns about
AI’s strain on power grids.
AI reasoning models used 100 times more power on average to respond to 1,000 written prompts than alternatives without
this reasoning capability or which had it disabled, according to a study released Thursday.
The work was carried out by the AI Energy Score project, led by Hugging Face research scientist Sasha Luccioni and
Salesforce Inc. head of AI sustainability Boris Gamazaychikov.
The researchers evaluated 40 open, freely available AI models, including software from OpenAI, Alphabet Inc.’s Google
and Microsoft Corp. Some models were found to have a much wider disparity in energy consumption, including one from
Chinese upstart DeepSeek.
A slimmed-down version of DeepSeek’s R1 model used just 50 watt hours to respond to the prompts when reasoning was
turned off, or about as much power as is needed to run a 50 watt lightbulb for an hour. With the reasoning feature
enabled, the same model required 308,186 watt hours to complete the tasks.
The soaring energy needs of AI have increasingly come under scrutiny. As tech companies race to build more and bigger
data centers to support AI, industry watchers have raised concerns about straining power grids and raising energy costs
A Bloomberg investigation in September found that wholesale electricity prices rose as much as 267 per cent over the
past five years in areas near data centres.
There are also environmental drawbacks, as Microsoft, Google and Amazon.com Inc. have previously acknowledged the data
centres buildout could complicate their long-term climate objectives.
More than a year ago, OpenAI released its first reasoning model, called o1. Where its prior software replied almost
instantly to queries, o1 spent more time computing an answer before responding.
Many other AI companies have since released similar systems, with the goal of solving more complex multistep problems
for fields like science, math and coding.
Though reasoning systems have quickly become the industry norm for carrying out more complicated tasks, there has been
little research into their energy demands.
Much of the increase in power consumption is due to reasoning models generating much more text when responding, the
The new report aims to better understand how AI energy needs are evolving, Luccioni said. She also hopes it helps people
better understand that there are different types of AI models suited to different actions. Not every query requires
tapping the most computationally intensive AI reasoning systems.
“We should be smarter about the way that we use AI,” Luccioni said. “Choosing the right model for the right task is
To test the difference in power use, the researchers ran all the models on the same computer hardware. They used the
same prompts for each, ranging from simple questions — such as asking which team won the Super Bowl in a particular year
— to more complex math problems. They also used a software tool called CodeCarbon to track how much energy was being
The results varied considerably. The researchers found one of Microsoft’s Phi 4 reasoning models used 9,462 watt hours
with reasoning turned on, compared with about 18 watt hours with it off.
OpenAI’s largest gpt-oss model, meanwhile, had a less stark difference. It used 8,504 watt hours with reasoning on the
most computationally intensive “high” setting and 5,313 watt hours with the setting turned down to “low.”
OpenAI, Microsoft, Google and DeepSeek did not immediately respond to a request for comment.
Google released internal research in August that estimated the median text prompt for its Gemini AI service used 0.24
watt-hours of energy, roughly equal to watching TV for less than nine seconds. Google said that figure was
“substantially lower than many public estimates.”
Much of the discussion about AI power consumption has focused on large-scale facilities set up to train artificial
Increasingly, however, tech firms are shifting more resources to inference, or the process of running AI systems after
they’ve been trained. The push toward reasoning models is a big piece of that as these systems are more reliant on
Recently, some tech leaders have acknowledged that AI’s power draw needs to be reckoned with. Microsoft CEO Satya
Nadella said the industry must earn the “social permission to consume energy” for AI data centers in a November
interview. To do that, he argued tech must use AI to do good and foster broad economic growth.
More stories like this are available on bloomberg.com
Published on December 5, 2025