As artificial intelligence transforms our computing systems, economies, and energy footprints, the question of who should regulate artificial intelligence and how is a pressing – and incredibly complex – issue. There’s a political battle currently unfolding in Washington about governance over the sector, and large questions about what limitations should be placed on the fast-evolving sector. Answering these questions will be extremely difficult thanks to high levels of secrecy and opacity within and around the sector.
One of the biggest concerns related to AI is its massive and ever-expanding energy footprint and associated greenhouse gas emissions. The projected growth of the sector’s energy needs is so significant that many world leaders are starting to prioritize it as an imminent threat to energy security. Energy demand from data centers is set to double by just 2030.
“In the past few years, AI has gone from an academic pursuit to an industry with trillions of dollars of market capitalisation and venture capital at stake,” reports the International Energy Agency. The scale of the energy needed to power this growth means that “the energy sector is therefore at the heart of one of the most important technological revolutions today.”
The thing is, no one knows exactly to what extent that’s true. Our best estimates of how much energy AI will use are fuzzy at best, because we don’t even know how much energy AI is already using. Researchers are trying hard to quantify and track exactly how much energy AI is consuming, but “this effort is complicated by the fact that major players like OpenAI disclose little environmental information,” according to a recent report from Wired.
As of May 2025, a whopping 84 percent of all large language model traffic was conducted on AI models operating with zero environmental disclosure. “It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she went on to say.
Luccioni is leading a research team that’s attempting to analyze exactly how much energy AI engines are using, which will be able to inform better targeted and more appropriate policy actions. Concerningly, the numbers around AI energy consumption that are currently being repeated over and over in the media are generally derived from statements with no empirical backing.
For example, a much-published “stat” states that the average ChatGPT is ten times more energy than a Google search, but the origins of this figure actually come from a potentially unfounded public remark made by Alphabet’s John Hennessy in 2023. Referring to this example, Luccioni says that “people have taken an off-the-cuff remark and turned it into an actual statistic that’s informing policy and the way people look at these things.”
The reality is that putting a number to the energy footprint of a single AI query is all but impossible. Queries vary widely in complexity and in corresponding computing power. And some companied have sophisticated systems that use simpler systems (and therefore less energy) for simpler questions, but others do not.
“The real core issue is that we have no numbers,” she went on to tell Wired. “So even the back-of-the-napkin calculations that people can find, they tend to take them as the gold standard, but that’s not the case.”
We should all be concerned about the opacity of AI’s energy consumption, and the lack of government regulation to reign it in. In all likelihood, the cost of increased energy demand driven by AI will fall to consumers in the form of high energy bills. And this will do little to incentivize AI companies to employ less energy-intensive models.
By Haley Zaremba for Oilprice.com
More Top Reads From Oilprice.com