{"id":505568,"date":"2025-10-17T01:20:31","date_gmt":"2025-10-17T01:20:31","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/505568\/"},"modified":"2025-10-17T01:20:31","modified_gmt":"2025-10-17T01:20:31","slug":"theres-a-simple-way-we-could-drastically-cut-ai-energy-use","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/505568\/","title":{"rendered":"There&#8217;s a simple way we could drastically cut AI energy use"},"content":{"rendered":"<p><img decoding=\"async\" class=\"Image\" alt=\"\" width=\"1350\" height=\"900\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/SEI_270214644.jpg\"   loading=\"eager\" fetchpriority=\"high\" data-image-context=\"Article\" data-image-id=\"2500414\" data-caption=\"AIs rely on data centres that use vast amounts of energy\" data-credit=\"Jason Alden\/Bloomberg\/Getty\"\/><\/p>\n<p class=\"ArticleImageCaption__Title\">AIs rely on data centres that use vast amounts of energy<\/p>\n<p class=\"ArticleImageCaption__Credit\">Jason Alden\/Bloomberg\/Getty<\/p>\n<\/p>\n<p>Being more judicious in which AI models we use for tasks could potentially save 31.9 terawatt-hours of energy this year alone \u2013 equivalent to the output of five nuclear reactors.<\/p>\n<p><a href=\"https:\/\/scholar.google.com\/citations?user=yvaWpXEAAAAJ&amp;hl=pt-BR\" target=\"_blank\" rel=\"noopener\">Tiago da Silva Barros<\/a> at the University of Cote d\u2019Azur in France and his colleagues looked at 14 different tasks that people use generative AI tools for, ranging from text generation to speech recognition and image classification.<\/p>\n<p>They then examined public leaderboards, including those hosted by the machine learning hub Hugging Face, for how different models perform. The energy efficiency of the models during inference \u2013 when an AI model produces an answer \u2013 was measured by a tool called CarbonTracker, and the total energy use of that model was calculated by tracking user downloads.<\/p>\n<p>\u201cBased on the size of the model, we estimated the energy consumption, and based on this, we can try to do our estimations,\u201d says da Silva Barros.<\/p>\n<p>The researchers found that, across all 14 tasks, switching from the best-performing to the most energy-efficient models for each task reduced energy use by 65.8 per cent, while only making the output 3.9 per cent less useful \u2013 a trade-off they suggest could be acceptable to the public.<\/p>\n<p>Because some people already use the most economical models, if people in the real world swapped from high-performance models to the most energy-efficient model they could bring about a 27.8 per cent reduction in energy consumption overall. \u201cWe were surprised by how much can be saved,\u201d says team member <a href=\"https:\/\/www-sop.inria.fr\/members\/Frederic.Giroire\/\" target=\"_blank\" rel=\"noopener\">Fr\u00e9d\u00e9ric Giroire<\/a> at the French National Centre for Scientific Research.<\/p>\n<p>However, that would require change from both users and AI companies, says da Silva Barros. \u201cWe have to think in the direction of running small models, even if we lose some of the performance,\u201d he says. \u201cAnd companies, when they develop models, it\u2019s important they share some information on the model which allows the users to understand and evaluate if the model is very energy consuming or not.\u201d<\/p>\n<p>Some AI companies are reducing the energy consumption of their products through a process called model distillation, where large models are used to train smaller models. This is already having a significant impact, says <a href=\"https:\/\/www.bristol.ac.uk\/people\/person\/Chris-Preist-b86f2bb7-446b-4dff-be53-861fa07cbcfd\/\" target=\"_blank\" rel=\"noopener\">Chris Preist<\/a> at the University of Bristol in the UK. For example, Google recently claimed <a href=\"https:\/\/blog.google\/outreach-initiatives\/sustainability\/google-ai-energy-efficiency\/\" target=\"_blank\" rel=\"noopener\">a 33-fold energy-efficiency improvement<\/a> in Gemini over the past year.<\/p>\n<p>However, getting users to pick the most efficient models \u201cis unlikely to result in limiting the energy increase from\u00a0data centres as the authors suggest, at least in the current AI bubble,\u201d says Preist. \u201cReducing energy per prompt will simply allow more customers to be served more rapidly with more sophisticated reasoning options,\u201d he says.<\/p>\n<p>\u201cUsing smaller models can definitely result in less energy usage in the short term, but there are so many other factors that need to be considered when making any kind of meaningful projections into the future,\u201d says <a href=\"https:\/\/www.sashaluccioni.com\/\" target=\"_blank\" rel=\"noopener\">Sasha Luccioni<\/a> at Hugging Face. She cautions that rebound effects like increased use \u201chave to be taken into account, as well as the broader impacts on society and the economy\u201d.<\/p>\n<p>Luccioni points out that any research in this space relies on external estimates and analysis because of a lack of transparency from individual companies. \u201cWhat we need, to do these kinds of more complex analyses, is more transparency from AI companies, data centre operators and even governments,\u201d she says. \u201cThis will allow researchers and policy-makers to make informed projections and decisions.\u201d<\/p>\n<p class=\"ArticleTopics__Heading\">Topics:<\/p>\n","protected":false},"excerpt":{"rendered":"AIs rely on data centres that use vast amounts of energy Jason Alden\/Bloomberg\/Getty Being more judicious in which&hellip;\n","protected":false},"author":2,"featured_media":505569,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3163],"tags":[323,1942,35,53,16,15],"class_list":{"0":"post-505568","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-energy","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/115386884010520532","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/505568","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=505568"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/505568\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/505569"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=505568"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=505568"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=505568"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}