Beware the energy costs of using AIBy Christian DohertyArtificial intelligence16 Jul 2024 Generative artificial intelligence has great promise, but it also comes with worrying energy demands.The past two years have seen an explosion in the widespread use of artificial intelligence (AI). From writing emails to booking a flight, AI has become embedded in many areas of our lives – both in the office and the wider world. And, as is usually the case, evangelists for the technology have been out in force to explain why AI can solve many of society and industry’s problems – from climate change and burnout, to cancer diagnosis and legal disputes.However, alongside existing concerns over privacy, intellectual property and security, a new worry has emerged around AI- how much energy it uses. And the numbers are scary. Consider this: A query run through the AI chatbot tool ChatGPT needs nearly 10 times as much electricity to process as a Google search, according to estimates by Goldman Sachs.Meanwhile, Cornell University estimated that training one LLM emits about 300,000 kg of carbon dioxide. For context, that’s roughly equivalent to making 125 round trip flights between New York and Beijing.Estimates on the magnitude vary, but there’s no doubt of the direction of travel – more AI, more energy consumed. John Pettigrew, CEO of the National Grid, stated that the power consumed by AI and quantum computing would increase six-fold in the next decade.Even sooner, data centres’ electricity consumption in 2026 is projected to reach 1,000 terawatts, roughly Japan’s total consumption, according to a Yale study, while another industry analyst projects that data-centre energy use could triple between 2023 and 2030, with AI accounting for 90 per cent of the growth.Even the companies investing in and promoting AI admit this is a significant issue. Google says that since 2019, its emissions have increased by 50%, largely due to the boom in GenAI. The firm estimates its electricity consumption grew by 17% in 2023 and its data centres accounted for as much as 10% of all data centre electricity consumption globally, while water usage also increased by 17%. Meanwhile Microsoft says its emissions are 30% higher than 2020 levels.Under the hoodSo why do AI-related tasks demand so much more energy? “AI models demand high energy usage for a couple of reasons, such as training and constant processing,” explains Ramprakash Ramamoorthy, Director of Research at ManageEngine.“The AI models require GPUs to process massive datasets efficiently. GPUs excel at processing these datasets in parallel due to their thousands of cores, and their data-optimised architecture makes them ideal for the task. Once trained, the AI models need to be up and running to perform their tasks, whether recognising faces in photos, predicting anomalies, generating content, or recommending products online; these computations require ongoing power.”Naturally, many of the tech giants driving the development and adoption of AI have downplayed the concerns. For instance, Bill Gates has said AI, rather than posing a problem for the environment, is actually far more likely to drive the solution by helping create more efficient energy systems. It’s worth noting that Microsoft has recently announced a $3.3bn investment in Wisconsin to spur AI innovation and economic growth – just the latest in big dollar bets on the technology.So are these concerns justified? “Yes, they certainly are,” says Peter Wood, Chief Technical Officer at Spectrum Search. “Just consider how much computing horsepower is needed to train and deploy large-scale AI models – it’s quite staggering.“We’re not just talking about powering up servers here, there’s also cooling systems, network infrastructures, and the non-stop operation to keep these high-performing environments going. The carbon emissions from all this energy use can’t be brushed aside. And as we move further into the AI era, the energy demand will only increase, underlining the urgent need for greener practices.”Grappling for controlNot surprisingly, calls are growing for greater regulation, both here and in the US. “Unchecked and unregulated, AI use has the potential for disastrous environmental consequences, not to mention the additional ethics, accuracy and privacy risks,” says Maxime Vermeir, Senior Director of AI strategy at ABBYY, a company that specialises in developing Small Language Models (SLMs), a specialised subset of AI that requires less computing power.As greenwashing continues to be a problem, Vermeir says we need to make sure businesses are held accountable for the environmental impacts of the development of their own AI. “An important part of this is proper legislation and regulation. Early AI legislation has mostly focused on privacy and other ethical concerns which are essential for responsible AI use, but the environmental implications are largely overlooked.”Even with improved regulation, legislators can only go so far. “Organisations must take responsibility for how they leverage AI themselves. One thing they should consider is pivoting to purpose-built Small Language Models (SLMs), which specialise AI for specific tasks and goals.”How to fix itAs to what else companies that are using AI should do about this, core accounting skills will be vital. “Businesses should start by defining a strategy to make sustainable decisions across the entirety of their IT estate to reduce their digital carbon emissions,” says Luis Rosenthal, Innovation Lead for KPMG’s Ignition innovation lab.“When it comes to AI, measuring and then managing the environmental impacts of the technology is key. This can be achieved by creating a baseline using a common set of measures and then monitoring the change associated with the implemented AI models.”Spectrum Search’s Peter Wood says businesses should be considering how to harness Gen AI to bring down their energy use. One of them is pouring investment into energy-efficient infrastructure: “This could take the form of upgrading to more capable hardware, tweaking software to cut energy use, and incorporating advanced cooling technologies,” he suggests.“Another strategy is to switch to renewable energy sources, which could drastically shrink the carbon footprint of data centres. Not only would this be good for the environment, but it could also make financial sense as the cost of renewable energy continues to drop.”Businesses could also use AI itself to manage and reduce their energy usage. For instance, AI-powered energy management systems can predict and adjust power consumption on-the-fly, making sure that energy is used in the most efficient way possible. “It’s also worth considering the life cycle of AI models,” says Wood. “By regularly updating and enhancing models to be more efficient, businesses can cut the amount of energy needed for training and deployment.”ManageEngine’s Ramamoorthy suggests that businesses should consider developing domain-specific GenAI models and training them with smaller datasets and efficient algorithms. This not only ensures less energy consumption but also less financial strain on the businesses.Powered by contextual intelligence, domain-specific models also deliver promising results. “Businesses can also use cloud platforms that prioritise efficiency and sustainability,” Ramamoorthy says. “They can look for providers who power their data centres with renewable energy sources, like solar or wind energy, to reduce their carbon footprints.”Why might this matter to you?The ISSB’s Global Sustainability Disclosure Standards (which AAT members can read about on Knowledge Hub) are a key milestone in creating a global baseline disclosing the effect of climate-related risks on a company’s prospects. They include reporting for Scopes 1-3 of emissions, which includes calculating the emissions directly and indirectly generated by a company.That means accountants need to keep an eye on their companies’ energy useage.Increased energy use that is directly attributable to AI-related tasks will be difficult to track in terms of which of Scope 1-3 emissions it falls into. The example of Microsoft is instructive in that the company has attributed the increase in its emissions as largely due to indirect emissions (Scope 3) from the construction and provisioning of more data centres to meet customer demand for cloud services. According to the Register, “Scope 3 accounts for more than 96% of Microsoft’s total emissions, which includes those from its supply chain, the life cycle of its hardware and devices and other indirect sources.” For the record, Microsoft has said that it “aims to address the Scope 3 issue through steps such as getting suppliers to use renewable energy.”As this issue shows, companies will need to take more care to track energy use throughout the supply chain. Christian Doherty is a business journalist and freelance writer for AAT.