Alaa Bawab, General Manager, Lenovo Infrastructure Solutions Group (ISG), META
Artificial intelligence (AI) holds the potential to solve some of the thorniest problems facing humanity, including the challenges of climate change. But at the same time, the technology – in particular, generative AI – uses a vast amount of computational power, and consequently a huge amount of energy. This is a problem, and one which is only going to grow. The amount of computing power required for cutting-edge AI models is doubling every five or six months, and it’s reasonable to imagine that it will continue to increase as demand for the technology booms. Data centres already consume up to 1.5% of the world’s electricity supply, and energy consumption is responsible for around 75% of man-made greenhouse gas emissions in the EU.
Recent research by Gartner® predicts that, “by 2030, AI could help reduce global GHG emissions by 5% to 10%”. However, by the same year, Gartner® predicts that “AI could consume up to 3.5% of the world’s electricity”. The UAE for example, is one of the five GCC countries that have also set active Net Zero targets. The country aims for net zero emissions by 2050, a strategic decision that also makes it the first Middle East and North African country to do so. Additionally, the UAE also believes that AI plays a crucial role in achieving these net zero targets as well as drive the transition towards a green economy.
The tech industry is facing a clear challenge: to find solutions to curb the energy demands of AI, and thus unlock the technology’s full potential to help the human race.
How AI consumes power
The power required by AI is due to two factors: energy is consumed when models are trained, and during inference, where live data is run through a trained AI model to solve tasks. Research published in the journal Joule suggests that inference can account for at least 60% of the energy consumption of generative AI, and that adding AI capabilities to web searches can multiply energy demands tenfold. There also tends to be an increased volume of queries when engaging with a generative model compared to a search engine, due to the back-and-forth dialogue as users try to achieve their desired result.
As new use cases for generative AI emerge around text, images and video, there will also be an increase in large models being trained, retrained and fine-tuned on a daily basis. The recent class of generative AI models require more than a 200-fold increase in computing power to train compared to previous generations. Every new generation of models requires more computing power for inference, and more energy to train. It’s a constant cycle that continually adds demand onto the required infrastructure.
In terms of hardware, the graphics processing units (GPUs) used for AI can expend many times the energy of a traditional CPU system. Today’s GPU’s can consume up to 700 watts, and an average installation takes eight GPUs per server. This means a server could be consuming nearly six kilowatts, compared to one kilowatt for the traditional two-socket server unit enterprises use for virtualisation. So, the big question is, how can we make this more sustainable?
Finding answers
The first step is to understand that sustainability is a journey: there is no singular action that can ‘fix’ it when it comes to AI. But small steps can make a big difference. The computing industry is being sent a loud, clear message to create better products that use fewer resources. This call is coming from consumers and investors, but also increasingly from governments. Being energy efficient will in future be a legal requirement for organisations in the AI space. Recent amendments to the EU AI Act will mandate that operators adopt state-of-the-art methods to cut energy consumption and enhance the efficiency of their AI platforms.
This can be achieved in three specific technical ways: first in the chips used to generate the computational power, second in the computers built for those chips, and third in the data centre. Sustainability is increasingly becoming a competitive differentiator both for chip makers and PC makers, and will become more so as companies make the effort to achieve ESG goals. In the coming decades, new advances such as analogue chips could offer an energy-efficient alternative, perfect for neural networks, according to research in the journal Nature.
In the data centre, older air-cooling technologies are already struggling to deal with the high energy demands of AI, and customers are turning to liquid cooling to minimise energy consumption. By efficiently transferring the heat generated by generative AI into water, customers can save up to 30-40% on electricity. Data centres driven by green energy sources will be key to reducing AI’s carbon footprint. ‘As a service’ approaches to AI technology can also help to minimise waste and ensure that organisations are using the newest, most sustainable hardware, without up-front capital outlay.
AI for good
There is a trade-off around AI and its energy demands that needs to be discussed. Some are using AI for the benefit of humankind, by improving medicine or tackling climate change, for example, while others are using it to generate entertainment. This raises questions around whether we should view those different energy demands differently.
It is certain that AI has enormous potential to do good, already having an impact in many areas. There are dozens of examples of how AI holds the potential to mitigate the impacts of climate change, with the UN pointing out that it is not only helping to better forecast and understand extreme weather, but also offering direct help to communities impacted by this.
In addition, AI can offer new understanding of the world around us, which could in turn help to curb greenhouse gas emissions. In smart cities, it has enormous potential to minimise emissions by saving minutes or hours of heating and air conditioning at city scale, by learning people’s habits and turning heating or air con down gradually in the hour before they leave their homes. The technology can also regulate traffic across a city, so that vehicles drive efficiently and traffic jams are prevented. Norwegian start-up Oceanbox.io is harnessing predictive AI on its mission to understand the depths of the ocean, forecasting the movement of currents which can help to combat the spread of pollution and help vessels to reduce their petrol use.
AI could also revolutionise medicine, accelerating the analysis of the human genome and potentially ushering in a new era of precision healthcare, where treatments are tailored to individual DNA. Tools such as GOAST: Genomics Optimization and Scalability Tool are speeding up genetic analysis to the point where personalised medicine could work at the population scale. With specially-tuned hardware and optimised workflows supported by AI, GOAST can process a human genome at a throughput rate of just 24 minutes, a task that used to take up to 150 hours.
AI’s contribution to a Net Zero world
There is no question that AI uses a lot of power, but we can tackle this step by step – by using warm water cooling instead of air cooling, harnessing sustainable energy sources to drive data centres, and through innovations in chip and computer design.
In so many ways, AI also can also offer positives for humanity and become a powerful force to drive the world toward the UN’s Sustainable Development Goals. It can help to understand and combat climate change, reduce inequality, and preserve our oceans and forests. Used responsibly, AI can go hand in hand with sustainable objectives. As the world comes together to drive towards net zero, AI will increasingly play an important part.