AI and generative AI (GenAI) are driving a rapid increase in electricity consumption, with data center power demand projected to grow by as much as 160% over the next two years, according to Gartner, Inc. The research firm warns that by 2027, 40% of existing AI data centers will face operational constraints due to power availability. The explosive growth of hyperscale data centers, needed to support GenAI applications, is creating an insatiable demand for energy, which is expected to exceed the ability of utility providers to expand capacity fast enough. Gartner predicts that this will lead to power shortages, limiting the growth of new data centers from 2026 onward.
Gartner estimates that the power required to run AI-optimized servers will reach 500 terawatt-hours (TWh) per year by 2027, more than double the consumption in 2023. As the demand for larger data centers increases, especially to handle the vast amounts of data needed for large language models (LLMs) driving GenAI, short-term power shortages are expected to persist for years. New power generation, transmission, and distribution capacity will take years to come online, and it is unlikely to address current energy shortages in the near future.
The surge in demand for electricity is also expected to lead to higher power costs, impacting the cost of operating data centers. Major power users are already securing long-term, guaranteed power contracts to ensure a stable energy supply. Data center operators, in turn, will pass on these increased costs to AI and GenAI service providers, who will need to factor these higher energy costs into their business models.
In addition, the demand for power could have negative effects on sustainability goals, as energy suppliers may have to keep fossil fuel plants running longer than planned to meet the increased demand. This could undermine efforts to reduce CO2 emissions from data centers. Renewable energy sources like wind and solar cannot provide 24/7 power, making it difficult for data centers to meet energy demands without relying on fossil fuels or nuclear power. In the long term, technologies like improved battery storage or small nuclear reactors may help achieve sustainability goals, but short-term solutions are likely to contribute to higher CO2 emissions.
Gartner recommends that organizations reassess their sustainability goals and consider the growing power demands of data centers when planning future AI initiatives. Companies should look for ways to minimize power consumption by focusing on more efficient computing methods, such as edge computing or smaller language models, and negotiate long-term contracts for data center services to secure power at reasonable rates. By planning ahead, organizations can manage the risks associated with increasing power demand and higher electricity costs while ensuring the continued growth of AI and GenAI applications.