The rapid advancement of artificial intelligence, particularly the training of large-scale models that are used to power many of today’s widely used applications, is driving renewed growth in electricity demand, the Electric Power Research Institute said in releasing a new report with Epoch AI that forecasts that training a leading model could require more than 4 gigawatts of power by 2030.

The training of large-scale AI models has historically required large, localized power supply. Despite rapid efficiency gains, the power demands of training a leading model have more than doubled every year for the past decade, according to the report

AI companies have found that increasing model size and complexity can provide better performance, which in turn drives the need for additional compute and electrical power. 

The report finds that the AI industry will likely continue to scale up its models in the coming years, despite recent computational efficiency breakthroughs.

Overall power demand for AI extends well beyond large-scale training. Significant power capacity will be allocated towards the deployment of AI to serve users, as well as training smaller models and conducting AI research. 

Total AI power capacity in the U.S. is estimated at around 5 GW today and could reach more than 50 GW by 2030 -- matching the total global demand from data centers today and comprising a rapidly growing share of overall data center power demands.

"The energy demands of training cutting-edge AI models are doubling annually, soon rivaling the output of the largest nuclear power plants," said Jaime Sevilla, director of Epoch AI. "This report offers a rigorous, data-driven look at these trends and where they’re headed. Epoch AI will continue investigating the energy demand of AI and related topics."

"AI applications are becoming prevalent in our daily lives and will likely play a key role in the energy system of the future," said EPRI President and CEO Arshad Mansoor. "To meet these rising energy demands, data center developers and power providers are embracing innovative solutions in a build-to-balance approach. Building new infrastructure along with balancing through flexibility in data center design will be critical to accelerate grid connections, while minimizing costs and enhancing system reliability," he added.

EPRI launched the DCFlex collaborative last year to demonstrate the technologies, policies, and tools to make data center flexibility a reality. 

Data center flexibility, including geographically distributed training data centers, could transform data centers from passive customers to grid assets to improve reliability, lower costs, and speed connection. 

The effort, which brings together over 45 companies, including founding members Google, Meta, NVIDIA, and various utilities, recently launched its first real-world field demonstrations in Lenoir, N.C., Phoenix, Ariz., and Paris, France.

The New York Power Authority is one of the founding members of DCFlex.

Emerald AI on July 1 released results of a first-of-its-kind demonstration as part of EPRI's DCFlex Initiative in Phoenix, Arizona alongside partners including Oracle Cloud Infrastructure (OCI), NVIDIA, EPRI, and public power utility Salt River Project (SRP). 
 

 


 

NEW Topics