AI Data Centers Require Vast Amounts Of Energy

By Liam Moore

New artificial intelligence (AI) data centers are being built worldwide to keep up with the rapid growth of AI usage, and they require a great deal of electrical power. Data center energy usage is expected to match the energy consumption of the entire country of Japan in 2026, making data centers one of the most energy-intensive industries in the world. By 2026, power usage in data centers is expected to grow from 2022’s 460 TWh (terawatt-hours) to 1000TWh. One AI data center uses 10 to 50 times more energy per floor space than a typical commercial building. If we are not able to make data centers more efficient with energy, then energy prices near these data centers will increase greatly, according to Ray Zuckerman of Server Lift, an IT company.

Data centers in Ashburn, Virginia. Photo credit: Hugh Kenny, Virginia Mercury.

Data centers are large buildings that hold and maintain the components needed to train and run the GPUs that power AI systems like Chat-GPT. Because of the increasing demand for AI, data centers have begun to pop up all over the country and around the world, especially in Northern Virginia, where close to 70% of the world’s data centers are located, according to the Virginia Mercury

With the number of data centers increasing, the US Department of Energy says they have been “anticipating and planning for rising electricity demand,” according to their website. A similar phenomenon happened in the early 2000s, when there was a large energy boom from the mass adoption of energy-inefficient computers. Electricity demand grew by 15-20% because of this new technology, but as computers became more efficient, our energy usage went down. This is similar to what the US Department of Energy is expecting with AI data centers. 

Data center servers. Photo credit: Department of Energy.

As of 2023, data centers use up to 4% of all electricity worldwide, according to the Department of Energy. AI is increasingly present in people’s everyday lives. ChatGPT, run by OpenAI, average 100 million users per week and have begun to replace search engines like Google. For businesses, AI’s ability to increase the efficiency of time-intensive jobs, like creating personalized experiences for consumers, simplifying the job process, and improving decision-making, has made it very popular for companies like Amazon, Walmart, and Microsoft.

Unlike older search engines like Google and Bing, AI systems require much more power to perform a search. The World Economic Forum says that AI could use around “33 times more energy to complete a task” than a regular search engine. 

The energy consumption of these complex data and processing systems has led to unprecedented levels of energy demand in data centers. 

Electricity demand in data centers rising. Image credit: US Department of Energy.

The building of data centers has also caused disruptions in the communities in which they are being planned and built. An October article in the New York Times focused on a rural Missouri town called Peculiar and the effects of adding a data center to a small town. Large corporations like Google, Meta, and Microsoft are drawn to these small towns because of their cheap land, but the large centers threaten the way of life for these people. The data centers will bring new jobs to the area, but they are specialized jobs that mainly focus on maintaining the center and would likely be outsourced and not available to the townspeople. In Peculiar, there was a petition signed to ban the building of a five-hundred-acre data center because of the residents’ concerns that it would take away from the small-town experience. 

Innovations through AI are made possible because of these data centers. But it’s important to consider the rising energy costs, as well as the many small communities affected by these new data centers when supporting the use of AI systems in our everyday lives.

Featured image credit: Teson Solutions.

About the author

Liam Moore is a member of the class of 2025