Data centers are at the heart of the modern digital economy, powering everything from cloud storage to streaming services to the rapidly growing field of artificial intelligence. As demand for AI computing power has surged, data centers continue to proliferate – and in doing so, they have created a new era of energy use across the globe. In 2024 alone, the installed capacity of data centers increased globally by an estimated 20%, and demand is projected to accelerate at a 15% CAGR between 2023 and 2030.
The AI boom has led to an arms race of developers building new (“greenfield”) facilities to win hyperscale contracts from major tech companies like Microsoft and Meta. However, uncertain macroeconomic forces and tightening resources have led to a critical need for innovative solutions to drive efficiency within existing (or “brownfield”) facilities. “Most investment is flowing towards securing land, power and equipment for new data center builds, but there's a massive opportunity for owner/operators to invest in efficiency at currently-deployed sites to improve their competitive positions and plan for higher utilization across their portfolios.” says Ben Tacka of Phaidra, an AI-enabled industrial control system.
Data centers are facing a watershed moment in resource use and management – and we believe that digital solutions won’t just be helpful, but essential to meeting technological, financial, and sustainability goals across the economy.
According to a 2024 Report by the Berkeley Lab, data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7-12% of total U.S. electricity by 2028. Total electricity usage has jumped from 58 TWh in 2014 to 176 TWh in 2023. At this scale, even a 1% improvement in energy efficiency can have massive ripple effects. It’s the energy equivalent of:
Assuming an installed data center capacity of approximately 29 GW across North America and Europe and a total energy cost of up to $20B per year, a 1% efficiency improvement would represent $200M in savings every year across these two regions alone1. Until now, this marginal difference wasn’t a priority; data center developers benefitted from cheap power and land, reducing the focus on optimizing resource usage. But with increasing power constraints and ballooning project scale, efficiency improvements are now essential.
Other key factors shaping the market include:
Operators monitor key metrics that drive both energy and water efficiency in data centers, including Power Usage Effectiveness (PUE), Water Usage Effectiveness (WUE), and Power Capacity Effectiveness (PCE). Balancing these metrics is a complex task, says Jasper de Vries of CoolGradient, a startup providing AI-driven optimization recommendations to data center operators. “There is a clear tradeoff between water and power usage. This optimization challenge is where software solutions can step in to provide visibility and support decision-making.” So far, hyperscale data centers have been at the forefront of efficiency improvements thanks to in-house expertise and resources, but plateauing efficiency has opened the playing field to emerging software tools and models particularly for co-location data centers.
While energy usage depends on type, size and age of a data center, there are two major areas that consistently drive the greatest power and water inefficiencies: (1) power servers and storage devices responsible for the compute and (2) HVAC and cooling solutions that maintain optimal temperatures for those servers to run efficiently. In most cases, these processes are managed through statically programed controls with human monitoring, but operators are beginning to turn to new tools that engage cutting edge technology to automate and fine-tune processes. “Data centers are among the most complex buildings to optimize, with diverse equipment, changing parameters, and siloed data systems,” says Abhishek Sastri, the founder and CEO of FLUIX AI, a pre-seed data center O&M automation software. “Software can bridge the gap between HVAC, water, and IT systems, enabling autonomous control based on environmental factors and IT load data.”
Here are a few key challenges facing data centers today and how we see software stepping in to drive solutions:
Cooling accounts for ~40% of a data center’s energy consumption on average. Traditional cooling systems lack intelligence, often running with unnecessary redundancy to avoid thermal limits and trigger alarms that might impact tenant service level agreements (SLAs). This results in energy waste and inflates operational costs.
How can software help? Software-powered cooling management systems leverage artificial intelligence to optimize temperature and airflow dynamically. These solutions analyze historical data, real-time sensor inputs, and predictive models to fine-tune cooling operations. Software solutions can drive efficiency by switching cooling units from operating in mechanical mode to free cooling mode, using adiabatic cooling in chillers when the weather outside is cold or optimizing water pump set-points. Combining multiple of these actions can reduce energy usage between 10-30%1.
Many data centers operate with minimal real-time insight into their energy usage, leading to inefficiencies. Without proactive monitoring and preventative maintenance, equipment often consumes more energy than necessary, operating below optimal efficiency. Over time, this not only drives up energy costs but also increases the risk of premature equipment failure, leading to unexpected and costly replacements.
How can software help? By leveraging predictive maintenance software, operators can anticipate equipment failures, optimize energy consumption, and proactively address inefficiencies before they escalate. Predictive maintenance software identifies equipment consuming excessive energy or operating outside its normal temperature range based on historical performance, then flags these issues for operators, allowing them to address potential problems (e.g., cleaning a clogged filter) before they impact operations.
While energy costs are the primary concern for data center operators, water usage for cooling is a critical sustainability factor. Some cooling methods can consume vast amounts of water, contributing to resource depletion in water-scarce regions.
How can software help? Software can play a crucial role in balancing the tradeoff between energy and water usage in data centers, setting dynamic controls for HVAC operations and temperature adjustments. For instance, it can intelligently decide when to use methods like adiabatic cooling—which leverages water evaporation to reduce energy consumption—rather than purely mechanical cooling, depending on temperature, humidity, and workload. These real-time adjustments help optimize both energy and water efficiency while maintaining peak performance.
Historically, data centers have been run by DCIMs (Data Center Infrastructure Management) solutions, which are largely dominated by incumbents. In most cases, these are an all-in-one solution incorporating energy tracking, capacity management, asset planning, and more. However, new players have come into the space leveraging AI to reduce energy consumption. It’s important to note that underlying DCIM systems are a key component of a data center’s tech stack and new players are usually building on top of these systems (e.g. BMS, DCIM, SCADA).
The rapid expansion of data centers presents a generational challenge for the grid - and a massive opportunity for innovative solutions. By integrating advanced software solutions, operators can drive substantial efficiency gains, reduce environmental impact, and lower operational costs. Learn more by downloading our Deep Dive report and stay tuned for more research on data center energy use. Working in the space? Reach out!
1Energize Analysis