The Overlooked Cost of Data Center Power
The public conversation around data centers primarily centers on two items: water and power. More specifically, how much water and how much power a data center will use.
That conversation misses a critical part of the resource equation.
Yes, data centers consume power and water. But where does that power actually come from, and what resources are used to create it before it ever reaches a server?
To understand the full impact, it is worth starting with a simple primer on how electricity is generated.
At its core, most large-scale power generation is based on a straightforward process. Water is heated until it becomes steam. That steam spins a turbine. The turbine turns a generator. The generator produces electricity.
That is the foundation of the global power system.
When people picture this, they often think of the large cooling towers seen in popular culture, such as the iconic scenes from The Simpsons. Those towers releasing white plumes into the sky are not just visual markers of power generation. They represent water being used, heated, converted to steam, and ultimately dissipated as part of the process.
This is where the conversation around data centers becomes incomplete.
A data center does not just use water at the facility level. It also relies on water that is used upstream in the generation of the electricity it consumes.
In effect, the system uses water twice. Once at the point of generation and again at the point of consumption.
There is another layer to this inefficiency that is even less visible.
After electricity is generated, it does not move directly into a data center. It travels. Often over long distances. Through transmission lines, substations, and distribution networks before it reaches its destination.
At each step, energy is lost.
According to the U.S. Energy Information Administration, approximately 5 percent of all electricity generated in the United States is lost during transmission and distribution.
That number may appear small, but at scale it becomes significant.
For large industrial users such as data centers, electricity often travels hundreds of miles from the point of generation. Losses accumulate across high voltage transmission, voltage step-down, and local distribution. The result is that a meaningful portion of the energy that was created never arrives at the facility.
It is generated, it consumes water, and it is paid for, but it is never used.
At hyperscale, the implications are substantial.
A 1 gigawatt data center campus relying on remote generation can require tens of megawatts of additional power production simply to offset transmission losses. That additional generation requires additional water at the plant level, further increasing the total resource footprint.
This is the hidden cost of distance.
When power and compute are separated, inefficiencies compound. Water is used to generate electricity. A portion of that electricity is lost in transit. More water is consumed to produce replacement power. All before a single workload is processed.
The traditional model has treated power as something that can be generated anywhere and delivered everywhere. That assumption is now being challenged by scale, by infrastructure constraints, and by the growing demands of AI-driven compute.
A more efficient model is emerging.
Bringing power generation closer to where it is consumed reduces the need for long-distance transmission. It minimizes losses across the grid. It reduces the amount of excess generation required to deliver reliable capacity. And it directly lowers the upstream water footprint associated with power production.
This shift is not just about efficiency. It is about precision in how resources are used.
Every megawatt that is lost in transmission represents energy that had to be created but never delivered. Every unit of water used to generate that lost energy represents a resource that provided no computational value.
As data center demand continues to scale, these inefficiencies become increasingly important.
The future of infrastructure will not be defined only by how much power is consumed. It will be defined by how efficiently that power is generated, delivered, and ultimately converted into compute.
Because the true cost of power is not just measured at the data center.
It begins at the source.

