Wholesale electricity costs as much as 267% more than it did five years ago in areas near data centers. That number comes from Bloomberg's analysis of grid pricing data across the United States, and it is not an abstraction. In the mid-Atlantic region served by the PJM grid, data center demand contributed to an estimated $9.3 billion price increase in the 2025-26 capacity market. The average residential bill in western Maryland rose by $18 a month. Ohio, $16 a month. Nobody voted on that. Nobody signed up for it. It just appeared on a utility statement, somewhere beneath the line items for distribution and transmission.

This is not a tech story. It is a power story, in both senses of the word.

The Scale Nobody Is Accounting For

The numbers are genuinely staggering if you let yourself sit with them. Global data centers consumed roughly 415 terawatt-hours of electricity in 2024, about 1.5% of all electricity used on Earth. The IEA projects that figure doubles to 945 TWh by 2030. In the United States specifically, data centers used 183 TWh in 2024, more than 4% of the country's total electricity consumption; by 2030, that figure is projected to hit 426 TWh, a 133% increase. To put that in terms that translate: a typical AI-focused hyperscaler already consumes as much electricity annually as 100,000 households, and the larger facilities currently under construction are expected to use twenty times that.

Meanwhile, Amazon, Microsoft, Google, and Meta collectively spent over $200 billion on capital expenditures in 2024, a 62% year-over-year increase, all of it pointed at building more of this infrastructure. Amazon's CapEx alone hit $85.8 billion, up 78% from the year prior. The Stargate initiative, announced with considerable fanfare, aims to spend $500 billion on as many as ten data centers, each potentially requiring five gigawatts of power, more than the total demand from the state of New Hampshire.

Follow the incentives here. The companies building this infrastructure are not paying for the grid upgrades it requires. A Carnegie Mellon study estimates that data centers and cryptocurrency mining could lead to an 8% increase in the average U.S. electricity bill by 2030, potentially exceeding 25% in northern Virginia, which has more data centers than any other state in the country. Dominion Energy, which serves that region, proposed its first base-rate increase since 1992 in February 2025, adding roughly $8.51 per month for a typical household in 2026. These costs diffuse outward. The benefit concentrates upward.

The Grid Was Not Built for This

Nobody is asking the obvious question at the scale it deserves: who authorized the transformation of the American electric grid into AI infrastructure? Because that transformation is happening, whether anyone authorized it or not.

AI-specific servers drove electricity consumption from roughly 2 TWh in 2017 to 40 TWh by 2023. That growth rate is accelerating. The IEA projects electricity consumption in AI-accelerated servers growing at 30% annually through 2030, more than three times faster than conventional server growth. Advanced reasoning models like OpenAI's o3 require between 7 and 40 watt-hours per query, up to 100 times more than basic models. Video generation requires 1,000 to 3,000 times more energy than text generation. Each time a company rolls out a flashier, more compute-hungry product feature, the grid absorbs the difference.

The grid is visibly struggling. In July 2024, a voltage fluctuation in northern Virginia triggered the simultaneous disconnection of 60 data centers, producing a sudden 1,500-megawatt power surplus that forced emergency grid adjustments to prevent cascading outages. In some parts of the country, AI-driven energy demand is already outpacing available capacity, forcing companies to install multiple inefficient natural gas generators on-site. As of 2024, natural gas supplied over 40% of electricity for U.S. data centers. Coal supplied around 15%. The tech companies pledging to run on clean energy are, in practice, running on fossil fuels while they wait for nuclear deals to materialize sometime in the next decade.

Some states are beginning to push back. Ohio and Georgia passed laws putting the cost of grid expansion on data centers rather than consumers. Texas enacted Senate Bill 6 in June 2025, requiring large data center loads to pay transmission infrastructure costs directly. The Netherlands and Ireland have imposed moratoriums on new data center construction in key regions until grids stabilize. These are the right instincts. They are also, at best, the first five pages of a regulatory response to a problem that has been compounding for eight years without meaningful federal oversight.

Who This Works For

The question is not whether this technology works. It clearly does, at least for some things, for some people. The question is who it works for, and who is quietly funding the infrastructure that makes it possible.

McKinsey estimates companies will need to invest $5.2 trillion into data centers by 2030 to meet AI demand. That capital comes from somewhere. Some of it comes from equity markets. Some from government subsidies. And a quietly significant portion comes from the millions of households whose utility rates are rising in the background, not because they chose to fund AI development, but because the grid is a shared resource and the costs of straining it get socialized while the profits get privatized.

The tech industry is fond of the word "democratization." Access to AI tools, yes. Access to AI-generated images and code and summaries. But the electricity bill for generating all of it lands disproportionately on the people in Maryland and Ohio and Virginia who live near the server farms, many of whom will never use the products those server farms power. That gap between who benefits and who pays the cost is not a policy detail. It is the policy question. And right now, nobody with the authority to answer it is being asked to do so.