Somewhere in Northern Virginia, 6 data center facilities are drawing 781 MW of power right now. That is roughly the residential electricity demand of 780,000 households, concentrated in a state where electricity prices in data center corridors have climbed 267% over the past 5 years. Jerome Powell, in remarks on March 19, named data centers as a contributing factor to inflation and household energy costs. When the Federal Reserve chair and 75% of Virginia voters in a January 2026 survey are pointing at the same thing, the question stops being whether there is a problem and starts being who is supposed to fix it.

The Cost Transfer Hidden in Your Rate Increase

Goldman Sachs projects a 6% consumer electricity price increase from 2026 to 2027, driven substantially by grid demand from AI infrastructure. Utilities filed a record $31 billion in rate increase requests in 2025, double the prior year. Those requests do not show up as line items labeled "Big Tech's server costs." They show up as base rate adjustments, infrastructure surcharges, and transmission upgrades that every residential customer absorbs equally, regardless of whether they use AI products at all.

The mechanism deserves attention. When a hyperscale data center, each consuming roughly 100 MW or the equivalent of 100,000 homes, connects to a regional grid, the utility must upgrade transmission infrastructure to handle the load. Under most state regulatory frameworks, those upgrade costs get distributed across the rate base. The tech company pays for its own electricity consumption. Everyone else pays for the grid expansion that made that consumption possible.

Bloom Energy projected in January 2026 that U.S. data center demand will double from 80 GW in 2025 to 150 GW by 2028, adding energy needs equivalent to an entire country the size of Spain. Lawrence Berkeley Lab puts data centers at 12% of U.S. electricity by 2028. Meta's planned Louisiana facility alone requires 5 GW, triple the total power consumption of New Orleans. Each of those expansions triggers infrastructure investment. Each triggers rate cases.

The Equity Problem Nobody Is Pricing In

To be fair to the industry's defenders: 46 planned data centers representing 56 GW are designed to operate off-grid, which would genuinely reduce the cross-subsidy problem. If that buildout materializes at scale, the grid strain argument weakens somewhat. That is a real mitigation path worth watching.

But the households that cannot wait for a buildout that may take a decade to materialize are the ones already stretched by energy costs. Powell explicitly flagged affordability concerns for lower-income households in the same breath as data centers. When a fixed-income renter in Loudoun County sees her electricity bill climb because Microsoft or Amazon needed another facility nearby, she is not a stakeholder in the AI economy whose costs are being fairly distributed. She is an involuntary creditor to it.

The technology generating this demand is profitable. The companies building these facilities are among the highest-capitalized entities in human history. There is no economic case for socializing their infrastructure costs through residential rate bases, and the fact that current utility regulation permits it is a policy failure, not an inevitability.

State utility commissions should require that data center operators, not ratepayers, fund the grid interconnection and transmission upgrades their facilities necessitate. Congress should condition any federal AI infrastructure incentives on that same cost-assignment principle. The question of who benefits from AI is already settled. The question of who pays for the wires that carry it should not be this hard to answer.