Hendricks County, Indiana ran a quiet experiment between 2020 and 2026. Distribution costs went from $0.10 to $0.30 per kilowatt-hour, a 140% increase, while data centers moved in and demand surged. Nobody held a press conference about it. It just showed up on bills.
That's the story the AI infrastructure boom doesn't tell. The headline version is $770 billion in global data center capital spending in 2025, vacancy rates at 1.4%, and private sector plans to triple capacity by 2030. The footnote version is who pays for the transmission lines, substations, and generation capacity that make all of it possible. In too many states right now, the answer is residential ratepayers.
The Tariff Gap Is the Real Risk
Xcel Energy, which serves 1.6 million Colorado customers, projects data centers will account for roughly 2/3 of all new electricity demand over the next 5 years, requiring 950 MW of new generation. On April 2, Xcel proposed a tariff that would require new data centers to cover 80% of power costs, pay for new generation and transmission, sign 15-year contracts, and put up a $600,000 upfront fee. The Colorado Office of the Utility Consumer Advocate called it a good framework. It is pending PUC approval.
That's the right structure. The problem is that 77 similar tariffs are still pending across 36 states, while only 29 have been approved. The gap between pending and approved is where ratepayers get squeezed. Botetourt County, Virginia saw generation costs rise 45%, transmission 29%, and distribution 90% between 2020 and 2026. New Jersey absorbed a 20% electricity cost jump linked to data center load growth. These aren't projections. They already happened.
The counterargument from the industry side is worth taking seriously: Berkeley Lab's 2025 research found that states with the highest load growth actually saw real electricity prices fall, because large industrial customers spread fixed grid costs across more users. Mississippi's rates sit 16% below the national average, and Entergy Mississippi projects $2 billion in customer savings from data center revenues. Load growth, done right, can be a subsidy flowing toward consumers rather than away from them.
The operative phrase is "done right." Mississippi's model works because the cost allocation is structured. Virginia and Indiana's model failed because it wasn't. The difference is a tariff, not a technology.
Pennsylvania Showed You the Political Problem
Pennsylvania House Democrats advanced legislation in late March requiring data centers to fund infrastructure upgrades, curtail usage during grid strain, and pair demand with renewable generation. The Senate stalled it under industry pressure. That's the pattern: the policy solution exists, the political will doesn't survive contact with a well-funded lobby.
Google committed 1 GW of demand response capacity at Southern and Midwestern data centers, which is a meaningful concession. But demand response is a pressure valve, not a cost solution. It reduces strain during peak hours; it doesn't retroactively fund the transmission infrastructure that had to be built to serve the load in the first place.
The math here is simple enough to be uncomfortable. The private sector needs $6.7 trillion in data center investment to hit 2030 capacity targets. That capital will flow. The question regulators in 36 states need to answer before it arrives is whether the infrastructure bill follows the capital or lands on the family paying $180 a month for electricity in a county that didn't ask to become an AI hub.
State PUCs should approve large-load tariffs before new data center contracts are signed, not after the substations are already built. The Xcel model is the template. Every state still sitting on 77 pending tariffs is running the same experiment Hendricks County ran. They already know how it ends.