The Hidden Cost of AI: Civic Infrastructure
The future of AI won’t be decided in labs or boardrooms. It’s being decided in utility commission hearings.
This is a column about technology. See my full ethics disclosure here.
Silicon Valley doesn’t like to admit this, but chip makers and research labs aren’t deciding the future of AI anymore; it’s being decided in utility commission hearings.
Meta has just cleared a major hurdle for its new $10 billion data center, as Entergy has won regulatory approval from Louisiana regulators to build the infrastructure in Richland Parish for what will be one of the world's largest data centers. That’s the new bottleneck. Not silicon. Not venture funding. It’s megawatts and transmission capacity.
Access to electricity is now the differentiator. Historically, the computing arms race has been about faster GPUs (Graphical Processing Units), and who can buy the most of them. Now, the constraints are industrial, as access to power and water are the new limiting factors. ERCOT in Texas is already warning that power demand could double by 2030. Georgia regulators admit that data centers are consuming 80% of the new supply.
This isn’t just isolated to a few areas, according to Goldman Sachs:
“The AI revolution has triggered an unprecedented power demand surge. AI datacenters could drive 75 million American homes (100 GW) of incremental power demand around the world by 2030.”
In the first internet age, leverage was determined by those who could create computing power, but for Web 4.0, the ability to access megawatts of electricity is the new lever. And just like oil drilling, creating energy has clear winners and losers. The winners are easy to spot: tech giants locking in 15-year supply deals at favorable rates, utilities expanding their portfolios under the banner of economic development, and politicians eager to cut ribbons on billion-dollar campuses. The losers are just as clear: households and small businesses that inherit the transmission costs and overruns once the contracts expire, and communities left with strained water systems, heavier emissions, and only a handful of long-term jobs to show for it.
We’ve seen this before. Oil booms built towns overnight, then left them hollowed out when the wells dried up. Company towns tout stability, but the company always wins, while communities carry the risk. Today’s hyperscale data centers aren’t pumping crude or smelting steel. Still, the dynamics appear to be the same: extraction disguised as progress, with profits freighted out, while costs are absorbed locally. The only difference is the resource. Instead of barrels and tons, the metric is gigawatts, and the stakes reach beyond the town border, straight into the grid that keeps everyone’s lights on.
Is there a way out? A few utilities and developers are testing different approaches. Some are pairing new campuses with on-site renewables and battery storage to isolate data center demand from ratepayers' pocketbooks.
Others are re-engineering cooling with closed-loop systems that recycle water instead of draining local supplies. A handful are even experimenting with “flexible compute,” shifting workloads to align with when the grid has surplus wind or solar energy.
These aren’t silver bullets, but they’re real levers. The problem is they’re voluntary, and in the U.S., “voluntary” usually means they show up in press releases more often than they show up in practice.
The uncomfortable truth is this: AI doesn’t just need faster computers; it’s consuming civic infrastructure at a discounted rate, leaving local communities at risk.
This is Part 1 of 4, read Part 2 here.