A new analysis from BloombergNEF forecasts that power demand from U.S. data centers will soar to 106 gigawatts by 2035 — nearly three times the current 40 gigawatts. That growth is fueled in part by the rapid expansion of AI-driven infrastructure, which relies on increasingly large and power-hungry facilities. Much of the buildout is expected in rural or suburban areas as urban sites become scarce and average facility size climbs well beyond today’s norms.
Sources: Latitude Media, TechCrunch
Key Takeaways
– U.S. data-center electricity demand could reach 106 GW by 2035, more than double current levels.
– The surge is driven by new, larger-scale data centers — many tied to AI workloads — increasingly being built in rural/suburban areas.
– Even with gains in efficiency, the growth of data centers will meaningfully raise national power consumption and strain existing energy infrastructure.
In-Depth
The future energy needs of data centers across the U.S. are poised for an unprecedented surge. According to a fresh report from BloombergNEF — cited in recent media coverage — data centers could draw as much as 106 gigawatts of power by 2035, a dramatic jump from the roughly 40 gigawatts consumed today. That translates to nearly triple the current load, a stark indicator of how rapidly demand for digital infrastructure is rising, especially amid the AI boom.
What’s driving this massive uptick? For one, data center projects announced in just the past year have more than doubled compared with earlier estimates, particularly for large-scale, energy-intensive facilities designed for AI workloads. Many of these new centers are planned for rural or suburban areas — locations where land and power-grid access remain more available than in crowded urban zones. That marks a shift from the past, as operators increasingly build “hyperscale” facilities optimized for AI compute and storage, with power demands well above conventional data centers.
Exacerbating the issue is the fact that even with better energy efficiency, data centers are inherently power-hungry. It’s not only the servers themselves but also the cooling, networking, and infrastructure overhead that drive up consumption. As a result, the aggregate electricity draw of all U.S. data centers is projected to climb significantly — challenging the national grid and raising the stakes for power suppliers, regulators, and state governments.
The implications reach far beyond tech. Higher demand means utilities will need to ramp up generation capacity — whether from traditional sources like natural gas or by expanding renewables — to avoid blackouts, price shocks, or energy shortages. For taxpayers and ratepayers, it could translate into higher bills; for regulators, a tougher balancing act between promoting innovation and ensuring grid stability.
At the same time, this expansion could further entrench digital infrastructure’s footprint on energy consumption. As AI becomes embedded in more industries — healthcare, logistics, finance — and as cloud computing adoption deepens, data centers will likely occupy an ever-growing share of U.S. electricity demand. Without careful planning and sizable investments in grid capacity, storage, and clean generation, the rush to scale up data centers could create serious bottlenecks — presenting one of the largest infrastructure challenges the country has faced in decades.

