A Return to the Powerhouse Tradition
How Onsite Power Generation of Data Centers Will Lead to a Once-in-a-Generation Energy Revolution
Before Generative Artifical Intelligence or GenAI became the word of the decade, many energy observers and grid operators were beginning to worry about the increased electricity demand, especially from data center growth. Energy consumption in Western developed countries is expected to double by 2050 (and that is without the optimistic assumption that we would all be driving EVs by 2030). As we are all facing an uncertain future for uninterrupted energy supply, it might be useful to go back to see how industries secured supplies in the past.
The word “Powerhouse” today tends to refer to a strong or dominant leader, an unbeatable sports team or a gym. In history though, the powerhouse was part of an industrial facility where energy was harnessed onsite and transmitted for production purposes.
Onsite energy generation from watermills to furnaces to steam generators was the norm until the development of energy grids in the late 1800s. Factories with high energy demands still produced their own electricity from their powerhouses, that eventually fed energy back onto the grid through later efficiencies (like cogeneration). In the 1940s to 60s, large hydroelectric power plants in the Northwest United States and Norway were developed hand in hand with the growth of energy-intensive aluminum smelters (and it could be argued that the growth in Chinese aluminum production mirrors the development of the Three Gorges Dam project). With nuclear power and increased coal generation providing a reliable baseload for the grid in the 70s and 80s, the risk of unreliable energy access for industrial purposes faded and the onsite powerhouses became museums.
Until today, that is.
A series of policy and planning catastrophes, inspired by a dogmatic environmentalist ideology, has threatened energy security in many industrialized countries. The premature decommissioning of existing nuclear power plants and the decarbonization and energy transition strategies of many advanced economic states away from fossil fuels has put unnecessary strains on the energy grids and on global energy prices. It hasn’t helped that industries themselves have been bullied into adopting ESG strategies toward sourcing and funding “green energy” options that further destabilize energy security.
The situation today is dire (even without the recent geopolitical stresses on energy supplies). Companies like BASF and Yara are no longer manufacturing energy intensive products like ammonia in Europe while many on the political left try to turn this crisis of their own making into the opportunity to deindustrialize and transition into an degrowth economy. They seem to romanticize the era before the powerhouse.
So what do the new, emerging energy-intensive (ie, GenAI technology) industries do to ensure a secure energy supply for their servers and data centers? It seems they are reverting to the 19th century powerhouse model.
The New Powerhouse
With the activist environmentalist meddling in the energy mix and a lack of political leadership or common sense to resist their simplistic alternatives, many advanced Western economies are facing energy security issues at a time when economies are transitioning to more electrified sources and relying far more on energy-intensive data centers. With the burgeoning demand from GenAI on the horizon, it falls on the technology industry to find a solution to ensure certainty of abundant energy supplies.
Many could assume that, as the tech generation has a rather granola and left-wing liberal tradition, they would all be building data centers powered by green renewable energy (mainly wind and solar). But these billionaires didn’t make their money chasing rainbows and butterflies, and while they preach from the green gospel when the cameras are on them, they are not idiots. They have done their research and have concluded that renewables are not a viable option to provide sufficient, reliable energy to power their servers (no matter how much the environmental activists the tech industry funds via their foundations might wish and pray). Renewables like wind and solar are also not very environmentally friendly when productions costs, mineral resources and battery demands are factored into the lifecycle assessments.
The technology experts have chosen other routes to provide inexpensive, reliable, green energy solutions (something I wish Western leaders, especially in the European Union, should have considered). This is a crossroads moment for global economies and the energy decisions being considered by the main industrial energy consumers for the coming decade should become the benchmark for all. If not renewables, then how will the tech industry power their GenAI and data center expansion?
Two options have been put on the table for green, reliable energy expansion that should double our energy output without harming the environment. Both of which, mind you, most environmental activist groups have been campaigning against for decades (…go figure).
The Nuclear Option
Microsoft surprised many when they announced a 20-year deal to recommission a closed reactor from the Three-Mile Island nuclear facility to produce abundant, cheap and clean energy for their data centers. There are several other decommissioned nuclear facilities that can be brought back online (because unlike the Germans, Americans did not vengefully destroy their nuclear sites the first chance they had).
There are also a new generation of Small Modular Nuclear Reactors (SMRs) that can be situated next to energy-intensive industrial sites. Amazon has invested $500 million to explore this alternative on three sites to see if SMR alternatives are viable. The development of this technology for onsite data center powerhouses will create a catalyst for an innovation leap that could also provide cities with cheap, clean energy from small nuclear generators.
Gas-fired Generators with Carbon Capture and Storage
Chevron and ExxonMobil recently announced their plan for the future of data center powerhouses provided by a new generation of integrated natural gas generators with a built-in CCUS (carbon capture utilization and storage) capacity. With the investment from the tech sector, the oil giants will be able to accelerate the development of the carbon capture technology. The midstream companies are already using their pipelines to transport both natural gas and carbon (if only the environmental activists would stop trying to block their development).
At first glance, the new gas-powered clean energy systems would seem more attractive (given the amount of energy required and the time needed for project realization) than the next generation small modular nuclear facilities. (Wind and solar are cute options, but were never seriously considered for anything more than floor lighting.)
We are at that moment in time where both powerhouse alternatives show the promise of innovative research and technology to solve important problems. Like steam and hydroelectricity, the opportunity of the development of a new generation of abundant energy, and the economic benefits that will ensue, will be enormous. This is a seminal moment in energy development.
That is … unless the narrow-minded green activists use their influence to relentlessly impose their dogmatic renewable ideology on policy-makers, media and foundations, pushing our economies back to pre-industrial times.
This article was written and stored in the cloud, using servers and data centers powered mostly, at the moment, by coal-based energy. We are at a once-in-a-generation moment where future Firebreak articles will rely on other energy sources. One thing is certain: it won’t be renewables (unless our leadership is irretrievably stupid).