AI May Bring Back Three Mile Island: Microsoft plans to reopen the 1979 disaster site. Proliferating data centers need even more energy.

https://www.wsj.com/opinion/artificial-intelligence-may-bring-back-three-mile-island-microsoft-data-center-energy-c125eef3?utm_campaign=L%3AP%20Newsletter&utm_medium=email&_hsmi=327569329&utm_content=327569329&utm_source=hs_email

By Mark P. Mills

The news that Microsoft plans to fund the reopening of the undamaged reactor at Pennsylvania’s Three Mile Island nuclear plant spread almost as quickly as news about the nuclear accident at that same site in 1979. Microsoft’s decision was animated, as the Journal reported, by the “gargantuan amount of power needed for data centers for AI.” During the nuclear industry’s long winter following the 1979 accident, it would never have occurred to anyone that algorithms and not Congress would revive nuclear energy.

Three Mile Island’s city-scale power generator and massive concrete cooling towers now epitomize digital energy realities. But all the hullabaloo over the plant’s resurrection is about the kind of power that a single hyperscale data center uses. Hundreds more data centers are under construction or planned. Thousands more will be built. There aren’t enough nuclear reactors operating or planned to meet that kind of demand, even if the few retired ones are resurrected. How much electricity will digital innovations need?

The arrival of useful, affordable artificial intelligence is a wild card. It’s hard to predict how much AI will boost electric demand, since consumers and entrepreneurs are still sorting out where and how to use it.

Using electric vehicles for calibration: A single AI card—a kitchen-drawer-size piece of hardware containing eight AI chips—uses as much electricity each year as 10 EVs do. Nvidia alone, a leader in AI, is already on track to ship more cards a year than Tesla does cars. AI algorithms feed on stupendous quantities of data, which creates a greater need for conventional digital hardware to collect, manage and store bytes.

Still, AI hardware is responsible, so far, for only 10% to 20% of data-center electricity demands. The real story is the prodigious expansion of existing cloud infrastructure.

Over one decade, $1 billion of data centers will use $700 million of power. To compare, over that same period, every $1 billion of EVs sold will result in about $150 million of electricity consumption. Today, U.S. spending on new data centers is greater and growing faster than total spending on all EVs. Environmentalists worry that the electricity needs of data centers and AI threaten the energy transition. But never mind finding enough low-carbon energy sources—the question is whether enough power of any kind will be available.

Electricity forecasters today face a challenge similar to trying to guess future air traffic circa 1960, when the development of affordable jet-powered aircraft revolutionized commercial air travel. Global air traffic soared tenfold. There was also a massive expansion of hardware, infrastructure—and fuel use.

Even so, estimating future demand for such things as air traffic, food, homes or cars can be bounded by reasonably knowable constraints: the number of people on earth, levels of wealth and the number of hours people are willing to devote to doing or using something. Estimating future data traffic is uncharted territory for a simple reason: There is no limit to demand for and supply of data.

What isn’t different: Everything uses energy, and advances in efficiency lower costs and drive greater data usage. Just as the tenfold boom in aviation traffic led to a fivefold rise in energy use, today’s digital-traffic growth will be driven by and outstrip efficiency gains.

 

Share: