0
The years spent years with practically stagnant electricity consumption. Suddenly artificial intelligence came into play and changed this picture. The Energy Department estimates that the data centers, which by 2023 already consumed about 4.4% of the country’s electricity, can reach between 6.7% and 12% by 2028, which is equivalent to 325 to 580 TWh. It’s a lot in very short time. The problem is that you can’t lift a power transmission line like an application updates.
In the PJM operator, the largest mesh in the country, which covers the middle atlantic to the midwest, the cargo projections leaned. The 2025 report speaks of average growth of 3.8% per year on the winter peak in the next decade, a very rare pace for US standards and pulled by new giant loads. No wonder PJM itself opened an accelerated process to create specific megacharg connection rules, in particular data centers. It is the bureaucracy trying to run after the cloud.
Read too
The economic dynamics also changed. For years, Big Techs bought certificates that proved that a certain amount of electricity was generated from renewable sources, (renewable energy certificates, res) or through virtual or financial purchase and sales contracts in the long run, where there was no physical delivery; The commitment term was used to set an energy price in the market and ensure financial predictability.
Now the discussion is about ballast: long physical energy contracts, that is, the buyer ensures the purchase of an amount of energy or the production of a renewable park. Microsoft, for example, has signed a 20 -year -old electricity -purchase and sale agreement with Constellation, a low -carbon energy production company to enable the resumption of Three Mile Island’s nuclear power plant. The goal has signed another 20 -year contract with the same company for the Clinton Nuclear Complex in Illinois. These arrangements are not just sustainability marketing; They are the way to ensure 24/7 gigawatts for operations that cannot flash.
Not everything, however, is plug and play. The Amazon – Talen case in Pennsylvania, where a data center was built glued to the Susquehanna nuclear plant, became a regulatory novel. FERC twice rejected the interconnection agreement that sought to expand direct supply “door to door” by cost fears and impact on the shared network. The moral of the story is simple: being close to a plant helps, but does not nullify the rules about who pays for the thread. This message has already been heard in other states.
And who pays, after all? Some states began to define specific hypercade rates and classes. In Virginia, Dominion proposed a new tariff category for very large date centers and, in parallel, gained permission to build a transmission line that serves only a hyperscale in Alexandria. The decision generated protests from neighboring neighborhoods and revealed the obvious: the “cloud” has 230 kV and passes in someone’s backyard.
Royal and Ghost Demand
On the side of the planners, there is another difficult node: what is real demand and what is “ghost demand”? With the AI race, developers enter multiple lines of connection at the same time, often for the same project. The result is an artificial swelling of the numbers that can lead to overdone and then underused networks, and this account is left for the consumer. The Wall Street Journal told this story of the “Data Centers that don’t even exist and already haunt the network.”
In the south, Georgia Power redesigned its resource plan to hold coal for longer, invest in batteries and additional gas and expand solar, all with an eye on the data centers. It is a good synthesis of the moment: the energy transition continues, but the sequence of the pieces has changed because of the AI.
And the new generation of nuclear technologies? SMRs, small nuclear reactors, which promise lower initial investment and greater security, are on the radar, but haleu fuel, which has Russia as the only supplier on scale, is still the bottleneck. The American nuclear fuel supplier, Centrus, in June, reached the 900 kg mark produced in June, a historic landmark in the US. But the US Department of Energy (DOE) projects the need for 50 tons per year by 2035. In other words, promising, but not in the time of the data centers between 2026 and 2028. Until then, what is nuclear “on the shelf” is extending the useful life and increasing the licensed power of existing nuclear reactors without building a new plant.
But there are realistic alternatives to get out of this situation. The first of these is the speed facing governance. The American network was designed to grow slowly, but AI brought industrial rhythm to the service sector. The institutional response of the United States federal agency responsible for regulating the electricity sector, Ferc, operators and state commissions, is to try to marry civil works with the cloud’s investment cycles. The regional PJM regional operator creates specific rails for large loads. And that is precisely that: to admit that Data Center is its own regulatory animal.
The second is addiction, not just the “green” energy. PPAs that prevent the closing of real plants, such as Three Mile Island, or who finance license extensions, such as the case in Clinton, have a much larger systemic impact than generic certificates. It is an important turnaround: corporate decarbonization takes care of the system’s reliability stock, not just the annual MWh balance clean.
The third is who pays the wiring. Special tariffs and investment obligations for new hyperconsuctors should no longer be theory and become practice. Stricter connection contracts, reduction or restriction of energy generation in peak hours and, when making sense, direct contribution to lines and substations are required. Without this, the indiscriminate distribution of costs generates political reaction, which has already begun at public hearings.
The fourth point is efficiency and flexibility. Not every AI Watt is the same. Model training can be scheduled; The inference, not so much. The industry that learning to move training to low -demand windows, or to time off, will reduce network Capex and gain regulatory bargaining power. This is the less glamorous B side of AI: orchestration software and contracts with electric slas, which ensure that the service delivered meets minimum quality and performance standards.
In the short term, the picture will be inevitably mixed: a little more gas to hold the peak, ancient nuclear monetizing its firmness via Big Tech, solar and batteries growing scale and a lot of transmission work. The background vision, however, is of revivement. The American network has always been a great public work, implicitly funded by the entire consumer base. AI is forcing a private liability experiment on the ballast, and this can be healthy if well regulated.
The risk, of course, is the opposite: to project for the ghost exaggeration and to socialize too much. The warning signal already appears in DOE projections, which speak of up to 12% of electricity by 2028, and nourish seasonal evaluations, the entity responsible for ensuring the reliability and safety of the North America, which have been increasing risk to higher peaks and aged plants. The challenge is to find the middle ground between underestimating the cloud and build for mirages. In the end, the “AI electricity bill” arrives at the same mailbox as ours.
Read more in, partner of the Metropolis.