top of page

Meta Acquires 6.6 Gigawatts of Nuclear Energy for Artificial Intelligence

Meta secures a long-term nuclear supply agreement with Vistra, TerraPower, and Oklo to support its Prometheus supercluster and forthcoming data centres.


Justin Sullivan/Getty Images
Justin Sullivan/Getty Images

Meta has announced an agreement for a series of nuclear energy supply and development contracts aimed at securing up to 6.6 GW of reliable electricity for its AI data centre expansion through 2035, with deliveries directed to the PJM grid region, which services significant portions of the US Mid-Atlantic and Midwest. The corporation characterises the initiative as a response to power limitations encountered by hyperscale computing, although an independent report indicates that the scale is sufficient to provide energy for millions of households.



The package combines three strategies instead of relying on a singular project: long-term offtake agreements from current reactors, financing for upgrades to enhance capacity of operational plants, and assistance for future "advanced" reactors that are in the development-to-licensing phase. This combination is significant as it mitigates delivery risk: uprates and life-extension support can be implemented sooner than greenfield constructions, but new plants are the sole means to increase multi-gigawatt capacity at the rate currently projected by AI operators.



Meta nuclear energy: current facilities, extended lifespan, and enhancements


Meta has agreements for three operational nuclear facilities: Perry and Davis-Besse in Ohio, and Beaver Valley in Pennsylvania, for immediate capacity. The company asserts it would procure over 2.1 GW from the two Ohio facilities over a span of 20 years and will provide financial assistance for facility enhancements aimed at increasing production. Meta reports that the anticipated uprates over the three locations amount to 433 MW, with the additional capacity likely to be realised in the early 2030s.



This is essentially a strategy to maintain the fleet's operations while incrementally expanding its size. Uprates may lack the allure of new reactors, yet they provide one of the limited avenues to increase reliable capacity without the delays associated with permission, finance, construction, and commissioning of a new site. Data centre operators are drawn to the prospect of an accelerated integration of continuous production into the grid, which can diminish dependence on fossil fuel-intensive marginal supply during peak demand intervals.



Meta nuclear energy: investments in new construction into the 2030s


The second component of the approach is new capacity from advanced reactor programs. Meta announces its support for the building of two Natrium units, providing up to 690 MW of reliable power, with delivery anticipated as early as 2032. It states that it has obtained rights to power from up to six additional Natrium units, amounting to 2.1 GW, with a delivery target set for 2035. Meta characterises the broader Natrium pipeline as comprising eight prospective units that integrate 2.8 GW of baseload power with 1.2 GW of energy storage capacity.



Meta has announced that a new nuclear campus proposed for Pike County, Ohio, could provide up to 1.2 GW of clean baseload power to PJM through multiple Aurora Powerhouse reactors, with the site potentially operational by 2030. Collectively, these two tracks (Natrium and the Ohio campus) aim to establish a secondary wave of robust supply that aligns with the most pronounced phase of the AI compute surge in the early to mid-2030s.



The implications for AI data centres and the PJM grid


Power procurement is emerging as a primary design constraint for AI infrastructure. Interconnection queues are extensive, transmission upgrades are sluggish, and even with available land and funding, the constraint may be reliable generation capable of operating continuously. Nuclear energy is among the limited low-carbon alternatives that inherently satisfies the "always-on" need, elucidating why numerous hyperscalers are increasingly incorporating generating agreements into their platform strategies rather than viewing them as mere sustainability enhancements. We examined how the broader change in the AI power curve compels hyperscalers to adopt extreme measures.



Additionally, there exists a conflict in communication. Meta asserts that it will cover the complete expenses associated with the energy use of its data centres, and that increasing capacity enhances reliability and price stability. Critics sometimes concentrate on whether these initiatives sufficiently increase generation capacity to match load increases and whether the advantages are distributed widely across the grid or predominantly benefit the largest new customers. The conflict will likely manifest in the structuring of contracts, the approval of upgrades, and the expansion of transmission to deliver electricity to the locations of new computing campuses.



Timelines and associated dangers to monitor


The proposed timeline is ambitious. Uprates and life-extension support may represent the "safer" aspect; yet, they remain contingent upon outage windows, regulatory permissions, and supply chain performance. The new construction segment entails greater uncertainties: inaugural deployments frequently experience delays, and financing conditions can fluctuate significantly as projects transition from announcement to physical implementation.



Nonetheless, the trajectory is evident: the voracious energy demands of AI compel the industry to consider gigawatts and long-term planning. Meta's 6.6 GW figure pertains less to an individual facility and more to the development of a diverse portfolio of reliable power alternatives that can adapt to AI demand, avoiding reliance on a singular technology, location, or timeline.

Comments


bottom of page