Stay Ahead, Stay ONMINE

What the last gas boom (and bust) says about today’s rush to build

Listen to the article 13 min This audio is auto-generated. Please let us know if you have feedback. Twenty-five years ago, a data center boom helped fuel a race to build gas-fired power plants, with the energy secretary, utilities, politicians and experts warning of blackouts and economic stagnation if the country didn’t meet surging demand […]

Twenty-five years ago, a data center boom helped fuel a race to build gas-fired power plants, with the energy secretary, utilities, politicians and experts warning of blackouts and economic stagnation if the country didn’t meet surging demand for electricity.

By 2001, however, the dot com bubble had burst, the economy was in recession and the huge demand increase never materialized. Efficiency and productivity improved rapidly, and demand remained more or less level for the next two decades, leaving many utilities with excess capacity and ratepayers footing the bill.

Some analysts and industry sources see parallels between then and now. Once again, headlines are warning of imminent energy shortfalls due largely to the power needs of artificial intelligence. Leading figures in government and industry are promoting more firm generation, and particularly gas, as a matter of economic and national security. 

“Can the same thing happen? Definitely,” said Eugene Kim, Wood Mackenzie’s Americas Gas Research team director. “The utilities and anyone planning for power demand is forecasting unprecedented and, in some cases, even exponential growth. Whether that materializes or not – huge degree of uncertainty.”

Gas investment reaches new heights

Investment firms, utilities, tech giants, energy companies and others are pouring billions into acquiring existing gas plants or developing new ones to serve data centers. Gas power merger and acquisition valuations have doubled since 2024, reaching up to $1.93 million/MW in some markets, according to energy analytics firm Enverus. 

While there are echoes of the millennium today, there are a few important differences. 

The first is that the U.S. is producing and consuming more gas than ever before — driven largely by the rise of fracking — with production concentrated in Texas and Louisiana to the south, and Pennsylvania and West Virginia farther north. 

Secondly, as the U.S. became a gas-producing powerhouse, the gas and electric power sectors have become much more interdependent. In 2000, the electric power sector accounted for about 22% percent of U.S. gas consumption, while gas accounted for around 16%of electricity produced, according to the U.S. Energy Information Administration.

By 2023, the electric power sector accounted for about 40% of total U.S. natural gas consumption, and gas accounted for about 42% to 43% of utility-scale electricity generation, making it the single largest fuel source, the EIA saysOver the years, the role of gas has grown, largely displacing coal as the latter became uneconomical.

A data center under construction at night.

DataBank’s IAD4 data center under construction in Ashburn, Virginia.

Diana DiGangi/Utility Dive

The third key difference is the rise of renewables and storage on a large scale. While gas is the dominant source of fuel for U.S. power generation, it makes up a small fraction of new generation coming online this year, and energy storage is taking a small but growing share of daily peak demand when the sun goes down. Utility-scale wind and solar account for 83% of FERC’s “high probability” additions through July 2028, while gas makes up about 16%. 

There is some evidence that could change. 

The outlook for renewables has dimmed since President Donald Trump came to office this year and largely made good on his promises to repeal tax credits, permits and other government support for wind and solar while throwing his weight behind fossil fuels. 

Scott Wilmot, an energy analyst at Enverus, said before the passage of the One Big Beautiful Bill Act in July, he would have said the quality and size of the renewable project pipeline was more than enough to meet projected energy demands. Now, he’s not so sure. 

“The reality is the [levelized cost of electricity] for these renewable projects has gone up without those tax credits,” he said. “A lot of folks who are developing these projects are probably reconsidering a lot of their pipeline.”

The PJM interconnection, the nation’s largest grid operator, is fast tracking 11.8 GW of generation, mostly from gas. Gas also dominated MISO‘s fast track interconnection review, accounting for 19 GW out of 26 GW. In ERCOT, gas went from 6.8% of the interconnection queue in August 2024 to 9.1% as of August 2025. 

At the same time, major gas pipeline expansions are planned or proposed.

Amy Andryszak, president and CEO of the Interstate National Gas Association of America, said existing pipelines are currently running at capacity, and her organization’s members are reporting a “record number” of inquiries for new pipelines from potential customers. 

“They indicate they have not seen this level of interest in building since around 2010 when the fracking boom drove interest in pipeline development,” Andryszak said in an emailed statement. “Many members are already announcing new or revived pipeline projects, and we expect many more will file for certificates at [the Federal Energy Regulatory Commission] over the next 18 months.”

FERC is expected to issue guidance soon on colocation rules, which could potentially make it easier to route gas directly to data centers and other large loads.

What would be the largest gas-fired power plant in the country is a 4.5 GW project being developed by Knighthead Capital Management at the former Homer City coal plant in Pennsylvania as part of a 3,200-acre data center campus. 

A gray white and data center seen next to power lines against a twilight sky.

A Google data center in Ashburn, Virginia, located down the street from the Potomac Energy Center, a 774-MW natural gas plant.

Diana DiGangi/Utility Dive

Some developers say even without data centers, electricity demand is rising and gas, which is firm, dispatchable and proven, will be an important part of the energy mix for decades to come, especially as more coal generation retires.

Bilal Khan is an executive at Blackstone who oversaw the firm’s recent acquisition of the 620-MW natural gas power plant Hill Top Energy Center in Pennsylvania for about $1,600/kW. The state is part of PJM, which is already setting record capacity auction prices, making it an attractive market for producers. The grid operator has said that is due to supply and demand conditions, but its independent market monitor says data centers are the “primary reason” for the price surge.

Khan said that even putting aside data centers, he’s bullish on natural gas thanks to the other factors driving increased power demand, including manufacturing and electrification. 

“We still need new power supply that’s reliable, consistent and affordable,” Khan said. “This moment is unique and in contrast to the last approximately 20 years when power demand was flat.”

A risky investment

Khan acknowledged, however, that new natural gas plants face significant obstacles. The backlog for new turbines, for example, can stretch up to seven years amid global competition for those and other components like transformers.

“There’s an equipment shortage, there’s a labor shortage and it’s more challenging to obtain all necessary permits,” Khan said.

And the cost of deploying new gas generation is rising fast.

According to Enverus, capital costs for new natural gas power plants now average $2,200/kW to $3,000/kW. An analysis by GridLab adds another $1,500/kW levelized for the 20-year cost of fuel and $500/kW for pipeline infrastructure, putting the cost of new gas generation at $4,000/kW to $4,500/kW.

Kevin Smith, CEO of Arevon Energy, spent years working on natural gas projects alongside nuclear and renewables before transitioning to focus on renewables exclusively. He suggested there was a ceiling to how much gas could be built to meet near-term demand given the prices, pipeline constraints and interconnection queues. 

“Tens of gigawatts of new generation from natural gas is likely at least five years away, if not more,” Smith said. 

Changing market dynamics also mean new gas-powered generators face tighter conditions under which they would be economical to run, according to a recent report on thermal generation from Ascend Analytics.

Their competitiveness depends on the type of gas generation technology, which varies by efficiency, availability, ramp rate, cold startup cost and other factors. But overall, Ascend said, the value of gas assets will become increasingly concentrated in “narrow, infrequent windows of time,” increasing the importance of availability, secure fuel supply and strategically-scheduled preventative maintenance.

Comparison of typical cost and flexibility characteristics for various gas generation technologies
Comparison of typical cost and flexibility characteristics for various gas generation technologies: Natural Gas Combined Cycle, Natural Gas Combustion Turbine, NGCT Aeroderivative, and Reciprocating Internal Combustion Engine plants. From Ascend Analytics’ report on thermal generation.​​​​

Permission granted by Ascend Analytics

“In the face of rising electricity demand and declining capacity accreditation for renewables and storage, the need for dispatchable, long-duration capacity resources will persist. To meet this need, new gas capacity will be built,” it concluded. “However, investments in new gas should be made cautiously, prudently, and strategically. Thermal generation will remain a risky investment, with stranded asset risks not going away.”

Have utilities learned their lesson?

Against a backdrop of rising residential power bills, utilities and regulators say they are taking precautions to protect ratepayers from the possibility of a bubble.

NRG Energy’s vice president of regulatory affairs, Travis Kavulla, said past experience has formed the industry’s current approach in which competitive generators carry the risk of load forecasts – at least in deregulated markets.

“All of that has ramifications for the extent to which people are comfortable rapidly building out generation,” he said during a recent panel discussion hosted by the Heritage Foundation. “That’s why it’s so important for long-term offtake agreements to be formalized in this industry.”

Some regulators and utilities are moving to place more of the risk on hyperscalers by creating new rate classes for large loads. At least 30 states have proposed or approved large-load tariffs over the past several years to manage growth and protect existing ratepayers, according to a database from the North Carolina Clean Energy Technology Center and the Smart Electric Power Alliance.


“Whether those power plants or the gas pipeline go underutilized because that boom didn’t realize is yet to be determined.”

Eugene Kim

Wood Mackenzie’s Americas Gas Research team director


The Ohio Public Utilities Commission recently approved AEP Ohio’s plan to make data centers pay for at least 85% of the energy they request, even if they use less, to cover the cost of infrastructure. It also requires data centers to show they are financially viable and to pay an exit fee if their project is canceled or they can’t meet obligations.

Since then, the utility said its data center pipeline has shrunk by half — a positive development, said Wilmot from Enverus. 

“We just can’t have absolutely unmitigated data center demand growth,” he said. “Otherwise, things are going to be very challenging for the ratepayer.”

Dominion Energy, which serves the area around Ashburn, Virginia – home to the highest concentration of data centers in the world — has proposed a similar tariff. Under its proposal, large load customers would be required to make a 14-year commitment to pay for the power they request regardless of how much they actually use.

Aaron Ruby, spokesperson for Dominion, said the utility is focused on making sure residential customers aren’t subsidizing the cost of infrastructure needed by data center customers and has yet to encounter delays as it pursues 5.9 GW of new gas generation.

He suggested, however, that there was little danger of overbuilding. In the last 20 years, he could only recall one example of Dominion developing infrastructure for a data center that fell through.

“Within a year or two, another data center customer emerged and fully utilized the infrastructure,” he said. 

Others expressed skepticism.

Abe Silverman, an assistant research scholar with Johns Hopkins University’s Ralph O’Connor Sustainable Energy Institute, said it was still very early to know what effects AI will have on power consumption.

“This is a baby industry, and we are building and paying for infrastructure today for data centers that don’t yet exist,” he said. “We are making an investment to build out that grid infrastructure based on these very frothy assumptions.”

A white data center under construction is seen under a night sky with stars.

DataBank’s IAD4 data center under construction in Ashburn, Virginia.

Diana DiGangi/Utility Dive

Kim, the Wood Mackenzie gas researcher, suggested that the power industry has matured since the last bubble burst. 

Back then, the three dominant gas turbine manfacturers — GE Vernova, Mitsubishi and Seimens — were “burned significantly,” he said. “The utilities were burned significantly by overbuilding, and now they’re more cautious.”

Kim predicted gas generation will likely continue to increase due to coal retirements and growing demand for electricity, with or without data centers. Advanced manufacturing facilities require more energy. Summers are getting hotter as the climate changes, increasing demand for cooling, and larger renewable portfolios will require dispatchable generation that cannot currently be replaced by storage, he added.

But an economic slump, technological innovation, improved efficiency and any number of unforeseen factors could impact what is still a highly speculative growth scenario, he said.

“Nevertheless, as pipelines could take two years to build, four years to build, you then have to start building those pipelines if you’re going to build that power plant and support it,” Kim said. “Whether those power plants or the gas pipeline go underutilized because that boom didn’t realize is yet to be determined.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Palo Alto Networks readies security for AI-first world

Palo Alto has articulated the value of a security platform for several years. But now, given the speed at which AI is moving, the value shifts from cost consolidation to agility. With AI, most customers don’t know what their future operating environment will look like, and a platform approach lets

Read More »

Chevron executives see 2025 production growth nearing 8%

Executives of Chevron Corp., Houston, expect the company’s 2025 production growth, excluding former Hess operations, to be near the top of their guidance range of 6-8%, they said Oct. 31. Chevron’s total production for the 3 months that ended Sept. 30 totaled nearly 4.09 MMboe/d compared with 3.37 MMboe/d in

Read More »

Cisco unveils integrated edge platform for AI

Announced at Cisco’s Partner Summit, Unified Edge will likely be part of many third-party packages that can be configured in a variety of ways, Cisco stated. “The platform is customer definable. For example, if a customer has a workload and they’ve decided they want to use Nutanix, they can go

Read More »

Infoblox bolsters Universal DDI Platform with multi-cloud integrations

Universal DDI for Microsoft Management integration enables enterprises to gain control of their DNS and DHCP by centrally managing DNS and DHCP hosted on Microsoft server platforms. Integration with Google Cloud Internal Range applies consistent IPAM policies across Google Cloud, on-premises, and other cloud environments, which helps enterprise IT to

Read More »

BP Profit Exceeds Expectations

BP Plc’s profit exceeded expectations, with operational improvements and higher oil and gas production outweighing lower prices, as the company’s turnaround plan builds momentum. The British energy giant posted adjusted third-quarter net income of $2.21 billion, higher than the average analyst estimate of $1.98 billion. Its quarterly share buyback plan was maintained and net debt rose slightly.  The results signal Chief Executive Officer Murray Auchincloss is starting to deliver a turnaround plan to win back investor confidence by focusing on oil and gas production, selling non-strategic assets and cutting costs.  “We continue to make good progress to cut costs, strengthen our balance sheet and increase cash flow and returns,” Auchincloss said in BP’s earnings statement. “We are looking to accelerate delivery of our plans, including undertaking a thorough review of our portfolio.” BP shares were little changed in London trading, as crude prices declined. BP’s plan to divest $20 billion of assets by the end of 2027 to improve the balance sheet still includes expectations of a transaction for lubricants business Castrol, Auchincloss said in an interview on Bloomberg TV. The firm also raised its disposal expectations for 2025, saying proceeds will exceed $4 billion after previously guiding between $3 to $4 billion. Quarterly share buybacks were held at $750 million, a reduced level BP announced earlier this year along with a strategic reset. Gearing — a ratio of net debt to equity that analysts have flagged as elevated compared to peers — ticked higher to 25.1%, from 24.6% in the previous quarter. Even though the company returned to focusing on fossil fuels, BP said its full year reported upstream production is expected to be slightly lower than last year. But in a telephone interview on Tuesday, Auchincloss said “maybe we’ll do better than that, but we don’t want to

Read More »

Biden staffers say IRA was hobbled by slow deployment

Implementation of the Inflation Reduction Act and Bipartisan Infrastructure Law suffered from muddled aims, and projects took too long to materialize under the Biden administration, according to an October report from former Department of Energy staffers who interviewed more than 80 of their former colleagues on the topic. The slow rollout meant that the “political theory animating the [Biden] administration’s approach — that the economic development generated by clean energy projects and industries would create a durable bipartisan coalition — was never truly tested,” and the Trump administration has been able to claw back much of the associated funding, the report says. “Programs frequently tried to satisfy multiple aims at once: decarbonization, onshoring, labor, equity, national security,” the report says. “This layering of priorities blurred mandates and slowed action. This proved to be particularly challenging for requirements that were at odds with energy industry realities (e.g., impractical [Build America Buy America] requirements for every component; labor union requirements for transmission projects where union labor didn’t exist).” The report was written by Ramsey Fahs, a former policy advisor at DOE; Louise White, a former senior consultant with DOE’s Loan Programs Office and Office of Technology Transitions; and Alan Propp, who first worked as a senior strategy consultant with DOE’s LPO and then served as a deputy chief of staff in its Loan Underwriting and Structuring Division. All three left the agency in January.  The authors say they interviewed more than 80 “political appointees and career staff who sat at the heart of implementation, with a primary focus on the infrastructure offices” at DOE, and noted that the interviews “are not exhaustive and at times interviewees reported conflicting information or divergent experiences.” However, interviewees seemed to agree that the implementation of the IRA and BIL was hampered by jumbled priorities, as well

Read More »

What the last gas boom (and bust) says about today’s rush to build

Listen to the article 13 min This audio is auto-generated. Please let us know if you have feedback. Twenty-five years ago, a data center boom helped fuel a race to build gas-fired power plants, with the energy secretary, utilities, politicians and experts warning of blackouts and economic stagnation if the country didn’t meet surging demand for electricity. By 2001, however, the dot com bubble had burst, the economy was in recession and the huge demand increase never materialized. Efficiency and productivity improved rapidly, and demand remained more or less level for the next two decades, leaving many utilities with excess capacity and ratepayers footing the bill. Some analysts and industry sources see parallels between then and now. Once again, headlines are warning of imminent energy shortfalls due largely to the power needs of artificial intelligence. Leading figures in government and industry are promoting more firm generation, and particularly gas, as a matter of economic and national security.  “Can the same thing happen? Definitely,” said Eugene Kim, Wood Mackenzie’s Americas Gas Research team director. “The utilities and anyone planning for power demand is forecasting unprecedented and, in some cases, even exponential growth. Whether that materializes or not – huge degree of uncertainty.” Gas investment reaches new heights Investment firms, utilities, tech giants, energy companies and others are pouring billions into acquiring existing gas plants or developing new ones to serve data centers. Gas power merger and acquisition valuations have doubled since 2024, reaching up to $1.93 million/MW in some markets, according to energy analytics firm Enverus.  While there are echoes of the millennium today, there are a few important differences.  The first is that the U.S. is producing and consuming more gas than ever before — driven largely by the rise of fracking — with production concentrated in Texas and Louisiana

Read More »

Independent Report Identifies 13B Barrel Potential in Greenland Basin

In a statement posted on its website recently, 80 Mile Plc announced that an independent petroleum reserves and resources evaluation identified “13 billion barrel potential” at Jameson Land Basin in Greenland. “80 Mile plc … notes the recent announcement …  by its U.S. joint venture partner, March GL Company and Pelican, regarding the results of an independent assessment and prospective resources report prepared by Sproule ERCE confirming the world-class potential of the Jameson Land Basin, located in Eastern Greenland,” 80 Mile said in the statement. “Independent assessment by U.S. based oil field specialists, Sproule ERCE estimate 13.03 billion barrels (P10) of gross un-risked recoverable prospective oil resources across the upper levels of the Jameson Basin,” it added. “The report also highlights potential upside outside these already identified target areas, across the broader license and at depth. Specifically, the Permian base layer,” it continued. In the statement, 80 Mile highlighted that its attributable share equates to approximately 3.9 billion barrels (P10) based on its 30 percent interest post earn-in completion. The company also noted that the report “identifies 58 prospects and leads, putting Jameson among the most prospective undrilled basins globally”. 80 Mile pointed out in the statement that it and March GL had previously entered into a binding joint venture agreement for drilling to commence at Jameson. Under this agreement, March GL will fund 100 percent of the costs associated with up to two exploration wells, each to a minimum depth of 3,500 meters, designed to delineate the hydrocarbon potential of the Jameson Basin, 80 Mile noted in its latest statement. In return, March GL may earn up to a 70 percent working interest, with 80 Mile retaining a 30 percent interest through its wholly owned subsidiary White Flame Energy A/S upon completion of the second well, 80 Mile added. “Until that time, 80 Mile retains 100

Read More »

Shell posts $5.4 billion third-quarter earnings, topping forecasts

Shell plc reported stronger-than-expected third-quarter 2025 earnings on Oct. 30, buoyed by robust operations, higher trading contributions, and steady upstream performance in key regions. The oil major posted adjusted earnings of $5.4 billion, topping the company’s own forecast of $5.09 billion. The result compares with $6 billion in the same quarter a year earlier and $4.26 billion in second-quarter 2025. Cash flow from operations (CFFO) totaled $12.2 billion, down from $14.7 billion a year earlier. Cash capital expenditure was at $4.9 billion for the quarter. “Shell delivered another strong set of results, with clear progress across our portfolio and excellent performance in our Marketing business and deepwater assets in the Gulf of Mexico and Brazil,” said chief executive officer Wael Sawan, citing record output in Brazil and two-decade highs in the Gulf, along with the highest marketing earnings in more than a decade.  To sustain its shareholder returns, Shell noted another $3.5-billion share buyback program for the next 3 months; the 16th consecutive quarter of at least $3 billion in buybacks. The company also reduced its net debt to $41.2 billion, down from $43.2 billion in the previous quarter. Shell third-quarter operational highlights Adjusted earnings for Integrated Gas were $2.14 billion, higher than in $1.7 billion second-quarter 2025, reflecting higher volumes and significantly higher trading and optimization results. The company’s LNG Canada project ramped up, with 13 cargoes delivered from phase 1. Shell said it expects startup of the second phase later this quarter. Upstream delivered $1.8 billion in adjusted earnings, up from $1.7 billion in second-quarter 2025, reflecting higher volumes, partly offset by the rebalancing of participation interests in Brazil. Marketing contributed $ 1.32 billion in adjusted earnings, up from $1.2 billion in the previous quarter, benefiting from higher margins and seasonally higher volumes. Chemicals & Products reported $550 million

Read More »

ExxonMobil posts $7.5 billion for third-quarter earnings, output hits records

ExxonMobil Corp. announced third-quarter 2025 earnings of $7.5 billion, up from $7.1 billion for second-quarter 2025 but down from $8.6 billion for third-quarter 2024. Year-to-date earnings totaled $22.3 billion, down from $26.1 billion for the same period last year. Exxon noted that this drop in earnings from a year ago was primarily due to the lower oil price environment, with Brent crude averaging $68.20/bbl in the third quarter, a 13% decline compared to the same quarter last year. Exxon’s free cash flow for third-quarter 2025 was $6.3 billion, down from $11.3 billion in the same quarter last year. Shareholder distributions totaled $9.4 billion, including $4.2 billion of dividends and $5.1 billion of share repurchases, consistent with the company’s announced plans. Cash capital expenditures were $8.6 billion in the third quarter, including $2.4 billion in growth acquisitions.  However, higher oil and gas production partly offset the impact of lower oil prices. Third-quarter net production reached 4.8 MMboe/d, up from 4.6 MMboe/d a year ago, underpinned by record production from both the Permian basin and Guyana. In Guyana, Exxon broke records with quarterly production surpassing 700,000 b/d and started up the Yellowtail development 4 months early and under budget. In the Permian, the company also set production record of nearly 1.7 MMboe/d, while continuing to expand the use of proprietary technologies like its lightweight proppant that improves well recoveries by up to 20%. The company acquired more than 80,000 additional net acres in the Permian basin from Sinochem Petroleum in the third quarter. “The transaction provides opportunities to further deploy the company’s innovative technology, leading to greater returns,” said Darren Woods, ExxonMobil chairman and chief executive officer. Meantime, Exxon has now started up eight of its 10 key 2025 projects, with the remaining 2 on track. “No one else in our industry

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Looking for Data Center Candidates? Check out Pkaza’s Active Candidate / Featured Candidate Hotlist Data Center Facility Technician (All Shifts Available) Impact, TX This position is also available in: Ashburn, VA; Abilene, TX; Needham, MA and New York, NY.  Navy Nuke / Military Vets leaving service accepted! This opportunity is working with a leading mission-critical data center provider. This firm provides data center solutions custom-fit to the requirements of their client’s mission-critical operational facilities. They provide reliability of mission-critical facilities for many of the world’s largest organizations facilities supporting enterprise clients, colo providers and hyperscale companies. This opportunity provides a career-growth minded role with exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Electrical Commissioning Engineer Montvale, NJ This traveling position is also available in: New York, NY; White Plains, NY;  Richmond, VA; Ashburn, VA; Charlotte, NC; Atlanta, GA; Hampton, GA; Fayetteville, GA; New Albany, OH; Cedar Rapids, IA; Phoenix, AZ; Dallas, TX or Chicago IL *** ALSO looking for a LEAD EE and ME CxA Agents and CxA PMs. *** Our client is an engineering design and commissioning company that has a national footprint and specializes in MEP critical facilities design. They provide design, commissioning, consulting and management expertise in the critical facilities space. They have a mindset to provide reliability, energy efficiency, sustainable design and LEED expertise when providing these consulting services for enterprise, colocation and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Data Center MEP Construction

Read More »

NVIDIA at GTC 2025: Building the AI Infrastructure of Everything

Omniverse DSX Blueprint Unveiled Also at the conference, NVIDIA released a blueprint for how other firms should build massive, gigascale AI data centers, or AI factories, in which Oracle, Microsoft, Google, and other leading tech firms are investing billions. The most powerful and efficient of those, company representatives said, will include NVIDIA chips and software. A new NVIDIA AI Factory Research Center in Virginia will use that technology. This new “mega” Omniverse DSX Blueprint is a comprehensive, open blueprint for designing and operating gigawatt-scale AI factories. It combines design, simulation, and operations across factory facilities, hardware, and software. • The blueprint expands to include libraries for building factory-scale digital twins, with Siemens’ Digital Twin software first to support the blueprint and FANUC and Foxconn Fii first to connect their robot models. • Belden, Caterpillar, Foxconn, Lucid Motors, Toyota, Taiwan Semiconductor Manufacturing Co. (TSMC), and Wistron build Omniverse factory digital twins to accelerate AI-driven manufacturing. • Agility Robotics, Amazon Robotics, Figure, and Skild AI build a collaborative robot workforce using NVIDIA’s three-computer architecture. NVIDIA Quantum Gains  And then there’s quantum computing. It can help data centers become more energy-efficient and faster with specific tasks such as optimization and AI model training. Conversely, the unique infrastructure needs of quantum computers, such as power, cooling, and error correction, are driving the development of specialized quantum data centers. Huang said it’s now possible to make one logical qubit, or quantum bit, that’s coherent, stable, and error corrected.  However, these qubits—the units of information enabling quantum computers to process information in ways ordinary computers can’t—are “incredibly fragile,” creating a need for powerful technology to do quantum error correction and infer the qubit’s state. To connect quantum and GPU computing, Huang announced the release of NVIDIA NVQLink — a quantum‑GPU interconnect that enables real‑time CUDA‑Q calls from quantum

Read More »

The Evolution of the Neocloud: From Niche to Mainstream Hyperscale Challenger

Infrastructure and Supply Chain Race Cloud competition is increasingly defined by the ability to secure power, land, and chips— three resources that dictate project timelines and customer onboarding. Neoclouds and hyperscalers face a common set of constraints: local utility availability, substation interconnection bottlenecks, and fierce competition for high-density GPU inventory. Power stands as the gating factor for expansion, often outpacing even chip shortages in severity. Facilities are increasingly being sited based on access to dedicated, reliable megawatt-scale electricity, rather than traditional latency zones or network proximity. AI growth forecasts point to four key ceilings: electrical capacity, chip procurement cycles, latency wall between computation and data, and scalable data throughput for model training. With hyperscaler and neocloud deployments now competing for every available GPU from manufacturers, deployment agility has become a prime differentiator. Neoclouds distinguish themselves by orchestrating microgrid agreements, securing direct-source utility contracts, and compressing build-to-operational timelines. Converting a bare site to a functional data hall with operators that can viably offer a shortened deployment timeline gives neoclouds a material edge over traditional hyperscale deployments that require broader campus and network-level integration cycles. The aftereffects of the COVID era supply chain disruptions linger, with legacy operators struggling to source critical electrical components, switchgear, and transformers, sometimes waiting more than a year for equipment. As a result, neocloud providers have moved aggressively into site selection strategies, regional partnerships, and infrastructure stack integration to hedge risk and shorten delivery cycles. Microgrid solutions and island modes for power supply are increasingly utilized to ensure uninterrupted access to electricity during ramp-up periods and supply chain outages, fundamentally rebalancing the competitive dynamics of AI infrastructure deployment. Creditworthiness, Capital, and Risk Management Securing capital remains a decisive factor for the growth and sustainability of neoclouds. Project finance for campus-scale deployments hinges on demonstrable creditworthiness; lenders demand

Read More »

Canyon Magnet Energy: The Superconducting Future of Powering AI Data Centers

At this year’s Data Center Frontier Trends Summit, Honghai Song, founder of Canyon Magnet Energy, presented his company’s breakthrough superconducting magnet technology during the “6 Moonshot Trends for the 2026 Data Center Frontier” panel—showcasing how high-temperature superconductors (HTS) could reshape both fusion energy and AI data-center power systems. In this episode of the Data Center Frontier Show, Editor in Chief Matt Vincent speaks with Song about how Canyon Magnet Energy—founded in 2023 and based in New Jersey with research roots at Stony Brook University—is bridging fusion research and AI infrastructure through next-generation magnet and energy-storage technology. From Fusion Research to Data Center Reality Founded in 2023, Canyon Magnet Energy emerged from the advanced-magnet research ecosystem around Stony Brook and now operates a manufacturing line in Newark, New Jersey. Its team draws on decades of experience designing the ultra-strong magnetic fields that enable the confinement and stability of fusion plasma—but their ambitions go far beyond the laboratory. “Super magnets are the foundation of fusion,” Song explains in the interview. “But the same high-temperature superconductors that can make fusion practical can also dramatically improve how we move and store electricity in data centers.” The company’s magnets are built using REBCO (Rare Earth Barium Copper Oxide) tape, which operates at around 77 Kelvin—cold, but far warmer and more manageable than traditional low-temperature superconductors. The result is a zero-resistance pathway for electricity, unlocking new possibilities in power transmission, energy storage, and grid integration. Why High-Temperature Superconductors Matter Since their discovery in 1986, high-temperature superconductors have progressed from exotic physics experiments to industrial-scale wire and magnet manufacturing. Canyon Magnet Energy is among a new generation of companies moving this technology into the AI data-center context—where efficiency and instantaneous power responsiveness are increasingly critical. With AI training clusters consuming power at hundreds of megawatts per campus,

Read More »

OpenAI spends even more money it doesn’t have

The aim, said Gogia, “is continuity, not cost efficiency. These deals are forward leaning, relying on revenue forecasts that remain speculative. In that context, OpenAI must continue to draw heavily on outside capital, whether through venture rounds, debt, or a future public offering.” He pointed out, “the company’s recent legal and corporate restructuring was designed to open the doors to that capital. Removing Microsoft’s exclusivity makes room for more vendors but also signals that no one provider can meet OpenAI’s demands. In several cases, suppliers are stepping in with financing arrangements that link product sales to future performance. While these strategies help close funding gaps, they introduce fragility. What looks like revenue is often pre-paid consumption, not realized margin.” Execution risks, he said, add to the concern. “Building and energizing enough data centers to meet OpenAI’s projected needs is not a function of ambition alone. It requires grid access, cooling capacity, and regional stability. Microsoft has acknowledged that it lacks the power infrastructure to fully deploy the GPUs it owns. Without physical readiness, all of these agreements sit on shaky ground.” Lots of equity swapping going on Scott Bickley, advisory fellow at Info-Tech Research Group, said he has not only been astounded by the funding announcements over the last few months, but is also appalled, primarily, he said, “because of the disconnect to what this does to the underlying technology stocks and their market prices versus where the technology is at from a development and ROI perspective … and from a boots on the ground perspective.” He added that while the financial pledges involve “huge, staggering numbers, most of them are tied up in ways that are not necessarily going to require all the cash to come from OpenAI. In a lot of cases, there is equity swapping. You have

Read More »

Verizon to build high-capacity fiber network to link AWS AI data centers

“AI will be essential to the future of business and society, driving innovation that demands a network to match,” Scott Lawrence, senior vice president and chief product officer at Verizon Business said in a statement. “This deal with Amazon demonstrates our continued commitment to meet the growing demands of AI workloads for the businesses and developers building our future.” This is not the first time that two companies have partnered. Verizon has previously adopted AWS as a preferred public cloud provider for its digital transformation efforts. The collaboration also extends to joint development of private mobile edge computing solutions, delivering secure, dedicated connectivity for enterprise customers. These efforts have been targeted at industries such as manufacturing, healthcare, retail, and entertainment.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »