Stay Ahead, Stay ONMINE

Can nuclear power really fuel the rise of AI?

In the AI arms race, all the major players say they want to go nuclear.   Over the past year, the likes of Meta, Amazon, Microsoft, and Google have sent out a flurry of announcements related to nuclear energy. Some are about agreements to purchase power from existing plants, while others are about investments looking to boost unproven advanced technologies. These somewhat unlikely partnerships could be a win for both the nuclear power industry and large tech companies. Tech giants need guaranteed sources of energy, and many are looking for low-emissions ones to hit their climate goals. For nuclear plant operators and nuclear technology developers, the financial support of massive established customers could help keep old nuclear power plants open and push new technologies forward. “There [are] a lot of advantages to nuclear,” says Michael Terrell, senior director of clean energy and carbon reduction at Google. Among them, he says, are that it’s “clean, firm, carbon-free, and can be sited just about anywhere.” (Firm energy sources are those that provide constant power.)  But there’s one glaring potential roadblock: timing. “There are needs on different time scales,” says Patrick White, former research director at the Nuclear Innovation Alliance. Many of these tech companies will require large amounts of power in the next three to five years, White says, but building new nuclear plants can take close to a decade.  Some next-generation nuclear technologies, especially small modular reactors, could take less time to build, but the companies promising speed have yet to build their first reactors—and in some cases they are still years away from even modestly sized demonstrations.  This timing mismatch means that even as tech companies tout plans for nuclear power, they’ll actually be relying largely on fossil fuels, keeping coal plants open, and even building new natural gas plants that could stay open for decades. AI and nuclear could genuinely help each other grow, but the reality is that the growth could be much slower than headlines suggest.  AI’s need for speed The US alone has roughly 3,000 data centers, and current projections say the AI boom could add thousands more by the end of the decade. The rush could increase global data center power demand by as much as 165% by 2030, according to one recent analysis from Goldman Sachs. In the US, estimates from industry and academia suggest energy demand for data centers could be as high as 400 terawatt-hours by 2030—up from fewer than 100 terawatt-hours in 2020 and higher than the total electricity demand from the entire country of Mexico. There are indications that the data center boom might be decelerating, with some companies slowing or pausing some projects in recent weeks. But even the most measured projections, in analyses like one recent report from the International Energy Agency, predict that energy demand will increase. The only question is by how much.   Many of the same tech giants currently scrambling to build data centers have also set climate goals, vowing to reach net-zero emissions or carbon-free energy within the next couple of decades. So they have a vested interest in where that electricity comes from.  Nuclear power has emerged as a strong candidate for companies looking to power data centers while cutting emissions. Unlike wind turbines and solar arrays that generate electricity intermittently, nuclear power plants typically put out a constant supply of energy to the grid, which aligns well with what data centers need. “Data center companies pretty much want to run full out, 24/7,” says Rob Gramlich, president of Grid Strategies, a consultancy focused on electricity and transmission. It also doesn’t hurt that, while renewables are increasingly politicized and under attack by the current administration in the US, nuclear has broad support on both sides of the aisle.  The problem is how to build up nuclear capacity—existing facilities are limited, and new technologies will take time to build. In 2022, all the nuclear reactors in the US together provided around 800 terawatt-hours of electricity to the power grid, a number that’s been basically steady for the past two decades. To meet electricity demand from data centers expected in 2030 with nuclear power, we’d need to expand the fleet of reactors in the country by half. New nuclear news  Some of the most exciting headlines regarding the burgeoning relationship between AI and nuclear technology involve large, established companies jumping in to support innovations that could bring nuclear power into the 21st century.  In October 2024, Google signed a deal with Kairos Power, a next-generation nuclear company that recently received construction approval for two demonstration reactors from the US Nuclear Regulatory Commission (NRC). The company is working to build small, molten-salt-cooled reactors, which it says will be safer and more efficient than conventional technology. The Google deal is a long-term power-purchase agreement: The tech giant will buy up to 500 megawatts of electricity by 2035 from whatever plants Kairos manages to build, with the first one scheduled to come online by 2030.  Amazon is also getting involved with next-generation nuclear technology with a direct investment in Maryland-based X-energy. The startup is among those working to create smaller, more-standardized reactors that can be built more quickly and with less expense. In October, Amazon signed a deal with Energy Northwest, a utility in Washington state, that will see Amazon fund the initial phase of a planned X-energy small modular reactor project in the state. The tech giant will have a right to buy electricity from one of the modules in the first project, which could generate 320 megawatts of electricity and be expanded to generate as much as 960 megawatts. Many new AI-focused data centers under construction will require 500 megawatts of power or more, so this project might be just large enough to power a single site.  The project will help meet energy needs “beginning in the early 2030s,” according to Amazon’s website. X-energy is currently in the pre-application process with the NRC, which must grant approval before the Washington project can move forward. Solid, long-term plans could be a major help in getting next-generation technologies off the ground. “It’s going to be important in the next couple [of] years to see more firm commitments and actual money going out for these projects,” says Jessica Lovering, who cofounded the Good Energy Collective, a policy research organization that advocates for the use of nuclear energy.  However, these early projects won’t be enough to make a dent in demand. The next-generation reactors Amazon and Google are supporting are modestly sized demonstrations—the first commercial installations of new technologies. They won’t be close to the scale needed to meet the energy demand expected from new data centers by 2030.  To provide a significant fraction of the terawatt-hours of electricity large tech companies use each year, nuclear companies will likely need to build dozens of new plants, not just a couple of reactors.  Purchasing power  One approach to get around this mismatch is to target existing reactors.  Microsoft made headlines in this area last year when it signed a long-term power purchase agreement with Constellation, the owner of the Three Mile Island Unit 1 nuclear plant in Pennsylvania. Constellation plans to reopen one of the reactors at that site and rename it the Crane Clean Energy Center. The deal with Microsoft ensures that there will be a customer for the electricity from the plant, if it successfully comes back online. (It’s currently on track to do so in 2028.) “If you don’t want to wait a decade for new technology, one of the biggest tools that we have in our tool kit today is to support relicensing of operating power plants,” says Urvi Parekh, head of global energy for Meta. Older facilities can apply for 20-year extensions from the NRC, a process that customers buying the energy can help support as it tends to be expensive and lengthy, Parekh says.  While these existing reactors provide some opportunity for Big Tech to snap up nuclear energy now, a limited number are in good enough shape to extend or reopen.  In the US, 24 reactors have licenses that will be up for renewal before 2035, roughly a quarter of those in operation today. A handful of plants could potentially be reopened in addition to Three Mile Island, White says. Palisades Nuclear Plant in Michigan has received a $1.52 billion loan guarantee from the US Department of Energy to reopen, and the owner of the Duane Arnold Energy Center in Iowa has filed a request with regulators that could begin the reopening process. Some sites have reactors that could be upgraded to produce more power without building new infrastructure, adding a total of between two and eight gigawatts, according to a recent report from the Department of Energy. That could power a handful of moderately sized data centers, but power demand is growing for individual projects—OpenAI has suggested the need for data centers that would require at least five gigawatts of power.  Ultimately, new reactors will be needed to expand capacity significantly, whether they use established technology or next-generation designs. Experts tend to agree that neither would be able to happen at scale until at least the early 2030s.  In the meantime, decisions made today in response to this energy demand boom will have ripple effects for years. Most power plants can last for several decades or more, so what gets built today will likely stay on the grid through 2040 and beyond. Whether the AI boom will entrench nuclear energy, fossil fuels, or other sources of electricity on the grid will depend on what is introduced to meet demand now.  No individual technology, including nuclear power, is likely to be the one true solution. As Google’s Terrell puts it, everything from wind and solar, energy storage, geothermal, and yes, nuclear, will be needed to meet both energy demand and climate goals. “I think nuclear gets a lot of love,” he says. “But all of this is equally as important.”

In the AI arms race, all the major players say they want to go nuclear.  

Over the past year, the likes of Meta, Amazon, Microsoft, and Google have sent out a flurry of announcements related to nuclear energy. Some are about agreements to purchase power from existing plants, while others are about investments looking to boost unproven advanced technologies.

These somewhat unlikely partnerships could be a win for both the nuclear power industry and large tech companies. Tech giants need guaranteed sources of energy, and many are looking for low-emissions ones to hit their climate goals. For nuclear plant operators and nuclear technology developers, the financial support of massive established customers could help keep old nuclear power plants open and push new technologies forward.

“There [are] a lot of advantages to nuclear,” says Michael Terrell, senior director of clean energy and carbon reduction at Google. Among them, he says, are that it’s “clean, firm, carbon-free, and can be sited just about anywhere.” (Firm energy sources are those that provide constant power.) 

But there’s one glaring potential roadblock: timing. “There are needs on different time scales,” says Patrick White, former research director at the Nuclear Innovation Alliance. Many of these tech companies will require large amounts of power in the next three to five years, White says, but building new nuclear plants can take close to a decade. 

Some next-generation nuclear technologies, especially small modular reactors, could take less time to build, but the companies promising speed have yet to build their first reactors—and in some cases they are still years away from even modestly sized demonstrations. 

This timing mismatch means that even as tech companies tout plans for nuclear power, they’ll actually be relying largely on fossil fuels, keeping coal plants open, and even building new natural gas plants that could stay open for decades. AI and nuclear could genuinely help each other grow, but the reality is that the growth could be much slower than headlines suggest. 

AI’s need for speed

The US alone has roughly 3,000 data centers, and current projections say the AI boom could add thousands more by the end of the decade. The rush could increase global data center power demand by as much as 165% by 2030, according to one recent analysis from Goldman Sachs. In the US, estimates from industry and academia suggest energy demand for data centers could be as high as 400 terawatt-hours by 2030—up from fewer than 100 terawatt-hours in 2020 and higher than the total electricity demand from the entire country of Mexico.

There are indications that the data center boom might be decelerating, with some companies slowing or pausing some projects in recent weeks. But even the most measured projections, in analyses like one recent report from the International Energy Agency, predict that energy demand will increase. The only question is by how much.  

Many of the same tech giants currently scrambling to build data centers have also set climate goals, vowing to reach net-zero emissions or carbon-free energy within the next couple of decades. So they have a vested interest in where that electricity comes from. 

Nuclear power has emerged as a strong candidate for companies looking to power data centers while cutting emissions. Unlike wind turbines and solar arrays that generate electricity intermittently, nuclear power plants typically put out a constant supply of energy to the grid, which aligns well with what data centers need. “Data center companies pretty much want to run full out, 24/7,” says Rob Gramlich, president of Grid Strategies, a consultancy focused on electricity and transmission.

It also doesn’t hurt that, while renewables are increasingly politicized and under attack by the current administration in the US, nuclear has broad support on both sides of the aisle. 

The problem is how to build up nuclear capacity—existing facilities are limited, and new technologies will take time to build. In 2022, all the nuclear reactors in the US together provided around 800 terawatt-hours of electricity to the power grid, a number that’s been basically steady for the past two decades. To meet electricity demand from data centers expected in 2030 with nuclear power, we’d need to expand the fleet of reactors in the country by half.

New nuclear news 

Some of the most exciting headlines regarding the burgeoning relationship between AI and nuclear technology involve large, established companies jumping in to support innovations that could bring nuclear power into the 21st century. 

In October 2024, Google signed a deal with Kairos Power, a next-generation nuclear company that recently received construction approval for two demonstration reactors from the US Nuclear Regulatory Commission (NRC). The company is working to build small, molten-salt-cooled reactors, which it says will be safer and more efficient than conventional technology. The Google deal is a long-term power-purchase agreement: The tech giant will buy up to 500 megawatts of electricity by 2035 from whatever plants Kairos manages to build, with the first one scheduled to come online by 2030. 

Amazon is also getting involved with next-generation nuclear technology with a direct investment in Maryland-based X-energy. The startup is among those working to create smaller, more-standardized reactors that can be built more quickly and with less expense.

In October, Amazon signed a deal with Energy Northwest, a utility in Washington state, that will see Amazon fund the initial phase of a planned X-energy small modular reactor project in the state. The tech giant will have a right to buy electricity from one of the modules in the first project, which could generate 320 megawatts of electricity and be expanded to generate as much as 960 megawatts. Many new AI-focused data centers under construction will require 500 megawatts of power or more, so this project might be just large enough to power a single site. 

The project will help meet energy needs “beginning in the early 2030s,” according to Amazon’s website. X-energy is currently in the pre-application process with the NRC, which must grant approval before the Washington project can move forward.

Solid, long-term plans could be a major help in getting next-generation technologies off the ground. “It’s going to be important in the next couple [of] years to see more firm commitments and actual money going out for these projects,” says Jessica Lovering, who cofounded the Good Energy Collective, a policy research organization that advocates for the use of nuclear energy. 

However, these early projects won’t be enough to make a dent in demand. The next-generation reactors Amazon and Google are supporting are modestly sized demonstrations—the first commercial installations of new technologies. They won’t be close to the scale needed to meet the energy demand expected from new data centers by 2030. 

To provide a significant fraction of the terawatt-hours of electricity large tech companies use each year, nuclear companies will likely need to build dozens of new plants, not just a couple of reactors. 

Purchasing power 

One approach to get around this mismatch is to target existing reactors. 

Microsoft made headlines in this area last year when it signed a long-term power purchase agreement with Constellation, the owner of the Three Mile Island Unit 1 nuclear plant in Pennsylvania. Constellation plans to reopen one of the reactors at that site and rename it the Crane Clean Energy Center. The deal with Microsoft ensures that there will be a customer for the electricity from the plant, if it successfully comes back online. (It’s currently on track to do so in 2028.)

“If you don’t want to wait a decade for new technology, one of the biggest tools that we have in our tool kit today is to support relicensing of operating power plants,” says Urvi Parekh, head of global energy for Meta. Older facilities can apply for 20-year extensions from the NRC, a process that customers buying the energy can help support as it tends to be expensive and lengthy, Parekh says. 

While these existing reactors provide some opportunity for Big Tech to snap up nuclear energy now, a limited number are in good enough shape to extend or reopen. 

In the US, 24 reactors have licenses that will be up for renewal before 2035, roughly a quarter of those in operation today. A handful of plants could potentially be reopened in addition to Three Mile Island, White says. Palisades Nuclear Plant in Michigan has received a $1.52 billion loan guarantee from the US Department of Energy to reopen, and the owner of the Duane Arnold Energy Center in Iowa has filed a request with regulators that could begin the reopening process.

Some sites have reactors that could be upgraded to produce more power without building new infrastructure, adding a total of between two and eight gigawatts, according to a recent report from the Department of Energy. That could power a handful of moderately sized data centers, but power demand is growing for individual projects—OpenAI has suggested the need for data centers that would require at least five gigawatts of power. 

Ultimately, new reactors will be needed to expand capacity significantly, whether they use established technology or next-generation designs. Experts tend to agree that neither would be able to happen at scale until at least the early 2030s. 

In the meantime, decisions made today in response to this energy demand boom will have ripple effects for years. Most power plants can last for several decades or more, so what gets built today will likely stay on the grid through 2040 and beyond. Whether the AI boom will entrench nuclear energy, fossil fuels, or other sources of electricity on the grid will depend on what is introduced to meet demand now. 

No individual technology, including nuclear power, is likely to be the one true solution. As Google’s Terrell puts it, everything from wind and solar, energy storage, geothermal, and yes, nuclear, will be needed to meet both energy demand and climate goals. “I think nuclear gets a lot of love,” he says. “But all of this is equally as important.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Survey: AMD continues to take server share from Intel

Dean McCarron, president of Mercury, said it’s not AMD stealing Intel business but mostly a case of AMD growing faster than Intel. “AMD’s growth rate in the quarter was multiples of Intel’s, resulting in significant server share gains,” he said in a research note. “Server processor shipments were definitively the

Read More »

Charging Forward: Centrica flags Rough shutdown in blow to hydrogen storage hopes

In this week’s Charging Forward, Centrica has warned it could shut down its Rough offshore gas storage facility without UK government support. Meanwhile, Harmony Energy is in the midst of a bidding war for its battery storage portfolio with competing offers from Drax Group and Foresight Group. In addition, Field Energy, SAE, Balance Power and EOS are all progressing various battery energy storage system (BESS) projects across the UK. This week’s UK energy storage headlines: Centrica warns it may close Rough gas storage facility Harmony Energy Income Trust to auction battery storage portfolio SAE forms joint venture with Econergy for 250 MW Uskmouth BESS EOS gets approval for £62m Teesside BESS Field secures £42m loan to accelerate three UK BESS projects Balance Power secures approval for 29.9 MW Ayrshire BESS Devon council rejects Clearstone Enrgy 150 MW BESS plans International energy storage news: Noon Energy targets ultra long duration storage and China’s Sungrow installs BESS near Arctic Circle Centrica warns it may close Rough offshore gas storage facility British Gas owner Centrica has warned it may shut down and decommission its Rough offshore gas storage facility without additional UK government support. In an interview with the BBC, Centrica chief executive Chris O’Shea said operating the Rough facility will lead to a £100 million loss for the company this year. “If we were to simply spend £2 billion redeveloping this field and the summer-winter gas price stays the same, then we will lose that £2bn and we’ll lose the cost of operating the facility,” O’Shea said. “It’s just not sustainable.” Centrica initially shut down Rough in 2017, but moved to reopen the site following Russia’s invasion of Ukraine in 2022. © Supplied by CentricaCentrica has plans to revamp its Rough site into a storage base for hydrogen. The company is investigating converting

Read More »

Transmission charging adds £1bn to Scottish offshore wind farm costs

Transmission charging is making Scottish offshore wind farms £1 billion more expensive than their English counterpart. A report from Aurora Energy Research found that the potential changes, currently being considered by the UK government, would add the £1bn figure to a 1GW wind farm over the course of its operating life. The report added that reforming the transmission charging model could save billpayers £16bn and protect investments in offshore wind projects. Transmission charging is a levy placed on generators, having originally been designed to incentivise building asset near major cities in England. However, Scotland is looking to build over 45GW of offshore wind in the country’s waters, driven through the ScotWind and INTOG leasing rounds. Transmission charging is now effectively penalising generation in these areas. Scottish Renewables chief executive Claire Mack said: “Scotland’s abundant natural resources should make it the home of the UK’s biggest and most productive renewable energy projects but our outdated transmission charging rules, designed over 30 years ago, are unbalancing how the modern-day electricity network should be paid for which is negatively impacting the development of major sites. “These charges are both volatile and unpredictable, unfairly penalising Scottish projects by tens of millions of pounds every year.” © Supplied by Scottish RenewablesScottish Renewables chief executive Claire Mack. She added that the UK government cannot meet its 2030 clean power targets without Scottish offshore wind. Instead, she urged the UK government and Ofgem to implement a cap and floor model for transmission charging that alleviates these costs and keeps projects on track. “Delivering this meaningful reform will provide a stable, investment-friendly environment – one that protects the clean power projects vital to creating green jobs at scale and delivering a secure, sustainable energy system for the future,” Mack said. Transmission charging reforms According to the report, transmission

Read More »

Petrofac restructuring approved by courts as it looks to ‘healthy pipeline of future opportunities’

Aberdeen-based oil services firm Petrofac has had its restructuring plan approved by the High Court of England and Wales. The business has been pursuing a cost-cutting restructuring plan with lenders in order to bring its books back into the black and has now received the green light. In February, Petrofac announced plans to raise $355 million (£280m) in funding as part of the deal that has further reduced shareholder allocation. Existing shareholders were to be allocated 2.2% firm’s total share capital as part of the deal, a downgrade compared to the 2.5% outlined in plans announced just before Christmas. Petrofac appeared in court on 30 April and witnessed “overwhelming support of shareholders and the majority of creditor classes,” it claimed in a stock market update. The firm’s chief executive, Tareq Kawash said: “The sanctioning of the restructuring plan marks a significant milestone for our group. “Implementation of the plan will allow our talented team, clients, suppliers, shareholders and investors, to move forward with renewed confidence, bolstered by a sustainable financial platform, a robust backlog of existing contracts, and a healthy pipeline of future opportunities.” Petrofac CEO Tareq Kawash. The firm has faced financial hardship in recent years as it became entangled in a corruption scandal, as the UK’s Serious Fraud Office (SFO) charged two former Petrofac executives with paying bribes in the United Arab Emirates in 2024.  Additionally, Petrofac’s books have not been as healthy as they once were, and share prices slumped following the COVD pandemic. A decision to defer publishing its audited annual results for 2024 has also resulted the halt of shares trading for the company as well, adding to its list of concerns. This forms an ongoing saga of financial uncertainty for the services giant that has been ongoing since 2023.   However, now that a restructuring

Read More »

FERC rejects MISO plan to speed generation interconnection

Federal regulators on Friday rejected the Midcontinent Independent System Operator’s plan to fast-track new generation interconnections, arguing the proposal lacked limits on the number of projects that could be considered. MISO filed its Expedited Resource Addition Study process, or ERAS, in March, to provide a framework for the accelerated study of generation it said could address urgent resource adequacy and reliability needs. Under the proposal, projects entering the ERAS process would have been studied serially each quarter and granted an expedited interconnection agreement within 90 days. But renewable energy stakeholders argued the proposal could add chaos “to an already complex process.” In a 2-1 vote, FERC commissioners denied the proposal. The ERAS proposal “places no limit on the number of projects that could be entered in the ERAS process, which could result in an ERAS queue with processing times for interconnection requests that are too lengthy to meet MISO’s stated resource adequacy and reliability needs,” Commissioners David Rosner and Lindsay See wrote in their decision. MISO’s proposed tariff language, for example, did not limit the number of interconnection requests or total megawatts of interconnection requests that are eligible to enter the ERAS queue, they noted. “MISO proposes up to 14 opportunities to enter the ERAS process through 2028, which could further impede MISO’s ability to process ERAS requests on an expedited basis,” according to the decision. FERC Chairman Mark Christie dissented, though he said he did not disagree with the majority’s criticism. “I am willing, however, to extend to both the states and MISO a trust that they would implement the ERAS proposal in a manner that would promote the construction of badly needed generation capacity that serves resource adequacy and reliability,” Christie wrote. “One thing we know with no need for further proof: This country, including MISO, is heading for a

Read More »

Stop work order on Empire Wind 1 lifted, Equinor resumes construction

The Trump administration has lifted its stop work order on the 810-MW Empire Wind 1 project offshore New York, allowing construction to resume, developer Equinor said Monday. New York Governor Kathy Hochul, D, said in a Monday release that “countless conversations with Equinor and White House officials” had led to the lifting of the stop work order. “Now, Equinor will resume the construction of this fully-permitted project that had already received the necessary federal approvals,” she said. “I also reaffirmed that New York will work with the Administration and private entities on new energy projects that meet the legal requirements under New York law.” Interior Secretary Doug Burgum said in a Monday X post that he was “encouraged by Governor Hochul’s comments about her willingness to move forward on critical pipeline capacity” for natural gas.  The stop work order was issued April 16, and last week Equinor said the situation would force the company to terminate the project entirely if the situation wasn’t resolved within days, as the stoppage cost around $50 million a week. “I am grateful to Governor Hochul for her constructive collaboration with the Trump Administration, without which we would not have been able to advance this project and secure energy for 500,000 homes in New York,” Anders Opedal, president and CEO of Equinor said in a release. Opedal also thanked the president, New York City Mayor Eric Adams, and several New York lawmakers “as well as labour groups and other advocates that have maintained their steadfast support for the project.” When issuing the order, Burgum said in a letter to the Bureau of Ocean Energy Management that the project was “rushed through by the prior administration without sufficient analysis or consultation among the relevant agencies as relates to the potential effects from the project” and that

Read More »

GOP supporters of IRA clean energy credits outline reconciliation bill recommendations

Dive Brief: A group of Congressional Republicans are asking their party to further tweak proposed changes to clean energy tax credits that the House Ways and Means Committee included in its reconciliation bill text. The proposed budget reconciliation bill would decrease the credits for a number of clean energy technologies after 2028 and phase them out entirely after 2031, as well as additional requirements to qualify for the credits and an early sunset for their transferability. Rep. Jen Kiggans, a Virginia Republican, led 13 other House members in a joint statement Wednesday seeking to extend the transferability mechanism through the phase-out period and make other changes. The GOP tax bill, as currently constructed, passed the House Budget Committee late on Sunday, after the committee failed to clear the bill on Friday. However, as Kiggans and the other lawmakers seek changes to the bill to further protect the clean energy tax provisions, others in the party are looking to speed up the phaseout. Dive Insight: While the original Inflation Reduction Act had technology-neutral clean energy production and investment credits available until 2032, the House Ways and Means Committee’s bill text would move the sunset dates up and phase them out over a four-year period. Companies would be able to receive 80% of the credit values in 2029, 60% in 2030, 40% in 2031, with the credits phasing out entirely in 2032. Another nuclear power production credit would be phased out on the same schedule. Kiggans and her colleagues said they commended the committee “for including reasonable phase-out schedules” for the credits, but said in the May 14 joint statement that a few more changes are needed “to fully realize the intent of these phase-out schedules.” “While many of these provisions reflect a commitment to American energy dominance through an all-of-the-above energy

Read More »

Tariff uncertainty weighs on networking vendors

“Our guide assumes current tariffs and exemptions remain in place through the quarter. These include the following: China at 30%, partially offset by an exemption for semiconductors and certain electronic components; Mexico and Canada at 25% for the components and products that are not eligible for the current exemptions,” Cisco CFO Scott Herron told Wall Street analysts in the company’s quarterly earnings report on May 14. At this time, Cisco expects little impact from tariffs on steel and aluminum and retaliatory tariffs, Herron said. “We’ll continue to leverage our world-class supply chain team to help mitigate the impact,” he said, adding that “the flexibility and agility we have built into our operations over the last few years, the size and scale of our supply chain, provides us some unique advantages as we support our customers globally.” “Once the tariff scenario stabilizes, there [are] steps that we can take to mitigate it, as you’ve seen us do with China from the first Trump administration. And only after that would we consider price [increases],” Herron said. Similarly, Extreme Networks noted the changing tariff conditions during its earnings call on April 30. “The tariff situation is very dynamic, I think, as everybody knows and can appreciate, and it’s kind of hard to call. Yes, there was concern initially given the magnitude of tariffs,” said Extreme Networks CEO Ed Meyercord on the earnings call. “The larger question is, will all of the changes globally in trade and tariff policy have an impact on demand? And that’s hard to call at this point. And we’re going to hold as far as providing guidance or judgment on that until we have finality come July.” Financial news Meanwhile, AI is fueling high expectations and influencing investments in enterprise campus and data center environments.

Read More »

Liquid cooling becoming essential as AI servers proliferate

“Facility water loops sometimes have good water quality, sometimes bad,” says My Troung, CTO at ZutaCore, a liquid cooling company. “Sometimes you have organics you don’t want to have inside the technical loop.” So there’s one set of pipes that goes around the data center, collecting the heat from the server racks, and another set of smaller pipes that lives inside individual racks or servers. “That inner loop is some sort of technical fluid, and the two loops exchange heat across a heat exchanger,” says Troung. The most common approach today, he says, is to use a single-phase liquid — one that stays in liquid form and never evaporates into a gas — such as water or propylene glycol. But it’s not the most efficient option. Evaporation is a great way to dissipate heat. That’s what our bodies do when we sweat. When water goes from a liquid to a gas it’s called a phase change, and it uses up energy and makes everything around it slightly cooler. Of course, few servers run hot enough to boil water — but they can boil other liquids. “Two phase is the most efficient cooling technology,” says Xianming (Simon) Dai, a professor at University of Texas at Dallas. And it might be here sooner than you think. In a keynote address in March at Nvidia GTC, Nvidia CEO Jensen Huang unveiled the Rubin Ultra NVL576, due in the second half of 2027 — with 600 kilowatts per rack. “With the 600 kilowatt racks that Nvidia is announcing, the industry will have to shift very soon from single-phase approaches to two-phase,” says ZutaCore’s Troung. Another highly-efficient cooling approach is immersion cooling. According to a Castrol survey released in March, 90% of 600 data center industry leaders say that they are considering switching to immersion

Read More »

Cisco taps OpenAI’s Codex for AI-driven network coding

“If you want to ask Codex a question about your codebase, click “Ask”. Each task is processed independently in a separate, isolated environment preloaded with your codebase. Codex can read and edit files, as well as run commands including test harnesses, linters, and type checkers. Task completion typically takes between 1 and 30 minutes, depending on complexity, and you can monitor Codex’s progress in real time,” according to OpenAI. “Once Codex completes a task, it commits its changes in its environment. Codex provides verifiable evidence of its actions through citations of terminal logs and test outputs, allowing you to trace each step taken during task completion,” OpenAI wrote. “You can then review the results, request further revisions, open a GitHub pull request, or directly integrate the changes into your local environment. In the product, you can configure the Codex environment to match your real development environment as closely as possible.” OpenAI is releasing Codex as a research preview: “We prioritized security and transparency when designing Codex so users can verify its outputs – a safeguard that grows increasingly more important as AI models handle more complex coding tasks independently and safety considerations evolve. Users can check Codex’s work through citations, terminal logs and test results,” OpenAI wrote.  Internally, technical teams at OpenAI have started using Codex. “It is most often used by OpenAI engineers to offload repetitive, well-scoped tasks, like refactoring, renaming, and writing tests, that would otherwise break focus. It’s equally useful for scaffolding new features, wiring components, fixing bugs, and drafting documentation,” OpenAI stated. Cisco’s view of agentic AI Patel stated that Codex is part of the developing AI agent world, where Cisco envisions billions of AI agents will work together to transform and redefine the architectural assumptions the industry has relied on. Agents will communicate within and

Read More »

US companies are helping Saudi Arabia to build an AI powerhouse

AMD announced a five-year, $10 billion collaboration with Humain to deploy up to 500 megawatts of AI compute in Saudi Arabia and the US, aiming to deploy “multi-exaflop capacity by early 2026.” AWS, too, is expanding its data centers in Saudi Arabia to bolster Humain’s cloud infrastructure. Saudi Arabia has abundant oil and gas to power those data centers, and is growing its renewable energy resources with the goal of supplying 50% of the country’s power by 2030. “Commercial electricity rates, nearly 50% lower than in the US, offer potential cost savings for AI model training, though high local hosting costs due to land, talent, and infrastructure limit total savings,” said Eric Samuel, Associate Director at IDC. Located near Middle Eastern population centers and fiber optic cables to Asia, these data centers will offer enterprises low-latency cloud computing for real-time AI applications. Late is great There’s an advantage to being a relative latecomer to the technology industry, said Eric Samuel, associate director, research at IDC. “Saudi Arabia’s greenfield tech landscape offers a unique opportunity for rapid, ground-up AI integration, unburdened by legacy systems,” he said.

Read More »

AMD, Nvidia partner with Saudi startup to build multi-billion dollar AI service centers

Humain will deploy the Nvidia Omniverse platform as a multi-tenant system to drive acceleration of the new era of physical AI and robotics through simulation, optimization and operation of physical environments by new human-AI-led solutions. The AMD deal did not discuss the number of chips involved in the deal, but it is valued at $10 billion. AMD and Humain plan to develop a comprehensive AI infrastructure through a network of AMD-based AI data centers that will extend from Saudi Arabia to the US and support a wide range of AI workloads across corporate, start-up, and government markets. Think of it as AWS but only offering AI as a service. AMD will provide its AI compute portfolio – Epyc, Instinct, and FPGA networking — and the AMD ROCm open software ecosystem, while Humain will manage the delivery of the hyperscale data center, sustainable power systems, and global fiber interconnects. The partners expect to activate a multi-exaflop network by early 2026, supported by next-generation AI silicon, modular data center zones, and a software platform stack focused on developer enablement, open standards, and interoperability. Amazon Web Services also got a piece of the action, announcing a more than $5 billion investment to build an “AI zone” in the Kingdom. The zone is the first of its kind and will bring together multiple capabilities, including dedicated AWS AI infrastructure and servers, UltraCluster networks for faster AI training and inference, AWS services like SageMaker and Bedrock, and AI application services such as Amazon Q. Like the AMD project, the zone will be available in 2026. Humain only emerged this month, so little is known about it. But given that it is backed by Crown Prince Salman and has the full weight of the Kingdom’s Public Investment Fund (PIF), which ranks among the world’s largest and

Read More »

Check Point CISO: Network segregation can prevent blackouts, disruptions

Fischbein agrees 100% with his colleague’s analysis and adds that education and training can help prevent such incidents from occurring. “Simulating such a blackout is impossible, it has never been done,” he acknowledges, but he is committed to strengthening personal and team training and risk awareness. Increased defense and cybersecurity budgets In 2025, industry watchers expect there will be an increase in the public budget allocated to defense. In Spain, one-third of the budget will be allocated to increasing cybersecurity. But for Fischbein, training teams is much more important than the budget. “The challenge is to distribute the budget in a way that can be managed,” he notes, and to leverage intuitive and easy-to-use platforms, so that organizations don’t have to invest all the money in training. “When you have information, management, users, devices, mobiles, data centers, clouds, cameras, printers… the security challenge is very complex. You have to look for a security platform that makes things easier, faster, and simpler,” he says. ” Today there are excellent tools that can stop all kinds of attacks.” “Since 2010, there have been cybersecurity systems, also from Check Point, that help prevent this type of incident from happening, but I’m not sure that [Spain’s electricity blackout] was a cyberattack.” Leading the way in email security According to Gartner’s Magic Quadrant, Check Point is the leader in email security platforms. Today email is still responsible for 88% of all malicious file distributions. Attacks that, as Fischbein explains, enter through phishing, spam, SMS, or QR codes. “There are two challenges: to stop the threats and not to disturb, because if the security tool is a nuisance it causes more harm than good. It is very important that the solution does not annoy [users],” he stresses. “As almost all attacks enter via e-mail, it is

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »