Stay Ahead, Stay ONMINE

How the Pentagon is adapting to China’s technological rise

It’s been just over two months since Kathleen Hicks stepped down as US deputy secretary of defense. As the highest-ranking woman in Pentagon history, Hicks shaped US military posture through an era defined by renewed competition between powerful countries and a scramble to modernize defense technology.   She’s currently taking a break before jumping into her (still unannounced) next act. “It’s been refreshing,” she says—but disconnecting isn’t easy. She continues to monitor defense developments closely and expresses concern over potential setbacks: “New administrations have new priorities, and that’s completely expected, but I do worry about just stalling out on progress that we’ve built over a number of administrations.” Over the past three decades, Hicks has watched the Pentagon transform—politically, strategically, and technologically. She entered government in the 1990s at the tail end of the Cold War, when optimism and a belief in global cooperation still dominated US foreign policy. But that optimism dimmed. After 9/11, the focus shifted to counterterrorism and nonstate actors. Then came Russia’s resurgence and China’s growing assertiveness. Hicks took two previous breaks from government work—the first to complete a PhD at MIT and the second to join the think tank Center for Strategic and International Studies (CSIS), where she focused on defense strategy. “By the time I returned in 2021,” she says, “there was one actor—the PRC (People’s Republic of China)—that had the capability and the will to really contest the international system as it’s set up.” In this conversation with MIT Technology Review, Hicks reflects on how the Pentagon is adapting—or failing to adapt—to a new era of geopolitical competition. She discusses China’s technological rise, the future of AI in warfare, and her signature initiative, Replicator, a Pentagon initiative to rapidly field thousands of low-cost autonomous systems such as drones. You’ve described China as a “talented fast follower.” Do you still believe that, especially given recent developments in AI and other technologies? Yes, I do. China is the biggest pacing challenge we face, which means it sets the pace for most capability areas for what we need to be able to defeat to deter them. For example, surface maritime capability, missile capability, stealth fighter capability. They set their minds to achieving a certain capability, they tend to get there, and they tend to get there even faster. That said, they have a substantial amount of corruption, and they haven’t been engaged in a real conflict or combat operation in the way that Western militaries have trained for or been involved in, and that is a huge X factor in how effective they would be. China has made major technological strides, and the old narrative of its being a follower is breaking down—not just in commercial tech, but more broadly. Do you think the US still holds a strategic advantage? I would never want to underestimate their ability—or any nation’s ability—to innovate organically when they put their minds to it. But I still think it’s a helpful comparison to look at the US model. Because we’re a system of free minds, free people, and free markets, we have the potential to generate much more innovation culturally and organically than a statist model does. That’s our advantage—if we can realize it. China is ahead in manufacturing, especially when it comes to drones and other unmanned systems. How big a problem is that for US defense, and can the US catch up? I do think it’s a massive problem. When we were conceiving Replicator, one of the big concerns was that DJI had just jumped way out ahead on the manufacturing side, and the US had been left behind. A lot of manufacturers here believe they can catch up if given the right contracts—and I agree with that. We also spent time identifying broader supply-chain vulnerabilities. Microelectronics was a big one. Critical minerals. Batteries. People sometimes think batteries are just about electrification, but they’re fundamental across our systems—even on ships in the Navy. When it comes to drones specifically, I actually think it’s a solvable problem. The issue isn’t complexity. It’s just about getting enough mass of contracts to scale up manufacturing. If we do that, I believe the US can absolutely compete. The Replicator drone program was one of your key initiatives. It promised a very fast timeline—especially compared with the typical defense acquisition cycle. Was that achievable? How is that progressing? When I left in January, we had still lined up for proving out this summer, and I still believe we should see some completion this year. I hope Congress will stay very engaged in trying to ensure that the capability, in fact, comes to fruition. Even just this week with Secretary [Pete] Hegseth out in the Indo-Pacific, he made some passing reference to the [US Indo-Pacific Command] commander, Admiral [Samuel] Paparo, having the flexibility to create the capability needed, and that gives me a lot of confidence of consistency. Can you talk about how Replicator fits into broader efforts to speed up defense innovation? What’s actually changing inside the system? Traditionally, defense acquisition is slow and serial—one step after another, which works for massive, long-term systems like submarines. But for things like drones, that just doesn’t cut it. With Replicator, we aimed to shift to a parallel model: integrating hardware, software, policy, and testing all at once. That’s how you get speed—by breaking down silos and running things simultaneously. It’s not about “Move fast and break things.” You still have to test and evaluate responsibly. But this approach shows we can move faster without sacrificing accountability—and that’s a big cultural shift.  How important is AI to the future of national defense? It’s central. The future of warfare will be about speed and precision—decision advantage. AI helps enable that. It’s about integrating capabilities to create faster, more accurate decision-making: for achieving military objectives, for reducing civilian casualties, and for being able to deter effectively. But we’ve also emphasized responsible AI. If it’s not safe, it’s not going to be effective. That’s been a key focus across administrations. What about generative AI specifically? Does it have real strategic significance yet, or is it still in the experimental phase? It does have significance, especially for decision-making and efficiency. We had an effort called Project Lima where we looked at use cases for generative AI—where it might be most useful, and what the rules for responsible use should look like. Some of the biggest use may come first in the back office—human resources, auditing, logistics. But the ability to use generative AI to create a network of capability around unmanned systems or information exchange, either in Replicator or JADC2? That’s where it becomes a real advantage. But those back-office areas are where I would anticipate to see big gains first. [Editor’s note: JADC2 is Joint All-Domain Command and Control, a DOD initiative to connect sensors from all branches of the armed forces into a unified network powered by artificial intelligence.] In recent years, we’ve seen more tech industry figures stepping into national defense conversations—sometimes pushing strong political views or advocating for deregulation. How do you see Silicon Valley’s growing influence on US defense strategy? There’s a long history of innovation in this country coming from outside the government—people who look at big national problems and want to help solve them. That kind of engagement is good, especially when their technical expertise lines up with real national security needs. But that’s not just one stakeholder group. A healthy democracy includes others, too—workers, environmental voices, allies. We need to reconcile all of that through a functioning democratic process. That’s the only way this works. How do you view the involvement of prominent tech entrepreneurs, such as Elon Musk, in shaping national defense policies? I believe it’s not healthy for any democracy when a single individual wields more power than their technical expertise or official role justifies. We need strong institutions, not just strong personalities. The US has long attracted top STEM talent from around the world, including many researchers from China. But in recent years, immigration hurdles and heightened scrutiny have made it harder for foreign-born scientists to stay. Do you see this as a threat to US innovation? I think you have to be confident that you have a secure research community to do secure work. But much of the work that underpins national defense that’s STEM-related research doesn’t need to be tightly secured in that way, and it really is dependent on a diverse ecosystem of talent. Cutting off talent pipelines is like eating our seed corn. Programs like H-1B visas are really important. And it’s not just about international talent—we need to make sure people from underrepresented communities here in the US see national security as a space where they can contribute. If they don’t feel valued or trusted, they’re less likely to come in and stay. What do you see as the biggest challenge the Department of Defense faces today? I do think the  trust—or the lack of it—is a big challenge. Whether it’s trust in government broadly or specific concerns like military spending, audits, or politicization of the uniformed military, that issue manifests in everything DOD is trying to get done. It affects our ability to work with Congress, with allies, with industry, and with the American people. If people don’t believe you’re working in their interest, it’s hard to get anything done.

It’s been just over two months since Kathleen Hicks stepped down as US deputy secretary of defense. As the highest-ranking woman in Pentagon history, Hicks shaped US military posture through an era defined by renewed competition between powerful countries and a scramble to modernize defense technology.  

She’s currently taking a break before jumping into her (still unannounced) next act. “It’s been refreshing,” she says—but disconnecting isn’t easy. She continues to monitor defense developments closely and expresses concern over potential setbacks: “New administrations have new priorities, and that’s completely expected, but I do worry about just stalling out on progress that we’ve built over a number of administrations.”

Over the past three decades, Hicks has watched the Pentagon transform—politically, strategically, and technologically. She entered government in the 1990s at the tail end of the Cold War, when optimism and a belief in global cooperation still dominated US foreign policy. But that optimism dimmed. After 9/11, the focus shifted to counterterrorism and nonstate actors. Then came Russia’s resurgence and China’s growing assertiveness. Hicks took two previous breaks from government work—the first to complete a PhD at MIT and the second to join the think tank Center for Strategic and International Studies (CSIS), where she focused on defense strategy. “By the time I returned in 2021,” she says, “there was one actor—the PRC (People’s Republic of China)—that had the capability and the will to really contest the international system as it’s set up.”

In this conversation with MIT Technology Review, Hicks reflects on how the Pentagon is adapting—or failing to adapt—to a new era of geopolitical competition. She discusses China’s technological rise, the future of AI in warfare, and her signature initiative, Replicator, a Pentagon initiative to rapidly field thousands of low-cost autonomous systems such as drones.

You’ve described China as a “talented fast follower. Do you still believe that, especially given recent developments in AI and other technologies?

Yes, I do. China is the biggest pacing challenge we face, which means it sets the pace for most capability areas for what we need to be able to defeat to deter them. For example, surface maritime capability, missile capability, stealth fighter capability. They set their minds to achieving a certain capability, they tend to get there, and they tend to get there even faster.

That said, they have a substantial amount of corruption, and they haven’t been engaged in a real conflict or combat operation in the way that Western militaries have trained for or been involved in, and that is a huge X factor in how effective they would be.

China has made major technological strides, and the old narrative of its being a follower is breaking down—not just in commercial tech, but more broadly. Do you think the US still holds a strategic advantage?

I would never want to underestimate their ability—or any nation’s ability—to innovate organically when they put their minds to it. But I still think it’s a helpful comparison to look at the US model. Because we’re a system of free minds, free people, and free markets, we have the potential to generate much more innovation culturally and organically than a statist model does. That’s our advantage—if we can realize it.

China is ahead in manufacturing, especially when it comes to drones and other unmanned systems. How big a problem is that for US defense, and can the US catch up?

I do think it’s a massive problem. When we were conceiving Replicator, one of the big concerns was that DJI had just jumped way out ahead on the manufacturing side, and the US had been left behind. A lot of manufacturers here believe they can catch up if given the right contracts—and I agree with that.

We also spent time identifying broader supply-chain vulnerabilities. Microelectronics was a big one. Critical minerals. Batteries. People sometimes think batteries are just about electrification, but they’re fundamental across our systems—even on ships in the Navy.

When it comes to drones specifically, I actually think it’s a solvable problem. The issue isn’t complexity. It’s just about getting enough mass of contracts to scale up manufacturing. If we do that, I believe the US can absolutely compete.

The Replicator drone program was one of your key initiatives. It promised a very fast timeline—especially compared with the typical defense acquisition cycle. Was that achievable? How is that progressing?

When I left in January, we had still lined up for proving out this summer, and I still believe we should see some completion this year. I hope Congress will stay very engaged in trying to ensure that the capability, in fact, comes to fruition. Even just this week with Secretary [Pete] Hegseth out in the Indo-Pacific, he made some passing reference to the [US Indo-Pacific Command] commander, Admiral [Samuel] Paparo, having the flexibility to create the capability needed, and that gives me a lot of confidence of consistency.

Can you talk about how Replicator fits into broader efforts to speed up defense innovation? What’s actually changing inside the system?

Traditionally, defense acquisition is slow and serial—one step after another, which works for massive, long-term systems like submarines. But for things like drones, that just doesn’t cut it. With Replicator, we aimed to shift to a parallel model: integrating hardware, software, policy, and testing all at once. That’s how you get speed—by breaking down silos and running things simultaneously.

It’s not about “Move fast and break things.” You still have to test and evaluate responsibly. But this approach shows we can move faster without sacrificing accountability—and that’s a big cultural shift.

 How important is AI to the future of national defense?

It’s central. The future of warfare will be about speed and precision—decision advantage. AI helps enable that. It’s about integrating capabilities to create faster, more accurate decision-making: for achieving military objectives, for reducing civilian casualties, and for being able to deter effectively. But we’ve also emphasized responsible AI. If it’s not safe, it’s not going to be effective. That’s been a key focus across administrations.

What about generative AI specifically? Does it have real strategic significance yet, or is it still in the experimental phase?

It does have significance, especially for decision-making and efficiency. We had an effort called Project Lima where we looked at use cases for generative AI—where it might be most useful, and what the rules for responsible use should look like. Some of the biggest use may come first in the back office—human resources, auditing, logistics. But the ability to use generative AI to create a network of capability around unmanned systems or information exchange, either in Replicator or JADC2? That’s where it becomes a real advantage. But those back-office areas are where I would anticipate to see big gains first.

[Editor’s note: JADC2 is Joint All-Domain Command and Control, a DOD initiative to connect sensors from all branches of the armed forces into a unified network powered by artificial intelligence.]

In recent years, we’ve seen more tech industry figures stepping into national defense conversations—sometimes pushing strong political views or advocating for deregulation. How do you see Silicon Valley’s growing influence on US defense strategy?

There’s a long history of innovation in this country coming from outside the government—people who look at big national problems and want to help solve them. That kind of engagement is good, especially when their technical expertise lines up with real national security needs.

But that’s not just one stakeholder group. A healthy democracy includes others, too—workers, environmental voices, allies. We need to reconcile all of that through a functioning democratic process. That’s the only way this works.

How do you view the involvement of prominent tech entrepreneurs, such as Elon Musk, in shaping national defense policies?

I believe it’s not healthy for any democracy when a single individual wields more power than their technical expertise or official role justifies. We need strong institutions, not just strong personalities.

The US has long attracted top STEM talent from around the world, including many researchers from China. But in recent years, immigration hurdles and heightened scrutiny have made it harder for foreign-born scientists to stay. Do you see this as a threat to US innovation?

I think you have to be confident that you have a secure research community to do secure work. But much of the work that underpins national defense that’s STEM-related research doesn’t need to be tightly secured in that way, and it really is dependent on a diverse ecosystem of talent. Cutting off talent pipelines is like eating our seed corn. Programs like H-1B visas are really important.

And it’s not just about international talent—we need to make sure people from underrepresented communities here in the US see national security as a space where they can contribute. If they don’t feel valued or trusted, they’re less likely to come in and stay.

What do you see as the biggest challenge the Department of Defense faces today?

I do think the  trust—or the lack of it—is a big challenge. Whether it’s trust in government broadly or specific concerns like military spending, audits, or politicization of the uniformed military, that issue manifests in everything DOD is trying to get done. It affects our ability to work with Congress, with allies, with industry, and with the American people. If people don’t believe you’re working in their interest, it’s hard to get anything done.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

NetBox Labs embraces intersection of network management and AI

“NetBox is intent,” Beevers explained. “This is where network teams are documenting ‘Here is what the network and the infrastructure should look like.’ Think of intent as what is in NetBox.” With the general availability of NetBox Assurance announced this week, the platform now extends beyond documentation to address the

Read More »

9 steps to take to prepare for a quantum future

“If you’re in pharma or chemical industry, they’re using it already,” says Constellation’s Mueller. “You have to look into it,” Mueller warns. And quantum computers are already playing an important role in protein folding, he says. “Quantum qubits are taking over traditional architectures for protein folding and mapping,” he says.

Read More »

Trump tariffs bite down on UK energy shares

Global share prices have fallen after US president Donald Trump introduced sweeping tariffs on a vast range of the country’s trading partners. Fears about the impact that these tariffs could have on the global economy meant that shares on stock markets around the world started to dip when they were first announced on Thursday 3 April. Friday 4 April saw these loses grow, and when markets opened after the weekend, shares feel even further. Energy stocks have shown that they aren’t immune to these loses, with companies across the size and supply chain spectrum all being affected. The supermajors have been hard hit on both sides of the Atlantic. BP and Shell, ExxonMobil and Chevron, and TotalEnergies and Equinor all taking hits. On 2 April, before the full announcement of the Trump tariffs, BP closed the day on 432.75p It saw an immediate drop when the market opened on 3 April, falling to 422.20p. By the time the first two days were over, its share price was 371.28p And when markets opened Monday, shares fell again, hitting a low of 347.85p before regaining some ground to go above 350p Shell suffered a similar fate, closing 2 April at 2,775.50p and immediately dropping 2,707p the next day. By market close on 4 April, it was down on 2,491.50p, and fell again Monday to 2,294.50p. Like BP, it regained a little ground to hover around 2,340p. US stocks were similarly hit. ExxonMobil lost $10 over the first two days of the tariff-inspired selloff, going from over $115 to close the week at $105. Monday’s market open cut another $5 off its share price, and the day has seen it go just below the $100 level. Meanwhile, Chevron went from its pre-tariff level of $163.47 down to $143.11 at market close on Friday.

Read More »

Trump Team Proposes Ending Clean Energy Office, Cutting Billions

The US Energy Department is proposing to shut down its Office of Clean Energy Demonstrations and cut some $9 billion in awards for programs regarding carbon capture, direct air capture, solar and hydrogen, according to documents seen by Bloomberg.  Under the plan, which isn’t final, the $27 billion agency’s staff would be reduced to 35 employees, and about $10 billion in projects, including $3 billion for so-called hydrogen hubs, would be kept “as is” and transferred to other parts of the Energy Department. The office employed about 250 people before President Donald Trump, a climate change skeptic, returned to office.  A decision on the fate of the agency, which was created in 2021 in bipartisan infrastructure legislation, could come as soon as next week, according to Axios, which earlier reported on the proposal.  The Energy Department didn’t immediately respond to a request for comment on Friday evening. Money for hydrogen hubs, in which the gas can both be produced and used, is intended to highlight its potential as carbon-free fuel to run factories and power plants. Direct air capture involves technology that sucks planet-warming carbon dioxide out of the air. Among the funding identified for potential elimination are billions of dollars for hubs in California, the Pacific Northwest, the Mid-Atlantic and the Midwest while hubs in Texas, Appalachia, and the Minnesota, North and South Dakota would be spared, according to the documents. The plan also recommends keeping $3.5 billion for advanced nuclear reactor projects and $1.9 billion in industrial demonstration program projects. Also slated for termination are carbon capture projects, battery storage projects, and a direct air capture project by a subsidiary of Occidental Petroleum Corp. that was selected for an award of as much as $1.2 billion by the Biden administration in 2023.  The plan comes as the Energy Department, which has a

Read More »

US electricity demand will grow 50% by 2050, electrical manufacturer study finds

Driven by data centers and transportation electrification, U.S. electricity demand will increase 2% annually and 50% by 2050, the National Electrical Manufacturers Association said in a study published Monday. Improvements to energy efficiency will mute some potential demand gains, leaving the electrical manufacturing group’s projections “somewhere in the middle” compared with other recent studies, NEMA President and CEO Debra Phillips said in a Friday discussion of the report. Permission granted by the National Electrical Manufacturers Association The analysis, completed for NEMA by PA Consulting, anticipates 300% growth in data center energy consumption over next 10 years and 9,000% projected growth in e-mobility power consumption through 2050. The electric vehicle deployment curve has “flattened” in recent years but “we still think the promise is there, for the consumer,” Phillips said, pointing to advances in battery technology establishing greater range and improving prices. The electricity demand expected in the next quarter century “is fairly remarkable,” Phillips said, adding, “our grid wasn’t designed really to meet demand growth at this rate … and so we’re going to have to get creative around the technology and policy solutions that are going to help us meet the demand.” On the policy front, NEMA’s report calls for: Permitting and siting reform, including for generation, transmission, distribution and critical minerals development, improved interregional electric transmission and the adoption of grid enhancing technologies; Tax certainty around incentives for grid technologies and domestic manufacturing of critical grid infrastructure, and incentives that enable utilities to make significant smart grid, distributed energy, and resiliency investments; An all-of-the-above approach to energy resources, including natural gas, small modular reactors and geothermal.  In the face of “unprecedented” growth in energy consumption and challenges posed by aging grid, “utility operators will require cutting-edge solutions that enhance the resilience and reliability of our grid,” Michael Plaster, executive vice president

Read More »

FERC urged to reject proposed ROE, incentives for $3B Valley Link transmission project

The Federal Energy Regulatory Commission should reduce the proposed return on equity and deny various incentives for a roughly $3 billion transmission project planned by Transource Energy, Dominion Energy and FirstEnergy, according to Friday filings at the agency. The Valley Link transmission project, which includes two 765-kV backbone transmission lines, is part of the PJM Interconnection’s latest Regional Transmission Expansion Plan, which was approved by the grid operator’s board in late February. The project is slated to come online in December 2029. The project is set to be built by Valley Link Transmission, a joint venture between Transource (36%), which is owned by American Electric Power and Evergy, FirstEnergy (34%) and Dominion (30%). Valley Link in mid-March asked FERC to approve formula rates and transmission incentives for the project, which the company said will “ensure reliability and brings innovative collaboration to large-scale infrastructure development in PJM at a time when efficient and cost-effective regional transmission development is essential.” Valley Link is seeking a 10.9% base ROE, plus an additional 0.5% for belonging to PJM. It also asked for a suite of incentives for the project, including “construction work in progress,” the ability to recoup any prudent costs it may have incurred even if the project is abandoned and a hypothetical capital structure of 60% equity and 40% debt. The proposed ROE and incentives will ease the risks that Valley Link faces as a startup transmission developer, the company told FERC. “Valley Link will face significant risks and challenges in financing, developing, and constructing the project portfolio,” the company said. The incentive package results in “an impermissible transfer of risk onto ratepayers,” the Maryland Office of People’s Counsel, which represents residential ratepayers, told FERC on Friday. Also, the proposed base ROE appears to be too high, according to the OPC, which

Read More »

Industry Bodies Examine Texas Upstream Employment for February

According to the Texas Independent Producers and Royalty Owners Association’s (TIPRO) analysis, direct Texas upstream employment for February totaled 205,400. That’s what TIPRO said in a statement sent to Rigzone by the TIPRO team recently, which cited the latest Current Employment Statistics (CES) report from the U.S. Bureau of Labor Statistics (BLS). In the statement, TIPRO highlighted that the February figure was “an increase of 1,900 industry positions from January employment numbers, subject to revisions”. The industry body noted that this represented an increase of 2,500 jobs in the services sector and decline of 600 jobs in oil and gas extraction. TIPRO said in its statement that its new workforce data indicated strong job postings for the Texas oil and natural gas industry, adding that there were 10,172 active unique jobs postings for the Texas oil and natural gas industry in February, including 3,337 new postings. Among the 19 specific industry sectors TIPRO uses to define the Texas oil and natural gas industry, Gasoline Stations with Convenience Stores led in the ranking for unique job listings in February with 2,541 postings, the organization highlighted in its statement. This sector was followed by Support Activities for Oil and Gas Operations, with 2,389 postings, and Petroleum Refineries, with 870 postings, TIPRO outlined.   The leading three cities by total unique oil and natural gas job postings were Houston, with 2,368, Midland, with 669, and Odessa, with 449, TIPRO pointed out in the statement. The top three companies ranked by unique job postings in February were Cefco, with 927, Love’s, with 680, and John Wood Group, with 308, according to TIPRO. In its statement, TIPRO noted that, of the top ten companies listed by unique job postings in February, four companies were in the services sector, three were in the gasoline stations with convenience stores category, two were midstream companies, and one was an oil and gas operator. Top posted industry occupations for February included

Read More »

US DOE Awards Operating Contract for Emergency Oil Reserve

The Department of Energy (DOE) has awarded Strategic Storage Partners LLC a $1.4-billion contract to manage and operate the United States’ Strategic Petroleum Reserve for an extendable five years. “After a transition period, Strategic Storage Partners, LLC, will assume responsibility for management and operation of the SPR on June 15, 2025”, the DOE said in an online statement. The federally owned oil stocks are stored in underground salt caverns at four sites in Louisiana and Texas. The DOE said the SPR has run under a series of contracts awarded by the agency since 1985. “Strategic Storage Partners submitted their proposal in partnership with two primary joint venture partners, Aptim Federal Services, LLC, headquartered in Baton Rouge, Louisiana and BWXT Technical Services Group, Inc., headquartered in Lynchburg, Virginia”, the statement said. The partnership was selected after a “rigorous competitive selection process”, the DOE said. The previous contractor was Fluor Federal Petroleum Operations LLC. In 2013 this consortium won a $1.46 billion award to operate the SPR, according to a DOE press release September 18 that year. The contract was awarded for an initial 5 years, extendable for another 5, the DOE said then. Based on information on government website USAspending.gov, the contract started April 2014 and is expected to end June 2025. The SPR, mandated by the Energy Policy and Conservation Act to be only used during severe supply disruptions, stood at 396.434 million barrels as of the fourth week of March, according to an online dashboard by the Energy Information Administration. In November 2024 the DOE made the final awards for a reinjection campaign to compensate withdrawals induced by Russia’s invasion of Ukraine. The replenishment program exceeded the 2022 sales from the SPR by 20 million barrels, according to the DOE. Purchases under the refill program were made “at a good

Read More »

DARPA backs multiple quantum paths in benchmarking initiative

Nord Quantique plans to use the money to expand its team, says Julien Camirand Lemyre, the company’s president, CTO and co-founder. That’s an opportunity to accelerate the development of the technology, he says. “By extension, what this will mean for enterprise users is that quantum solutions to real-world business problems will be available sooner, due to that acceleration,” he says. “And so enterprise customers need to also accelerate how they are thinking about adoption because the advantages quantum will provide will be tangible.” Lemyre predicts that useful quantum computers will be available for enterprises before the end of the decade. “In fact, there has been tremendous progress across the entire quantum sector in recent years,” he says. “This means industry needs to begin thinking seriously about how they will integrate quantum computing into their operations over the medium term.” “We’re seeing, with the deployment of programs like the QBI in the US and investments of billions of dollars from  public and private investors globally, an increasing maturity of quantum technologies,” said Paul Terry, CEO at Photonic, which is betting on optically-linked silicon spin qubits.  “Our architecture has been designed from day one to build modular, scalable, fault-tolerant quantum systems able to be deployed in data centers,” he said. He’s not the only one to mention fault-tolerance. DARPA stressed fault-tolerance in its announcement, and its selections point to the importance of error correction for the future of quantum computing. The biggest problem with today’s quantum computers is that the number of errors increases faster than the number of qubits, making them impossible to scale up. Quantum companies are working on a variety of approaches to reduce the error rates low enough that quantum computers can get big enough to actually to real work.

Read More »

Zayo’s Fiber Bet: Scaling Long-Haul and Metro Networks for AI Data Centers

Zayo Group Holdings Inc. has emerged as one of the most aggressive fiber infrastructure players in North America, particularly in the context of AI-driven growth. With a $4 billion investment in AI-related long-haul fiber expansion, Zayo is positioning itself as a critical enabler of the AI and cloud computing boom. The company is aggressively expanding its long-haul fiber network, adding over 5,000 route miles to accommodate the anticipated 2-6X increase in AI-driven data center capacity by 2030. This initiative comes as AI workloads continue to push the limits of existing network infrastructure, particularly in long-haul connectivity. New Fiber Routes The new routes include critical connections between key AI data center hubs, such as Chicago-Columbus, Las Vegas-Reno, Atlanta-Ashburn, and Columbus-Indianapolis, among others. Additionally, Zayo is overbuilding seven existing routes to further enhance network performance, resiliency, and low-latency connectivity. This new development is a follow-on to 15 new long haul routes representing over 5300 route miles of new and expanded capacity deployed over the last five years. These route locations were selected based on expected data center growth, power availability, existing capacity constraints, and specific regional considerations. The AI Data Center Sector: A Significant Driver of Fiber Infrastructure The exponential growth of AI-driven data center demand means that the U.S. faces a potential bandwidth shortage. Zayo’s investments look to ensure that long-haul fiber capacity keeps pace with this growth, allowing AI data centers to efficiently transmit data between key markets. This is especially important as data center development locations are being driven more by land and power availability rather than proximity to market. Emerging AI data center markets get the high speed fiber they need, especially as they are moving away from expensive power regions (e.g., California, Virginia) to lower-cost locations (e.g., Ohio, Nevada, Midwest). Without the high-speed networking capabilities offered by

Read More »

Crusoe Adds 4.5 GW Natural Gas to Fuel AI, Expands Abilene Data Center to 1.2 GW

Crusoe and the Lancium Clean Campus: A New Model for Power-Optimized Compute Crusoe Energy’s 300-megawatt deployment at the Lancium Clean Campus in Abilene is a significant marker of how data center strategies are evolving to integrate more deeply with energy markets. By leveraging demand flexibility, stranded power, and renewable energy, Crusoe is following a path similar to some of the most forward-thinking projects in the data center industry. But it’s also pushing the model further—fusing AI and high-performance computing (HPC) with the next generation of power-responsive infrastructure. Here’s how Crusoe’s strategy compares to some of the industry’s most notable power-driven data center deployments: Google’s Oklahoma Data Center: Proximity to Renewable Growth A close parallel to Crusoe’s energy-centric site selection strategy is Google’s Mayes County data center in Oklahoma. Google sited its facility there to take advantage of abundant wind energy, aligning with the local power grid’s renewable capacity. Similarly, Crusoe is tapping into Texas’s deregulated energy market, optimizing for low-cost renewable power and the ability to flexibly scale compute operations in response to grid conditions. Google has also been an industry leader in time-matching workloads to renewable energy availability, something that Crusoe is enabling in real time through grid-responsive compute orchestration. Sabey Data Centers in Quincy: Low-Cost Power as a Foundation Another instructive comparison is Sabey Data Centers’ Quincy, Washington, campus, which was built around one of the most cost-effective power sources in the U.S.—abundant hydroelectric energy. Sabey’s long-term strategy has been to co-locate power-intensive compute infrastructure near predictable, low-cost energy sources. Crusoe’s project applies a similar logic but adapts it for a variable grid environment. Instead of relying on a fixed low-cost power source like hydro, Crusoe dynamically adjusts to real-time energy availability, a strategy that could become a model for future power-aware, AI-driven workloads. Compass and Aligned: Modular, Energy-Adaptive

Read More »

Executive Roundtable: Data Center Site Selection and Market Evolution in a Constrained Environment

For the third installment of our Executive Roundtable for the First Quarter of 2025, we asked our panel of seasoned industry experts about how the dynamics of data center site selection have never been more complex—or more critical to long-term success. In an industry where speed to market is paramount, operators must now navigate an increasingly constrained landscape in the age of AI, ultra cloud and hyperscale expansion, marked by fierce competition for land, tightening power availability, and evolving local regulations.  Traditional core markets such as Northern Virginia, Dallas, and Phoenix remain essential, but supply constraints and permitting challenges are prompting developers to rethink their approach. As hyperscalers and colocation providers push the boundaries of site selection strategy, secondary and edge markets are emerging as viable alternatives, driven by favorable energy economics, infrastructure investment, and shifting customer demand.  At the same time, power procurement is now reshaping the equation. With grid limitations and interconnection delays creating uncertainty in major hubs, operators are exploring new solutions, from direct utility partnerships to on-site generation with renewables, natural gas, and burgeoning modular nuclear concepts. The question now is not just where to build but how to ensure long-term operational resilience. As data center demand accelerates, operators face mounting challenges in securing suitable land, reliable power, and regulatory approvals in both established and emerging markets.  And so we asked our distinguished executive panel for the First Quarter of 2025, with grid capacity constraints, zoning complexities, and heightened competition shaping development decisions, how are companies refining their site selection strategies in Q1 2025 to balance speed to market, scalability, and sustainability? And, which North American regions are showing the greatest potential as the next wave of data center expansion takes shape?

Read More »

Podcast: iMasons CEO Santiago Suinaga on the Future of Sustainable AI Data Centers

For this episode of the DCF Show podcast, host Matt Vincent, Editor in Chief of Data Center Frontier, is joined by Santiago Suinaga, CEO of Infrastructure Masons (iMasons), to explore the urgent challenges of scaling data center construction while maintaining sustainability commitments, among other pertinent industry topics. The AI Race and Responsible Construction “Balancing scale and sustainability is key because the AI race is real,” Suinaga emphasizes. “Forecasted capacities have skyrocketed to meet AI demand. Hyperscale end users and data center developers are deploying high volumes to secure capacity in an increasingly constrained global market.” This surge in demand pressures the industry to build faster than ever before. Yet, as Suinaga notes, speed and sustainability must go hand in hand. “The industry must embrace a build fast, build smart mentality. Leveraging digital twin technology, AI-driven design optimization, and circular economy principles is critical.” Sustainability, he argues, should be embedded at every stage of new builds, from integrating low-carbon materials to optimizing energy efficiency from the outset. “We can’t afford to compromise sustainability for speed. Instead, we must integrate renewable energy sources and partner with local governments, utilities, and energy providers to accelerate responsible construction.” A key example of this thinking is peak shaving—using redundant infrastructure and idle capacities to power the grid when data center demand is low. “99.99% of the time, this excess capacity can support local communities, while ensuring the data center retains prioritized energy supply when needed.” Addressing Embodied Carbon and Supply Chain Accountability Decarbonization is a cornerstone of iMasons’ efforts, particularly through the iMasons Climate Accord. Suinaga highlights the importance of tackling embodied carbon—the emissions embedded in data center construction materials and IT hardware. “We need standardized reporting metrics and supplier accountability to drive meaningful change,” he says. “Greater transparency across the supply chain can be

Read More »

Executive Roundtable: The Changing Economics of Data Center Development

For the final installment of our Executive Roundtable for the First Quarter of 2025, we asked our panel of seasoned industry experts about how the economics of data center development are shifting as rising construction costs, supply chain volatility, and evolving enterprise workloads reshape the industry’s growth trajectory.  Operators are under pressure to deliver capacity at scale while managing higher capital expenditures, extended lead times for critical infrastructure, and increasing complexity in customer requirements. At the same time, demand remains strong, driven by AI workloads, cloud expansion, and heightened enterprise reliance on digital infrastructure. To remain competitive, data center providers must rethink everything from procurement strategies to financing models, balancing long-term investment with the need for speed and flexibility in deployment.  Energy procurement and real estate dynamics are also in flux. As power availability becomes a gating factor in core markets, operators and energy specialists are forging new partnerships to secure sustainable, cost-effective solutions. Meanwhile, the real estate landscape is evolving, with developers looking at creative approaches to land acquisition, modular construction, and even repurposing existing assets.  With construction costs escalating, supply chain disruptions extending project timelines, and enterprise IT requirements shifting toward AI and high-density workloads, data center operators, energy providers, and real estate firms are reevaluating their business models. And so we asked our expert panel: From innovative procurement strategies and new financing approaches to evolving power solutions and creative site selection methods, what adjustments are being made to sustain growth and profitability in 2025? And how are industry leaders balancing cost efficiency, scalability, and customer needs in an increasingly complex and competitive market?

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »