Stay Ahead, Stay ONMINE

The Tokamak Interview: Warrick Matthews, chief executive

Speaking at Hampton Court Palace on the sidelines of the Terra Carta Sustainable Markets Initiative sustainable transition summit, Tokamak Energy chief executive Warrick Matthews described how his job involves “myth busting” around nuclear fusion. In an exclusive interview, he explains how the company is on a path to commercialising nuclear fusion technology after raising $150 […]

Speaking at Hampton Court Palace on the sidelines of the Terra Carta Sustainable Markets Initiative sustainable transition summit, Tokamak Energy chief executive Warrick Matthews described how his job involves “myth busting” around nuclear fusion.

In an exclusive interview, he explains how the company is on a path to commercialising nuclear fusion technology after raising $150 million from existing and strategic investors in November.

Energy Voice: Can you tell me any more about Tokamak’s tech scale-up prototype for wind power?

Warrick Matthews: We’re 15 years old as a company and our founders spun out of UKAEA (the UK Atomic Energy Authority). They had operated a conventional tokamak. A tokamak is a device to hold plasma in a strong magnetic field. The conventional tokamak looks like a doughnut, a Torus. Our device looks a bit more like a cored apple shape.

Future machines have to have superconductors. You put energy into them and at an operating temperature they have zero resistance. They can carry very high currents, which is why they’re suddenly interesting for power transmission.

Or you can wind them into coil packs and produce very high-field magnets – that’s the magnet that goes in a big sort of D-shape, or in a ring around the plasma.

You mentioned how tokamaks can be used for power generation. Please elaborate.
We’re engaged across numerous verticals [with] strategic partners now in propulsion – in water, on land for rail, in the air for new hybrid propulsion and then in space.

When you look at the biggest, highest power-output offshore wind turbines, they’re enormous and expensive and a lot of the design is based around the nacelle at the top – which is extremely heavy because it’s got hundreds of tonnes of rare-earth permanent magnet installed in the machine.

If you replace that with the technology that we now use, you would literally take out 99.9% of the rare earth magnet material. You would produce a higher magnetic field device, so it can be much lighter.

You would also decouple your supply chain from certain countries, specifically those rare-earth magnets coming from China. It gives you more flexibility.

If you looked at it, it looks like a metal ribbon, but within that metal ribbon there is a layer of the superconductor which is between one and three microns thick; which is a 20th of a human hair.

You’re really minimising the amount of copper that you would use or aluminium you’d use in a data centre.

In November you secured $125 million (£115m) of financing. What will that be put towards?  

We’ve raised a total of $330 million historically and a lot of that was developing technology and, of course, building our machine that we rely upon for our development, ST40.

The money that we’ve raised in the series C rounds; firstly, we’ve had some very loyal shareholders and investors across our history. And they followed their investment into this series. But we wanted to also add in new investors.

So, we’re really pleased in this round that we added in the likes of Lingotto, which is part of the Exor and Agnelli group. We added in Furukawa Electric who are one of our supply chain partners, who now invest in us and as a result, we’ve got two unique opportunities in Japan.

We are also rapidly developing this magnet technology within the company, which is revenue generating already. And as we identify new verticals, that gives us the ability to really grow a business within the business.

Is it true that you have a demonstrator already in place specifically for nuclear fusion?

Controlling burning plasma within a device naturally sounds really quite hard. It’s not quite the power of a volcano. But it’s containing the fourth state of matter. It’s never going to be so easy, right? Machine learning and AI genuinely does have a huge promise for that.

In ST40, we’ve taken plasma temperatures to seven to eight times the temperature of the sun; and so over 100 million degrees, which is one of the thresholds you need for fusion. Essentially, fusion on Earth is just trying to replicate what nature does really well in the sun or in a star…

Get plasma really hot, have very strong gravitational fields that we replicate with a magnet, and then force isotopes of hydrogen that would naturally want to repel themselves from each other, to fuse together, make helium, shed a neutron; and there’s your energy.

How is fusion dealing with different risks around nuclear waste and hydrogen? 

The reason why so many companies and governments are driven to develop fusion – it’s limitless energy. It’s incredibly energy dense and it’s an inherently fail-safe reaction that’s going on – so to unpack a few of those things then, fuel-wise; it’s limitless.

Effectively you’re using deuterium, from seawater and tritium in a commercial device, which the device needs to manufacture a surplus of, and we know how you can do that.
You impact lithium with the neutrons, you create a surplus of that, and governments like the UK are very proactive in working that part of the plan.

You’ve got fuel that you don’t have to go to the far corners of the earth to find; it’s sea water.

The energy-dense bit is probably one of my favourite ever stats for fusion. That a one-litre bottle of fusion fuel is equivalent to 10 million litres of oil; which is a heck of a prize to go after and much cleaner so no CO2. Back to the inherent failsafe, unlike a fission device – and the first thing I’d say is there are lots of fission devices around the world that are incredibly safe, so I’m not knocking fission as a bad thing – but it’s very different. It’s the opposite of fusion.

Clearly, in fission you are splitting a heavy atom and you’re containing a chain reaction, and your fuel is right there in the reactor. Go over to a fusion device, and you are forcing fusion to occur with these incredible parameters around temperature and pressure. And if any of those dropped, so if your temperature dropped below that threshold, a hundred million, then everything stops, because it always wants to stop.

In the case of nuclear fusion, there’s no need to get rid of nuclear waste, is that right?
I think looking at nuclear fission as an industry, the amount of materials that you’re having to deal with versus the upside of baseload power that’s not emitting I would still say it’s worth the equation, but you have very long-lived nuclear waste. You’re into activated materials at the thousands or tens of thousands of years, which is quite a hard thing to comprehend.

Fusion is nothing like fission. You do activate some components that are there, like the first wall of the device, but you’re into tens of years, not tens of thousands of years, that you would have to take care of those components and store them. It’s a very, very different prospect. It is a low to intermediate level of waste on a key number of components in the device.

The whole safety point leads us to another, which is regulation; because one of the things that holds it back is the cost and speed of deployment, a lot of which loops back to its nuclear regulated rights.

Fusion, whether that’s our own device that we operate or up the road at UKAEA, that’s operated for 40 years. None of that is nuclear regulated. It’s regulated by the Environment Agency and the Health and Safety Executive. It’s equivalent to operating a big hospital.

Do you think the UK is doing enough in terms of funding for fusion right now? 

The UK can look at itself and say we are a global leader in fusion, which is a good start.
We have 40 years of operating the Joint European Torus, the most powerful tokamak that’s existed at Culham in Oxfordshire. There are 3,000 people who work around fusion. It has spun out companies like ours to drive fusion forward. So, we’re in a very good place right now.

But the global competition is hot. From the US, from partner countries and now increasingly China, it’s really driving both financial and people resources into fusion. They want to advance that quickly.

That’s good for the world, but we don’t want to lose the position that we’ve got, so we’ve really got to keep up the intensity of the activity.

And how far away are we from having the first nuclear fusion generated power, potentially in homes or in industrial works at commercial scale?

The consensus view of those in the US milestone programme, which includes talking about energy, is that you can design and build a fusion pilot plant generating net energy that could be going to the grid, or could be heat for an industrial process, by the mid-2030(s), a decade away.

The UK’s big project that follows JET (Joint European Torus) in Oxfordshire is called STEP (Spherical Tokamak for Energy Production), which is being built at the site of an old coal-fired power station in Nottinghamshire.

It really does become the power for humanity after that – not just green electrons, but also industrial applications that need heat rather than electricity. It’s great that we can talk about SAF (sustainable aviation fuel) and hydrogen and ammonia; it’s going to power everything, but you’ve got to produce it.

How do you think fusion might compete with or complement small modular reactors (SMRs) in future? 

SMR technology is more available and near return than fusion. It’s out there right now, so we’re big supporters of getting going with that. But we see it as an important step on the way to fusion.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Cisco, Google Cloud offer enterprises new way to connect SD-WANs

Looking back, the rapid adoption of SaaS and cloud applications led to a WAN transformation and the emergence of SD-WAN via direct internet access, Sambi asserted. “Then, to enhance application performance, enterprises built colocation-based cloud on-ramps, which, while improving latency, introduced complexity and costs. This evolution led to a proliferation

Read More »

Fortinet embeds AI capabilities across Security Fabric platform

“By embedding FortiAI across the Fortinet Security Fabric platform, including new agentic AI capabilities, we’re empowering our customers to reduce the workload on their security and network analysts while improving the efficiency, speed, and accuracy of their security and networking operations,” said Michael Xie, founder, president, and chief technology officer

Read More »

EU States Set to Back More Flexibility for Filling Gas Storage

European Union member states are set to back more flexible rules for filling gas storage before winter, amid criticism that current targets artificially raise prices. Ambassadors from the bloc’s 27 member states will meet in Brussels on Friday to sign off on a joint push for a 10 percentage-point deviation until 2027 from rules that require tanks to be 90 percent full by winter. If their position is agreed soon with the European Parliament in upcoming talks, the new regulations could come into effect before the next heating season. The targets were brought in at the height of the energy crisis, when a drop in Russian flows sparked concern that Europe might not have enough gas to make it through a cold winter. But countries like Germany have said the rules helped inflate prices as speculators bet on expected purchases. The regulations have also been criticized for distorting the market by pushing up prices in summer, when they’d normally be cheapest. The plan to secure more leeway to fill storage, combined with the fallout of the trade war, has helped spark a sharp drop in prices. European gas futures this week hit the lowest since September, extending a retreat from February’s two-year high. Energy costs are a key concern for EU officials and governments.  In parliament, lawmakers in the industry committee will on April 24 vote on their position on the storage regulation. Changes proposed by the center-right European People’s Party, the largest group in the assembly, are broadly similar to what’s likely to be agreed by member states on Friday. “The EPP is calling for a more balanced approach that maintains energy security but urgently returns to market-based mechanisms,” Andrea Wechsler, an EPP negotiator, said at a committee meeting this week. Storage Proposals Under the proposals, the Nov. 1 deadline

Read More »

National Subsea Centre and AI firm partner on decom scheduling tool

A piece of software to optimise subsea decommissioning operations has been developed by Aberdeen’s National Subsea Centre (NSC) and marine AI firm PlanSea. The new tool aims to slash emissions from operations while giving users cheaper and “resource-efficient” outcomes. The duo behind the software have said that it will address the challenges facing a sector which is set to see an estimated £21 billion in spending within the UK over the next decade. The project was funded in part by a grant awarded by the Scottish Funding Council in partnership with Scottish Enterprise. Jim Cargill, chief executive of PlanSea said: “The NSC and PlanSea have successfully collaborated for many years to utilise AI to optimise marine logistics, and this logical extension of our joint technological capabilities to cover subsea decommissioning evidences a UK centre of excellence in this sector. “This is another step in our quest to digitalise and optimise offshore operations whilst at the same time reducing costs and emissions.” © Supplied by RGUAI Dashboard for NSC and PlanSea’s decommissioning software. The tech takes in data from private and public sources to produce “task-based activities” which are compatible with current PlanSea visualisation and optimisation software. It then creates standalone or campaign plans, which show vessel schedules and task allocations. The tool also simulates varying fleet composition and tooling or resource functionality for alternative options for decommissioning plans Dr. Ciprian Zavoianu, NSC Research programme lead and principal investigator for this project, added: “Together with my NSC colleagues, Dr. Fyvie and Dr. Santos, I’ve had another great experience working with PlanSea on the timely delivery of a challenging piece of marine logistics applied research. “Dr Fyvie has been instrumental in developing a robust software demonstrator that can ingest decommissioning data from multiple sources and subsequently leverage a bespoke AI-enhanced scheduler to

Read More »

NEO and Repsol to save ‘hundreds of millions in tax’ following merger

Tax savings worth hundreds of millions for North Sea oil and gas companies are the “main driver” of deals like Repsol and Neo Energy. CMS partner Norman Wisely, who worked on the tie-up between Repsol and Neo, said the firms could save “hundreds of millions in tax” following the deal. The two businesses will combine their North Sea oil and gas producing assets, with Neo claiming 55% ownership of the new Neo Next Energy, while Repsol takes ownership of the remaining 45% stake. As the Spanish energy giant retains $1.8 billion in decommissioning liability, the Aberdeen-based operator stands to gain, Wisely explained. Norman Wisely Meanwhile, the Madrid-headquartered company will no longer have to report carbon emissions from the North Sea in the same way, due to having relinquished its status as an operator to that of a minority shareholder in the firm, which is expected to have a combined production of more than 130,000 boe/d this year. Wisely explained the structure of the deal means the new entity will be able to take advantage of tax relief on the costs associated with decommissioning ageing North Sea rigs and infrastructure. Decom discounts Repsol is thought to have one of the largest decommissioning liability portfolios in the North Sea, including the Fulmar, Saltire and Tartan platforms. The combined group will target “synergies” from the combination worth over $1 billion, a law firm acting for Repsol has said. Slaughter and May, which acted for Repsol‘s E&P division that will own the stake in the new North Sea business, said the Spanish firm will commit to spend up to $1.8bn, which represents around 40% of the decommissioning liabilities. “Repsol will retain $1.8bn of decom liability. So that is a debt effectively that is being paid into the company [Neo Next],” Wisely told Energy Voice.

Read More »

In an uncertain world, you need a forward-thinking recruitment partner

The energy industry is gripped by uncertainty. Market volatility, energy security, shifting geopolitics and pressure to expand into cleaner energy are forcing companies to rethink their focus. That includes planning for tomorrow’s workforce.   About partnership content Some Energy Voice online content is funded by outside parties. The revenue from this helps to sustain our independent news gathering. You will always know if you are reading paid-for material as it will be clearly labelled as “Partnership” on the site and on social media channels, This can take two different forms. “Presented by”This means the content has been paid for and produced by the named advertiser. “In partnership with”This means the content has been paid for and approved by the named advertiser but written and edited by our own commercial content team. Securing the best talent is critical. Companies can’t afford to be left behind. Advanced technology represents the latest evolution of recruitment. Its expert use can provide faster, more reliable and broader-reaching recruitment solutions. For employers and candidates, it pays to work with a recruitment partner at the forefront of technology, who understands the industry and retains its human touch. How technology is transforming energy recruitment Technology has reshaped talent acquisition. Artificial intelligence (AI) and machine learning have moved beyond traditional automation to bring measurable improvements to hiring efficiency. For recruitment companies, these technologies open up the world of talent globally. It means they can access the best people, from contingent workers to executives, regardless of where they are. This article explores how AI specifically is revolutionising energy recruitment. Advanced algorithms can assess candidate skills, experience and compatibility, connecting with the right opportunities faster than ever before. In some cases, that can mean cutting the recruitment cycle from weeks to days or even minutes. Everybody wants the best people. Being

Read More »

Testing Starts on First Higher Enriched Fuel in U.S. Commercial Reactor

Southern Nuclear recently loaded a new higher enriched nuclear fuel into a commercial reactor for irradiation testing.  This is the first time fuel enriched above 5 percent will be irradiated in a U.S. commercial reactor.  The higher enrichment levels allow the fuel to last longer and operate at increased power levels — potentially leading to additional reliable power production at nuclear power plants across the country.   The advanced fuel was developed through the U.S. Department of Energy (DOE) Accident Tolerant Fuel Program to help improve fuel cycle safety and lower operational costs.  Preparing for Testing  Southern Nuclear recently loaded four lead test assemblies containing Westinghouse Electric Company’s ADOPT® fuel pellets into the Vogtle Unit 2 reactor in Waynesboro, GA.   The new fuel is enriched up to 6 weight percent of uranium-235 – the main fissile isotope that produces energy during a chain reaction and could be a game-changer for the industry.   Commercial reactors currently operate on fuel that typically ranges between 3 and 5 percent enrichment.   The higher enriched fuel could help extend operation cycles from 18 to 24 months, allow for higher power output, and lead to less waste generated over the life of the reactor. The pellets, which also include additives expected to enhance safety performance, were derived from higher enriched uranium oxide powder that was first prepared by Idaho National Laboratory.   The powder was further processed into fuel pellets and fabricated into pins at before they were shipped to Vogtle Unit 2 for commercial testing.  “This achievement is a significant step forward for not only the resiliency of the entire U.S. operating fleet, but future nuclear technologies,” said Pete Sena, Southern Nuclear Chairman, President and CEO. “Our goal is to operate our units for longer periods with higher output, and with higher enriched fuel, we’re even better positioned

Read More »

Trump Signs Order to Protect Big Oil From State Emissions Fines

US President Donald Trump is ordering the Justice Department to take legal action against states that impose fines on fossil fuel companies for their greenhouse gas emissions. Trump signed an executive order Tuesday directing US Attorney Pam Bondi to identify state laws or regulations that could impede the use of domestic oil and gas. The president told the Justice Department to prioritize targeting state efforts to address climate change and environmental justice. Oil executives had identified efforts by New York and other states to penalize the industry for its greenhouse gas emissions and contributions to climate change as a top concern during a meeting with the president at the White House last month, according to people familiar with the matter. It’s an example of how the industry is getting much of what it wants from the administration, even as Trump’s global tariffs have triggered a sharp drop in crude prices over the past week. The White House didn’t immediately comment.  Oil and gas companies regularly face fines and in some cases lawsuits from states pushing to cut emissions and hold the industry responsible for its contributions to climate change. Trump’s executive order singled out New York’s climate superfund law and California‘s cap-and-trade system for scrutiny. “Directing the Department of Justice to address this state overreach will help restore the rule of law and ensure activist-driven campaigns do not stand in the way of ensuring the nation has access to an affordable and reliable energy supply,” Ryan Meyers, senior vice president and general counsel for the American Petroleum Institute, said in a statement.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed. MORE FROM THIS AUTHOR Bloomberg

Read More »

Podcast: Nomads at the Frontier – AI, Infrastructure, and Data Center Workforce Evolution at DCD Connect New York

The 25th anniversary of the latest Data Center Dynamics event in New York City last month (DCD Connect NY 2025) brought record-breaking attendance, underscoring the accelerating pace of change in the digital infrastructure sector. At the heart of the discussions were evolving AI workloads, power and cooling challenges, and the crucial role of workforce development. Welcoming Data Center Frontier at their show booth were Phill Lawson-Shanks of Aligned Data Centers and Phillip Koblence of NYI, who are respectively managing director and co-founder of the Nomad Futurist Foundation. Our conversation spanned the pressing issues shaping the industry, from the feasibility of AI factories to the importance of community-driven talent pipelines. AI Factories: Power, Cooling, and the Road Ahead One of the hottest topics in the industry is how to support the staggering energy demands of AI workloads. Reflecting on NVIDIA’s latest announcements at GTC, including the potential of a 600-kilowatt rack, Lawson-Shanks described the challenges of accommodating such density. While 120-130 kW racks are manageable today, scaling beyond 300 kW will require rethinking power distribution methods—perhaps moving power sleds outside of cabinets or shifting to medium-voltage delivery. Cooling is another major concern. Beyond direct-to-chip liquid cooling, air cooling still plays a role, particularly for DIMMs, NICs, and interconnects. However, advances in photonics, such as shared laser fiber interconnects, could reduce switch power consumption, marking a potential turning point in energy efficiency. “From our perspective, AI factories are highly conceivable,” said Lawson-Shanks. “But we’re going to see hybridization for a while—clients will want to run cloud infrastructure alongside inference workloads. The market needs flexibility.” Connectivity and the Role of Tier-1 Cities Koblence emphasized the continuing relevance of major connectivity hubs like New York City in an AI-driven world. While some speculate that dense urban markets may struggle to accommodate hyperscale AI workloads,

Read More »

2025 Data Center Power Poll

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; }

Read More »

How Microgrids and DERs Could Solve the Data Center Power Crisis

Microgrid Knowledge’s annual conference will be held in Dallas, Texas this year. Energy industry leaders and microgrid developers, customers and enthusiasts will gather April 15-17 at the Sheraton Dallas, to learn from each other and discuss a wide variety of microgrid related topics. There will be sessions exploring the role microgrids can play in healthcare, military, aviation and transportation, as well as other sectors of the economy. Experts will share insights on fuels, creating flexible microgrids, integrating electric vehicle charging stations and more.  “Powering Data Centers: Collaborative Microgrid Solutions for a Growing Market” is expected to be one of the most popular sessions at the conference. Starting at 10:45am on April 16, industry experts will tackle the biggest question facing data center operators and the energy industry – how can we solve the data center energy crisis? During the session, the panelists will discuss how private entities, developers and utilities can work together to deploy microgrids and distributed energy technologies that address the data center industry’s rapidly growing power needs. They’ll share solutions, technologies and strategies to favorably position data centers in the energy queue. In advance of the conference, we sat down with two of the featured panelists to learn more about the challenges facing the data center industry and how microgrids can address the sector’s growing energy needs. We spoke with session chair Samantha Reifer, director of strategic alliances at Scale Microgrids and Elham Akhavan, senior microgrid research analyst at Wood Mackenzie. Here’s what Reifer and Akhavan had to say: The data center industry is growing rapidly. What are the critical challenges facing the sector as it expands? Samantha Reifer: The biggest barrier we’ve been hearing about from our customers and partners is whether these data centers can get power where they want to build? For a colocation

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Data Center Facility Technician (All Shifts Available) Impact, TXThis position is also available in: Tacoma, WA; Ashburn, VA; New Albany, OH; Needham, MA and New York, NY.  Also working on lead electrical and lead mechanical CFTs in Chantilly, VA. Navy nuke / military vets leaving service accepted! This opportunity is working with a leading mission-critical data center provider. This firm provides data center solutions custom-fit to the requirements of their client’s mission-critical operational facilities. They provide reliability of mission-critical facilities for many of the world’s largest organizations facilities supporting enterprise clients, colo providers and hyperscale companies. This opportunity provides a career-growth minded role with exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Electrical Commissioning EngineerPhoenix, AZThis traveling position is also available in: Boydton, VA; Richmond, VA; Ashburn, VA; Charlotte, NC; Atlanta, GA; Hampton, GA; Fayetteville, GA; St Louis, MO; New Albany, OH; Dallas, TX; Chicago, IL or Toronto, ON. *** ALSO looking for a LEAD EE and ME CxA Agents.*** Our client is an engineering design and commissioning company that has a national footprint and specializes in MEP critical facilities design. They provide design, commissioning, consulting and management expertise in the critical facilities space. They have a mindset to provide reliability, energy efficiency, sustainable design and LEED expertise when providing these consulting services for enterprise, colocation and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Switchgear Field Service Technician – Critical FacilitiesNationwide Travel This position is also

Read More »

How Tariffs Could Impact Data Centers, AI, and Energy Amid Supply Chain Shifts

The present imposition of sweeping tariffs by the U.S. government has sent ripples through various sectors of the economy, with the data center industry poised to experience significant ramifications. These tariffs, encompassing a baseline 10% duty on all imports and escalating to higher percentages for specific countries—such as 54% on Chinese imports—are set to influence data center construction, hardware manufacturing, software development, supply chains, user demand, and energy consumption.​ Impact on Data Center Construction, Energy Access and Site Seletion Data center construction has long been dependent on key materials such as steel and aluminum, which are essential for everything from structural frameworks to power infrastructure. The newly enacted 25% tariff on steel imports represents a significant escalation in material costs, with analysts predicting a ripple effect throughout the entire data center ecosystem. For the construction industry, this price hike means an immediate increase in the cost per square foot of building new facilities—costs that are likely to be passed on to developers and ultimately to end users. More concerning, however, is the potential for delayed project timelines. The data center industry, already operating under tight deadlines to meet surging demand for digital infrastructure, could see construction timelines stretched as a result of both rising material costs and the limited availability of key components. Steel and aluminum are used in not just the physical building, but in critical power systems—transformers, switchgear, and cooling equipment. A shortage of these materials could, therefore, exacerbate ongoing supply chain bottlenecks, pushing back go-live dates for new facilities and forcing operators to reevaluate their development strategies. Furthermore, analysts are particularly worried about the compounded impact of these tariffs on already-strained energy access. In regions like Northern Virginia, Silicon Valley, and parts of Texas, data center growth has been stifled by grid congestion, making it difficult to

Read More »

With Ampere Deal, SoftBank Tightens Its Grip on AI Data Centers

From Silicon to Stargate: Aligning with OpenAI, Oracle, and the Future of AI Infrastructure The Ampere acquisition doesn’t stand alone. It is the latest and perhaps most strategic move in a broader chess game SoftBank is playing across the AI and data infrastructure landscape. To understand its full impact, the deal must be seen in context with SoftBank’s recent alignment with two other heavyweight players: OpenAI and Oracle. As you might’ve heard, earlier this year, OpenAI unveiled plans for its Stargate project—a massive, multi-billion-dollar supercomputing campus set to come online by 2028. Stargate is expected to be one of the largest AI infrastructure builds in history, and Oracle will be the primary cloud provider for the project. Behind the scenes, SoftBank is playing a key financial and strategic role, helping OpenAI secure capital and compute resources for the long-term training and deployment of advanced AI models. Oracle, in turn, is both an investor in Ampere and a major customer—one of the first hyperscale operators to go all-in on Ampere’s Arm-based CPUs for powering cloud services. With SoftBank now controlling Ampere outright, it gains a stronger seat at the table with both Oracle and OpenAI—positioning itself as an essential enabler of the AI supply chain from silicon to software. The Ampere deal gives SoftBank direct access to a custom silicon pipeline purpose-built for the kind of high-efficiency, high-throughput compute that AI inference and model serving demand at scale. Combine this with SoftBank’s ownership of Arm, the bedrock of energy-efficient chip design, and its portfolio now spans everything from the instruction set to the cloud instance. More importantly, it gives SoftBank leverage. In a world where NVIDIA dominates AI training workloads, there’s growing appetite for alternatives in inference, especially at scale where power, cost, and flexibility become deciding factors. Ampere’s CPU roadmap,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »