Stay Ahead, Stay ONMINE

NGOs win critical seabed minerals skirmish with Norwegian government

A year ago, Norway became the first country to back deep sea mineral prospecting in its waters, with a government plan to launch an exploration licenses bidding round this year (2025). Barely 11 months later, last December, the Norwegians suspended activity indefinitely, having been sued by a non-governmental organisation (NGO) – the widely respected Worldwide […]

A year ago, Norway became the first country to back deep sea mineral prospecting in its waters, with a government plan to launch an exploration licenses bidding round this year (2025).

Barely 11 months later, last December, the Norwegians suspended activity indefinitely, having been sued by a non-governmental organisation (NGO) – the widely respected Worldwide Fund for Nature (WWF).

It was January 2024 when the Storting (Norwegian parliament) voted in favour of opening about 280,000 sq. km (108,000 square miles) of sea space between Jan Mayen island and the Svalbard archipelago for seabed minerals exploration.

It argued that the world needed minerals for the green transition, and that it was necessary to explore the possibility of extracting seabed minerals in a big way from the Norwegian Continental Shelf.

Despite the size of the prospective prize, minor political party SV (Socialist Left) tabled a demand that the Oslo government scrap its first licensing round, comprising 386 blocks, in return for support for the budget for 2025, which is also an election year.

Clearly spooked, Prime Minster Jonas Gahr Stoere claimed it was a “postponement”.

But even before the Storting’s January vote, Norway’s PM had come under pressure from the EU.

On November 9 2023, 119 European parliamentarians from 16 European countries called for a halt to Norway’s plans to start deep sea mining in the Arctic. This will have been viewed as deeply hurtful at the Oslo Parliament.

An open letter was issued, signed by Members of the European Parliament, as well as national and regional parliaments.

It emphasised that the green transition could not be used to justify harming marine biodiversity and the world’s largest natural carbon sink (the ocean), especially when alternatives already exist.

Norway’s decision to proceed with deep sea mining, the letter said, could also set a dangerous precedent in international waters.

© Supplied by WWF
Karoline Andaur, CEO, WWF Norway.

This move forward, without a comprehensive international legal framework for deep sea mining, could open doors to similar ventures in other parts of the world, posing a risk to global ocean biodiversity.

WWF-Norway chief executive Karoline Andaur described the suspension as “a major and important environmental victory”.

However, a potentially major problem for Stoere is that parliamentary elections are due in September.

According to Norwegian media, the Conservative and Progress parties leading in the polls are in favour of deep sea mining.

The blocking by the SV party “has given the next Storting a chance to halt the hasty process,” according to Andaur.

A court decision was expected last month but no determination was evident at the time of going to press.

WWF’s court action was launched in May and is based on the impact assessment that the Stoere Administration used to guide decision making – which, by its own admission, contained scant information to help evaluate the potential impact of Arctic seafloor mining on the marine environment.

This mean that, for 99% of the area intended for offer to marine minerals prospectors, there is zero available environmental data.

But there is an estimate of the prize out there.

Two years ago, the Norwegian Petroleum Directorate on behalf of the country’s Ministry of Petroleum and Energy (MPE) published an assessment.

The Norwegian Continental Shelf survey identified “substantial” (millions of tonnes) of metals and minerals resources, ranging from copper to rare earth metals.

The list of ‘in-place’ reserve estimates includes:

  • 38 million tonnes of copper
  • 45 million tonnes of zinc
  • 2,317 tonnes of gold
  • 85,000 tonnes of silver
  • 4.1 million tonnes of cobalt
  • 230,000 tonnes of lithium
  • 24 million tonnes of magnesium
  • 8.4 million tonnes of titanium
  • 1.9 million tonnes of vanadium
  • 185 million tonnes of manganese

It happens that, of the metals found on the seabed in the study area, magnesium, niobium, cobalt and rare earth minerals are found on the European Commission’s list of critical minerals.

The WWF has accumulated significant knowledge of proposals around the world to establish an industry to mine minerals from deep ocean seafloors, typically resources such as manganese nodules.

Since 2019, the organisation has worked to ensure a global moratorium on deep seabed mining. Such a moratorium is considered necessary, at least until there is enough science available to make informed decisions about whether to go ahead with this allegedly destructive industry.

Without doubt, 2024 was a frustrating year for Norway as it sought to establish a lead in the seafloor minerals harvesting dash, claiming the energy transition as justification for wanting to be first out of the exploration gate.

On the one hand, research by a team of academics at Exeter University (published in the scientific journal Nature Sustainability in April) came down against deep sea mining.

On the other, the assessment A Deadly Moratorium by the Critical Ocean Mineral Research Center (COMRC) launched mid-October is deeply critical of WWF’s campaign against deep sea resources exploitation.

© Supplied by NCS
Arctic Map. Supplied by NCS.

The Exeter report

The Exeter scientists want a blanket worldwide moratorium, insisting the controversial emerging industry currently poses an “unjustifiable environmental risk”.

They insist that the arguments put forward for deep sea mining fail to hold true from both an environmental and economic geology perspective.

They state in a summary: “Crucially, there is currently no coherent ‘net-zero carbon’ argument for the practice because the metals which deep sea mining could potentially source – including copper, nickel and cobalt, which are urgently needed to build renewable energy technology and thereby help decarbonise our society – remain widespread on land.”

They warn too: “Whilst on-land mining is also, by definition, environmentally destructive it is a relatively mature ‘tried and tested’ technology. In contrast, deep sea mining is highly novel and the environmental risk of such activity therefore remains largely unknown and could be extremely severe – and irreversible on human timescales.”

They point to an urgent need to scale up current on-land mining to source metals needed to tackle the Climate Emergency, and then ultimately be displaced with a fully circular ‘recycle and reuse’ economy.

Exeter co-author Dr Kate Littler warned: “The deep sea is the largest biome on Earth, home to unique and vulnerable organisms, many of which are still unknown to science.

“Human activity continues to severely disturb the biology and biogeochemistry of the global ocean through fishing, shipping and pollution; it is imperative we take the upmost care before deciding to decimate the deep sea for transient economic returns.”

Exeter’s Professor James Scourse, who is closely involved in the so-called Convex Seascape Survey – an ambitious five-year project examining ocean carbon storage – said: “If allowed to go ahead, deep sea mining would potentially result in global impacts that transcend national jurisdictions.

“For once, we have an opportunity to prevent catastrophic exploitation and to support responsible onshore mining to the benefit of local communities.”

The COMRC report

Turning to A Deadly Moratorium, amorphous, multi-stakeholder-owned COMRC is unequivocal in its condemnation of WWF’s efforts to put an end to deep sea mineral mining before it has even started.

COMRC claims the moratorium push is forcing countries and mining corporations to double down on some of the deadliest mineral extraction practices known, in the most biodiverse ecosystems on the planet, directly adjacent to human settlements.

It is apparently bringing death, disease and displacement to many vulnerable indigenous people each year that could be avoided. It is also impeding efforts to decarbonise while increasing greenhouse gas emissions.

Other claims laid include:

  • Increased threat to Western national security and strategic industries – China dominates the world’s production and processing of critical minerals and has introduced restrictions on these minerals nine times from 2009-2020 (Coyne, 2024).
  • Reduction in opportunity for breakthrough medical therapies – investment in nodule exploration has driven increased access to deep sea biological data, creating the opportunity for medical breakthroughs. Yet, the moratorium is claimed to impede this research by cutting off industry funding.

Basically, A Deadly Moratorium is a call for signatories to the moratorium to “reconsider their stance in the name of a more just energy transition and a healthier planet.”

COMRC claims too that several environmental NGOs have disclosed that they are in favour of careful nodule collection.

“While most of these groups are reluctant to publish anything that would undermine the fundraising campaigns of big, corporate NGOs such as WWF and Greenpeace, at least one, The Breakthrough Institute, has already broken ranks (Wang, 2024).

“We know that others have followed, and we are confident that more will come along as they critically analyse the scientific research and the data.”

Meanwhile, the UN’s International Seabed Authority (ISA), which oversees areas of the marine floor that do not belong to national territories, has been working on rules for deep sea mining for years. But they are not yet complete.

The ISA has, to date, granted exploration licenses in various deep sea regions, including in the Pacific Ocean. Some countries such as China, Japan and Russia would like to start mining the seabed as soon as possible.

However, according to the WWF, 32 other states are now calling for a precautionary pause or a moratorium on deep sea mining to allow for more research.

And it has also been claimed that more than 50 international companies, including Apple, Google, Microsoft and BMW, have stated they will not source components from deep sea mining minerals. They are publicly listed by WWF.

ISA told Energy Voice: “A total of 31 exploration contracts have been issued to 22 contractors.

“In December 2021, CPRM (The Geological Survey Brazil) renounced its rights in relation to its exploration contract.

“As of today, the number of contracts in effect is 30 with 21 contractors.”

In the case of CPRM, its contract covered polymetallic nodules, polymetallic sulphides, and cobalt-rich crusts in the Clarion-Clipperton Fracture Zone, Central Indian Ocean Basin and Western Pacific Ocean.

It has taken decades for deep sea mining to get this far – and the pace seems unlikely to quicken anytime soon.

Recommended for you

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

MOL’s Tiszaújváros steam cracker processes first circular feedstock

MOL Group has completed its first certified production trial using circular feedstock at subsidiary MOL Petrochemicals Co. Ltd. complex in Tiszaújváros, Hungary, advancing the company’s strategic push toward circular economy integration in petrochemical production. Confirmed completed as of Sept. 15, the pilot marked MOL Group’s first use of post-consumer plastic

Read More »

Network jobs watch: Hiring, skills and certification trends

Desire for higher compensation Improve career prospects Want more interesting work “A robust and engaged tech workforce is essential to keeping enterprises operating at the highest level,” said Julia Kanouse, Chief Membership Officer at ISACA, in a statement. “In better understanding IT professionals’ motivations and pain points, including how these

Read More »

F5 to acquire CalypsoAI for advanced AI security capabilities

CalypsoAI’s platform creates what the company calls an Inference Perimeter that protects across models, vendors, and environments. The offers several products including Inference Red Team, Inference Defend, and Inference Observe, which deliver adversarial testing, threat detection and prevention, and enterprise oversight, respectively, among other capabilities. CalypsoAI says its platform proactively

Read More »

Chord Announces ‘Strategic Acquisition of Williston Basin Assets’

Chord Energy Corporation announced a “strategic acquisition of Williston Basin assets” in a statement posted on its website recently. In the statement, Chord said a wholly owned subsidiary of the company has entered into a definitive agreement to acquire assets in the Williston Basin from XTO Energy Inc. and affiliates for a total cash consideration of $550 million, subject to customary purchase price adjustments. The consideration is expected to be funded through a combination of cash on hand and borrowings, Chord noted in the statement, which highlighted that the effective date for the transaction is September 1, 2025, and that the deal is expected to close by year-end. Chord outlined in the statement that the deal includes 48,000 net acres in the Williston core, noting that “90 net 10,000 foot equivalent locations (72 net operated) extend Chord’s inventory life”. Pointing out “inventory quality” in the statement, Chord highlighted that “low average NYMEX WTI breakeven economics ($40s) compete at the front-end of Chord’s program and lower the weighted-average breakeven of Chord’s portfolio”. The company outlined that the deal is “expected to be accretive to all key metrics including cash flow, free cash flow and NAV in both near and long-term”. “We are excited to announce the acquisition of these high-quality assets,” Danny Brown, Chord Energy’s President and Chief Executive Officer, said in the statement. “The acquired assets are in one of the best areas of the Williston Basin and have significant overlap with Chord’s existing footprint, setting the stage for long-lateral development. The assets have a low average NYMEX WTI breakeven and are immediately competitive for capital,” he added. “We expect that the transaction will create significant accretion for shareholders across all key metrics, while maintaining pro forma leverage below the peer group and supporting sustainable FCF generation and return of capital,” he continued.

Read More »

AI can aid building energy retrofit decisions, but faces limitations: study

Listen to the article 4 min This audio is auto-generated. Please let us know if you have feedback. Generative AI models are able to produce effective retrofit decisions but do less well identifying which ones can produce the best result most quickly and at the least cost, according to analysis by researchers at Michigan State University. The study, “Can AI Make Energy Retrofit Decisions? An Evaluation of Large Language Models,” is one of the first to examine how large language models, or LLMs, perform in determining efficient and effective building energy retrofits.  Identifying the optimal retrofit solution can be critical from a cost standpoint. Light to medium retrofits can unlock between 10% and 40% in energy savings, or $0.49 to $1.94 per square foot of savings on average, according to JLL research published last September. Despite these savings, these actions aren’t being implemented at the scale required to meet decarbonization targets because of their capital-intensive nature, the report says. Decision-making complexity and the inadequacy of data and tools are also problem, according to the report. To determine the potential of generative AI in addressing these limitations, MSU researchers tasked seven LLMs with generating energy retrofit decisions under two contexts: a technical context focused on maximum CO2 reduction and a sociotechnical context focused on minimum packback period.  The AI-generated retrofit decisions were evaluated based on whether they matched the top-ranked retrofit measure or fell within the top three or the top five measures. The researchers then used a sample of 400 homes from ResStock 2024.2 data, spanning 49 states, to evaluate LLM performance based on accuracy, consistency, sensitivity and reasoning.   Researchers evaluated each LLM by issuing prompts, which included an overview of 16 potential retrofit packages and building-specific information. The overview described each retrofit measure’s features like heat pump efficiency, whether

Read More »

CPS Energy to Acquire Nearly 2 GW Gas Plants from ProEnergy

CPS Energy has signed an agreement to acquire four natural gas power plants operated by ProEnergy in the ERCOT area for $1.387 billion. The facilities have an aggregate capacity of 1.632 gigawatts, according to a joint statement. “Located in the Electric Reliability Council of Texas (ERCOT) market, the acquired assets include state-of-the-art, recently constructed peaking natural gas plants in Harris, Brazoria and Galveston Counties”, the companies said. “The acquired assets are dual-fuel capable, providing CPS Energy future optionality to transition to a hydrogen fuel blend that would enable reduced carbon emissions”. San Antonio, Texas-based CPS Energy has a prior agreement with Modern Hydrogen, announced July 22, to use the latter’s technology to convert natural gas into hydrogen. CPS Energy president and chief executive Rudy D. Garza said of the agreement with ProEnergy, “By acquiring recently constructed, currently operating modern power plants that utilize proven technology already in use by CPS Energy, we avoid higher construction costs, inflationary risk, and long timelines associated with building new facilities – while also enhancing the reliability and affordability of the CPS Energy generation portfolio”. “As we add resources to meet the needs of our fast-growing communities, we will continue to look to a diverse balance of energy sources that complement our portfolio, including natural gas, solar, wind and storage, keeping our community powered and growing”, Garza added. CPS Energy earlier issued a request for proposals (RFP) to secure up to 400 MW of wind generation capacity through one or more PPAs (power purchase agreements). “The RFP marks the first time in over a decade that CPS Energy has specifically sought proposals for wind projects. CPS Energy’s target to contract up to 400 MW of wind capacity would bring the utility’s total wind generation to 1,467 MW”, it said in a press release July 31. “The

Read More »

New Fortress Energy Gets New Deal to Supply Gas to Puerto Rico

New Fortress Energy Inc. (NFE) said Tuesday it had agreed contract terms with local authorities to continue delivering natural gas for Puerto Rico’s power system for seven more years. The agreement with the Third-Party Procurement Office and the Public-Private Partnership Authority provides for the supply of up to 75 trillion British thermal units a year (TBtu) “with minimum annual take-or-pay volumes of 40 TBtu, increasing to up to 50 TBtu if certain conditions are met”, New York City-based NFE said in a statement on its website. “This landmark agreement provides two critical benefits to the island. First, it establishes security of supply in San Juan for the next seven years for power plants currently running on LNG”, said NFE chair and chief executive Wes Edens. “Second, it provides for incremental LNG volumes to be delivered, allowing for the conversion of additional gas-ready plants currently burning diesel, resulting in hundreds of millions of dollars in energy savings for Puerto Ricans”. “Puerto Ricans pay far too much for electricity today and this long-term agreement provides cheaper and cleaner fuel for existing power plants for years to come”, Edens added, noting talks with the Puerto Rican government for a long-term fuel supply had been ongoing since April. “This contract complements our existing long-term 25-year supply contract with Energiza and the new 550-megawatt power plant they are developing”. NFE said, “Pricing of the volumes supplied through the GSA [Gas Supply Agreement] is set at a blend of 115 percent of Henry Hub plus $7.95/million Btu, excluding natural gas supplied to the units at San Juan 5 & 6 (which has historically consumed ~20 TBtu per year). Instead, these volumes are priced at 115 percent of Henry Hub plus $6.50/MMBtu”. NFE expects to source the LNG under the new GSA from its Fast LNG facility in Altamira, Mexico. With a capacity of 1.4 million metric

Read More »

USA EIA Sees USA Crude Oil Production Dropping in 2026

The U.S. Energy Information Administration (EIA) expects U.S. crude oil production to drop next year, according to its latest short term energy outlook (STEO), which was released on September 9. In its September STEO, the EIA projected that total U.S. crude oil output, including lease condensate, will average 13.44 million barrels per day in 2025 and 13.30 million barrels per day in 2026. This output came in at 13.23 million barrels per day in 2024, the STEO highlighted. The STEO showed that Lower 48 states, excluding the Gulf of America, will provide 11.18 million barrels per day of the projected total in 2025 and 10.96 million barrels per day of the projected total in 2026. Of this 11.18 million barrel per day figure, the STEO expects 6.52 million barrels per day to come from the Permian region, 1.20 million barrels per day to come from the Bakken region, 1.12 million barrels per day to come from the Eagle Ford region, 0.19 million barrels per day to come from the Appalachia region, 0.03 million barrels per day to come from the Haynesville region, and 2.12 million barrels per day to come from the “rest of [the] Lower 48 states”. In 2026, the STEO sees 6.41 million barrels per day coming from the Permian region, 1.20 million barrels per day coming from the Bakken region, 1.10 million barrels per day coming from the Eagle Ford region, 0.17 million barrels per day coming from the Appalachia region, 0.03 million barrels per day to come from the Haynesville region, and 2.05 million barrels per day to come from the “rest of [the] Lower 48 states”. The EIA’s latest STEO showed that, in 2024, the Permian region produced 6.30 million barrels per day, the Bakken region produced 1.23 million barrels per day, the Eagle Ford

Read More »

Aramco Signs $11B Jafurah Deal

Aramco announced, in a statement posted on its site recently, that it has signed an $11 billion lease and leaseback deal involving its Jafurah gas processing facilities with a consortium of international investors. The consortium is led by funds managed by Global Infrastructure Partners (GIP), a part of BlackRock, Aramco highlighted in the statement. As part of the transaction, a newly formed subsidiary, Jafurah Midstream Gas Company (JMGC), will lease development and usage rights for the Jafurah Field Gas Plant and the Riyas NGL Fractionation Facility, and lease them back to Aramco for a period of 20 years, the statement noted. JMGC will receive a tariff payable by Aramco in exchange for granting Aramco the exclusive right to receive, process, and treat raw gas from Jafurah, Aramco said in the statement, which highlighted that the company will hold a 51 percent majority stake in JMGC, with the remaining 49 percent held by investors led by GIP. The transaction will not impose any restrictions on Aramco’s production volumes, according to the statement. Aramco noted in the statement that the deal is “expected to close as soon as practicable, subject to customary closing conditions”. Jafurah is the largest non-associated gas development in the Kingdom of Saudi Arabia, Aramco highlighted in the statement, pointing out that it is estimated to contain 229 trillion standard cubic feet of raw gas and 75 billion Stock Tank Barrels of condensate. In the statement, Aramco described Jafurah as a “key component in Aramco’s plans to increase gas production capacity by 60 percent between 2021 and 2030, to meet rising demand”. Aramco President and CEO Amin H. Nasser said in the statement, “Jafurah is a cornerstone of our ambitious gas expansion program, and the GIP-led consortium’s participation as investors in a key component of our unconventional gas operations demonstrates

Read More »

Power shortages are the only thing slowing the data center market

Another major shortage – which should not be news to anyone – is power. Lynch said that it is the primary reason many data centers are moving out of the heavily congested areas, like Northern Virginia and Santa Clara, and into secondary markets. Power is more available in smaller markets than larger ones. “If our client needs multi-megawatt capacity in Silicon Valley, we’re being told by the utility providers that that capacity will not be available for up to 10 years from now,” so out of necessity, many have moved to secondary markets, such as Hillsborough, Oregon, Reno, Nevada, and Columbus, Ohio. The growth of hyperscalers as well as AI is driving up the power requirements of facilities further into the multi-megawatt range. The power industry moves at a very different pace than the IT world, much slower and more deliberate. Lynch said the lead time for equipment makes it difficult to predict when some large scale, ambitious data centers can be completed. A multi-megawatt facility may even require new transmission lines to be built out as well. This translates into longer build times for new data centers. CBRE found that the average data center now takes about three years to complete, up from 2 years just a short time ago. Intel, AMD, and Nvidia haven’t even laid out a road map for three years, but with new architectures coming every year, a data center risks being obsolete by the time it’s completed. However, what’s the alternative? To wait? Customers will never catch up at that rate, Lynch said.   That is simply not a viable option, so development and construction must go on even with short supplies of everything from concrete and steel to servers and power transformers.

Read More »

Arista continues to defy expectations, build enterprise momentum

During her keynote, Ullal noted Arista is not only selling high-speed switches for AI data centers but also leveraging its own technology to create a new category of “AI centers” that simplify network management and operations, with a goal of 60% to 80% growth in the AI market. Arista has its sights set on enterprise expansion Arista hired Todd Nightingale as its new president a couple of month ago, and the reason should be obvious to industry watchers: to grow the enterprise business. Nightingale recently served as CEO of Fastly, but he is best known for his tenure as Cisco. He joined when Cisco acquired Meraki, where he was the CEO. Ullal indicated the campus and WAN business would grow from the current $750 million to $800 million run rate to $1.25 billion, which is a whopping 60% growth. Some of this will come from VeloCloud being added to Arista’s numbers, but not all of it. Arista’s opportunity in campus and WAN is in bringing its high performance, resilient networking to this audience. In a survey I conducted last year, 93% of respondents stated the network is more important to business operations than it was two years ago. During his presentation, Nightingale talked about this shift when he said: “There is no longer such a thing as a network that is not mission critical. We think of mission critical networks for military sites and tier one hospitals, but every hotel and retailer who has their Wi-Fi go down and can’t transact business will say the network is critical.” Also, with AI, inferencing traffic is expected to put a steady load on the network, and any kind of performance hiccup will have negative business ramifications. Historically, Arista’s value proposition for companies outside the Fortune 2000 was a bit of a solution

Read More »

Arista touts liquid cooling, optical tech to reduce power consumption for AI networking

Both technologies will likely find a role in future AI and optical networks, experts say, as both promise to reduce power consumption and support improved bandwidth density. Both have advantages and disadvantages as well – CPOs are more complex to deploy given the amount of technology included in a CPO package, whereas LPOs promise more simplicity.  Bechtolsheim said that LPO can provide an additional 20% power savings over other optical forms. Early tests show good receiver performance even under degraded conditions, though transmit paths remain sensitive to reflections and crosstalk at the connector level, Bechtolsheim added. At the recent Hot Interconnects conference, he said: “The path to energy-efficient optics is constrained by high-volume manufacturing,” stressing that advanced optics packaging remains difficult and risky without proven production scale.  “We are nonreligious about CPO, LPO, whatever it is. But we are religious about one thing, which is the ability to ship very high volumes in a very predictable fashion,” Bechtolsheim said at the investor event. “So, to put this in quantity numbers here, the industry expects to ship something like 50 million OSFP modules next calendar year. The current shipment rate of CPO is zero, okay? So going from zero to 50 million is just not possible. The supply chain doesn’t exist. So, even if the technology works and can be demonstrated in a lab, to get to the volume required to meet the needs of the industry is just an incredible effort.” “We’re all in on liquid cooling to reduce power, eliminating fan power, supporting the linear pluggable optics to reduce power and cost, increasing rack density, which reduces data center footprint and related costs, and most importantly, optimizing these fabrics for the AI data center use case,” Bechtolsheim added. “So what we call the ‘purpose-built AI data center fabric’ around Ethernet

Read More »

Network and cloud implications of agentic AI

The chain analogy is critical here. Realistic uses of AI agents will require core database access; what can possibly make an AI business case that isn’t tied to a company’s critical data? The four critical elements of these applications—the agent, the MCP server, the tools, and the data— are all dragged along with each other, and traffic on the network is the linkage in the chain. How much traffic is generated? Here, enterprises had another surprise. Enterprises told me that their initial view of their AI hosting was an “AI cluster” with a casual data link to their main data center network. With AI agents, they now see smaller AI servers actually installed within their primary data centers, and all the traffic AI creates, within the model and to and from it, now flows on the data center network. Vendors who told enterprises that AI networking would have a profound impact are proving correct. You can run a query or perform a task with an agent and have that task parse an entire database of thousands or millions of records. Someone not aware of what an agent application implies in terms of data usage can easily create as much traffic as a whole week’s normal access-and-update would create. Enough, they say, to impact network capacity and the QoE of other applications. And, enterprises remind us, if that traffic crosses in/out of the cloud, the cloud costs could skyrocket. About a third of the enterprises said that issues with AI agents generated enough traffic to create local congestion on the network or a blip in cloud costs large enough to trigger a financial review. MCP tool use by agents is also a major security and governance headache. Enterprises point out that MCP standards haven’t always required strong authentication, and they also

Read More »

There are 121 AI processor companies. How many will succeed?

The US currently leads in AI hardware and software, but China’s DeepSeek and Huawei continue to push advanced chips, India has announced an indigenous GPU program targeting production by 2029, and policy shifts in Washington are reshaping the playing field. In Q2, the rollback of export restrictions allowed US companies like Nvidia and AMD to strike multibillion-dollar deals in Saudi Arabia.  JPR categorizes vendors into five segments: IoT (ultra-low-power inference in microcontrollers or small SoCs); Edge (on-device or near-device inference in 1–100W range, used outside data centers); Automotive (distinct enough to break out from Edge); data center training; and data center inference. There is some overlap between segments as many vendors play in multiple segments. Of the five categories, inference has the most startups with 90. Peddie says the inference application list is “humongous,” with everything from wearable health monitors to smart vehicle sensor arrays, to personal items in the home, and every imaginable machine in every imaginable manufacturing and production line, plus robotic box movers and surgeons.  Inference also offers the most versatility. “Smart devices” in the past, like washing machines or coffee makers, could do basically one thing and couldn’t adapt to any changes. “Inference-based systems will be able to duck and weave, adjust in real time, and find alternative solutions, quickly,” said Peddie. Peddie said despite his apparent cynicism, this is an exciting time. “There are really novel ideas being tried like analog neuron processors, and in-memory processors,” he said.

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Looking for Data Center Candidates? Check out Pkaza’s Active Candidate / Featured Candidate Hotlist (and coming soon free Data Center Intern listing). Data Center Critical Facility Manager Impact, TX There position is also available in: Cheyenne, WY; Ashburn, VA or Manassas, VA. This opportunity is working directly with a leading mission-critical data center developer / wholesaler / colo provider. This firm provides data center solutions custom-fit to the requirements of their client’s mission-critical operational facilities. They provide reliability of mission-critical facilities for many of the world’s largest organizations (enterprise and hyperscale customers). This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Electrical Commissioning Engineer New Albany, OH This traveling position is also available in: Richmond, VA; Ashburn, VA; Charlotte, NC; Atlanta, GA; Hampton, GA; Fayetteville, GA; Cedar Rapids, IA; Phoenix, AZ; Dallas, TX or Chicago, IL. *** ALSO looking for a LEAD EE and ME CxA Agents and CxA PMs. *** Our client is an engineering design and commissioning company that has a national footprint and specializes in MEP critical facilities design. They provide design, commissioning, consulting and management expertise in the critical facilities space. They have a mindset to provide reliability, energy efficiency, sustainable design and LEED expertise when providing these consulting services for enterprise, colocation and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits.  Data Center Engineering Design ManagerAshburn, VA This opportunity is working directly with a leading mission-critical data center developer /

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »