Stay Ahead, Stay ONMINE

Moomba CCS in Australia on Track to Achieve Declared Work Rate, Says Santos

The Moomba carbon capture and storage (CCS) project onshore South Australia stored 340,000 metric tons of carbon dioxide equivalent (CO2e) at yearend after starting service October 2024, operator Santos Ltd. has said, touting the facility as a showcase of Australia’s potential for the technology. “The technology and reservoir is [sic] performing as expected, putting Moomba CCS […]

The Moomba carbon capture and storage (CCS) project onshore South Australia stored 340,000 metric tons of carbon dioxide equivalent (CO2e) at yearend after starting service October 2024, operator Santos Ltd. has said, touting the facility as a showcase of Australia’s potential for the technology.

“The technology and reservoir is [sic] performing as expected, putting Moomba CCS on track to safely and permanently sequester up to 1.7 million tonnes per annum of CO2e, depending on CO2 availability”, the local gas and oil company said in an online statement.

A recent analysis by the Institute for Energy Economics and Financial Analysis (IEEFA) of another Australian CCS project found underperformance and cast doubt about the technology’s viability for abating emissions.

The Chevron Corp.-led Gorgon CCS injection system captured, in the last Australian fiscal year (July 2023-June 2024), just 30 percent of the CO2 emitted from natural gas extraction by the Gorgon LNG and domestic gas project, IEEFA reported November 28, 2024.

Gorgon CCS had been approved on the condition it captures, on a five-year rolling average from 2016, at least 80 percent of CO2 emissions from wells drilled for the gas facility, according to information published online by Chevron Australia Pty. Ltd.

Santos assured its project “is delivering immediate and real large-scale emissions reduction for the company and for Australia at a very competitive lifecycle cost”.

“The project is providing a real confidence boost for the potential of CCS technology to help Australia reach net zero and decarbonize faster, at scale and affordably”, the Adelaide-based exploration and production company added.

At a full injection rate, Moomba CCS avoids more CO2 in four days than what 10,000 electric vehicles save in one year, according to Santos.

“And in just one year, Moomba CCS will achieve around 28 percent of the total emissions reduction achieved by Australia’s entire electricity sector in 2023”, it said.

The project injects into depleted reservoirs near the Moomba oil and gas gathering and processing complex, which serves the onshore Cooper and Eromanga basins. Santos operates Moomba CCS with a 66.7 percent stake. Beach Energy Ltd., also a local oil and gas exploration and production player, holds the remaining interest.

Santos chief executive and managing director Kevin Gallagher commented, “In bringing this project to fruition, I believe we have also started an incredible new chapter in Australia’s energy transition, which will lead us to become a carbon capture and storage superpower”.

“Policymakers should seize the opportunity to deploy CCS to reduce emissions faster, at scale and cost competitively – particularly when Australia has a unique and natural advantage in carbon capture and storage that is complemented by a well-established, world-class regulatory regime administered by the Clean Energy Regulator”, Gallagher added.

“CCS is the one technology with real potential to abate emissions at scale and that’s why projects like Moomba CCS are so important to help make net zero a reality,” Gallagher said.

Australian Potential

Santos aims to establish a carbon storage business with a capacity to permanently store 14 million metric tons per annum (MMtpa) of third-party CO2e emissions by 2040, it declared November 19, 2024.

The Cooper and Eromanga basins alone hold the potential for injection of up to 20 MMtpa of CO2e for up to 50 years, according to Santos.

“Australia has a natural competitive advantage in CCS with known high-quality, stable geological storage basins capable of injection at a rate of 300 million tonnes per annum for at least 100 years”, it said in the Moomba CCS update.

Third-Party Emissions

Santos has secured agreements with potential domestic and overseas clients for its first CCS project. This interest “provides strong momentum for Moomba CCS phase two as a commercial service”, Gallagher said October 17, 2024, as the company announced start-up.

In 2023 two Japanese companies came on board Moomba CCS after Australia passed controversial legislation that would allow foreign companies to ship CO2 via Australian waters.

“The signing of a memorandum of understanding between Santos, JX Nippon Oil & Gas Exploration Corporation and ENEOS Corporation paves the way for a joint feasibility study that will evaluate the potential to capture, transport and sequester emissions from Japan, supporting expansion of the Moomba CCS project”, Santos said December 18, 2023.

“The aggregation and management of carbon at Moomba would also support Santos’ Energy Solutions low-carbon fuels ambitions and complement current studies with Tokyo Gas and Osaka Gas for potential low-carbon e-methane production in the Cooper Basin.

“This would facilitate the export of e-methane, made by combining green hydrogen with CO2 obtained from industrial emissions or direct air capture in a circular economy”.

The companies agreed to study the potential of importing CO2 from Japan to Moomba CCS via either Gladstone, Queensland or Port Bonython, South Australia. The import targets are five MMtpa by 2030, 10 MMtpa by 2035 and 20 MMtpa by 2040.

“The CO2 target set for 2040 in this study is equivalent to delivering – every year – triple the total emissions reduction achieved in Australia’s electricity sector last year [2022]”, Gallagher said then. “It also equates to around two-thirds of Santos’ total annual Scope 3 emissions today or almost four times our current annual Scope 1 and 2 emissions – it’s huge”.

Santos had signed a similar agreement with Australian power producer and gas distributor APA Group. “The collaboration will include an assessment of CCS pipeline transport routes from key emission sources in Gladstone, Port Bonython and Greater Sydney to the Moomba CCS facility in the Cooper Basin”, Santos said November 22, 2023.

Santos also announced at the time an agreement with Abu Dhabi National Oil Co. for potential cooperation in the development of CCS technologies and the provision of CCS services such as shipping, as well as potential joint investments in CCS projects.

Gorgon CCS

The IEEFA report last November said there has been a trend of underperformance among CCS projects across the globe and that the Gorgon CCS project calls into question “the financial viability of ambitious CCS plans by Australian governments and companies”.

Gorgon CCS captured 44 percent of CO2 removed during gas extraction between FY 2019-20 and FY 2023-24, according to the Cleveland, Ohio-based think tank.

Chevron Australia has not responded to a request for comment emailed by Rigzone about the IEEFA analysis. However, it says in an online project factsheet it is “committed to increasing carbon dioxide injection rates at Gorgon CCS in accordance with its environmental approvals”.

“A project has commenced that aims to expand the system’s capacity to manage water found within the reservoir where carbon dioxide is stored, thereby reducing reservoir pressure and enabling increased carbon dioxide injection rates”, the fact sheet says.

“In addition to this project, Chevron Australia continues to explore options to further increase carbon dioxide injection rates within the system”, it adds.

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Yea or nay: Will Nvidia H200 chips go to China?

He noted, “the broader implications and potential impacts may signal to enterprise customers of Nvidia that perhaps they don’t need the latest and greatest GPUs from [them] either to achieve acceptable results across select AI workloads. It is doubtful that Nvidia would commission additional production issues for H200 without China

Read More »

Chinese AI firm trains state-of-the-art model entirely on Huawei chips

The pricing positions GLM-Image as a cost-effective option for enterprises generating marketing materials, presentations, and other text-heavy visual content at scale. Technical approach and benchmark performance GLM-Image employs a hybrid architecture combining a 9-billion-parameter autoregressive model with a 7-billion-parameter diffusion decoder, according to Zhipu’s technical report. The autoregressive component handles

Read More »

Analyst Explains Why Feb NatGas Contract Collapsed Wednesday

In an EBW Analytics Group report sent to Rigzone by the EBW team on Thursday, Eli Rubin, an energy analyst at the company, highlighted that the February natural gas contract “collapsed” yesterday. Rubin outlined in the report that the February natural gas contract fell to $3.068 per million British thermal units (MMBtu) on Wednesday “on (i) chances for a dissipating Alaska ridge opening milder February risks and (ii) a Webber Research report that Golden Pass LNG Trains 2-3 may be delayed until 2027”. “Weakness was compounded by volatility: yesterday’s $3.120 close is within 1.1 cents of Friday’s low,” Rubin added. In the report, Rubin pointed out that daily LNG demand “dropped to a two month low” yesterday, “mitigating weather driven Henry Hub spot price upside to clear at $3.12 per MMBtu”. He also noted that “LNG could jump 3.5 billion cubic feet per day – adding to a 12.4 billion cubic foot per day increase in weather-driven demand into Tuesday”. Rubin went on to outline in the report that “consensus projections” for the U.S. Energy Information Administration’s (EIA) next weekly natural gas storage report – which is scheduled to be released later today and will include data for the week ending January 9 – “are for an 87-91 billion cubic foot draw”. “The bigger story is likely to be rising physical market strength into a cold Martin Luther King holiday weekend,” Rubin added. “Healthy storage surpluses suggest NYMEX futures may try to continue to look past near term cold, however,” he continued. The EBW report highlighted that the February natural gas contract closed at $3.120 per MMBtu on Wednesday. It outlined that this marked a 29.9 cent, or 8.7 percent, decrease from Tuesday’s close. In Thursday’s report, EBW predicted a “mixed signals” trend for the NYMEX front-month natural gas contract

Read More »

ENAP Secures Deal to Use Oxiquim Terminal

Chile’s National Petroleum Company (ENAP) has signed an agreement with Oxiquim allowing ENAP to transfer and store fuels at Oxiquim’s terminal in the municipality of Coronel, Región del Biobío. The agreement allows ENAP to use the Escuadrón Maritime Terminal starting this quarter, ENAP said in a press release. Oxiquim general manager Cecilia Pardo, in comments about the deal, noted of “limited infrastructure available for fuel storage and difficulty in executing new projects”.  Escuadrón is one of three terminals operated by Oxiquim for bulk liquid raw materials and chemicals, fuels and liquefied petroleum gas in Chile, Oxiquim says on its website. The other two are the Mejillones and Quintero terminals. “The agreement is consistent with ENAP’s recent announcement to consolidate its logistics business as a key pillar of the company, incorporating strategic partners and high service standards”, ENAP said. “This is based on the fact that ENAP has logistics assets exceeding $3 billion and that its operations span from Arica to Punta Arenas”. ENAP general manager Julio Friedmann said, “[S]ome of the main areas of work in the immediate term are to generate greater storage capacity and to develop future partnerships with third parties, always for the purpose of strengthening fuel supply throughout the territory”. Last year ENAP consolidated its logistics assets under a “Corporate Logistics Management” and announced an investment plan of $540 million over the next five years for its logistics business. “Among the main areas of focus of this new management are operational excellence, efficiency and profitability of the logistics chain, and the adaptation of infrastructure to meet current and future needs”, ENAP said in an online statement September 5, 2025. “Currently, ENAP manages logistics assets valued at over $3 billion, including a large land fleet, 11 ships, 830 kilometers [515.74 miles] of pipelines, three million cubic meters [105.94 million

Read More »

Petrofac Urges Creditors to Agree ‘Compromise’ to Clear CBI Sale

Petrofac on Wednesday called on certain creditors to sign a “voluntary arrangement” regarding their claims, to allow the British energy engineering company to proceed with a recent agreement to sell its Asset Solutions business to CB&I. Petrofac said in an online statement it has scheduled a meeting of Petrofac Facilities Management Ltd’s creditors on January 30 for a vote on “a proposed Company Voluntary Arrangement (CVA) related to the sale”. “The CVA is intended to compromise certain creditor claims to enable the completion of the sale”, the statement said. “Trade creditors, employees and certain other parties are not affected by the CVA. “Operations will continue as normal during the CVA process and Asset Solutions will continue to serve its customers and work with suppliers as usual”. Petrofac did not disclose the terms of the voluntary arrangement. “The agreed sale of Asset Solutions to CB&I is a strong outcome for the business and for around 3,000 colleagues who are expected to move across on completion”, said John Pearson, chief operating officer for Petrofac’s Asset Solutions business. “After more than two years of restructuring, this CVA is the final step to deliver that outcome, and we are asking creditors to support it so the sale can complete”. Petrofac and CB&I, a maker of storage facilities, tanks and terminals, announced their agreement December 24, 2025. “The consideration for the sale has been agreed on a debt-free cash-free basis”, Petrofac said then. “The ultimate net proceeds from the transaction will depend on the quantum of various deductions which will only be confirmed closer to completion. However, the administrators of the company expect the net proceeds to be in the range of $45m to $55m”. “The net proceeds will be distributed to the secured creditors in accordance with the intercreditor agreement entered into between the company and the secured creditors on 26

Read More »

Oil Prices Retreat After 5 Session Rebound

Oil prices retreated after a five-session rebound as concerns about the geopolitical situation in the Middle East abated to some extent. That’s what Daniel Takieddine, Co-founder and CEO of Sky Links Capital Group, said in a market analysis sent to Rigzone today, adding that “the announcement that the U.S. administration has put its latest move in the area on hold tempered immediate fears of supply disruption in an important oil producing region”. Takieddine warned, however, that geopolitical risk has not disappeared. “Tensions across major producing regions remain elevated, and developments in Eastern Europe and Latin America continue to inject uncertainty into energy supply chains,” he said in the analysis. “In parallel, U.S. crude inventory data showed a strong increase in stockpiles, reinforcing the perception that physical inventory remains plentiful and that recent gains were running ahead of underlying demand conditions,” he added. Looking ahead, Takieddine stated in the analysis that any renewed escalation in the Middle East or other producing regions would quickly push crude prices back to the upside. “However, over the long term, the concerns over an oversupplied market remain a weight on prices and could pull oil toward new lows if geopolitical risks abate more consistently,” he said. In a market comment sent to Rigzone on Thursday, Naeem Aslam, Chief Analyst at Zaye Capital Markets, noted that “crude oil prices remain volatile as markets continuously reprice the geopolitical risk premium against underlying supply-demand fundamentals”. “The latest price swings have largely been driven by shifting Middle East dynamics, including troop movements, Iran-related tension, and broader security posturing, which lifted risk premium earlier, while recent signs of de-escalation have triggered a pullback,” Aslam said in the comment. “Meanwhile, recent economic data reinforced resilient consumption and steady housing activity, supporting the view that demand remains intact but not overheating enough to

Read More »

Woodside, JERA Finalize Winter LNG Deal for Japan

JERA Co Inc and Woodside Energy Group Ltd have finalized an agreement for the Australian company to export liquefied natural gas (LNG) to the Japanese power utility in the winter months for five years. The companies already have an existing contract under which Woodside is to deliver 400,000 metric tons a year of LNG to Japan. The 10-year agreement is to fulfill its first cargo this year on a delivered basis, as announced September 18, 2024. Under the new deal, Woodside will deliver the volumes on an ex-ship basis between the months of December and February from 2027. Three cargoes totaling about 200,000 metric tons will be delivered per year, JERA said in a press release Wednesday. “Gas-fired power generation plays a critical role in meeting peak energy demand and balancing seasonal fluctuations – challenges that are becoming more pronounced as renewable energy expands”, JERA said. “Securing sufficient LNG supply during the winter months, when global gas demand surges, is essential to maintaining a stable electricity supply in Japan”. “The volumes supplied to JERA will be sourced from Woodside’s global LNG portfolio, leveraging assets such as Scarborough, North West Shelf, Pluto LNG and LALNG, once it is operational”, Woodside said in a separate online statement about the new agreement. JERA owns 15.1 percent in the Scarborough field offshore Western Australia, part of the broader Scarborough Energy Project. Scarborough Energy is on track to start exporting LNG in the second half of this year, Woodside affirmed Tuesday as it announced the arrival of the project’s floating production unit. Scarborough Energy includes the development of the Scarborough field off the coast of Karratha, the construction of a second gas processing train for Pluto LNG with a capacity of five million metric tons per annum (MMtpa) and modifications to Pluto Train 1, according to

Read More »

OEUK Highlights ‘Key to Delivering North Sea Future Plan’

Cross-sector collaboration and close consultation of industry experts will be key to delivering the North Sea Future Plan (NSFP). That’s what Mark Wilson, HSE and Operations Director at industry body Offshore Energies UK (OEUK), said in a release sent to Rigzone recently, which highlighted that OEUK will host its 2026 HSE conference on February 25, “bringing together leaders from across sectors to shape the future of the North Sea as a multi-purpose energy basin”. “Under the theme ‘Trust. Transparency. Transformation’, the conference will kick off with a pivotal session based on the UK government’s North Sea Future Plan, which will explore how co-location between offshore wind, oil and gas, and carbon capture and storage (CCUS) can deliver a safer, cleaner basin,” OEUK said in its release. The UK Department for Energy Security and Net Zero (DESNZ) announced the NSFP in a statement posted on its site back in November 2025. “We look forward to welcoming industry leaders, technical experts, regulators, and innovators to this one-day event,” Wilson said in the OEUK release. “The first session of the day is a great opportunity to hear how improved guidance, aligned standards, better data sharing, and early engagement can enable offshore energies to achieve safe and sustainable co-location across the North Sea and support the plan’s integrated vision,” he added. OEUK noted in its release that its event explores the themes of trust, transparency, and transformation across all areas of health, safety, and environment, including emissions performance. It added that the conference features full audience events as well as a series of parallel breakout sessions throughout the day. In its statement published in November last year, DESNZ announced the publication of a “landmark plan to protect existing jobs and deliver the next generation of good, new jobs”.  “The government is charting a new course for the North Sea, providing a

Read More »

Google warns transmission delays are now the biggest threat to data center expansion

The delays stem from aging transmission infrastructure unable to handle concentrated power demands. Building regional transmission lines currently takes seven to eleven years just for permitting, Hanna told the gathering. Southwest Power Pool has projected 115 days of potential loss of load if transmission infrastructure isn’t built to match demand growth, he added. These systemic delays are forcing enterprises to reconsider fundamental assumptions about cloud capacity. Regions including Northern Virginia and Santa Clara that were prime locations for hyperscale builds are running out of power capacity. The infrastructure constraints are also reshaping cloud competition around power access rather than technical capabilities. “This is no longer about who gets to market with the most GPU instances,” Gogia said. “It’s about who gets to the grid first.” Co-location emerges as a faster alternative to grid delays Unable to wait years for traditional grid connections, hyperscalers are pursuing co-location arrangements that place data centers directly adjacent to power plants, bypassing the transmission system entirely. Pricing for these arrangements has jumped 20% in power-constrained markets as demand outstrips availability, with costs flowing through to cloud customers via regional pricing differences, Gogia said. Google is exploring such arrangements, though Hanna said the company’s “strong preference is grid-connected load.” “This is a speed to power play for us,” he said, noting Google wants facilities to remain “front of the meter” to serve the broader grid rather than operating as isolated power sources. Other hyperscalers are negotiating directly with utilities, acquiring land near power plants, and exploring ownership stakes in power infrastructure from batteries to small modular nuclear reactors, Hanna said.

Read More »

OpenAI turns to Cerebras in a mega deal to scale AI inference infrastructure

Analysts expect AI workloads to grow more varied and more demanding in the coming years, driving the need for architectures tuned for inference performance and putting added pressure on data center networks. “This is prompting hyperscalers to diversify their computing systems, using Nvidia GPUs for general-purpose AI workloads, in-house AI accelerators for highly optimized tasks, and systems such as Cerebras for specialized low-latency workloads,” said Neil Shah, vice president for research at Counterpoint Research. As a result, AI platforms operating at hyperscale are pushing infrastructure providers away from monolithic, general-purpose clusters toward more tiered and heterogeneous infrastructure strategies. “OpenAI’s move toward Cerebras inference capacity reflects a broader shift in how AI data centers are being designed,” said Prabhu Ram, VP of the industry research group at Cybermedia Research. “This move is less about replacing Nvidia and more about diversification as inference scales.” At this level, infrastructure begins to resemble an AI factory, where city-scale power delivery, dense east–west networking, and low-latency interconnects matter more than peak FLOPS, Ram added. “At this magnitude, conventional rack density, cooling models, and hierarchical networks become impractical,” said Manish Rawat, semiconductor analyst at TechInsights. “Inference workloads generate continuous, latency-sensitive traffic rather than episodic training bursts, pushing architectures toward flatter network topologies, higher-radix switching, and tighter integration of compute, memory, and interconnect.”

Read More »

Cisco’s 2026 agenda prioritizes AI-ready infrastructure, connectivity

While most of the demand for AI data center capacity today comes from hyperscalers and neocloud providers, that will change as enterprise customers delve more into the AI networking world. “The other ecosystem members and enterprises themselves are becoming responsible for an increasing proportion of the AI infrastructure buildout as inferencing and agentic AI, sovereign cloud, and edge AI become more mainstream,” Katz wrote. More enterprises will move to host AI on premises via the introduction of AI agents that are designed to inject intelligent insight into applications and help improve operations. That’s where the AI impact on enterprise network traffic will appear, suggests Nolle. “Enterprises need to host AI to create AI network impact. Just accessing it doesn’t do much to traffic. Having cloud agents access local data center resources (RAG etc.) creates a governance issue for most corporate data, so that won’t go too far either,” Nolle said.  “Enterprises are looking at AI agents, not the way hyperscalers tout agentic AI, but agents running on small models, often open-source, and are locally hosted. This is where real AI traffic will develop, and Cisco could be vulnerable if they don’t understand this point and at least raise it in dialogs where AI hosting comes up,” Nolle said. “I don’t expect they’d go too far, because the real market for enterprise AI networking is probably a couple years out.” Meanwhile, observers expect Cisco to continue bolstering AI networking capabilities for enterprise branch, campus and data centers as well as hyperscalers, including through optical support and other gear.

Read More »

Microsoft tells communities it will ‘pay its way’ as AI data center resource usage sparks backlash

It will work with utilities and public commissions to set the rates it pays high enough to cover data center electricity costs (including build-outs, additions, and active use). “Our goal is straightforward: To ensure that the electricity cost of serving our data centers is not passed on to residential customers,” Smith emphasized. For example, the company is supporting a new rate structure Wisconsin that would charge a class of “very large customers,” including data centers, the true cost of the electricity required to serve them. It will collaborate “early, closely, and transparently” with local utilities to add electricity and supporting infrastructure to existing grids when needed. For instance, Microsoft has contracted with the Midcontinent Independent System Operator (MISO) to add 7.9GW of new electricity generation to the grid, “more than double our current consumption,” Smith noted. It will pursue ways to make data centers more efficient. For example, it is already experimenting with AI to improve planning, extract more electricity from existing infrastructure, improve system resilience, and speed development of new infrastructure and technologies (like nuclear energy). It will advocate for state and national public policies that ensure electricity access that is affordable, reliable, and sustainable in neighboring communities. Microsoft previously established priorities for electricity policy advocacy, Smith noted, but “progress has been uneven. This needs to change.” Microsoft is similarly committed when it comes to data center water use, promising four actions: Reducing the overall amount of water its data centers use, initially improving it by 40% by 2030. The company is exploring innovations in cooling, including closed-loop systems that recirculate cooling liquids. It will collaborate with local utilities to map out water, wastewater, and pressure needs, and will “fully fund” infrastructure required for growth. For instance, in Quincy, Washington, Microsoft helped construct a water reuse utility that recirculates

Read More »

Can retired naval power plants solve the data center power crunch?

HGP’s plan includes a revenue share with the government, and the company would create a decommissioning fund, according to Bloomberg. The alternative? After a lengthy decommissioning process, the reactors are shipped to a remote storage facility in Washington state together dust along with dozens of other retired nuclear reactors. So the carrier itself isn’t going to be turned into a data center, but its power plants are being proposed for a data center on land. And even with the lengthening decommissioning process, that’s still faster than building a nuclear power plant from scratch. Don’t hold your breath, says Kristen Vosmaer, managing director, JLL Work Dynamics Data Center team. The idea of converting USS Nimitz’s nuclear reactors to power AI data centers sounds compelling but faces insurmountable obstacles, he argues. “Naval reactors use weapons-grade uranium that civilian entities cannot legally possess, and the Nuclear Regulatory Commission has no pathway to license such facilities. Even setting aside the fuel issue, these military-designed systems would require complete reconstruction to meet civilian safety standards, eliminating any cost advantages over purpose-built nuclear plants,” Vosmaer said. The maritime concept itself, however, does have some merit, said Vosmaer. “Ocean cooling can reduce energy consumption compared to land-based data centers, and floating platforms offer positioning flexibility that fixed facilities cannot match,” Vosmaer said.

Read More »

What exactly is an AI factory?

Others, however, seem to use the word to mean something smaller than a data center, referring more to the servers, software, and other systems used to run AI. For example, the AWS AI Factory is a combination of hardware and software that runs on-premises but is managed by AWS and comes with AWS services such as Bedrock, networking, storage and databases, and security.  At Lenovo, AI factories appear to be packaged servers designed to be used for AI. “We’re looking at the architecture being a fixed number of racks, all working together as one design,” said Scott Tease, vice president and general manager of AI and high-performance computing at Lenovo’s infrastructure solutions group. That number of racks? Anything from a single rack to hundreds, he told Computerworld. Each rack is a little bigger than a refrigerator, comes fully assembled, and is often fully preconfigured for the customer’s use case. “Once it arrives at the customer site, we’ll have service personnel connect power and networking,” Tease said. For others, the AI factory concept is more about the software.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »