Stay Ahead, Stay ONMINE

Energy firm Hunting to axe Aberdeen jobs in restructure

Precision engineering group Hunting is making changes in a bid to restore profitability to its Europe, Middle East and Africa (EMEA) business and adapt to challenges in the market. On March 18, Hunting released an update on restructuring plans for the EMEA segment. This followed a separate announcement earlier in the month about the acquisition […]

Precision engineering group Hunting is making changes in a bid to restore profitability to its Europe, Middle East and Africa (EMEA) business and adapt to challenges in the market.

On March 18, Hunting released an update on restructuring plans for the EMEA segment. This followed a separate announcement earlier in the month about the acquisition of organic oil recovery (OOR) technology from its founding shareholders for $17.5m, in a move expected to benefit Hunting’s global operations.

Hunting announced in January that it had decided to restructure its EMEA operating segment.

This move was based on expectations of low levels of future drilling activity in the North Sea, among other factors.

The latest update said a review of the segment, including an analysis of the medium-term outlook for the European region, had led to a series of proposals that were now under consideration by Hunting’s senior leadership.

These include the proposed closure of the oil country tubular goods (OCTG) operating site in the Netherlands, with any future orders in relation to geothermal activity being fulfilled by Hunting’s UK sites; the consolidation of OCTG threading and accessories manufacturing activity into the Fordoun operating site in Aberdeen, UK; the consolidation and transfer of all well intervention manufacturing activities into Hunting’s Dubai operating site; and a reduction in the total headcount of the EMEA segment.

Hunting’s regional managing director, Graham Goodall, told Energy Voice the future orders being shifted from the Netherlands to the UK represented “some positivity for the UK side of the business”.

He added that the new Dubai facility was expected to enter service within the next few weeks. And while numbers for the anticipated headcount reductions are not being disclosed at this time, Goodall said the reductions would mainly affect the Badentoy facility in Aberdeen.

Consultation processes with employees have now started.

Goodall described the “one and only goal” of the restructuring as restoring the profitability of the EMEA segment.

“It’s no secret that there’s a challenging market out there, particularly being driven by government policy as a lack of spending,” he said.

This has been the case for a number of years among Hunting’s customer base, Goodall continued, “which has resulted in us getting to this point that we had to take a look at our business and then restructure to right-size our business and get us to a profitable state in the future”.

Hunting anticipates realising annualised cost savings of around $10m per year once the restructuring is complete and is targeting profitability for the EMEA segment by the first quarter of 2026.

Meanwhile, Goodall described the OOR technology acquisition as “positive news”, including for the UK and the North Sea.

The acquisition comprises the entire portfolio of intellectual property, including over 25 discreet patents, the distribution rights for the technology, and a laboratory located in California, US.

Once the acquisition closes, Hunting will hold the global rights for the OOR technology, putting it on a path to further accelerate commercialisation across North America and elsewhere.

OOR is an enhanced oil recovery (EOR) technology aimed at improving the ultimate recovery of oil reserves from wells, thereby extending field life and boosting economic returns. Hunting has touted the technology’s cost-efficiency, saying it reduces capital expenditure requirements while being simple to deploy. Reduced water cut during end-of-life production and lower hydrogen sulphide (H2S) levels in production offtake have also been cited as benefits of OOR.

Field trials of the technology are currently underway with numerous exploration and production companies across North America, Europe, the Middle East and Asia Pacific, Hunting said. This comes after the company announced in August 2024 that it had secured up to $60m worth of OOR contracts to support operators in the North Sea.

“Demand remains strong, with the fastest growing areas being Africa, India, APAC and North America,” Hunting’s finance director, Bruce Ferguson, told Energy Voice. “We have around 70 engaged customers looking at testing the technology, with five customers using it at full-field level. With the long sales cycle it will take 12-18 months to convert the other customers to this stage.”

Ferguson cited a previous trial of the OOR technology at CNOOC International’s Scott platform in the North Sea, which saw a 140% increase in oil production and a 50-80% reduction in H2S. This led to a full-field treatment, though it is currently on hold owing to power issues at the platform, according to Ferguson.

“Following that pilot we have now analysed over 20 oilfields in the UKCS, with demand in the North Sea continuing to grow,” he said.

“Results from the ongoing field trials are constantly being received and are positive. To date, full-field applications are 99% successful in returning a return on investment within three months of breakthrough,” Ferguson continued.

“However, these results can usually only be made public flowing the publication of a peer-reviewed scientific paper. A highly successful application will be published next month which demonstrates the results from two full-field applications on two reservoirs in South Oman. These will be released as appropriate going forward.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Cisco, Nvidia team to deliver secure AI factory infrastructure

Hypershield uses AI to dynamically refine security policies based on application identity and behavior. It automates policy creation, optimization, and enforcement across workloads. In addition, Hypershield promises to let organizations autonomously segment their networks when threats are a problem, gain exploit protection without having to patch or revamp firewalls, and

Read More »

Hydrasun awards contracts for BP Aberdeen Hydrogen Hub

Aberdeen-based Hydrasun has awarded three contracts for major equipment packages for the BP Aberdeen Hydrogen Hub in what has been hailed as a “significant milestone” for the flagship project. The firm, part of the D2Zero stable of decarbonisation and clean energy companies, has delivered the deals having won a bid to design and integrate a hydrogen refuelling station and associated infrastructure. Hydrasun has struck a deal with Norway’s Nel Hydrogen for a package that includes a 2.5MW electrolyser. Engineering firm Maximator will deliver a compression package with Sheffield-based Chesterfield Specialist Cylinders coming on board to supply the storage package. The scalable green hydrogen production, storage and distribution facility powered by renewable energy, is being delivered through a joint venture between energy giant BP and Aberdeen City Council. The joint venture achieved a financial investment decision in July 2024. Land at Hareness Road, at Altens Industrial Park in Aberdeen, has been earmarked for the hydrogen production facility. Hydrasun, a specialist hydrogen system integrator and refuelling solutions provider, said each of the equipment packages will play a “vital role” in ensuring the efficient production, storage, and distribution of hydrogen. © Supplied by HydrasunHydrasun chief executive Neil Thompson. Neil Thompson, CEO at Hydrasun said the hub project “highlights the industry’s capability to provide next-generation energy solutions and promising substantial growth opportunities for the region, through both supply chain enhancements and skills development”. He added: “The first tranche of contract awards marks a significant milestone for the project and supply chain.” Hydrasun employ over 500 worldwide with around 60 employees currently focused on the hydrogen market sector.

Read More »

Macquarie Strategists Forecast USA Crude Inventory Drop

In an oil and gas report sent to Rigzone late Monday by the Macquarie team, Macquarie strategists revealed that they are forecasting that U.S. crude inventories will be down by 1.2 million barrels for the week ending March 14. “For this week’s crude balance, from refineries, we model crude runs up modestly (+0.2 million barrels per day) following a strong print last week,” the strategists noted in the report. “Among net imports, we model a meaningful reduction, with exports sharply higher (+0.9 million barrels per day) and imports also up (+0.3 million barrels per day) on a nominal basis,” they added. The strategists stated in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod.+adj.+transfers), we look for a moderate increase (+0.3 million barrels per day) this week,” the strategists said in the report. “Rounding out the picture, we anticipate another small increase in SPR [Strategic Petroleum Reserve] stocks (+0.3 million barrels) this week,” they added. The Macquarie strategists went on to state in the report that, “among products”, they “look for draws in gasoline (-3.0 million barrels) and distillate (-0.6 million barrels), with a build in jet (+1.0 million barrels)”. “We model implied demand for these three products at ~14.6 million barrels per day for the week ending March 14,” the Macquarie strategists went on to state. In its latest weekly petroleum status report at the time of writing, which was released on March 12 and included data for the week ending March 7, the U.S. Energy Information Administration (EIA) highlighted that U.S. commercial crude oil inventories, excluding those in the SPR, increased by 1.4 million barrels from the week ending February 28 to the week ending March 7. The report showed that crude oil stocks, not

Read More »

Petroecuador Oil Sales to Shell in Limbo after Pipeline Spill

Ecuadorian state oil company Petroecuador said it might not be able to honor contractual commitments to export crude as it struggles to control a spill from a ruptured pipeline. The force majeure, declared on Saturday, is expected to reduce exports to Shell Plc, which had purchased at least 1.8 million barrels of heavy sour Oriente crude loading this month, according to internal documents seen by Bloomberg. London-based Shell didn’t immediately return a message seeking comment.  Ecuador has suffered from torrential rains and mudslides that ruptured the 360,000-barrel-a-day SOTE pipeline, the country’s second-largest oil conduit, in the province of Esmeraldas. While the pipeline was shut Thursday, an unknown volume of oil reached a river system and has flowed 65 kilometers (40 miles) to the Pacific Ocean, near oil-exporting terminals.  US imports of Ecuadorian crude slumped to a 22-year low last year amid falling domestic production and competition from Canadian oil delivered by the expanded Trans Mountain pipeline. Most Ecuadorian exports are bound to Asia. Petroecuador didn’t immediately reply to follow-up questions on details of the force majeure. The Quito-based company may be able to tap into supplies held in the Balao terminal’s storage tanks, which can hold 3.2 million barrels. What do you think? We’d love to hear from you, join the conversation on the Rigzone Energy Network. The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy. MORE FROM THIS AUTHOR Bloomberg

Read More »

Work advances on facility to support Celtic Sea floating offshore wind expansion

Work on a facility to support the expansion of floating offshore wind (FLOW) in the Celtic Sea is “well underway” at Falmouth Docks in Cornwall, according to a mid-March update from Cornwall Council. APCL Falmouth, the owner of Falmouth Docks, is building a FLOW marshalling facility with 320,000 sq feet (30,000 sq m) of laydown space between County and Duchy Wharfs to serve the growing FLOW market. The £1m project is being partially funded by the Cornwall and Isles of Scilly Good Growth Programme, which is managed by Cornwall Council. The programme is meeting around half of the cost, or £502,500, from the UK Shared Prosperity Fund. In its update, Cornwall Council said around 100,000 sq feet (10,000 sq m) of unused sheds had now been demolished at the site and improvements to 210,000 sq feet (20,000 sq m) of quayside space, including resurfacing, were now underway. “FLOW developers need sites at ports with substantial laydown space for marshalling components,” stated Drystan Jones, APCL Falmouth’s director of port development, who also serves as Falmouth Docks and Engineering Company’s director and general manager. “These sites also need to be adjacent to large quaysides to allow access for jack-up and other construction vessels during installation operations. The space being created at Falmouth has been modelled and tested with FLOW developers (encompassing multiple device designs) to ensure it is fit for purpose.” Celtic Sea This comes as the UK government pursues a target of 4.5GW of FLOW capacity in the Celtic Sea by 2035, with a further 12GW by 2045. The relative immaturity of FLOW technology, however, means that meeting these targets may not be straightforward. “Floating offshore wind is a newer technology than fixed foundation offshore wind, and although many of the parts are not unique, their combination in a floating platform

Read More »

Energy firm Hunting to axe Aberdeen jobs in restructure

Precision engineering group Hunting is making changes in a bid to restore profitability to its Europe, Middle East and Africa (EMEA) business and adapt to challenges in the market. On March 18, Hunting released an update on restructuring plans for the EMEA segment. This followed a separate announcement earlier in the month about the acquisition of organic oil recovery (OOR) technology from its founding shareholders for $17.5m, in a move expected to benefit Hunting’s global operations. Hunting announced in January that it had decided to restructure its EMEA operating segment. This move was based on expectations of low levels of future drilling activity in the North Sea, among other factors. The latest update said a review of the segment, including an analysis of the medium-term outlook for the European region, had led to a series of proposals that were now under consideration by Hunting’s senior leadership. These include the proposed closure of the oil country tubular goods (OCTG) operating site in the Netherlands, with any future orders in relation to geothermal activity being fulfilled by Hunting’s UK sites; the consolidation of OCTG threading and accessories manufacturing activity into the Fordoun operating site in Aberdeen, UK; the consolidation and transfer of all well intervention manufacturing activities into Hunting’s Dubai operating site; and a reduction in the total headcount of the EMEA segment. Hunting’s regional managing director, Graham Goodall, told Energy Voice the future orders being shifted from the Netherlands to the UK represented “some positivity for the UK side of the business”. He added that the new Dubai facility was expected to enter service within the next few weeks. And while numbers for the anticipated headcount reductions are not being disclosed at this time, Goodall said the reductions would mainly affect the Badentoy facility in Aberdeen. Consultation processes with employees have

Read More »

Triton FPSO restart pushed back until May

Serica Energy has said that it expects production from the Triton floating production storage and offloading (FPSO) vessel will not recommence until at least May. Its announcement follows conversations with FPSO’s operator, Dana Petroleum (‘Dana’) and pushes back the original restart timeline of mid-to-late March. Production from the vessel was originally suspended in January due to damage from Storm Eowyn. Serica’s CEO Chris Cox stated: “Our frustrations with the ongoing performance of the Triton FPSO have been well documented – it is not good enough for Serica, and it is not good enough for our shareholders. “Our drilling results around Triton have been tremendous, and these need to be converted into sustained production and cashflow. We are working closely with Dana to help support them with the current work, and to drive the change required to deliver a more predictable production performance going forward.” Due to the ongoing maintenance issues and the performance of the Triton FPSO over the last 12 months, Serica Energy is discussing with Dana all options to secure a lasting improvement in the operating performance of the FPSO. Sea spray during Storm Eowyn on 24 January triggered the vessel’s fire and gas detection system, causing an automatic production shutdown. Investigations also revealed it had suffered minor damage to one of the cargo tanks, which in turn revealed an integrity issue with a coupling in a gas line. Triton has been offline since as the partners identify the root cause of the issue and the best means of resolving it. Triton was hit by multiple outages in 2024, with compressor seal issues causing it to shutdown at the end of October last year. These issues led to Serica dubbing its 2024 results as “disappointing” with output lower than expected. Now, the issues risk upsetting the group’s “promising”

Read More »

HPE, Nvidia broaden AI infrastructure lineup

“Accelerated by 2 NVIDIA H100 NVL, [HPE Private Cloud AI Developer System] includes an integrated control node, end-to-end AI software that includes NVIDIA AI Enterprise and HPE AI Essentials, and 32TB of integrated storage providing everything a developer needs to prove and scale AI workloads,” Corrado wrote. In addition, HPE Private Cloud AI includes support for new Nvidia GPUs and blueprints that deliver proven and functioning AI workloads like data extraction with a single click, Corrado wrote. HPE data fabric software HPE has also extended support for its Data Fabric technology across the Private Cloud offering. The Data Fabric aims to create a unified and consistent data layer that spans across diverse locations, including on-premises data centers, public clouds, and edge environments to provide a single, logical view of data, regardless of where it resides, HPE said. “The new release of Data Fabric Software Fabric is the data backbone of the HPE Private Cloud AI data Lakehouse and provides an iceberg interface for PC-AI users to data hosed throughout their enterprise. This unified data layer allows data scientists to connect to external stores and query that data as iceberg compliant data without moving the data,” wrote HPE’s Ashwin Shetty in a blog post. “Apache Iceberg is the emerging format for AI and analytical workloads. With this new release Data Fabric becomes an Iceberg end point for AI engineering. This makes it simple for AI engineering data scientists to easily point to the data lakehouse data source and run a query directly against it. Data Fabric takes care of metadata management, secure access, joining files or objects across any source on-premises or in the cloud in the global namespace.” In addition, HPE Private Cloud AI now supports pre-validated Nvidia blueprints to help customers implement support for AI workloads.  AI infrastructure optimization  Aiming to help customers

Read More »

Schneider Electric Adds Data Center and Microgrid Testing Labs to Andover, MA Global R&D Center

Schneider Electric, a global leader in energy management and automation, has established its Global Innovation Hubs as key centers for technological advancement, collaboration, and sustainable development. These hub facilities serve as ecosystems where cutting-edge solutions in energy efficiency, industrial automation, and digital transformation are designed, tested, and deployed to address the world’s most pressing energy and sustainability challenges. Energy Management and Industrial Automation Focus Strategically located around the world, Schneider Electric’s Global Innovation Hubs are positioned to drive regional and global innovation in energy management and industrial automation. The hubs focus on developing smart, connected, and sustainable solutions across various sectors, including data centers, smart buildings, industrial automation, and renewable energy. Key aspects of the Schneider Global Innovation Hubs include: Collaboration and Co-Innovation: Partnering with startups, industry leaders, and research institutions to accelerate innovation. Fostering an open ecosystem where ideas can be rapidly developed and tested. Digital Transformation and Automation: Leveraging IoT, AI, and cloud technologies to enhance energy efficiency. Implementing digital twin technology for real-time monitoring and predictive maintenance. Sustainability and Energy Efficiency: Developing solutions that contribute to decarbonization and net-zero emissions. Creating energy-efficient systems for buildings, industries, and critical infrastructure. Customer-focused Innovation: Offering live demonstrations, simulation environments, and test labs for customers. Customizing solutions to meet specific industry challenges and regulatory requirements. Schneider’s Andover R&D Lab Highlights While there are 11 hubs worldwide to give the global customer base more convenient locations where they can evaluate Schneider product, the new lab facilities have also been added to one of the company’s five global R&D locations. The selected location is co-located with Schneider’s US research labs in Andover, Massachusetts. With the addition of these two new labs there are now 41 labs located in Andover. Over the last year, Schneider Electric has invested approximately $2.4 billion in R&D. The

Read More »

Executive Roundtable: Probing Data Center Power Infrastructure and Energy Resilience in 2025

Ryan Baumann, Rehlko: Industry leaders are taking bold steps to secure long-term energy availability by embracing innovative backup power solutions, forming strategic partnerships, and exploring alternative energy sources. To overcome the challenges ahead, collaboration is key—operators, utilities, OEMs, and technology providers must come together, share insights, and create customized solutions that keep energy both reliable and sustainable as the landscape evolves. One of the most significant strategies is the growing use of alternative energy sources like hydrogen, natural gas, and even nuclear to ensure a steady supply of power. These options provide a more flexible, reliable backup to grid power, especially in markets with fluctuating energy demands or limited infrastructure. Emergency generator systems, when equipped with proper emissions treatment, can also support the grid through peak shaving or interruptible rate programs with utilities. Hydrogen fuel cells, in particular, are becoming a game-changer for backup power. Offering zero-emission, scalable, and efficient solutions, hydrogen is helping data centers move toward their carbon-neutral goals while addressing energy reliability. When integrated into a microgrid, hydrogen fuel cells create a cohesive energy network that can isolate from the main grid during power outages, ensuring continuous energy security for critical infrastructure like data centers. Additionally, natural gas Central Utility Plants (CUPs) are emerging as a key bridging power source, helping large data centers in grid-constrained regions maintain operations until permanent utility power is available. Smart energy solutions, including customized paralleling systems, allow emergency assets to be grid-intertied, enabling utilities and communities to share power burdens during peak periods. By embracing these innovative solutions and fostering collaboration, the industry not only ensures reliable power for today’s data centers but also paves the way for a more sustainable and resilient energy future. Next:  Cooling Imperatives for Managing High-Density AI Workloads 

Read More »

From Billions to Trillions: Data Centers’ New Scale of Investment

With Apple’s announcement to spend $500 billion over the next four years briefly overshadowing the $500 billion joint venture announcement of the Stargate project with the federal government, you can almost be forgiven for losing track of the billions of dollars in data center and tech spending announced by other industry players. Apple’s Four-Year, $500 Billion Spend Resonates with Tech The company’s data center infrastructure will see some collateral improvement to support future AI efforts, as a percentage of the funding will be dedicated to enhancing their existing data center infrastructure, though as yet there has been no public discussion of new data center facilities. Apple has committed to spending over $500 billion in the U.S. during the next four years.  This investment aims to bolster various sectors, including AI infrastructure, data centers, and research and development (R&D) in silicon engineering. The initiative also encompasses expanding facilities and teams across multiple states, such as Texas, California, Arizona, Nevada, Iowa, Oregon, North Carolina, and Washington. The spend will be a combination of investments in new infrastructure components along with the expansion of existing facilities. What has been publicly discussed includes the following: New AI Server Manufacturing Facility in Houston, Texas A significant portion of this investment is allocated to constructing a 250,000-square-foot manufacturing facility in Houston, Texas. Scheduled to open in 2026, this facility will produce servers designed to power Apple Intelligence, the company’s AI system. These servers, previously manufactured abroad, will now be assembled domestically, enhancing energy efficiency and security for Apple’s data centers. The project is expected to create thousands of jobs in the region. Expansion of Data Center Capacity Apple plans to increase its data center capacity in several states, including North Carolina, Iowa, Oregon, Arizona, and Nevada. This expansion aims to support the growing demands of AI

Read More »

Why Geothermal Energy Could Be a Behind-the-Meter Game Changer for Data Center Power Demand

By colocating data centers with geothermal plants, operators could tap into a clean, baseload power source that aligns with their sustainability goals. Operators could reduce transmission losses and enhance energy efficiency. Meanwhile, the paper points out that one of the most promising aspects of geothermal energy is its scalability. The Rhodium Group estimates that the U.S. has the technical potential to generate up to 5,000 GW of geothermal power—far exceeding the current and projected needs of the data center industry. With the right investments and policy support, Rhodium contends that geothermal could become a cornerstone of the industry’s energy strategy. The researchers project that 55-64% of the anticipated growth in hyperscale data center capacity could be met with behind-the-meter geothermal power, representing 15-17 GW of new capacity. In 13 of the 15 largest data center markets, geothermal could meet 100% of projected demand growth using advanced cooling technologies. Even in less favorable markets, geothermal could still meet at least 15% of power needs. Challenges and Opportunities for Geothermal-Driven Data Center Siting Strategies The Rhodium Group report explores two potential siting strategies for data centers: one that follows historical patterns of clustering near population centers and fiber-optic networks, and another that prioritizes proximity to high-quality geothermal resources. In the latter scenario, geothermal energy could easily meet all projected data center load growth by the early 2030s. Geothermal heat pumps also offer an additional benefit by providing efficient cooling for data centers, further reducing their overall electric load. This dual application of geothermal energy—for both power generation and cooling—could significantly enhance the sustainability and resilience of data center operations. However, despite its potential, geothermal energy faces several challenges that must be addressed to achieve widespread adoption. High drilling costs and technical risks associated with EGS development have historically deterred investment. (The report

Read More »

Cerebras Unveils Six Data Centers to Meet Accelerating Demand for AI Inference at Scale

6 Key Adjacent Data Center Industry Developments in Light of Cerebras’s New AI Acceleration Data Center Expansion Cerebras Systems’ announcement of six new U.S. data center sites dedicated to AI acceleration has sent ripples across the data center and AI industries. As the demand for AI compute capacity continues to surge, this move underscores the growing importance of specialized infrastructure to support next-generation workloads.  Here are six important adjacent and competitive developments in the data center industry that are shaping the landscape in light of Cerebras’s expansion. 1. Hyperscalers Doubling Down on AI-Optimized Data CentersMajor cloud providers like Google, AWS, and Microsoft Azure are rapidly expanding their AI-optimized data center footprints. These hyperscalers are investing heavily in GPU- and TPU-rich facilities to support generative AI, large language models (LLMs), and machine learning workloads. Cerebras’s move highlights the competitive pressure on hyperscalers to deliver low-latency, high-performance AI infrastructure. 2. Specialized AI Hardware Ecosystems Gaining TractionCerebras’s Wafer-Scale Engine (WSE) technology is part of a broader trend toward specialized AI hardware. Competitors like NVIDIA (with its Grace Hopper Superchips and DGX systems) and AMD (with its Instinct MI300 series) are also pushing the envelope in AI acceleration. This arms race is driving demand for data centers designed to accommodate these unique architectures, including advanced cooling and power delivery systems. 3. Liquid Cooling Adoption AcceleratesThe power density of AI workloads is forcing data center operators to rethink cooling strategies. Cerebras’s systems, known for their high compute density, will likely require liquid cooling solutions. This aligns with industry-wide adoption of liquid cooling technologies by companies like Equinix, Digital Realty, and EdgeConneX to support AI and HPC workloads efficiently. 4. Regional Data Center Expansion for AI WorkloadsCerebras’s choice to establish six new U.S. sites reflects a growing trend toward regional data center expansion to meet AI

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »