Stay Ahead, Stay ONMINE

NEO Energy seeks contractors for Donan, Balloch and Lochranza decommissioning

NEO Energy has released five tenders seeking contractors to help decommission its Donan, Balloch and Lochranza fields, along with the Global Producer III floating production offloading and storage (FPSO) vessel. According to data from the North Sea Transition Authority’s (NSTA’s) Pathfinder database, the decommissioning campaign is expected to start in the second quarter of 2026 […]

NEO Energy has released five tenders seeking contractors to help decommission its Donan, Balloch and Lochranza fields, along with the Global Producer III floating production offloading and storage (FPSO) vessel.

According to data from the North Sea Transition Authority’s (NSTA’s) Pathfinder database, the decommissioning campaign is expected to start in the second quarter of 2026 at the earliest, when work to disconnect the subsea infrastructure is expected to commence.

This will also see the FPSO unmoored and towed to an unspecified location.

By 2027, NEO plans to begin recovering the subsea infrastructure, followed by plugging and abandoning a total of 19 wells in 2028.-

To help with this, NEO Energy is looking for a contractor to perform P&A activities on the wells. The tender is expected to take place on 31 December 2025 and has a value of over £25 million

The company also announced four additional tenders, each with a value of less than £25m, covering recycling the FPSO, flushing, isolating and disconnecting the subsea infrastructure from the FPSO, disconnecting the moorings and towing the FPSO, and bulk seabed clearance.

NEO Energy recently announced plans to merge its North Sea operations with Repsol Resources UK’s.

The deal will see Repsol retain $1.8 billion (£1.4bn) in decommissioning liabilities related to its legacy assets, which NEO said will enhance the cash flows of the merged business.

NEO said it expects to complete the deal during the third quarter of 2025, subject to regulatory approvals.

© Supplied by System
Ninian South.

CNRL

Canadian Natural Resources Ltd (CNRL) has issued two tenders to assist with decommissioning its Ninian field in the Northern North Sea, located east of Shetland.

The decommissioning scope consists of three areas, covering the Ninian South Platform, Ninian Central Platform and the Ninian subsea infrastructure, which includes the Strathspey, Lyell, Columba E subsea tie-backs.

While the project schedule is not fully defined, completion of the Ninian South well plug and abandonment work is anticipated to be sometime in the fourth quarter of 2026, and Ninian Central in the first quarter of 2029.

The two tenders have a value of over £25m, and cover subsea well abandonment services, due for release on 1 July 2025, and for subsea infrastructure decommissioning, removal & disposal, due for release on 1 October 2025.

CNRL started tenders to decommission Ninian a year ago, looking for contractors to cover topsides and jacket removal.

offshore wind auction © Supplied by Scottish Power
A ScottishPower offshore wind turbine.

Inch Cape

The developers of the 1.1GW Inch Cape offshore wind farm issued two tenders last month as the project looks to start construction this year.

Both contracts are sub-£25m and cover hiring subject matter experts.

The first covered a specialist to help with explosive ordnance disposal, including reviewing documentation and consulting on improvements to reduce risk.

They would also provide advice to the principal contractor and monitor the EOD contractor to ensure compliance with risk mitigation plans and procedures.

The second contract covers hiring a temporary works coordinator to assist the principal contractor in keeping temporary works compliant with health and safety standards.

The deal also covers designs and procedures for the works, which could include scaffolding and cable pull in quadrants. The service start date was set for mid-2025.

The £3.5 billion Inch Cape offshore wind is being developed by Irish energy firm ESB and Chinese-owned Red Rock Renewables, based in Edinburgh.

The project will install up to 72 wind turbines and a single offshore substation off the coast of Angus, with the export cable making landfall near the Cockenzie Power Plant.

Offshore construction is due to begin in the second quarter of this year with the start of installation of the export cables and followed by the installation of the offshore platform. First power is expected in late-2026 and with a commercial operation date in 2027.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Linux Foundation Networking shares new AI projects, milestone releases

Beyond blueprints that provide best practices and deployment guidance, there are specific innovations inside of a series of LF Networking projects. Joshipura noted that projects like Nephio simplify the deployment of cloud-native network functions with a declarative approach to service description and intent-based automation that converts operators’ desired state of

Read More »

CenterPoint Energy completes $1.2B sale of Louisiana, Mississippi gas systems

CenterPoint Energy has completed the sale of its natural gas distribution systems in Louisiana and Mississippi to Delta Utilities for $1.2 billion, the companies announced Tuesday. Sale proceeds will “support the efficient funding of what we believe is one of the most tangible long-term growth plans in the industry,” CenterPoint President and CEO Jason Wells said in a statement. “We will continue to optimize the funding of our capital investments to support safety, reliability and resiliency for the benefit of our customers and communities.” CenterPoint now has electric transmission and distribution, power generation and natural gas distribution operations that serve approximately 7 million customers in Indiana, Minnesota, Ohio and Texas. Assets sold to Delta include approximately 12,000 miles of main pipeline serving approximately 380,000 metered customers. “Completing this acquisition furthers our vision to establish modern, multi-state natural gas utilities that build stronger, more resilient communities,” Delta CEO Tim Poché said in a statement. Delta is an affiliate of private equity firm Bernhard Capital Partners. Delta is also in the final stages of acquiring Entergy’s two regulated natural gas local distribution companies in New Orleans and Baton Rouge, with the sale expected to close this summer, the company said. Entergy Louisiana’s gas business serves approximately 95,000 homes and businesses in the Baton Rouge area, and Entergy New Orleans’ gas business serves approximately 109,000 homes and businesses in New Orleans, the utility said in a February announcement of the deal. While private equity firms have been known to squeeze profit from companies before reselling, Poché told the Shreveport-Bossier City Advocate that the gas system acquisition “is not a short-term investment at all.” “We’re very attracted to natural gas, in the resiliency that it has as a very significant transition fuel within our country,” he said. For CenterPoint, the utility said in 2024 that the

Read More »

PacifiCorp aims to add 4.7 GW renewables, 1.7 GW storage by 2031

Dive Brief: PacifiCorp expects to procure through requests for proposals about 2,400 MW of solar, 2,270 MW of wind, 1,680 MW of four-hour storage and 510 MW of 100-hour iron air battery storage by the end of 2030, according to the utility company’s integrated resource plan, filed with state regulators on Monday. By the end of this year, PacifiCorp also plans to finalize a 500-MW agreement to buy electricity from TerraPower’s Natrium advanced nuclear project in Wyoming, which is set to be operating by the end of 2031, the utility company said. The cost of the project is confidential. Also, if, as expected, the Biden administration’s greenhouse gas emissions rule for power plants doesn’t take effect, PacifiCorp plans to continue operating some coal-fired generation it had previously expected to retire. The plan calls for converting 562 MW of coal-fired generation to gas, exiting 386 MW of minority-owned coal-fired capacity, and adding carbon capture equipment to two coal-fired units at the Jim Bridger power plant in Wyoming that total 700 MW. Dive Insight: PacifiCorp utilities serve about 2 million customers in California, Idaho, Oregon, Utah, Washington and Wyoming. The IRP, filed every two years with stakeholder input, provides a roadmap for supplying power to those customers over the next 20 years. The utility owns or contracts for about 5,150 MW of wind, 4,530 MW of coal, 3,850 MW of gas, 3,620 MW of solar coupled with 550 MW of storage, 1,200 MW of hydroelectric capacity and 525 MW of stand-alone storage, according to the IRP. Under its resource plan, PacifiCorp expects that coal- and gas-fired generation will account for 16% and 10% of its energy mix in 2031, down from 35% and 19% this year, respectively. Wind and solar will make up 32% and 25%, up from 24% and 10%, respectively, in

Read More »

PG&E announces microgrid awards for $43M as Sunrun joins its 2025 VPP

Dive Brief: Pacific Gas & Electric intends to award up to $43 million in grants for community microgrid projects in Northern California, the gas and electric utility said on March 26. Issued under California’s Microgrid Incentive Program, a $200 million statewide competitive grant program, the funds will support projects serving nearly 9,000 customers in Humboldt, Lake and Marin counties. Four of the projects are in tribal communities, PG&E said. Also this week, distributed energy provider Sunrun announced a “first-of-its-kind program” with PG&E to harness approximately 600 home solar-and-storage systems to provide “targeted load relief to neighborhoods identified with highly constrained electric grids,” potentially avoiding or deferring distribution grid investments. Dive Insight: In 2023, the California Public Utilities Commission approved $200 million in funding for the Microgrid Incentive Program, including $79.2 million for PG&E, $83.3 million for Southern California Edison and $17.5 million for San Diego Gas & Electric. The nine awards announced March 26 represent the first tranche of PG&E’s MIP grants, selected from a pool of 22 applicants. Applications for the second tranche open on April 3 and run through May 30, PG&E said. First-tranche projects will receive a combined $34 million for front-of-the-meter generating resources and other project costs, plus $1 million each to cover grid interconnection costs, the utility said. Eligible microgrid projects must be able to provide at least 24 consecutive hours of energy in “island mode,” interconnect on distribution lines at or below 50 kV, and have “island mode” emissions no greater than the surrounding grid, according to a PG&E fact sheet. Projects must also be located in areas vulnerable to outages due to high wildfire or earthquake risk or lower historical reliability, and serve “disadvantaged or vulnerable communities” in rural, tribal or low-income areas. The awardees will join several active microgrids in Northern and

Read More »

Energy profits hit half a trillion as consumer price cap rises

Energy profits have hit half a trillion pounds since the energy crisis despite rising consumer bills, according to an independent analysis of company reports. Researchers at the End Fuel Poverty Coalition counted the profits declared by UK energy producers such as Shell and Equinor, as well as suppliers including British Gas and grid operators such as National Grid. Nearly half of the £500 billion of profits generated by the industry since 2020, or £207bn, were made by companies involved in the gas industry. According to Scottish Renewables chief executive Claire Mack, bill payers are ‘paying the price’ for exposure to global gas markets, with renewable power likely to provide the best value for consumers. © Supplied by Scottish RenewablesScottish Renewables chief executive Claire Mack. The ceiling on energy prices, designed to protect consumers from price spikes, and enforced by energy regulator Ofgem, known as the energy price cap, rose by 6.4% this week. Trade union Unite protested against high energy prices on Tuesday, arguing that nobody should have to choose between ‘heating or eating’. Union members will campaign across 40 locations in the UK, with further protests planned in the coming two weeks. Prime minister Keir Starmer has meanwhile promised that clean power will lower energy bills, something Kemi Badenoch, leader of the Conservatives, wants to abolish – while right-wing party Reform is also trying to leverage high bills in its political campaign. Politicians have begun to engage energy companies as they seek to invest in clean technologies and lower prices. Energy secretary Ed Miliband broached talks with Centrica this month to extend the Rough gas storage facility in the North Sea beyond 2030. A spokesperson for the Department for Energy Security and Net Zero (DESNZ) said: “We’re open to discussing proposals, just as long as it provides value for

Read More »

Hydrogen sector seeks policy certainty as UK government pledges revamped strategy

The UK hydrogen industry is calling on the government to deliver “pragmatism” and policy certainty to help the nascent sector overcome development challenges. It comes as the UK Labour government today pledged to unveil an updated hydrogen strategy later this year. The new hydrogen strategy will build on one introduced under the previous Conservative government in 2021. Addressing the Hydrogen UK annual conference in Birmingham, industry minister Sarah Jones said the updated strategy comes following “a great deal of change” in the industry in recent years. “New evidence has emerged on costs, demand and expected operating patterns, and our understanding has evolved in time, both in terms of how we can best use hydrogen in energy systems, and how we can expect the hydrogen economy to develop over time,” Jones said. Clean power and economic growth The government believes hydrogen will play a central role in two of Labour’s “guiding missions”, Jones said, delivering its clean power by 2030 target and securing economic growth. Jones said the hydrogen strategy will set out the government’s plans to “build on the progress made in recent years and seize the opportunities ahead”. In addition, Jones said the Department for Energy Security and Net Zero (DESNZ) will announce the successful projects within the second hydrogen allocation round (HAR2) “very shortly”. © Supplied by VoltalisPicture shows; Industry minister Sarah Jones. Royal Society of Arts. Supplied by Voltalis Date; 29/01/2025 The first allocation round (HAR1) saw 11 green hydrogen projects secure close to £2 billion in UK government funding in 2023 as part of the revenue support scheme. However, the first round fell short of securing its 250 MW capacity target, and there have been lengthy delays in securing final investment decisions from HAR1 developers. While several HAR1 developers have now signed government contracts, Jones said

Read More »

May NatGas Contract Looking to Establish New Post-Winter Trading Range

It appears that the May natural gas contract is looking to establish a new post-winter trading range into the shoulder season. That’s what Eli Rubin, an energy analyst at EBW Analytics Group, said in an EBW report sent to Rigzone by the EBW team on Wednesday, adding that, over the next seven to 10 days, normalizing weather after a bearish March and a likely bounce in LNG feedgas intake may help reinforce technical support. “Over the next 30-45 days, however, a continued erosion of storage deficits may bias late spring NYMEX contract price risks to the downside,” Rubin warned. In the EBW report, Rubin noted that the May contract “remains volatile as the NYMEX front-month contract returned recent gains to close sub-$4.00 per million British thermal units (MMBtu) [on Tuesday] for only the fourth time since February”. “Henry Hub spot prices dipped to $3.96 amid falling LNG feedgas demand readings at Sabine Pass and Corpus Christi reaching as deep as 2.0 billion cubic feet per day below late March record highs,” Rubin said in the report. “Still, while early-month pipeline nominations remain noisy, production readings suggest weakness into April – particularly in the Permian and Haynesville – to reinforce physical support,” Rubin added. “Heating demand may also bottom tomorrow before climbing into early next week,” Rubin continued. In a separate EBW report sent to Rigzone by the EBW team on Tuesday, Rubin highlighted that, on an intraday basis, the May contract “skyrocketed as much as 52.1¢ per MMBtu from Thursday’s intraday low at $3.732 to yesterday’s intraday high at $4.253”. “Still, notwithstanding Henry Hub spot rising to $4.13 per MMBtu amid stronger daily heating demand, the NYMEX front-month returned the majority of gains yesterday afternoon,” Rubin pointed out. In that report, The EBW analyst said a bearish weather shift over the past 24 hours

Read More »

European cloud group invests to create what it dubs “Trump-proof cloud services”

But analysts have questioned whether the Microsoft move truly addresses those European business concerns. Phil Brunkard, executive counselor at Info-Tech Research Group UK, said, commenting on last month’s announcement of the EU Data Boundary for the Microsoft Cloud,  “Microsoft says that customer data will remain stored and processed in the EU and EFTA, but doesn’t guarantee true data sovereignty.” And European companies are now rethinking what data sovereignty means to them. They are moving beyond having it refer to where the data sits to focusing on which vendors control it, and who controls them. Responding to the new Euro cloud plan, another analyst, IDC VP Dave McCarthy, saw the effort as “signaling a growing European push for data control and independence.” “US providers could face tougher competition from EU companies that leverage this tech to offer sovereignty-friendly alternatives. Although €1 million isn’t a game-changer on its own, it’s a clear sign Europe wants to build its own cloud ecosystem—potentially at the expense of US market share,” McCarthy said. “For US providers, this could mean investing in more EU-based data centers or reconfiguring systems to ensure European customers’ data stays within the region. This isn’t just a compliance checkbox. It’s a shift that could hike operational costs and complexity, especially for companies used to running centralized setups.” Adding to the potential bad news for US hyperscalers, McCarthy said that there was little reason to believe that this trend would be limited to Europe. “If Europe pulls this off, other regions might take note and push for similar sovereignty rules. US providers could find themselves adapting to a patchwork of regulations worldwide, forcing a rethink of their global strategies,” McCarthy said. “This isn’t just a European headache, it’s a preview of what could become a broader challenge.”

Read More »

Talent gap complicates cost-conscious cloud planning

The top strategy so far is what one enterprise calls the “Cloud Team.” You assemble all your people with cloud skills, and your own best software architect, and have the team examine current and proposed cloud applications, looking for a high-level approach that meets business goals. In this process, the team tries to avoid implementation specifics, focusing instead on the notion that a hybrid application has an agile cloud side and a governance-and-sovereignty data center side, and what has to be done is push functionality into the right place. The Cloud Team supporters say that an experienced application architect can deal with the cloud in abstract, without detailed knowledge of cloud tools and costs. For example, the architect can assess the value of using an event-driven versus transactional model without fixating on how either could be done. The idea is to first come up with approaches. Then, developers could work with cloud providers to map each approach to an implementation, and assess the costs, benefits, and risks. Ok, I lied about this being the top strategy—sort of, at least. It’s the only strategy that’s making much sense. The enterprises all start their cloud-reassessment journey on a different tack, but they agree it doesn’t work. The knee-jerk approach to cloud costs is to attack the implementation, not the design. What cloud features did you pick? Could you find ones that cost less? Could you perhaps shed all the special features and just host containers or VMs with no web services at all? Enterprises who try this, meaning almost all of them, report that they save less than 15% on cloud costs, a rate of savings that means roughly a five-year payback on the costs of making the application changes…if they can make them at all. Enterprises used to build all of

Read More »

Lightmatter launches photonic chips to eliminate GPU idle time in AI data centers

“Silicon photonics can transform HPC, data centers, and networking by providing greater scalability, better energy efficiency, and seamless integration with existing semiconductor manufacturing and packaging technologies,” Jagadeesan added. “Lightmatter’s recent announcement of the Passage L200 co-packaged optics and M1000 reference platform demonstrates an important step toward addressing the interconnect bandwidth and latency between accelerators in AI data centers.” The market timing appears strategic, as enterprises worldwide face increasing computational demands from AI workloads while simultaneously confronting the physical limitations of traditional semiconductor scaling. Silicon photonics offers a potential path forward as conventional approaches reach their limits. Practical applications For enterprise IT leaders, Lightmatter’s technology could impact several key areas of infrastructure planning. AI development teams could see significantly reduced training times for complex models, enabling faster iteration and deployment of AI solutions. Real-time AI applications could benefit from lower latency between processing units, improving responsiveness for time-sensitive operations. Data centers could potentially achieve higher computational density with fewer networking bottlenecks, allowing more efficient use of physical space and resources. Infrastructure costs might be optimized by more efficient utilization of expensive GPU resources, as processors spend less time waiting for data and more time computing. These benefits would be particularly valuable for financial services, healthcare, research institutions, and technology companies working with large-scale AI deployments. Organizations that rely on real-time analysis of large datasets or require rapid training and deployment of complex AI models stand to gain the most from the technology. “Silicon photonics will be a key technology for interconnects across accelerators, racks, and data center fabrics,” Jagadeesan pointed out. “Chiplets and advanced packaging will coexist and dominate intra-package communication. The key aspect is integration, that is companies who have the potential to combine photonics, chiplets, and packaging in a more efficient way will gain competitive advantage.”

Read More »

Silicon Motion rolls SSD kit to bolster AI workload performance

The kit utilizes the PCIe Dual Ported enterprise-grade SM8366 controller with support for PCIe Gen 5 x4 NVMe 2.0 and OCP 2.5 data center specifications. The 128TB SSD RDK also supports NVMe 2.0 Flexible Data Placement (FDP), a feature that allows advanced data management and improved SSD write efficiency and endurance. “Silicon Motion’s MonTitan SSD RDK offers a comprehensive solution for our customers, enabling them to rapidly develop and deploy enterprise-class SSDs tailored for AI data center and edge server applications.” said Alex Chou, senior vice president of the enterprise storage & display interface solution business at Silicon Motion. Silicon Motion doesn’t make drives, rather it makes reference design kits in different form factors that its customers use to build their own product. Its kits come in E1.S, E3.S, and U.2 form factors. The E1.S and U.2 forms mirror the M.2, which looks like a stick of gum and installs on the motherboard. There are PCI Express enclosures that hold four to six of those drives and plug into one card slot and appear to the system as a single drive.

Read More »

Executive Roundtable: Cooling Imperatives for Managing High-Density AI Workloads

Michael Lahoud, Stream Data Centers: For the past two years, Stream Data Centers has been developing a modular, configurable air and liquid cooling system that can handle the highest densities in both mediums. Based on our collaboration with customers, we see a future that still requires both cooling mediums, but with the flexibility to deploy either type as the IT stack destined for that space demands. With this necessity as a backdrop, we saw a need to develop a scalable mix-and-match front-end thermal solution that gives us the ability to late bind the equipment we need to meet our customers’ changing cooling needs. It’s well understood that liquid far outperforms air in its ability to transport heat, but further to this, with the right IT configuration, cooling fluid temperatures can also be raised, and this affords operators the ability to use economization for a greater number of hours a year. These key properties can help reduce the energy needed for the mechanical part of a data center’s operations substantially.  It should also be noted that as servers are redesigned for liquid cooling and the onboard server fans get removed or reduced in quantity, more of the critical power delivered to the server is being used for compute. This means that liquid cooling also drives an improvement in overall compute productivity despite not being noted in facility PUE metrics.  Counter to air cooling, liquid cooling certainly has some added management challenges related to fluid cleanliness, concurrent maintainability and resiliency/redundancy, but once those are accounted for, the clusters become stable, efficient and more sustainable with improved overall productivity.

Read More »

Airtel connects India with 100Tbps submarine cable

“Businesses are becoming increasingly global and digital-first, with industries such as financial services, data centers, and social media platforms relying heavily on real-time, uninterrupted data flow,” Sinha added. The 2Africa Pearls submarine cable system spans 45,000 kilometers, involving a consortium of global telecommunications leaders including Bayobab, China Mobile International, Meta, Orange, Telecom Egypt, Vodafone Group, and WIOCC. Alcatel Submarine Networks is responsible for the cable’s manufacturing and installation, the statement added. This cable system is part of a broader global effort to enhance international digital connectivity. Unlike traditional telecommunications infrastructure, the 2Africa Pearls project represents a collaborative approach to solving complex global communication challenges. “The 100 Tbps capacity of the 2Africa Pearls cable significantly surpasses most existing submarine cable systems, positioning India as a key hub for high-speed connectivity between Africa, Europe, and Asia,” said Prabhu Ram, VP for Industry Research Group at CyberMedia Research. According to Sinha, Airtel’s infrastructure now spans “over 400,000 route kilometers across 34+ cables, connecting 50 countries across five continents. This expansive infrastructure ensures businesses and individuals stay seamlessly connected, wherever they are.” Gogia further emphasizes the broader implications, noting, “What also stands out is the partnership behind this — Airtel working with Meta and center3 signals a broader shift. India is no longer just a consumer of global connectivity. We’re finally shaping the routes, not just using them.”

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »