Stay Ahead, Stay ONMINE

OPEC+ Damocles Sword Hanging Over Shale Oil Producers

In a report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team on Tuesday morning, Bjarne Schieldrop, chief commodities analyst at the company, said “the Damocles Sword of OPEC+” is “hanging over U.S. shale oil producers”. “OPEC+ decided yesterday [Monday] to stick with its plan – to lift production by 120,000 barrels per […]

In a report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team on Tuesday morning, Bjarne Schieldrop, chief commodities analyst at the company, said “the Damocles Sword of OPEC+” is “hanging over U.S. shale oil producers”.

“OPEC+ decided yesterday [Monday] to stick with its plan – to lift production by 120,000 barrels per day every month for 18 months starting April,” Schieldrop noted in the report.

“Again and again, it has pushed the start of the production increase further into the future. It could do it yet again. That will depend on circumstances of one – global oil demand growth and two – non-OPEC+ supply growth,” he added.

In the report, Schieldrop said “all oil producers in the world know that OPEC+ has … five to six million barrels per day of reserve capacity at hand” and noted that the group “wants to return two to three million barrels per day of this reserve to the market to get back to a more normal reserve level”.

“The now increasingly standing threat of OPEC+ to increase production in ‘just a couple of months’ is hanging over the world’s oil producers like a Damocles Sword. OPEC+ is essentially saying: ‘produce much more and we will do too, and you will get a much lower price’,” Schieldrop noted.   

The chief commodities analyst at SEB went on to state in the report that, “if U.S. shale oil producers embarked on a strong supply growth path, heeding calls from Donald Trump for more production and a lower oil price, then OPEC+ would have no other choice than to lift production and let the oil price fall”.

“Trump would get a lower oil price as he wishes for, but he would not get higher U.S. oil production. U.S. shale oil producers would get a lower oil price, lower income, and no higher production,” he added.

“U.S. oil production might even fall in the face of a lower oil price with lower price and volume hurting U.S. trade balance as well as producers,” he continued.

Schieldrop said in the report that lower taxes on U.S. oil producers could lead to higher oil production but added that “no growth equals lots of profits”.

“Trump could reduce taxes on U.S. oil production to lower their marginal cost by up to $10 per barrel,” Schieldrop noted in the report.

“It could be seen as a four-year time-limited option to produce more oil at a lower cost as such tax-measures could be reversed by the next president in four years. It would be very tempting for them to produce more,” he added.

Rigzone has contacted OPEC, the American Petroleum Institute (API), the International Association of Oil & Gas Producers (IOGP), the Trump transition team, the White House, and the U.S. Department of Energy (DOE) for comment on the SEB report. At the time of writing, none of the above have responded to Rigzone’s request yet.

A statement posted on OPEC’s website on December 5 highlighted that OPEC+’s required production level for 2025 and 2026 is 39.725 million barrels per day. That statement pointed out that the required production level for Saudi Arabia, Russia, Iraq, United Arab Emirates, Kuwait, Kazakhstan, Algeria, and Oman “is before applying any additional production adjustments”. It also noted that UAE required production has been increased by 300,000 barrels per day and added that this increase will be phased in gradually starting April 2025 until the end of September 2026.

A separate statement posted on the OPEC site on the same day revealed that Saudi Arabia, Russia, Iraq, United Arab Emirates, Kuwait, Kazakhstan, Algeria, and Oman decided to extend additional voluntary adjustments of 1.65 million barrels per day, that were announced in April 2023, until the end of December 2026.

That statement also revealed that those countries will extend additional voluntary adjustments of 2.2 million barrels per day, that were announced in November 2023, until the end of March 2025. These will be “gradually phased out on a monthly basis until the end of September 2026”, that statement highlighted. The statement also noted that “this monthly increase can be paused or reversed subject to market conditions”.

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Equinor starts production at Bacalhau field offshore Brazil

@import url(‘https://fonts.googleapis.com/css2?family=Inter:[email protected]&display=swap’); a { color: #c19a06; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: Inter; } body { line-height: 150%; letter-spacing: 0.025em; font-family: Inter; } button, .ebm-button-wrapper { font-family: Inter; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style

Read More »

AI’s dark side shows in Gartner’s top predictions for IT orgs

Increasing legal claims against AI-induced safety problems related to autonomous vehicle or medical accidents are also a mounting concern, Plummer stated. By the end of 2026, “death by AI” legal claims will exceed 1,000 globally due to insufficient AI risk guardrails, Plummer stated. “As regulatory scrutiny intensifies, organizations will face pressure

Read More »

AI dominates Gartner’s top strategic technology trends for 2026

“AI supercomputing platforms integrate CPUs, GPUs, AI ASICs, neuromorphic and alternative computing paradigms, enabling organizations to orchestrate complex workloads while unlocking new levels of performance, efficiency and innovation. These systems combine powerful processors, massive memory, specialized hardware, and orchestration software to tackle data-intensive workloads in areas like machine learning, simulation,

Read More »

IBM signs up Groq for speedy AI inferencing option

The technology involved in the partnership will let customers use watsonx capabilities in a familiar way and allow them to use their preferred tools while accelerating inference with GroqCloud, IBM stated. “This integration will address key AI developer needs, including inference orchestration, load balancing, and hardware acceleration, ultimately streamlining the

Read More »

IEA: Global oil market to see huge oversupply

In its most recent Oil Market Monthly Report, the International Energy Agency (IEA) lowered its projections for oil demand growth for this year, while increasing its expectations for supply growth, indicating a significant supply overhang. IEA highlights that global oil inventories are already surging, particularly oil stored in tankers on water. IEA data shows that global oil demand actually expanded by 750,000 b/d year-on-year (y-o-y) in third-quarter 2025, led by a rebound in petrochemical feedstock use following the tariff-induced slowdown in second-quarter 2025. However, the agency expects oil consumption to stay subdued through the rest of 2025 and in 2026, as weaker macroeconomic conditions and rising transport electrification weigh on demand. Annual demand growth is now projected at around 700,000 b/d for both years, down from 740,000 b/d expected for 2025 in IEA’s September report. This growth is well below the 980,000 b/d pace seen in 2024 and markedly lower than the 1.3 million b/d average during the 2010s. Demand for 2025 and 2026 is now projected to be 103.8 million b/d and 104.5 million b/d, respectively. Total global oil supply increased by 760,000 b/d month-on-month (m-o-m) in September to reach 108 million b/d, driven by a 1 million b/d surge in OPEC+ output, primarily from the Middle East. In this month’s report, IEA forecasts that global oil supply will increase by 3 million b/d to reach 106.1 million b/d in 2025, followed by an additional rise of 2.4 million b/d in 2026. This projection exceeds the September forecast of 2.7 million b/d supply growth for 2025 and 2.1 million b/d growth next year.  Non-OPEC+ producers account for 1.6 million b/d of this year’s growth and 1.2 million b/d next year, led by the US, Brazil, Canada, Guyana, and Argentina. OPEC+ output is expected to add 1.4 million b/d in 2025 and

Read More »

bp-led Shah Deniz consortium lets $700 million in compression project contracts

The bp plc-led Shah Deniz consortium has awarded three offshore contracts to advance the Shah Deniz compression (SDC) project in Azerbaijan. The $2.9-billion SDC project was sanctioned earlier this year. The aim is to access and produce low pressure gas reserves in Shah Deniz gas field and maximize resource recovery. In first-half 2025, the field produced around 14 billion cu m of gas and about 16 million bbl of condensate in total from the Shah Deniz Alpha and Shah Deniz Bravo platforms. Production capacty of existing Shah Deniz infrastructure is about 77.2 million cu m/day of gas. The contracts, with a combined value of about $700 million, have been awarded to the Saipem-BOS Shelf joint venture. The scope of work under the contracts includes: Transportation and installation of the entire SDC platform – a new 19,000-tonne platform to be installed in the Caspian Sea. Engineering, procurement, construction, and installation of subsea structures, including about 26 km of new offshore pipelines, to connect the SDC platform with existing Shah Deniz infrastructure. All onshore construction activities will be carried out at Baku Deep Water Jacket Factory, operated by BOS Shelf. Offshore construction and installation will be executed using the Khankendi subsea construction vessel owned by the Shah Deniz consortium, and the Israfil Huseynov pipelay barge owned by the Azerbaijan Caspian Shipping Co. (ASCO). Both vessels will be operated by Saipem. Offshore activities are expected to begin with pin pile installation in third-quarter 2026, with completion targeted for 2029. Shah Deniz compression project The SDC project is expected to enable around 50 billion cu m of additional gas and about 25 million bbl of additional condensate production and export from Shah Deniz field, bp said in a project overview on its website. The project involves installation of an electrically-powered unmanned compression platform in 85

Read More »

Valeura Energy finds JV partner for Thrace deep gas play testing

Valeura Energy Inc. has entered into a new joint venture to explore for and develop hydrocarbons in the deep rights formations of the Thrace basin of northwest Türkiye. The agreement was entered into through a Valeura subsidiary, together with partner Pinnacle Turkey Inc., with a subsidiary of Transatlantic Petroleum LLC. “Despite our strategic pivot toward the Asia-Pacific region, we have maintained our conviction that the deep gas play we discovered in northwest Türkiye offers significant potential to add value to the company,” said Sean Guest, president and chief executive officer of Valeura, in a release Oct. 15. “Our drilling program from 2017 to 2019 demonstrated that there are multiple tcf of gas in place across Valeura’s lands in a deep tight gas play.  We drilled three wells into this play and tested 12 separate zones – every one of which flowed gas.  It is my hope that with a reinvigorated push to test the play, we will see this evolve into a commercial success,” he continued. Thrace deep gas play, Türkiye Valeura has held various blocks and operated in Türkiye for almost 15 years.  The company continues to hold the deep rights (below 2,500 m or a pressure gradient of 0.6 psi/ft, whichever is shallower) in various exploration licenses and production leases covering a total of 955 sq km (gross) in the Thrace basin, just west of Istanbul.  The current exploration phase for most of the acreage (lands held under exploration license) expires June 27, 2026, but discussions are under way with the government for a 2-year appraisal period extension, the company said.

Read More »

EIA: US crude inventories up 3.5 million bbl

US crude oil inventories for the week ended Oct. 10, excluding the Strategic Petroleum Reserve, increased by 3.5 million bbl from the previous week, according to data from the US Energy Information Administration (EIA). While the federal government remains shutdown, the data was released 1 day later than usual due to the EIA’s regular holiday schedule. At 423.8 million bbl, US crude oil inventories are about 4% below the 5-year average for this time of year, the EIA report indicated. EIA said total motor gasoline inventories decreased by 300,000 bbl from last week and are slightly below the 5-year average for this time of year. Finished gasoline inventories and blending components inventories both decreased last week. Distillate fuel inventories decreased by 4.5 million bbl last week and are about 7% below the 5-year average for this time of year. Propane-propylene inventories increased by 1.9 million bbl from last week and are 11% above the 5-year average for this time of year, EIA said. US crude oil refinery inputs averaged 15.1 million b/d for the week ended Oct. 10, about 1.2 million b/d less than the previous week’s average. Refineries operated at 85.7% of capacity. Gasoline production decreased, averaging 9.4 million b/d. Distillate fuel production decreased by 577,000 b/d, averaging 4.6 million b/d. US crude oil imports averaged 5.5 million b/d, down 878,000 b/d from the previous week. Over the last 4 weeks, crude oil imports averaged 6.1 million b/d, 2.4% less than the same 4-week period last year. Total motor gasoline imports averaged 532,000 b/d. Distillate fuel imports averaged 160,000 b/d.

Read More »

Falling Saudi oil demand highlights power generation progress

During the first 7 months of 2025, combined demand for direct crude burn, fuel oil, and gasoil declined by nearly 100,000 b/d year on year, most noticeably in the summer months, when electricity demand typically peaks. The reduction came despite a 1.6% rise in cooling degree days (CDDs) and continued demographic pressures, with the working-age population expected to grow by roughly 6% this year. “While the availability of prompt data relating to electricity and natural gas is comparatively limited, the most likely driver of the fall in oil use is rising power output from other sources, especially gas. Increasing natural gas supply and utilization has long been a focus for the Saudi energy sector and the Jafurah project, with production beginning later this year, is expected to significantly boost gas (and NGLs) output during the rest of this decade,” IEA said. IEA expects this to enable a major reduction in oil use for electricity production, in a resumption of the substantial declines achieved during the late 2010s. “While monthly data can be volatile, the figures reported for June and July suggest that this progress may be outpacing the medium-term trajectory included in our Oil 2025 report, which already saw Saudi Arabian oil demand dropping by more than any country by 2030.” August and September temperatures were broadly consistent with recent seasonal norms, and CDDs were essentially flat year on year. In recent years, use of power plant input products has been less than half as responsive to underlying cooling requirements than it was during 2010-2016, and ‘base load’ winter deliveries appear to have fallen by more than 100,000 b/d since the pandemic. “Barring an unusually hot October and November, it is likely that total 2025 Saudi oil consumption will drop slightly, despite strong rises in GDP and population. With accelerating

Read More »

ExxonMobil Guyana advances to Phase 2 for Hammerhead FPSO

ExxonMobil Guyana Ltd. has let a Phase 2 contract to MODEC Inc. for a floating production storage, and offloading (FPSO) vessel for the Hammerhead project. The contract is for a full engineering, procurement, construction, and installation (EPCI) scope of work and follows the Phase One front-end engineering and design (FEED) contract. In April 2025, MODEC received a limited notice to proceed (LNTP) enabling it to commence FPSO design activities to support the earliest possible startup in 2029, subject to required government approvals. Phase One has since been completed and MODEC is advancing Phase Two. The Hammerhead FPSO will have the capacity to produce 150,000 b/d of oil, along with associated gas and water. It will be moored at a water depth of about 1,025 m using a spread mooring system. Hammerhead will be MODEC’s second FPSO for use in Guyana, following Errea Wittu, which is being built for ExxonMobil Guyana’s Uaru project. As with Uaru , MODEC will provide ExxonMobil with operations and maintenance services for the FPSO for 10 years from first oil.

Read More »

Storage constraints add to AI data center bottleneck

AI deployment uses multiple storage layers, and each one has different requirements, says Dell’Oro’s Fung. For storing massive amounts of unstructured, raw data, cold storage on HDDs makes more sense, he says. SSDs make sense for warm storage, such as for pre-processing data and for post-training and inference. “There’s a place for each type of storage,” he says. Planning ahead According to Constellation’s Mehta, data center managers and other storage buyers should prepare by treating SSD procurement like they do GPUs. “Multi-source, lock in lanes early, and engineer to standards so vendor swaps don’t break your data path.” He recommends qualifying at least two vendors for both QLC and TLC and starting early. TrendForce’s Ao agrees. “It is better to build inventory now,” he says. “It is difficult to lock-in long term deals with suppliers now due to tight supply in 2026.” Based on suppliers’ availability, Kioxia, SanDisk, and Micron are in the best position to support 128-terabyte QLC enterprise SSD solutions, Ao says. “But in the longer term, some module houses may be able to provide similar solutions at a lower cost,” Ao adds. “We are seeing more module houses, such as Phison and Pure Storage, supporting these solutions.” And it’s not just SSD for fast storage and HDD for slow storage. Memory solutions are becoming more complex in the AI era, says Ao. “For enterprise players with smaller-scale business models, it is important to keep an eye on Z-NAND and XL-Flash for AI inference demand,” he says. These are memory technologies that sit somewhere between the SSDs and the RAM working memory. “These solutions will be more cost-effective compared to HBM or even HBF [high bandwidth flash],” he says.

Read More »

AI gold rush sparks backlash against Core Scientific acquisition

Meanwhile, in a release issued last week, CoreWeave stated, “it has been unequivocal — to Core Scientific and publicly — that we will not modify our offer. Our offer is best and final.” Alvin Nguyen, senior analyst at Forrester Research, said what happens next with the overall data center market “depends on when AI demand slows down (when the AI bubble bursts).” He added, “if AI demand continues, prices continue to go up, and data centers change in terms of preferred locations (cooler climates, access to water, lots of space, more remote), use of microgrids/energy production, expect [major] players to continue to dominate.” However, said Nguyen, “if that slowdown is soon, then prices will drop, and the key players will need to either unload property or hold onto them until AI demand builds back up.” Generational shift occurring Asked what the overall effect of AI will be on CIOs in need of data center capacity, he said, “the new AI mega-factories alter data center placement: you don’t put them near existing communities because they demand too much power, water, land, you build them somewhere remote, and communities will pop up around them.” Smaller data centers, said Nguyen, “will still consume power and water in contention with their neighbors (industrial, commercial, and residential), potential limiting their access or causing costs to rise. CIOs and Network World readers should evaluate the trade offs/ROI of not just competing for data center services, but also for being located near a new data center.”

Read More »

Why cloud and AI projects take longer and how to fix the holdups

No. 2 problem: Unrealistic expectations lead to problematic requirements Early planning and business case validation show that the requirements set for the project can’t be met, which then requires a period of redefinition before real work can start. This situation – reported by 69% of enterprises – leads to an obvious question: Is it the requirements or the project that’s the problem? Enterprises who cite this issue say it’s the former, and that it’s how the requirements are set that’s usually the cause. In the case of the cloud, the problem is that senior management thinks that the cloud is always cheaper, that you can always cut costs by moving to the cloud. This is despite the recent stories on “repatriation,” or moving cloud applications back into the data center. In the case of cloud projects, most enterprise IT organizations now understand how to assess a cloud project for cost/benefit, so most of the cases where impossible cost savings are promised are caught in the planning phase. For AI, both senior management and line department management have high expectations with respect to the technology, and in the latter case may also have some experience with AI in the form of as-a-service generative AI models available online. About a quarter of these proposals quickly run afoul of governance policies because of problems with data security, and half of this group dies at this point. For the remaining proposals, there is a whole set of problems that emerge. Most enterprises admit that they really don’t understand what AI can do, which obviously makes it hard to frame a realistic AI project. The biggest gap identified is between an AI business goal and a specific path leading to it. One CIO calls the projects offered by user organizations as “invitations to AI fishing

Read More »

Riverbed tackles AI data bottleneck with new Oracle-based service

“Customers are looking for faster, more secure ways to move massive datasets so they can bring AI initiatives to life,” said Sachin Menon, Oracle’s vice president of cloud engineering, in a statement. “With Riverbed Data Express Service deployed on OCI, organizations will be able to accelerate time to value, reduce costs, and help ensure that their data remains protected.” Riverbed’s Aras explains that its Data Express Service uses post-quantum cryptography (PQC) to move petabyte-scale datasets through secure VPN tunnels to ensure that customer data remains protected during the transfer process. The technology is based on Riverbed’s SteelHead acceleration platform running RiOS 10 software. “Our cloud-optimized technology design delivers much higher data retrieval, data movement across the network, and data write rates, through highly performant data mover instances, instance parallelization and matched network fabric configurations. The design is tailored for each cloud, to ensure maximal performance can be achieved using cloud-specific product adjustments,” Aras says. “The time for preventing harvest-now, decrypt-later is now,” Aras says, referring to the security threat where encrypted data is intercepted and stored for decryption once quantum computers become powerful enough. The Riverbed service addresses use cases spanning AI model training, inference operations, and emerging agentic AI applications. Data Express is initially deployed on Oracle Cloud Infrastructure, but Riverbed said the service will orchestrate data movement across AWS, Azure, and Google Cloud Platform, as well as on-premises data centers. General availability is planned for Q4 2025.

Read More »

Roundup: Digital Realty Marks Major Milestones in AI, Quantum Computing, Data Center Development

Key features of the DRIL include: • High-Density AI and HPC Testing. The DRIL supports AI and high-performance computing (HPC) workloads with high-density colocation, accommodating workloads up to 150 kW per cabinet. • AI Infrastructure Optimization. The ePlus AI Experience Center lets businesses explore AI-specific power, cooling, and GPU resource requirements in an environment optimized for AI infrastructure. • Hybrid Cloud Validation. With direct cloud connectivity, users can refine hybrid strategies and onboard through cross connects. • AI Workload Orchestration. Customers can orchestrate AI workloads across Digital Realty’s Private AI Exchange (AIPx) for seamless integration and performance. • Latency Testing Across Locations. Enterprises can test latency scenarios for seamless performance across multiple locations and cloud destinations. The firm’s Northern Virginia campus is the primary DRIL location, but companies can also test latency scenarios between there and other remote locations. DRIL rollout to other global locations is already in progress, and London is scheduled to go live in early 2026. Digital Realty, Redeployable Launch Pathway for Veteran Technical Careers As new data centers are created, they need talented workers. To that end, Digital Realty has partnered with Redeployable, an AI-powered career platform for veterans, to expand access to technical careers in the United Kingdom and United States. The collaboration launched a Site Engineer Pathway, now live on the Redeployable platform. It helps veterans explore, prepare for, and transition into roles at Digital Realty. Nearly half of veterans leave their first civilian role within a year, often due to unclear expectations, poor skill translation, and limited support, according to Redeployable. The Site Engineer Pathway uses real-world relevance and replaces vague job descriptions with an experience-based view of technical careers. Veterans can engage in scenario-based “job drops” simulating real facility and system challenges so they can assess their fit for the role before applying. They

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »