Stay Ahead, Stay ONMINE

US Aims to Cut Iranian Oil Flows by Over 90 Pct

Treasury Secretary Scott Bessent said the US aims to squeeze Iran’s oil exports to less than 10 percent of current levels as President Donald Trump renews a campaign of “maximum pressure” on Tehran’s nuclear program. “We are committed to bringing the Iranians to going back to the 100,000 barrels-a-day of oil exports” shipped during Trump’s first […]

Treasury Secretary Scott Bessent said the US aims to squeeze Iran’s oil exports to less than 10 percent of current levels as President Donald Trump renews a campaign of “maximum pressure” on Tehran’s nuclear program.

“We are committed to bringing the Iranians to going back to the 100,000 barrels-a-day of oil exports” shipped during Trump’s first term, Bessent said in a Friday interview with Fox Business. The country is exporting 1.5 million-1.6 million barrels per day now, he said.

Bessent also said he would escalate sanctions on Russian energy if asked to do so by the president. Oil futures jumped after his remarks, with Brent crude trading as much as 1.1 percent higher at $75.85 a barrel.

Last week, Trump signed a directive intended to ramp up economic pressure on Iran, renewing the tough stance taken during his first term, in an effort to prevent the Islamic Republic from obtaining a nuclear weapon.

Oil revenues are enabling Iran to fund “terrorist activities around the world,” Bessent said.

“The Chinese, perhaps Indians, are buying the sanctioned Iranian oil, and that is unacceptable,” he told Fox Business.

Squeezing exports to 100,000 barrels per day will leave Iran in “severe economic distress,” Bessent said. The country’s internal numbers show its economy is “quite fragile right now” as a result of “massive inflation” and a “gigantic budget deficit,” he said.

The treasury secretary’s comments struck a slightly harder tone than that adopted by Trump, who said last week he was reluctant to sign the order targeting Iran and hoped for an agreement to avoid its implementation.

In the days leading up to Trump’s inauguration Jan. 20, oil refiners in Asia scoured for alternative supplies in case his administration crimped access to Iranian or Russian barrels. But some of those concerns ebbed after the announcement of his first measures, which concentrated on just three tankers

For the time being, the OPEC member’s production remains resilient. Iran pumped an average of 3.32 million barrels a day of crude in January, close to the highest level since Trump launched his previous crackdown in 2018, according to data compiled by Bloomberg.



WHAT DO YOU THINK?

Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.


MORE FROM THIS AUTHOR



Bloomberg


Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Fortinet speeds threat detection with improved FortiAnalyzer

The package also now integrates with FortiAI, the vendor’s genAI assistant, to better support analytics and telemetry to help security teams speed threat investigation and response, the vendor stated. “FortiAI identifies the threats that need analysis from the data collected by FortiAnalyzer, primarily collected from FortiGates. By automating the collection,

Read More »

Aryaka adds AI-powered observability to SASE platform

Nadkarni explained that Aryaka runs unsupervised machine learning models on the data to identify anomalies and outliers in the data. For example, the models may detect a sudden spike in traffic to a domain that has not been seen before. This unsupervised analysis helps surface potential issues or areas of

Read More »

BKV to Develop Texas CCS Facility with Midstream Firm

BKV Corporation said it is partnering with an undisclosed “leading diversified midstream company” to develop a new carbon capture and sequestration (CCS) facility at a currently operating natural gas plant in South Texas. The natural gas processing plant, located near the city of Freer in south Texas, processes natural gas produced in the Eagle Ford Shale. Under the announced partnership, BKV will purchase the carbon dioxide (CO2) waste stream from the plant, which will then be compressed, transported and permanently sequestered via BKV’s injection well at an adjacent site, the company said in a news release. BKV noted that it will retain the environmental attributes associated with the CCS project. The project reached a final investment decision (FID) in December 2024. The Texas Railroad Commission has approved the project’s Class II injection well and a monitoring, reporting and verification (MRV) plan has been submitted to the United States Environmental Protection Agency for approval, BKV said. BKV expects the project to be fully operational in the first quarter of 2026, subject to receipt of all required permits. The facility is forecast to achieve an average sequestration rate of approximately 90,000 metric tons per year of carbon dioxide (CO2) equivalent. The new CCS facility adds to BKV’s carbon capture, utilization and sequestration (CCUS) portfolio, which includes the operational CCS facility at the Company’s Barnett Zero site in north Texas and the Cotton Cove CCS project, which has reached FID and is expected to begin initial sequestration operations in the first half of 2026, according to the release. “This new CCS project addition to our portfolio further demonstrates our commitment to achieving net-zero emissions from our owned and operated upstream and natural gas midstream businesses and showcases the execution of our differentiated business model to make this vision a reality,” BKV CEO Chris

Read More »

Why sub-surface gas storage will be key to Scotland’s hydrogen ambitions

Green hydrogen is set to be a major component of low-carbon energy systems of the future. Although the debate is still open on the role this zero-emissions gas will play in domestic settings, it is clear that green hydrogen – produced via the electrolysis of water using renewable energy – will have a significant role in industry. Hydrogen is already a major component in an array of industrial processes, from ammonia production for use in fertilisers to refining, and myriad other uses. But in the vast majority of cases we still rely on what is known as ‘grey’ hydrogen, produced via the highly polluting steam-reforming process using methane or natural gas. If we are going to have any hope of reaching net zero, our first step should be to replace this high-carbon feedstock with its low-carbon alternative. In parallel, we should look to use green H2 as a high-temperature fuel – for things that cannot be electrified or decarbonised in other ways – and again many of these will be industrial process, such as the production of green steel. But if this is the case, we will need to find ways to store it – safely and affordably. With natural gas, we are able to rely on the vast natural reserves beneath the sea, and on ‘linepack’ – basically storing gas within our existing pipe network. However, hydrogen is different. Green hydrogen is a wonder gas – it can be created from water and renewable energy and produces zero emissions (when used in fuel cells). But on the flip side it is highly flammable, with much smaller molecules than the natural gas on which we rely – and it cannot be stored in the same way without significant modifications to the current network. But as green hydrogen production ramps up,

Read More »

Bab el-Mandeb Strait Maritime Traffic Sees Significant Downturn

In an analysis posted on its website earlier this month, Dryad Global said maritime traffic through the Bab el-Mandeb Strait “has experienced a significant downturn following the series of Houthi attacks”. “Before the attacks started in November 2023, the strait saw an average of approximately 459 cargo-carrying vessels over 10,000 deadweight tonnage (dwt) per week,” Dryad noted in the analysis. “However, current data reflects a stark reduction, with only 252 transits recorded, marking a 48 percent decrease,” it added. “Analyzing by vessel type, bulk carriers have seen the most dramatic decline, dropping from an average of 540 transits per week to just 79, an 85 percent reduction. Crude oil tankers have also decreased, from 78 transits to 45, indicating a 42 percent fall,” it continued. “Product tankers, with specifics less detailed pre-attack, are now averaging 43 transits per week. Containerships have plummeted by 70 percent, from a ‘normal’ average of 130 transits to only 40. General cargo ships have decreased by 24 percent,” Dryad went on to state. The company noted in the analysis that LNG and LPG carriers “have shown the most severe impact”. “No LNG carriers have transited since mid-March 2024, with LPG carrier traffic dropping by 92 percent,” Dryad said. Dryad also stated in the analysis that the total deadweight tonnage of vessels passing through the strait had “fallen by 67 percent by March 2024, suggesting not only fewer ships but potentially smaller vessels or reduced cargo loads”. The traffic through this region has not recovered since March of 2024, Dryad noted in the analysis. “This situation underscores the broader economic implications, including increased shipping times and costs due to rerouting around Africa via the Cape of Good Hope,” Dryad said. “The ongoing instability in the region has led to a rise in insurance premiums for ships

Read More »

Ecopetrol Boosts Investment in Colombian Eastern Plains

State-controlled oil driller Ecopetrol SA sees “great potential” for development and exploration in Colombia’s crude-producing heartland alongside bets on US fracking and offshore gas.  The company said this month that it now has full ownership of the CPO-09 block in the nation’s oil-rich eastern plains after completing its $452 million acquisition of Repsol SA’s 45 percent stake. That investment’s strategic location near Castilla and Chichimene, among the country’s top five fields, allows the company to benefit from synergies, a top executive said in an interview.  “We’re looking to increase our reserves and output,” Rafael Guzman, Ecopetrol’s vice president of hydrocarbons, said Tuesday in Barrancabermeja.   With reserves of oil and natural gas falling, and President Gustavo Petro refusing to grant new exploration licenses, Ecopetrol is focused on contracts signed before the environmentalist leader took power in 2022. The company’s priorities are boosting output in blocks such as CPO-09 in the cattle-ranching lowlands of central Colombia, finding and producing natural gas in the nation’s Caribbean waters, and increasing output at its international assets, Guzman said.  Ecopetrol’s recent production growth has come from the Permian Basin — which accounted for 14 percent of output in the third quarter — helping offset declines at home. Earlier this month, Petro asked the company to sell its operations in the US, citing his government’s stance against fracking, which he says is destructive to nature and humanity.  However one of the joint ventures the company has in the southwestern US with Occidental Petroleum Corp. runs through 2027 and the other was extended recently until mid-2026.  “We already have a commitment,” Guzman said.  Even after Colombia recently started importing liquefied natural gas to supply factories and homes, Petro is determined to block the use of fracking to boost output and close the deficit. The environment ministry this week sent an anti-fracking bill to lawmakers. 

Read More »

Biogas project investment increased 40% in 2024, industry group says

Dive Brief: The amount of capital invested in U.S. projects across sectors that use biogas grew by 40% year over year in 2024, according to an analysis released last week by industry group American Biogas Council. The number of new facilities that came online also increased by 17% year over year, the group revealed, including at landfills, farms and wastewater treatment plants. There are now nearly 2,500 facilities that capture and use biogas in the United States. The report also reflects how policies that are favorable to renewable natural gas have dominated the industry, as 95% of the 125 projects that came online last year produced RNG. Dive Insight: Those trends reflect growing interest in using biogas for energy or to convert into renewable natural gas, a fuel that can be used in place of petroleum-based natural gas. The use of biogas for energy has a long history, particularly for wastewater treatment facilities and landfills. The first modern anaerobic digestion plant was built at a sewage treatment site in India in the 1800s, according to the National Renewable Energy Laboratory. Biogas capture and use has also become common for U.S. landfills above a certain size due to federal programs like the U.S. EPA’s Landfill Methane Outreach Program, which began in the 1990s. Today, 47% of all biogas capture facilities in the country are at wastewater facilities. An additional 25% are at agricultural facilities, 23% are at landfills and 5% are built to process food waste alone. Codigestion, where multiple feedstocks like food waste, agricultural waste or sewage are processed at the same time, has also become more common. A growing number of public incentives have made refining biogas a more profitable enterprise as demand increases for fuel not derived from petroleum. Policies like the federal Renewable Fuel Standard and state clean

Read More »

Texas regulators move virtual power plant pilot development to ERCOT

Texas utility regulators have shifted management of the aggregated distributed energy resource, or ADER, pilot program into the Electric Reliability Council of Texas stakeholder process in order for the burgeoning resource to “engage with a larger community of ERCOT market participants,” according to a staff memo recommending the change. ADER launched more than two years ago, with the Public Utility Commission of Texas organizing the governing task force. “I’m comfortable with this moving to ERCOT now that we’re looking at more technical aspects,” PUCT Chairman Thomas Gleeson said at the commission’s Thursday open meeting. The ADER pilot has yet to hit its phase one capacity targets, but as of February there were three virtual power plants providing 25.5 MW of energy and almost 20 MW of other reserve services, according to staff’s memo. The program launched in 2022 with a goal to harness 80 MW of flexible resources, but experts say initial limits on the program’s size dissuaded participation. Commissioner Kathleen Jackson said shifting the program to ERCOT is “the logical next step.” PUCT staff will continue to monitor the project and recommended that ERCOT report progress on the pilot every six months.  Former Commissioner Jimmy Glotfelty led the ADER effort from the commission, working as a liaison to the task force which developed the initial rules for the pilot with the grid operator. Glotfelty retired from the PUCT last year but told Utility Dive “success, to me, is if in three years, or two years, [ADER has] 300 MW or 500 MW and it’s a general part of the market system.” Virtual power plants, deployed at scale, can reduce the need for long-term power system investments and are being considered across the U.S., experts say. A 60-GW nationwide deployment could help meet the country’s future resource adequacy needs while avoiding $15

Read More »

Ireland says there will be no computation without generation

Stanish said that, in 2023, she wrote a paper that predicted “by 2028, more than 70% of multinational enterprises will alter their data center strategies due to limited energy supplies and data center moratoriums, up from only about 5% in 2023. It has been interesting watching this trend evolve as expected, with Ireland being a major force in this conversation since the boycotts against data center growth started a few years ago.” Fair, equitable, and stable electricity allocation, she said, “means that the availability of electricity for digital services is not guaranteed in the future, and I expect these policies, data center moratoriums, and regional rejections will only continue and expand moving forward.” Stanish pointed out that this trend is not just occurring in Ireland. “Many studies show that, globally, enterprises’ digital technologies are consuming energy at a faster rate than overall growth in energy supply (though, to be clear, these studies mostly assume a static position on energy efficiency of current technologies, and don’t take into account potential for nuclear or hydrogen to assuage some of these supply issues).” If taken at face value, she said, this means that a lack of resources could cause widespread electricity shortages in data centers over the next several years. To mitigate this, Stanish said, “so far, data center moratoriums and related constraints (including reduced tax incentives) have been enacted in the US (specifically Virginia and Georgia), Denmark, Singapore, and other countries, in response to concerns about the excessive energy consumption of IT, particularly regarding compute-intense AI workloads and concerns regarding an IT energy monopoly in certain regions. As a result, governments (federal, state, county, etc.) are working to ensure that consumption does not outpace capacity.” Changes needed In its report, the CRU stated, “a safe and secure supply of energy is essential

Read More »

Perspective: Can We Solve the AI Data Center Power Crisis with Microgrids?

President Trump announced a$500 billion private sector investment in the nation’s Artificial Intelligence (AI) infrastructure last month. The investment will come from The Stargate Project, a joint venture between OpenAI, SoftBank, Oracle and MGX, which intends to build 20 new AI data centers in the U.S in the next four to five years. The Stargate Project committed$100 billion for immediate deployment and construction has already begun on its first data center in Texas. At approximately a half a million square feet each, the partners say these new facilities will cement America’s leadership in AI, create jobs and stimulate economic growth. Stargate is not the only game in town, either. Microsoft is expected to invest$80 billion in AI data center development in 2025, with Google, AWS and Meta also spending big. While all this investment in AI infrastructure is certainly exciting, experts say there’s one lingering question that’s yet to be answered and it’s a big one: How are we going to power all these AI data centers? This will be one of the many questions tackled duringMicrogrid Knowledge’s annual conference, which will be held in Texas April 15-17 at the Sheraton Dallas. “Powering Data Centers: Collaborative Microgrid Solutions for a Growing Market” will be one of the key sessions on April 16. Industry experts will gather to discuss how private entities, developers and utilities can work together to deploy microgrids and distributed energy technologies that address the data center industry’s power needs. The panel will share solutions, technologies and strategies that will favorably position data centers in the energy queue. In advance of this session, we sat down with two microgrid experts to learn more about the challenges facing the data center industry and how microgrids can address the sector’s growing energy needs. We spoke with Michael Stadler, co-founder and

Read More »

Data Center Tours: Iron Mountain VA-1, Manassas, Virginia

Iron Mountain Northern Virginia Overview Iron Mountain’s Northern Virginia data centers VA-1 through VA-7 are situated on a 142-acre highly secure campus in Prince William County, Virginia. Located at 11680 Hayden Road in Manassas, Iron Mountain VA-1 spans 167,958 sq. ft. and harbors 12.4 MW of total capacity to meet colocation needs. The 36 MW VA-2 facility stands nearby. The total campus features a mixture of single and multi-tenant facilities which together provide more than 2,000,000 SF of highly efficient green colocation space for enterprises, federal agencies, service providers and hyperscale clouds.  The company notes that its Manassas campus offers tax savings compared to Ashburn and exceptional levels of energy-efficiency as well as a diverse and accessible ecosystem of cloud, network and other service providers.  Iron Mountain’s Virginia campus has 9 total planned data centers, with 5 operational facilities to date and two more data centers coming soon. VA-2 recently became the first data center in the United States to achieve DCOS Maturity Level 3.    As we continued the tour, Kinra led the way toward the break room, an area where customers can grab coffee or catch up on work. Unlike the high-end aesthetic of some other colocation providers, Iron Mountain’s approach is more practical and focused on functionality. At the secure shipping and receiving area, Kinra explained the process for handling customer equipment. “This is where our customers ship their equipment into,” he said. “They submit a ticket, send their shipments in, and we’ll take it, put it aside for them, and let them know when it’s here. Sometimes they ask us to take it to their environment, which we’ll do for them via a smart hands ticket.” Power Infrastructure and Security Measures The VA-1 campus is supported by a single substation, providing the necessary power for its growing

Read More »

Land and Expand: DPO, Microsoft, JLL and BlackChamber, Prologis, Core Scientific, Overwatch Capital

Land and Expand is a periodic feature at Data Center Frontier highlighting the latest data center development news, including new sites, land acquisitions and campus expansions. Here are some of the new and notable developments from hyperscale and colocation data center developers and operators about which we’ve been reading lately. DPO to Develop $200 Million AI Data Center in Wisconsin Rapids; Strategic Partnership with Billerud’s CWPCo Unlocks Hydroelectric Power for High-Density AI Compute Digital Power Optimization (DPO) is moving forward with plans to build a $200 million high-performance computing (HPC) data center in Wisconsin Rapids, Wisconsin. The project, designed to support up to 20 megawatts (MW) of artificial intelligence (AI) computing, leverages an innovative partnership with Consolidated Water Power Company (CWPCo), a subsidiary of global packaging leader Billerud. DPO specializes in developing and operating data centers optimized for power-dense computing. By partnering with utilities and independent power producers, DPO colocates its facilities at energy generation sites, ensuring direct access to sustainable power for AI, HPC, and blockchain computing. The company is privately held. Leveraging Power Infrastructure for Speed-to-Energization CWPCo, a regulated utility subsidiary, has operated hydroelectric generation assets since 1894, reliably serving industrial and commercial customers in Wisconsin Rapids, Biron, and Stevens Point. Parent company Billerud is a global leader in high-performance packaging materials, committed to sustainability and innovation. The company operates nine production facilities across Sweden, the USA, and Finland, employing 5,800 people in over 19 countries.  The data center will be powered by CWPCo’s renewable hydroelectric assets, tapping into the utility’s existing 32 megawatts of generation capacity. The partnership grants DPO a long-term land lease—extending up to 50 years—alongside interconnection rights to an already-energized substation and a firm, reliable power supply. “AI infrastructure is evolving at an unprecedented pace, and access to power-dense sites is critical,” said Andrew

Read More »

Data center spending to top $1 trillion by 2029 as AI transforms infrastructure

His projections account for recent advances in AI and data center efficiency, he says. For example, the open-source AI model from Chinese company DeepSeek seems to have shown that an LLM can produce very high-quality results at a very low cost with some clever architectural changes to how the models work. These improvements are likely to be quickly replicated by other AI companies. “A lot of these companies are trying to push out more efficient models,” says Fung. “There’s a lot of effort to reduce costs and to make it more efficient.” In addition, hyperscalers are designing and building their own chips, optimized for their AI workloads. Just the accelerator market alone is projected to reach $392 billion by 2029, Dell’Oro predicts. By that time, custom accelerators will outpace commercially available accelerators such as GPUs. The deployment of dedicated AI servers also has an impact on networking, power and cooling. As a result, spending on data center physical infrastructure (DCPI) will also increase, though at a more moderate pace, growing by 14% annually to $61 billion in 2029.  “DCPI deployments are a prerequisite to support AI workloads,” says Tam Dell’Oro, founder of Dell’Oro Group, in the report. The research firm raised its outlook in this area due to the fact that actual 2024 results exceeded its expectations, and demand is spreading from tier one to tier two cloud service providers. In addition, governments and tier one telecom operators are getting involved in data center expansion, making it a long-term trend.

Read More »

The Future of Property Values and Power in Virginia’s Loudoun County and ‘Data Center Alley’

Loudoun County’s FY 2026 Proposed Budget Is Released This week, Virginia’s Loudoun County released its FY 2026 Proposed Budget. The document notes how data centers are a major driver of revenue growth in Loudoun County, contributing significantly to both personal and real property tax revenues. As noted above, data centers generate almost 50% of Loudoun County property tax revenues. Importantly, Loudoun County has now implemented measures such as a Revenue Stabilization Fund (RSF) to manage the risks associated with this revenue dependency. The FY 2026 budget reflects the strong growth in data center-related revenue, allowing for tax rate reductions while still funding critical services and infrastructure projects. But the county is mindful of the potential volatility in data center revenue and is planning for long-term fiscal sustainability. The FY 2026 Proposed Budget notes how Loudoun County’s revenue from personal property taxes, particularly from data centers, has grown significantly. From FY 2013 to FY 2026, revenue from this source has increased from $60 million to over $800 million. Additionally, the county said its FY 2026 Proposed Budget benefits from $150 million in new revenue from the personal property tax portfolio, with $133 million generated specifically from computer equipment (primarily data centers). The county said data centers have also significantly impacted the real property tax portfolio. In Tax Year (TY) 2025, 73% of the county’s commercial portfolio is composed of data centers. The county said its overall commercial portfolio experienced a 50% increase in value between TY 2024 and TY 2025, largely driven by the appreciation of data center properties. RSF Meets Positive Economic Outlook The Loudoun County Board of Supervisors created the aformentioned Revenue Stabilization Fund (RSF) to manage the risks associated with the county’s reliance on data center-related revenue. The RSF targets 10% of data center-related real and personal property tax

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »