Stay Ahead, Stay ONMINE

About 700 Bcf of Gas Matched in 2nd Midterm Round of AggregateEU

The European Commission has matched almost 20 billion cubic meters (706.29 billion cubic feet) of demand from European Union gas buyers with offers from potential suppliers under the second midterm round of AggregateEU. Vendors offered 31 Bcm, exceeding the 29 Bcm of demand pooled during the matchmaking round opened this month, according to an online […]

The European Commission has matched almost 20 billion cubic meters (706.29 billion cubic feet) of demand from European Union gas buyers with offers from potential suppliers under the second midterm round of AggregateEU.

Vendors offered 31 Bcm, exceeding the 29 Bcm of demand pooled during the matchmaking round opened this month, according to an online statement Wednesday from the Commission’s Directorate-General for Energy.

“All participants have been informed about the matching results and will now be able to negotiate contracts bilaterally”, the Directorate said.

Energy and Housing Commissioner Dan Jørgensen commented, “As we fast track our decarbonization efforts in the EU, it is also key that European buyers are able to secure competitive gas offers from reliable international suppliers”.

“The positive results of this second matching round on joint gas purchasing show the strong interest from the market and the value in providing increased transparency to European gas users and buyers”, Jørgensen added.

Announcing the second midterm round March 12, 2025, the Directorate said LNG buyers and sellers not only can name their preferred terminal of delivery as before but can now also express preference to have the LNG delivered free-on-board. This option has been added “to better reflect LNG trade practices and attract additional international suppliers”, the Directorate said.

AggregateEU, a mechanism in which gas suppliers compete to book demand placed by companies in the EU and its Energy Community partner countries, was initially only meant for the 2023-24 winter season. However, citing lessons from the prolonged effects of the energy crisis, the EU has made it a permanent mechanism under “Regulation (EU) 2024/1789 on the internal markets for renewable gas, natural gas and hydrogen”, adopted June 13, 2024.

Midterm rounds offer six-month contracts for potential suppliers during a buyer-seller partnership of up to five years.

“In early 2024, with the effects of the energy crisis still not over, AggregateEU is introducing a different concept of mid-term tenders in order to address the growing demand for stability and predictability from buyers and sellers of natural gas”, the Directorate said February 1, 2024, announcing the first midterm tender.

“Under such tenders, buyers will be able to submit their demand for seasonal 6-month periods (for a minimum 1,800,000 MWh for LNG and 30,000 for NBP per period), going from April 2024 to October 2029. This is intended to support sellers in identifying buyers who might be interested in a longer trading partnership – i.e. up to 5 years.

“Mid-term tenders will not only increase security of supply but also help European industrial players increase their competitiveness”.

NBP gas, or National Balancing Point gas, refers to gas from the national transmission systems of EU states.

The first midterm round aggregated 34 Bcm of demand from 19 companies including industrial players. Offers totaled 97.4 Bcm, almost triple the demand, the Commission said February 28, 2024.

A total of 7 rounds have been conducted under AggregateEU, pooling over 119 Bcm of demand and attracting 191 Bcm of offers. Nearly 100 Bcm have been matched, according to Thursday’s results announcement. 

AggregateEU, created under Council Regulation 2022/2576 of December 19, 2022, is part of the broader EU Energy Platform for coordinated purchases of gas and hydrogen. The Energy Platform was formed 2022 as part of the REPowerEU strategy for achieving energy independence from Russia.

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Zscaler, café-inspired branch networks, and mobile security

In Japan, I met with Nathan Howe, senior vice president of innovation and product management for Zscaler, and talked to him about the Zscaler Cellular service. Without getting into the technical nuances, the service works by integrating zero trust into the mobile network. This makes it ideally suited to secure

Read More »

Newsom signs 1 bill to speed geothermal approvals, vetoes another

Gov. Gavin Newsom, D-Calif., signed into law Monday a bill that will expand the California Energy Commission’s streamlined certification program to allow it to speed approvals for geothermal energy, but he vetoed a bill to speed approvals of well-drilling for geothermal exploratory projects. “In addition to delaying much-needed regulations that are already in process, the [Geologic Energy Management] Division would need to substantially increase fees on geothermal operators to implement the new requirements imposed by the bill,” Newsom wrote in his veto message for the second bill, AB527. Newsom said in the message that while he supports the expansion of geothermal energy in California “as a much-needed source of baseload clean power,” he thinks that the increased fees would disincentivize geothermal development “beyond any incentive provided by a [California Environmental Quality Act] exemption for one part of a project’s permitting process.” The governor also noted that he signed AB1359 last year, which reformed the approval process for geothermal exploration, and he thinks it is “prudent that we understand the effects of these changes before granting wholesale CEQA exemptions with costly and complicated conditions.” AB527 would have allowed geothermal exploratory projects that GEMD deems to meet certain criteria to also be presumed to “have satisfied the requirements of CEQA for the geothermal exploratory project, including to support the issuance of any permit, funding, or other approval by a state or local agency, as provided.” AB531, which Newsom signed, makes geothermal projects one of the types of projects that — once the state Energy Commission certifies them as an “environmental leadership development project” — can benefit from a streamlined approval process under CEQA. The other projects that can receive that certification are solar or wind farms generating 50 MW or more, energy storage systems capable of storing 200 MWh or more, power plants 50 MW or

Read More »

Citigroup Flags Oil Market’s Bearish Consensus

The broad mood in the oil market remains bearish, although there are discrepancies as to how gloomy crude’s prospects are, according to Citigroup Inc., summarizing views from clients in North America and Europe. “Conviction differs on the depth of downside,” analysts including Francesco Martoccia said in a note. “Some clients doubt that a price floor at $60 a barrel for Brent crude oil would be enough to induce a supply-and-demand reaction to balance a global liquids market generally seen heading for a surplus.” Oil prices have shed more than 10% this year, with global benchmark Brent posting back-to-back monthly losses in August and September. The weakness has been driven largely by expectations that supplies will run ahead of demand as OPEC+ loosens output curbs and rival drillers also step up production. Still, stockpiling by China has acted to support the market, with inventory builds so far seen concentrated away from the market’s main pricing centers. “Other clients expect a more moderate, orderly price correction, arguing that projected stock builds could continue to accumulate outside of key pricing hubs, certainly ex-Cushing,” the analysts said, referring to the storage hub in Oklahoma that’s the physical delivery point for West Texas Intermediate. The Organization of the Petroleum Exporting Countries and its allies endorsed another quota hike last weekend, although the increment — 137,000 barrels a day for November’s production — was smaller than some of the sums that had been reported in the run-up to the gathering. “Today’s slower non-OPEC+ growth and greater OPEC+ optionality, along with heightened geopolitical risks looming on large producers” such as Russia and Iran, could temper the pace of price adjustment, the analysts said. Brent futures — which tumbled 8% last week ahead of the OPEC+ supply decision — traded slightly lower at $65.80 a barrel on Thursday. “Within the energy complex, consensus

Read More »

USA Crude Oil Stocks Rise by Almost 4 Million Barrels WoW

U.S. commercial crude oil inventories, excluding those in the Strategic Petroleum Reserve (SPR), increased by 3.7 million barrels from the week ending September 26 to the week ending October 3, the U.S. Energy Information Administration (EIA) highlighted in its latest weekly petroleum status report. That report, which was released on October 8 and included data for the week ending October 3, showed that crude oil stocks, not including the SPR, stood at 420.3 million barrels on October 3, 416.5 million barrels on September 26, and 422.7 million barrels on October 4, 2024. The report highlighted that data may not add up to totals due to independent rounding. Crude oil in the SPR stood at 407.0 million barrels on October 3, 406.7 million barrels on September 26, and 382.9 million barrels on October 4, 2024, the report revealed. Total petroleum stocks – including crude oil, total motor gasoline, fuel ethanol, kerosene type jet fuel, distillate fuel oil, residual fuel oil, propane/propylene, and other oils – stood at 1.694 billion barrels on October 3, the report highlighted. Total petroleum stocks were down 0.9 million barrels week on week and up 52.2 million barrels year on year, the report showed. “At 420.3 million barrels, U.S. crude oil inventories are about four percent below the five year average for this time of year,” the EIA said in its latest weekly petroleum status report. “Total motor gasoline inventories decreased by 1.6 million barrels from last week and are about one percent below the five year average for this time of year. Finished gasoline inventories increased, while blending components inventories decreased last week,” it added. “Distillate fuel inventories decreased by 2.0 million barrels last week and are about six percent below the five year average for this time of year. Propane/propylene inventories decreased by 2.9 million

Read More »

To keep energy affordable, Virginia must embrace power line innovation

Jeff Dennis is executive director of the Electricity Customers Alliance, a coalition that advocates for customer-centric solutions to grid modernization and energy affordability challenges. A modern economy requires modern energy solutions to meet the growing energy demand created by reshored manufacturing, increased electrification of homes, businesses, and vehicles, and — of course — data centers. Virginia and the Mid-Atlantic house the data center capital of the Western world, and the Commonwealth is positioned to lead America’s push to win the global artificial intelligence race. Data center growth can deliver many economic benefits to communities. However, this increased electricity demand requires smartly planned power infrastructure and a focus on short- and long-term solutions to address consumers’ concerns about their power bills. Planning and building electricity infrastructure to meet rapidly growing demand while keeping electricity affordable for all consumers will take time. However, data centers and manufacturers need power now to win the AI race and capture new economic opportunities in manufacturing and the digital economy. Virginia must capitalize on short-term opportunities to expand and maximize the use of our existing grid assets to deliver more power while setting the foundation for smart, long-term buildout of our shared electricity grid. To address this challenge, the General Assembly passed, and Gov. Glenn Youngkin signed, legislation to encourage deployment of advanced transmission technologies that maximize the amount of power that can be delivered from our existing lines. Beginning in 2026, the legislation requires the State Corporation Commission to consider the use of new power line technologies called “advanced conductors” when deciding how to bring more power supply onto the grid. Many other Mid-Atlantic states have followed Virginia’s lead in encouraging adoption of these technologies to help meet our needs today while establishing a foundation for an efficient future electricity system. That’s great news for consumers —

Read More »

Investor-owned utilities could spend $1.1T between 2025 and 2029: EEI

Listen to the article 3 min This audio is auto-generated. Please let us know if you have feedback. Dive Brief: Electric utilities are on pace to spend nearly $208 billion on grid upgrades and expansions this year, the highest amount ever, the Edison Electric Institute said Tuesday. The group represents investor-owned utilities. And more growth in capital expenditures is on the way, as the sector rushes to meet growing demand, according to EEI’s 2024 financial review. The group’s members are projected to make capital expenditures of more than $1.1 trillion between 2025 and 2029. U.S electricity generation rose 3% in 2024 “and is expected to rise for the foreseeable future,” EEI said. Generation investments as a share of the industry’s total capital expenditures have risen for four straight years, it said. Dive Insight: After years of relatively stagnant growth, the electric sector is moving quickly to meet new demand from AI data centers, industrial expansion, electrification and other sources. The U.S. generated 4.3 million GWh in 2024, “the largest annual jump in five years,” EEI said. The group anticipates an annual growth rate of 1.7% through 2040, when domestic generation could surpass 5.4 million GWh.     Meeting the new demand requires significant grid investments. IOU capital expenditures grew more than 16% from 2024 to 2025, based on anticipated investments, EEI said. Permission granted by Edison Electric Institute “Our capital expenditures are higher than any other sector in the U.S. economy, outpacing transportation, retail, and other capital-intensive industries,” EEI President and CEO Drew Maloney wrote in a letter accompanying the report. “As always, we remain committed to keeping customer bills as low as possible as we work to deliver the reliable, secure electricity that is enabling innovation and enhancing the energy leadership of the United States.”   But some experts say retail consumers are footing the bill

Read More »

Grid planners and experts on why markets keep choosing renewables

Listen to the article 11 min This audio is auto-generated. Please let us know if you have feedback. As electricity demand grows, alongside wind and solar’s share in the U.S. energy mix, concerns about renewables’ reliability are being raised more frequently — including at the highest levels of the federal government. “With the electricity grid, you have to match supply and demand in every moment in time,” Energy Secretary Chris Wright said recently on Fox News. “With wind and solar, you don’t know when they’re going to be there, and you don’t know when they’re going to go away.” Wright went on to say the development of renewables has led to an “extra distribution grid” that has raised energy prices.  But utility planners, grid operators and analysts say wind, solar and batteries are an important part of an evolving power system in which intermittent resources can be reliably scheduled and called upon using sophisticated software and other tools. They also point to both the levelized cost of electricity for renewables and their competitiveness in automated energy markets that select the least cost units to run in each hour. “System operators don’t decide whether resources bidding into the market are good or bad,” said Rob Gramlich, president of energy sector consultant Grid Strategies, in an interview. “There is no central decision maker,” he said. “Markets don’t play favorites.” ERCOT’s multi-resource price-selected portfolio for August 20, 2025, at 5:50 PM CDT ERCOT “dashboard” [jpg]. Retrieved from dashboard. With generators retiring, demand rising, and construction, financing, permitting and supply chain challenges growing, operators and analysts acknowledged concerns over the future of the U.S. power system. Many called for diversification of resources, including renewables and storage, to protect electricity reliability and affordability. “The objective of planning is a portfolio of diverse resources at the least cost

Read More »

Cisco seriously amps-up Silicon One chip, router for AI data center connectivity

Some say deep buffers shouldn’t be used to handle this type of traffic; the contention is that these buffers fill and drain, creating jitter in the workloads, and that slows things down, Chopra told Network World. “But the real source of that challenge is not the buffers. It’s a poor congestion management scheme and poor load balancing with AI workloads, which are completely deterministic and predictable. You can actually proactively figure out how to place flows across the network and avoid the congestion,” he said. The 8223’s deep-buffer design provides ample memory to temporarily store packets during congestion or traffic bursts, an essential feature for AI networks where inter-GPU communication can create unpredictable, high-volume data flows, according to Gurudatt Shenoy, vice president of Cisco Provider Connectivity. “Combined with its high-radix architecture, the 8223 allows more devices to connect directly, reducing latency, saving rack space, and further lowering power consumption. The result is a flatter, more efficient network topology supporting high-bandwidth, low-latency communication that is critical for AI workloads,” Shenoy wrote in a blog post. NOS options Notably, the first operating systems that the 8223 supports are the Linux Foundation’s Software for Open Networking in the Cloud (SONiC) and Facebook open switching system (FBOSS) – not Cisco’s own IOS XR.  IXR will be supported, too, but at a later date, according to Cisco.  SONiC decouples network software from the underlying hardware and lets it run on hundreds of switches and ASICs from multiple vendors while supporting a full suite of network features such as Border Gateway Protocol (BGP), remote direct memory access (RDMA), QoS, and Ethernet/IP. One of the keys to SONiC is its switch-abstraction interface, which defines an API to provide a vendor-independent way of controlling forwarding elements such as a switching ASIC, an NPU, or a software switch in a uniform

Read More »

Utilities Race to Meet Surging Data Center Demand With New Power Models

Over the last 18 months or so, the energy generation industry and its public utilities have been significantly impacted by the AI data center boom. It has been demonstrated across North America that the increase in demand for power, as driven by the demand for hyperscale and AI data centers, greatly exceeds the ability of the industry to actually generate and deliver power to meet the demand. We have covered many of the efforts being made to control the availability of power. In response, utilities and regulators have begun rethinking how to manage power availability through means such as: temporary moratoriums on new data center interconnections; the creation of new rate classes; cogeneration and load-sharing agreements; renewable integration; and power-driven site selection strategies.  But the bottom line is that in many locations utilities will need to change the way they work and how and where they spend their CAPEX budgets. The industry has already realized that their demand forecast models are hugely out of date, and that has had a ripple effect on much of the planning done by public utilities to meet the next generation of power demand. Most utilities now acknowledge that their demand forecasting models have fallen behind reality, triggering revisions to Integrated Resource Plans (IRPs) and transmission buildouts nationwide. This mismatch between forecast and actual demand is forcing a fundamental rethink of capital expenditure priorities and long-term grid planning. Spend More, Build Faster Utilities are sharply increasing CAPEX and rebalancing their resource portfolios—not just for decarbonization, but to keep pace with multi-hundred-megawatt data center interconnects. This trend is spreading across the industry, not confined to a few isolated utilities. Notable examples include: Duke Energy raised its five-year CAPEX plan to $83 billion (a 13.7% increase) and plans to add roughly 5 GW of natural gas capacity

Read More »

Duos Pairs Mobile Power and Modular Edge Data Centers for Rapid Texas Rollout

Duos Technology Group has launched the fifth of its AI edge data centers, part of a plan to deploy 15 units by the end of 2025. The projects are executed through Duos Edge AI, a subsidiary focused on modular, rapidly installed edge data centers (EDCs) in underserved markets, beginning with school districts and regional carrier hubs across Texas. The newest site is being deployed on-premises with the Dumas Independent School District in Dumas, Texas. High-Density Edge Design Duos’ EDCs emphasize very high rack densities (100 kW+ per rack), SOC 2 Type II compliance, N+1 power with dual generators, and a 90-day build/turn-up cycle. Each site is positioned approximately 12 miles from end users, cutting latency for real-time workloads. To meet the power demands of these edge deployments, Duos formed Duos Energy and partnered with Fortress/APR Energy to deliver behind-the-meter mobile gas turbines. This approach allows compute to go live in 90 days without waiting years for utility interconnection upgrades. The goal is straightforward: move power and compute close to demand, with rapid deployment. Duos’ modular pods are designed for exurban and rural locations as localized compute hubs for carriers, schools, healthcare systems, and municipal users. The rugged design pairs high-density racks with the short deployment cycle and proximity targeting, enabling a wide range of applications. With Dumas ISD now live, Duos has five sites in Texas, including Amarillo/Region 16, Victoria/Region 3, Dumas ISD, and multiple Corpus Christi locations. Mobile Power vs. Modular Compute While Duos doesn’t consistently describe its data center units as “mobile,” they are modular and containerized, engineered for rapid, site-agnostic deployment. The “mobile” label more explicitly applies to Duos’ power strategy—a turbine fleet that can be fielded or re-fielded to match demand. From an operator’s perspective, the combined proposition functions like a mobile platform: pre-integrated compute pods

Read More »

Report: AMD could be Intel’s next foundry customer

[ Related: More Intel news and insights ] AMD has lagged behind Nvidia in the AI business but has done well in the federal supercomputing business, holding numerous top spots with supercomputers like El Capitan and Frontier. Manufacturing its chips in the United States would be a good way to get the Trump administration off its back given its push for domestic manufacturing of semiconductors. The Trump administration is pushing for 50% of chips sold in America to be manufactured domestically, and tariffs on chips that are not. It also faces outbound restrictions. Earlier this year, AMD faced export restrictions GPUs meant for China as part of U.S. export controls against China’s AI business. “I believe this is a smart move by AMD to secure capacity in the local market without fighting against Nvidia and Apple and their deeper pockets for the limited capacity at TSMC,” said Alvi Nguyen, senior analyst with Forrester Research.” With the US investment in Intel, followed by Nvidia, this is can be seen as diversifying their supply chain and providing cheaper, locally sourced parts.” For Intel, this will continue a streak of good news it has enjoyed recently. “Having customers take up capacity at their foundries will go a long way in legitimizing their semiconductor processes and hopefully create the snowball effect of getting even more US-based customers,” said Nguyen. In recent weeks, Intel has partnered with Nvidia to jointly make PC and data center chips. Nvidia also took a $5B stake in Intel. Earlier the Trump administration made a $11.1B, or 10%, stake in Intel.

Read More »

AI Infra Summit 2025 Insights: AI Factories at the Core of the Fifth Industrial Revolution

NVIDIA’s AI Factory: Engineering the Future of Compute NVIDIA’s keynote illuminated the invisible but indispensable heart of the AI revolution—the AI factory. This factory blends hardware and software innovation to achieve performance breakthroughs that transcend traditional limits. Technologies such as disaggregated rack-scale GPUs and the novel 4-bit floating point numerical representation move beyond incremental improvements; they redefine what is achievable in energy efficiency and cost-effectiveness. The software ecosystem NVIDIA fosters, including open-source frameworks like Dynamo, enables unprecedented flexibility in managing inference workloads across thousands of GPUs. This adaptability is crucial given the diverse, dynamic demands of modern AI, where workloads can fluctuate dramatically in scale and complexity. The continuous leap in benchmark performance, often quadrupling hardware capabilities through software alone, continues to reinforce their accelerated innovation cycle. NVIDIA’s framing of AI factories as both technology platforms and business enablers highlights an important shift. The value computed is not merely in processing raw data but in generating economic streams through optimizing speed, reducing costs, and creating new AI services. This paradigm is central to understanding how the new industrial revolution will operate through highly efficient AI factories uniting production and innovation. AWS and the Cloud’s Role in Democratizing AI Power Amazon Web Services (AWS) represents a key pillar in making AI capabilities accessible across the innovation spectrum. AWS’ focus on security and fault tolerance reflects maturity in cloud design, ensuring trust and resilience are priorities alongside raw compute power. The evolution towards AI agents capable of specification-driven operations signifies a move beyond traditional computing paradigms towards contextual, autonomous AI services embedded deeply in workflows. Their launch of EC2 P6-B200 instances with next-generation Blackwell GPUs and specialized Trainium chips represents a continual drive to optimize AI training and inference at scale and out-of-box improvements in performance of 85% reduction in training time;

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Looking for Data Center Candidates? Check out Pkaza’s Active Candidate / Featured Candidate Hotlist Business Development Manager – Mechanical Data Center Solutions Remote / traveler This position can be located anywhere in the U.S. as long as candidate can travel and will need to be located near a major airport in the U.S. Key for this role is data center mechanical experience selling engineered products and solutions with a rolodex of either colo or hyperscale contacts.  This opportunity is working directly with a successful global OEM looking to expand market share in the critical facilities industry. They help data centers reduce energy and operating costs by providing mechanical solutions that modernize their infrastructure. By ensuring high-reliability mission-critical facilities for some of the world’s largest hyperscale, colocation, and enterprise customers, this company offers a career-growth-minded opportunity with exciting projects, cutting-edge technology, competitive salaries, and benefits. Engineering Design Director – Data CenterDenver, CO / Dallas, TX / Remote This position is a remote position in the Midwest supporting HPC / AI colo design projects being built in TX. Previous A/E experience with data center hyperscale design a must! We are seeking an experienced director of data center engineering design who will lead the development and oversight of critical power and mechanical infrastructure design for new data center builds and existing data center facilities. The ideal candidate will bring strong technical acumen, leadership skills, and extensive experience in mission-critical environments. You will oversee all aspects of engineering design, guide project teams, interface with stakeholders, and ensure best practices are upheld in quality,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »