Stay Ahead, Stay ONMINE

EU States Set to Back More Flexibility for Filling Gas Storage

European Union member states are set to back more flexible rules for filling gas storage before winter, amid criticism that current targets artificially raise prices. Ambassadors from the bloc’s 27 member states will meet in Brussels on Friday to sign off on a joint push for a 10 percentage-point deviation until 2027 from rules that […]

European Union member states are set to back more flexible rules for filling gas storage before winter, amid criticism that current targets artificially raise prices.

Ambassadors from the bloc’s 27 member states will meet in Brussels on Friday to sign off on a joint push for a 10 percentage-point deviation until 2027 from rules that require tanks to be 90 percent full by winter. If their position is agreed soon with the European Parliament in upcoming talks, the new regulations could come into effect before the next heating season.

The targets were brought in at the height of the energy crisis, when a drop in Russian flows sparked concern that Europe might not have enough gas to make it through a cold winter. But countries like Germany have said the rules helped inflate prices as speculators bet on expected purchases. The regulations have also been criticized for distorting the market by pushing up prices in summer, when they’d normally be cheapest.

The plan to secure more leeway to fill storage, combined with the fallout of the trade war, has helped spark a sharp drop in prices. European gas futures this week hit the lowest since September, extending a retreat from February’s two-year high. Energy costs are a key concern for EU officials and governments. 

In parliament, lawmakers in the industry committee will on April 24 vote on their position on the storage regulation. Changes proposed by the center-right European People’s Party, the largest group in the assembly, are broadly similar to what’s likely to be agreed by member states on Friday.

“The EPP is calling for a more balanced approach that maintains energy security but urgently returns to market-based mechanisms,” Andrea Wechsler, an EPP negotiator, said at a committee meeting this week.

Storage Proposals

Under the proposals, the Nov. 1 deadline would be replaced with a broader range of Oct. 1 to Dec. 1, and countries would be able to deviate from the 90% target depending on market conditions. If the regulation is agreed to before the start of October, the flexibilities will apply to this year’s targets.

Under the EU council’s draft plan, some nations will have an option to use an additional 5 percentage points deviation from the target, but only if it doesn’t harm Europe’s gas market or impact supplies of neighboring countries.

Poland, which holds the EU’s rotating presidency, told lawyers working on the regulation that the council intends for the rules to start before the end of September, Bloomberg reported earlier this week. It will represent member states in talks with the European Parliament about the final version of the rules.

With the extra flexibility and derogations, the European Commission told member states this week that the storage goal would effectively become 67 percent before winter in the years to 2027, a person familiar with the matter has said.

Europe’s underground gas-storage facilities are currently about 35 percent full, near the lowest since 2022.

Trade lobby Eurogas on Thursday called for clarity on storage rules before summer, saying that lingering uncertainty could jeopardize refilling efforts. Once agreed by the council, parliament and the commission, the regulation will need to be translated into the bloc’s official languages before publication. 

“More and more we’ve started to intervene in the market due to the energy crisis,” Eurogas Secretary General Andreas Guth said. “More trust in the market and less intervention is the top line.”

What do you think? We’d love to hear from you, join the conversation on the

Rigzone Energy Network.

The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy.


MORE FROM THIS AUTHOR



Bloomberg


Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

VMware (quietly) brings back its free ESXi hypervisor

By many accounts, Broadcom’s handling of the VMware acquisition was clumsy and caused many enterprises to reevaluate their relationship with the vendor The move to subscription models was tilted in favor of larger customers and longer, three-year licenses. Because the string of bad publicity and VMware’s competitors pounced, offering migration

Read More »

CoreWeave offers cloud-based Grace Blackwell GPUs for AI training

Cloud services provider CoreWeave has announced it is offering Nvidia’s GB200 NVL72 systems, otherwise known as “Grace Blackwell,” to customers looking to do intensive AI training. CoreWeave said its portfolio of cloud services are optimized for the GB200 NVL72, including CoreWeave’s Kubernetes Service, Slurm on Kubernetes (SUNK), Mission Control, and

Read More »

Kyndryl launches private cloud services for enterprise AI deployments

Kyndryl’s AI Private Cloud environment includes services and capabilities around containerization, data science tools, and microservices to deploy and manage AI applications on the private cloud. The service supports AI data foundations and MLOps/LLMOps services, letting customers manage their AI data pipelines and machine learning operation, Kyndryl stated. These tools facilitate

Read More »

Where Will the USA Gasoline Price Land in 2025?

According to the U.S. Energy Information Administration’s (EIA) latest short term energy outlook (STEO), which was released on April 10, the U.S. regular gasoline price will average $3.09 per gallon in 2025. In its latest STEO, the EIA projects that the U.S. regular gasoline price will average $3.10 per gallon in the second quarter of this year, $3.14 per gallon in the third quarter, and $3.01 per gallon in the fourth quarter. This STEO highlights that the gasoline price averaged $3.10 per gallon in the first quarter of this year and $3.31 per gallon overall in 2024. In its previous STEO, which was released in March, the EIA projected that the U.S. regular gasoline price would average $3.22 per gallon this year. The EIA forecast in that STEO that the gasoline price would come in at $3.10 per gallon in the first quarter of 2025, $3.24 per gallon in the second quarter, $3.39 per gallon in the third quarter, and $3.16 per gallon in the fourth quarter. The EIA projected in its April STEO that the U.S. regular gasoline price will average $3.11 per gallon in 2026. The organization’s March STEO saw the gasoline price averaging $3.21 per gallon in 2026. In its latest STEO, the EIA projected that the U.S. regular gasoline price will average $2.99 per gallon in the first quarter of next year, $3.21 per gallon in the second quarter, $3.22 per gallon in the third quarter, and $3.00 per gallon in the fourth quarter. In its March STEO, the EIA forecast that the gasoline price would come in at $3.11 per gallon in the first quarter of 2026, $3.30 per gallon in the second quarter, $3.33 per gallon in the third quarter, and $3.10 per gallon in the fourth quarter. “We forecast that this summer’s inflation

Read More »

Schneider Electric debuts data center consulting service

Dive Brief: Schneider Electric is rolling out a consulting service for data center operators looking to boost efficiency and maximize uptime, the company said Tuesday. Schneider says its EcoConsult for Data Centers draws on a global network of more than 250 consultants and more than 6,500 service representatives to assess customers’ data center infrastructure, including power and cooling systems, and develop asset modernization plans that increase reliability while reducing costs. “In today’s fast-paced digital environment, maintaining peak uptime while minimizing costs and extending the life of critical infrastructure is paramount,” Schneider Electric Senior Vice President for Services Jerome Soltani said in a statement. Dive Insight: About 36% of U.S. data centers are older than 10 years and lack a “facility-wide proactive asset management strategy,” Schneider said. Schneider intends for EcoConsult for Data Centers to support those aging facilities with a medium- to long-term roadmap for increased uptime; longer IT and server infrastructure lifespans; achievable sustainability initiatives; and lower total cost of ownership, the company said. EcoConsult is future-ready, addressing both current demands and those created by future growth and technology adoption, it added.  “Bringing this new level of expertise and innovation to market represents a significant step forward in our commitment to data center managers,” Soltani said. To assess client data centers’ performance and risk, Schneider consultants conduct comprehensive site visits, checking the physical condition and operational status of power distribution systems, cooling infrastructure, cybersecurity posture and IT and server assets.  Then, they develop digital maps of power and cooling systems to uncover critical issues and analyze power usage effectiveness and cooling system efficiency,identifying potential energy savings, according to a Schneider video overview.  The initial consultation concludes with a “detailed, site-specific report with insights and guidance for modernizing and maintaining your data center,” with asset models and specific recommendations ranked

Read More »

EIA outlook points to more renewables, near-term decline in energy consumption

Total energy consumption in the United States will decline through 2040 before beginning to increase again, the Energy Information Administration said Tuesday in its Annual Energy Outlook. Efficiency, and the report’s methodology, both play a role in the decline. The report also indicates significant growth in the production of renewable electricity through 2050, alongside a decline in coal generation. On the efficiency side, corporate average fuel economy and tailpipe emissions standards have a significant effect in the transportation sector, EIA spokesperson Chris Higginbotham said in an email. “Updated end-use equipment standards are a big factor for buildings, most notably for residential electric water heaters and natural gas furnaces,” Higginbotham added. “In the industrial sector, some increased adoption of electric boilers and industrial heat pumps contributes to flattening energy use; electricity-based equipment is typically more efficient than natural gas boilers.” Retrieved from U.S. Energy Information Administration. On the methodology side, Higginbotham said EIA is now using a “captured energy approach” for noncombustible renewable energy for electricity generation, which applies a constant conversion factor of 3,412 British thermal units per kilowatt-hour. “We used a fossil fuel equivalency approach in previous long-term projections,” he said. “It’s important to note the methodology change because it affects the accounting of total energy use. With the projected adoption of more renewable energy sources, electricity losses do not increase as quickly using the captured energy approach as with the fossil fuel equivalency approach we previously used.” The annual report sees renewable energy production, including wind and solar, growing from 2.57 quads in 2024 to 13.92 quads in 2050. A quad is equal to one quadrillion British thermal units. U.S. coal production will decline from 10.26 quads in 2024 to 3.78 quads in 2050. Natural gas production will rise modestly, to about 43.5 quads in 2050, up from about

Read More »

Cold-weather grid performance improves, plus 4 other FERC open meeting takeaways

The power grid performed well during extremely cold weather in January, a sign that steps taken since previous winter storms are paying off with improved electric reliability, according to a report released Thursday by the Federal Energy Regulatory Commission and the North American Electric Reliability Corp. About 71 GW of generation was unexpectedly offline during winter storms in January that covered most of the United States, according to the report. About 91 GW was unexpectedly offline during Winter Storm Elliot, which mainly affected the Eastern Interconnection in December 2022. There were no power outages in January, while grid operators instituted rolling blackouts totaling 5,400 MW during Elliott and 23,418 MW during Winter Storm Uri, which mainly affected Texas and the Southcentral U.S. in February 2021. The Texas Department of State Health Services estimates that 246 people died as a result of Uri, mainly from hypothermia as well as the use of alternate heating sources while their power was out. Several factors appear to have contributed to the improved performance during the most recent bout of extremely cold weather, including improved communication and coordination between the natural gas and electric industries, according to the report. Among the positives during January’s bitter cold, the PJM Interconnection exported 7,650 MW during a peak demand period, according to the report. Several electric entities declared “conservative operations” earlier than in past events to delay or cancel planned transmission outages to reduce grid congestion and enhance transfer capability, FERC and NERC staff said in the report. Also, several entities surveyed by staff said battery storage helped their performance during the January arctic events, according to the report. “ERCOT stated that the rapid deployment of battery storage resulted in batteries providing 3,800 MW at peak times, alleviating stress on the grid during critical demand hours,” staff said.

Read More »

BP Ships First Greater Tortue Ahmeyim LNG Cargo

BP p.l.c. marked a major milestone at its Greater Tortue Ahmeyim liquefied natural gas (LNG) Phase 1 project offshore Mauritania and Senegal. The oil and gas major said in a media release that it loaded the first LNG cargo from the GTA project following first gas earlier this year. The initial LNG shipment at GTA marks BP’s third significant upstream project launch of the year. This is the first of ten anticipated by the conclusion of 2027, aligning with BP’s strategy to expand its upstream oil and gas operations. “This first cargo from Mauritania and Senegal marks a significant new supply for global energy markets. Starting exports from GTA Phase 1 is an important step for BP and our oil and gas business as we celebrate the creation of a new production hub within our global portfolio”, Gordon Birrell, EVP production & operations, said. “This is the culmination of years of work from the entire project and operations teams – congratulations to all who were involved in safely reaching this landmark. I would also like to thank the governments of Mauritania and Senegal, and our partners – Kosmos Energy, PETROSEN, and SMH – for their ongoing support and collaboration”, he said. The initial delivery of LNG was transferred from the project’s floating liquefied natural gas (FLNG) vessel situated 10 kilometers (6.2 miles) offshore, where the natural gas was cooled to a cryogenic state, liquefied, and stored, BP said. The company added that GTA stands as one of the most profound offshore projects in Africa, with gas reserves found at water depths reaching up to 2,850 meters, and has been recognized as “a project of strategic national importance” by the governments of Mauritania and Senegal. Once fully operational, GTA Phase 1 is anticipated to generate approximately 2.4 million tonnes of LNG

Read More »

Atlas Professionals Secures Crew Management Contract from Noble Corporation

Atlas Professionals B.V., a global recruitment and HR services company, has secured a Crew Management contract from Noble Corporation. Under the contract, Atlas Professionals will support the Noble Developer’s upcoming drilling campaign in Suriname. Atlas Professionals said in a media release that the award marks a significant milestone in a trusted relationship spanning over 15 years and reinforces its commitment to delivering tailored workforce solutions in complex and emerging markets. Under this agreement, Atlas will supply junior and expatriate crew for the campaign, as well as extensive logistics, training, and immigration support, all managed from its Paramaribo, Suriname office. “It’s great to be able to support Noble Corporation with this project in Suriname”, Lourdes Landa, Global Network Development Director, said. “Our long history of working together means we understand exactly what Noble needs in emerging markets like Suriname. Thanks to our global infrastructure and local set-up in the region, we can deliver a truly local solution backed by international standards”. Under the contract, Atlas said it will engage local Surinamese crew and focus on cultivating homegrown talent via its acclaimed ‘Greenhand Offshore’ Program, offering organized onboarding and career opportunities for newcomers in the offshore sector. “Atlas has invested significantly in establishing legal entities and offices in Guyana, Colombia, Mexico, Suriname, and Trinidad”, Chris Boardman, General Manager at Atlas Professionals, said. “We’ve made it a priority to support our clients’ increasing activity in this region, while also ensuring that local communities benefit from the growth in the offshore industry. We thrive in environments that others may find too difficult or too complex – that’s where we add the most value”. To contact the author, email [email protected] What do you think? We’d love to hear from you, join the conversation on the Rigzone Energy Network. The Rigzone Energy Network is a new

Read More »

Intel sells off majority stake in its FPGA business

Altera will continue offering field-programmable gate array (FPGA) products across a wide range of use cases, including automotive, communications, data centers, embedded systems, industrial, and aerospace.  “People were a bit surprised at Intel’s sale of the majority stake in Altera, but they shouldn’t have been. Lip-Bu indicated that shoring up Intel’s balance sheet was important,” said Jim McGregor, chief analyst with Tirias Research. The Altera has been in the works for a while and is a relic of past mistakes by Intel to try to acquire its way into AI, whether it was through FPGAs or other accelerators like Habana or Nervana, note Anshel Sag, principal analyst with Moor Insight and Research. “Ultimately, the 50% haircut on the valuation of Altera is unfortunate, but again is a demonstration of Intel’s past mistakes. I do believe that finishing the process of spinning it out does give Intel back some capital and narrows the company’s focus,” he said. So where did it go wrong? It wasn’t with FPGAs because AMD is making a good run of it with its Xilinx acquisition. The fault, analysts say, lies with Intel, which has a terrible track record when it comes to acquisitions. “Altera could have been a great asset to Intel, just as Xilinx has become a valuable asset to AMD. However, like most of its acquisitions, Intel did not manage Altera well,” said McGregor.

Read More »

Intelligence at the edge opens up more risks: how unified SASE can solve it

In an increasingly mobile and modern workforce, smart technologies such as AI-driven edge solutions and the Internet of Things (IoT) can help enterprises improve productivity and efficiency—whether to address operational roadblocks or respond faster to market demands. However, new solutions also come with new challenges, mainly in cybersecurity. The decentralized nature of edge computing—where data is processed, transmitted, and secured closer to the source rather than in a data center—has presented new risks for businesses and their everyday operations. This shift to the edge increases the number of exposed endpoints and creates new vulnerabilities as the attack surface expands. Enterprises will need to ensure their security is watertight in today’s threat landscape if they want to reap the full benefits of smart technologies at the edge. Bypassing the limitations of traditional network security  For the longest time, enterprises have relied on traditional network security approaches to protect their edge solutions. However, these methods are becoming increasingly insufficient as they typically rely on static rules and assumptions, making them inflexible and predictable for malicious actors to circumvent.  While effective in centralized infrastructures like data centers, traditional network security models fall short when applied to the distributed nature of edge computing. Instead, organizations need to adopt more adaptive, decentralized, and intelligent security frameworks built with edge deployments in mind.  Traditional network security typically focuses on keeping out external threats. But today’s threat landscape has evolved significantly, with threat actors leveraging AI to launch advanced attacks such as genAI-driven phishing, sophisticated social engineering attacks, and malicious GPTs. Combined with the lack of visibility with traditional network security, a cybersecurity breach could remain undetected until it’s too late, resulting in consequences extending far beyond IT infrastructures.  Next generation of enterprise security with SASE As organizations look into implementing new technologies to spearhead their business, they

Read More »

Keysight tools tackle data center deployment efficiency

Test and performance measurement vendor Keysight Technologies has developed Keysight Artificial Intelligence (KAI) to identify performance inhibitors affecting large GPU deployments. It emulates workload profiles, rather than using actual resources, to pinpoint performance bottlenecks. Scaling AI data centers requires testing throughout the design and build process – every chip, cable, interconnect, switch, server, and GPU needs to be validated, Keysight says. From the physical layer through the application layer, KAI is designed to identify weak links that degrade the performance of AI data centers, and it validates and optimizes system-level performance for optimal scaling and throughput. AI providers, semiconductor fabricators, and network equipment manufacturers can use KAI to accelerate design, development, deployment, and operations by pinpointing performance issues before deploying in production.

Read More »

U.S. Advances AI Data Center Push with RFI for Infrastructure on DOE Lands

ORNL is also the home of the Center for Artificial Intelligence Security Research (CAISER), which Edmon Begoli, CAISER founding director, described as being in place to build the security necessary by defining a new field of AI research targeted at fighting future AI security risks. Also, at the end of 2024, Google partner Kairos Power started construction of their Hermes demonstration SMR in Oak Ridge. Hermes is a high-temperature gas-cooled reactor (HTGR) that uses triso-fueled pebbles and a molten fluoride salt coolant (specifically Flibe, a mix of lithium fluoride and beryllium fluoride). This demonstration reactor is expected to be online by 2027, with a production level system becoming available in the 2030 timeframe. Also located in a remote area of Oak Ridge is the Tennessee Valley Clinch River project, where the TVA announced a signed agreement with GE-Hitachi to plan and license a BWRX-300 small modular reactor (SMR). On Integrating AI and Energy Production The foregoing are just examples of ongoing projects at the sites named by the DOE’s RFI. Presuming that additional industry power, utility, and data center providers get on board with these locations, any of the 16 could be the future home of AI data centers and on-site power generation. The RFI marks a pivotal step in the U.S. government’s strategy to solidify its global dominance in AI development and energy innovation. By leveraging the vast resources and infrastructure of its national labs and research sites, the DOE is positioning the country to meet the enormous power and security demands of next-generation AI technologies. The selected locations, already home to critical energy research and cutting-edge supercomputing, present a compelling opportunity for industry stakeholders to collaborate on building integrated, sustainable AI data centers with dedicated energy production capabilities. With projects like Oak Ridge’s pioneering SMRs and advanced AI security

Read More »

Generac Sharpens Focus on Data Center Power with Scalable Diesel and Natural Gas Generators

In a digital economy defined by constant uptime and explosive compute demand, power reliability is more than a design criterion—it’s a strategic imperative. In response to such demand, Generac Power Systems, a company long associated with residential backup and industrial emergency power, is making an assertive move into the heart of the digital infrastructure sector with a new portfolio of high-capacity generators engineered for the data center market. Unveiled this week, Generac’s new lineup includes five generators ranging from 2.25 MW to 3.25 MW. These units are available in both diesel and natural gas configurations, and form part of a broader suite of multi-asset energy systems tailored to hyperscale, colocation, enterprise, and edge environments. The product introductions expand Generac’s commercial and industrial capabilities, building on decades of experience with mission-critical power in hospitals, telecom, and manufacturing, now optimized for the scale and complexity of modern data centers. “Coupled with our expertise in designing generators specific to a wide variety of industries and uses, this new line of generators is designed to meet the most rigorous standards for performance, packaging, and after-treatment specific to the data center market,” said Ricardo Navarro, SVP & GM, Global Telecom and Data Centers, Generac. Engineering for the Demands of Digital Infrastructure Each of the five new generators is designed for seamless integration into complex energy ecosystems. Generac is emphasizing modularity, emissions compliance, and high-ambient operability as central to the offering, reflecting a deep understanding of the real-world challenges facing data center operators today. The systems are built around the Baudouin M55 engine platform, which is engineered for fast transient response and high operating temperatures—key for data center loads that swing sharply under AI and cloud workloads. The M55’s high-pressure common rail fuel system supports low NOx emissions and Tier 4 readiness, aligning with the most

Read More »

CoolIT and Accelsius Push Data Center Liquid Cooling Limits Amid Soaring Rack Densities

The CHx1500’s construction reflects CoolIT’s 24 years of DLC experience, using stainless-steel piping and high-grade wetted materials to meet the rigors of enterprise and hyperscale data centers. It’s also designed to scale: not just for today’s most power-hungry processors, but for future platforms expected to surpass today’s limits. Now available for global orders, CoolIT is offering full lifecycle support in over 75 countries, including system design, installation, CDU-to-server certification, and maintenance services—critical ingredients as liquid cooling shifts from high-performance niche to a requirement for AI infrastructure at scale. Capex Follows Thermals: Dell’Oro Forecast Signals Surge In Cooling and Rack Power Infrastructure Between Accelsius and CoolIT, the message is clear: direct liquid cooling is stepping into its maturity phase, with products engineered not just for performance, but for mass deployment. Still, technology alone doesn’t determine the pace of adoption. The surge in thermal innovation from Accelsius and CoolIT isn’t happening in a vacuum. As the capital demands of AI infrastructure rise, the industry is turning a sharper eye toward how data center operators account for, prioritize, and report their AI-driven investments. To wit: According to new market data from Dell’Oro Group, the transition toward high-power, high-density AI racks is now translating into long-term investment shifts across the data center physical layer. Dell’Oro has raised its forecast for the Data Center Physical Infrastructure (DCPI) market, predicting a 14% CAGR through 2029, with total revenue reaching $61 billion. That revision stems from stronger-than-expected 2024 results, particularly in the adoption of accelerated computing by both Tier 1 and Tier 2 cloud service providers. The research firm cited three catalysts for the upward adjustment: Accelerated server shipments outpaced expectations. Demand for high-power infrastructure is spreading to smaller hyperscalers and regional clouds. Governments and Tier 1 telecoms are joining the buildout effort, reinforcing AI as a

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »