Stay Ahead, Stay ONMINE

USA Natural Gas Hits Record Weekly Gain

(Update) January 23, 2026, 8:33 PM GMT: Article updated. US natural gas futures for February delivery surged by 70% this week as the market braced for a historic winter storm that’s poised to send temperatures plummeting and boost demand for the heating fuel. Futures settled up 4.6%, or by 23 cents, to $5.275 per million British thermal […]

(Update) January 23, 2026, 8:33 PM GMT: Article updated.

US natural gas futures for February delivery surged by 70% this week as the market braced for a historic winter storm that’s poised to send temperatures plummeting and boost demand for the heating fuel.

Futures settled up 4.6%, or by 23 cents, to $5.275 per million British thermal units. That settlement was up by $2.172 per million Btu from the end of last week and capped the largest weekly percentage gain by far in records going back to 1990.

Natural gas prices for near-term delivery at regional trading hubs across the US also jumped. So-called cash prices for gas at the benchmark Henry Hub in Louisiana to be delivered over the weekend surged on Friday to higher than $28 per million British thermal units, according to traders. That compares with $8.42 on Thursday.

Spot prices at the SoCal Citygate hub in California traded as high as $8 per million Btu as gas volumes delivered via pipeline from the Permian Basin in West Texas to the West Coast have likely been reduced, traders said. That’s up from $4.42 on Thursday. Spot prices at the Houston Chip Channel were trading at $30 per million Btu on Friday morning, traders said. The Transco zone 6 non-New York index, an indicator for prices from Baltimore to New Jersey, traded Friday afternoon at $58 per million Btu, traders said.

This week’s surge has been driven by forecasts for below-normal temperatures across most of the country, threatening to boost gas consumption and drain inventories. The freeze — particularly in the southern gas-producing states — has raised concerns about water icing in pipelines, potentially disrupting output starting this weekend. One energy consultancy, Energy Aspects, raised its forecast for how much production will be lost to so-called freeze-offs over the next 14 days to 86.4 billion cubic feet, equivalent to 5.5% of US gas production, from 63.7 bcf on Thursday.

The shift in US weather forecasts came days after hedge funds turned more bearish on gas at the end of last week, leaving the market poised for a rally as traders rushed to close out those wagers. Gas prices briefly climbed above $5.50 per million Btu on Thursday, a level that a Citigroup Inc. analysis on Thursday showed would wipe out all shorts. 



WHAT DO YOU THINK?

Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Nvidia is still working with suppliers on RAM chips for Rubin

Nvidia changed its requirements for suppliers of the next generation of high-bandwidth memory, HBM4, but is close to certifying revised chips from Samsung Electronics for use in its AI systems, according to reports. Nvidia revised its specifications for memory chips for its Rubin platform in the third quarter of 2025,

Read More »

Storage shortage may cause AI delays for enterprises

Higher prices ahead All indicators are showing a steep price increase for memory and storage in 2026. Brad Gastwirth, for example, says he met with many of the most important players in the market at CES earlier this month, and his analysis suggests there will be a 50% or more

Read More »

Rust 1.93 updates bundled musl library to boost networking

The Rust team has unveiled Rust 1.93, the latest version of the programming language designed to create fast and safe system-level software. This release improves operations involving the DNS resolver for the musl implementation of the  C standard library. Linux binaries are expected to be more reliable for networking as

Read More »

EIA Sees Glut Widening in 2026

World petroleum and other liquid fuels production will outweigh consumption by 2.83 million barrels per day in 2026. That’s according to the U.S. Energy Information Administration’s (EIA) January short term energy outlook (STEO), which projected that global petroleum and other liquid fuels production and consumption will average 107.65 million barrels per day and 104.82 million barrels per day, respectively, this year. A quarterly breakdown included in the EIA’s latest STEO projected that production will average 106.93 million barrels per day in the first quarter of 2026, 107.52 million barrels per day in the second quarter, and 107.92 million barrels per day in the third quarter, and 108.24 million barrels per day in the fourth quarter. Another quarterly breakdown in the EIA’s January STEO forecast that consumption will come in at 103.36 million barrels per day in the first quarter of this year, 104.86 million barrels per day in the second quarter, and 105.66 million barrels per day in the third quarter, and 105.38 million barrels per day in the fourth quarter. The EIA’s latest STEO showed that world petroleum and other liquid fuels production outweighed consumption by 2.59 million barrels per day in 2025. In this STEO, the EIA highlighted that output averaged 103.67 million barrels per day in the first quarter of 2025, 105.21 million barrels per day in the second quarter, 107.88 million barrels per day in the third quarter, and 108.30 million barrels per day in the fourth quarter. The STEO showed that consumption came in at 101.96 million barrels per day in the first quarter of last year, 103.68 million barrels per day in the second quarter, 104.55 million barrels per day in the third quarter, and 104.52 million barrels per day in the fourth quarter. Looking ahead to 2027 in its January STEO, the EIA

Read More »

The new equation for energy security in 2026

Ken West is president and CEO of Honeywell Process Technology. Over the past year, the global energy landscape has shifted in meaningful ways. Liquefied natural gas, and natural gas overall, gained momentum, while growth in areas like petrochemicals and renewables slowed. Global policy developments shaped how governments and companies make investment and innovation decisions, reinforcing the need for stable, long-term policy frameworks to support next-generation energy solutions. What became clear in 2025 is that the future will require more energy supply requiring all forms of energy, which includes making traditional fuels more efficient while adding innovative alternative fuels to the mix. Growing energy demand and evolving regulations will continue to shape 2026. Meeting the global need for more power means expanding energy infrastructure worldwide, which is becoming more challenging. As these systems become increasingly complex, digitalization is emerging as the connective tissue that links infrastructure, data and people. The next phase of progress will be defined by how technology is deployed — not only to cut costs and boost efficiency, but to enable intelligent operations, advance modular solutions and strengthen energy security for responsible growth. Here are four strategic shifts that will shape the energy sector in 2026. Digital solutions help meet rising energy demand Looking ahead, connected solutions — where AI-powered tools are integrated with digitally connected infrastructure — will accelerate the shift of energy companies toward intelligent operations. These tools help strengthen and augment workforce capabilities, extract more value from existing assets, and position energy leaders to meet rising demand with greater confidence. Early adopters are already seeing measurable gains in reliability and workforce efficiency. Skilled labor constraints remain a challenge as experienced workers retire and turnover stays high. To address this, energy companies are turning to digital solutions that upskill the workforce, maintain reliability and scale operations.

Read More »

Enverus Releases ‘Top Public Onshore E&P Cos in USA’ List

In a statement sent to Rigzone by the Enverus team recently, Enverus announced that it has released its annual list of “the top public onshore exploration and production companies in the U.S. based on gross operated production last year”. According to the list, which was seen by Rigzone and is titled Top Public Lower 48 Onshore E&P Companies of 1H25 by Gross Operated Production, ExxonMobil ranked first with 1.951 million barrels of oil equivalent per day, followed by Expand Energy, with 1.754 million barrels of oil equivalent per day, and ConocoPhillips, with 1.420 million barrels of oil equivalent per day. Occidental Petroleum was ranked fourth, with 1.296 million barrels of oil equivalent per day, and EOG Resources came in fifth, with 1.250 million barrels of oil equivalent per day, the list showed. ExxonMobil’s production had an oil weighting of 53 percent and its active rig count came in at 37, according to Enverus’ list, which highlighted that the company ranked first in 2024. Expand Energy’s production had an oil weighting of one percent and its active rig count came in at eight, the list highlighted. The company ranked second in 2024, the list pointed out. In terms of oil weighted production, ConocoPhillips came in at 62 percent, Occidental Petroleum came in at 57 percent, and EOG Resources came in at 54 percent, the list revealed. In terms of active rig count, ConocoPhillips had 33, Occidental Petroleum had 30, and EOG Resources had 23, the list showed. ConocoPhillips ranked sixth in last year’s list, Occidental Petroleum was third, and EOG Resources was fourth, Enverus’ list highlighted. The list showed the Permian basin as the “primary Enverus region” of ExxonMobil, ConocoPhillips, and Occidental Petroleum. It pointed out that the “primary Enverus region” of Expand Energy was the Eastern United States and that

Read More »

USA Natural Gas Hits Record Weekly Gain

(Update) January 23, 2026, 8:33 PM GMT: Article updated. US natural gas futures for February delivery surged by 70% this week as the market braced for a historic winter storm that’s poised to send temperatures plummeting and boost demand for the heating fuel. Futures settled up 4.6%, or by 23 cents, to $5.275 per million British thermal units. That settlement was up by $2.172 per million Btu from the end of last week and capped the largest weekly percentage gain by far in records going back to 1990. Natural gas prices for near-term delivery at regional trading hubs across the US also jumped. So-called cash prices for gas at the benchmark Henry Hub in Louisiana to be delivered over the weekend surged on Friday to higher than $28 per million British thermal units, according to traders. That compares with $8.42 on Thursday. Spot prices at the SoCal Citygate hub in California traded as high as $8 per million Btu as gas volumes delivered via pipeline from the Permian Basin in West Texas to the West Coast have likely been reduced, traders said. That’s up from $4.42 on Thursday. Spot prices at the Houston Chip Channel were trading at $30 per million Btu on Friday morning, traders said. The Transco zone 6 non-New York index, an indicator for prices from Baltimore to New Jersey, traded Friday afternoon at $58 per million Btu, traders said. This week’s surge has been driven by forecasts for below-normal temperatures across most of the country, threatening to boost gas consumption and drain inventories. The freeze — particularly in the southern gas-producing states — has raised concerns about water icing in pipelines, potentially disrupting output starting this weekend. One energy consultancy, Energy Aspects, raised its forecast for how much production will be lost to so-called freeze-offs over the next

Read More »

ContourGlobal Enters Greek Renewables Market

ContourGlobal said it has acquired producing solar farms with a combined capacity of about 37 megawatts (MW) and an under-development battery energy storage portfolio of 500 MW/2,000 MW hours in Greece, marking the London-based green power developer’s entry into the country. “Although Greece is a relatively small power market in Europe with 24 GW [gigawatts] of installed capacity, it is the second-largest in the Balkan region and is experiencing rapid growth in renewables”, ContourGlobal said in a press release. “While historically dominated by lignite and gas, the country is accelerating renewable development, with solar and wind capacity projected to reach around 60 GW by 2060. “At the end of the current decade, nearly 70 percent of installed capacity and electricity generation are expected to come from renewable sources, increasing the need for flexible, grid-scale storage. Moreover, the country is projected to become a net exporter of electricity from 2026”. The solar purchase from QuestEnergy of Greece’s Quest Holdings comprised 26 plants commissioned between 2011 and 2022 across the Mediterranean country, ContourGlobal said. “[T]he assets are fully contracted under FiT (Feed-in Tariff) and FiP (Feed-in Premium) supporting schemes, providing both predictable and premium revenues”, it said. “The portfolio’s expected yearly generation is of approximately 51 GWh, enough electricity to power up to 15,000 Greek households per year”. “The acquisition of the Quest portfolio was one of the first transactions to be filed and approved unconditionally under the new Foreign Direct Investment (FDI) directive in Greece”, ContourGlobal added. “The FDI approval process places strong emphasis on the review of acquisitions of critical assets in energy and infrastructure through foreign investors”. Additionally ContourGlobal acquired battery energy storage system (BESS) projects across the country through transactions with Greece’s Zephiros and Spain’s Fotowatio Renewable Ventures. “The first project, Taxiarches, is a 100 MW/400 MWh ready-to-build

Read More »

Analyst Explains ‘Massive Rise’ in USA NatGas Prices Today

In an exclusive interview with Rigzone on Monday, Ole R. Hvalbye, a commodities analyst at Skandinaviska Enskilda Banken AB (SEB), said the “massive rise” in U.S. natural gas prices today is “primarily driven by weather risk and thus tightening short-term balances, hence a continuation of the market movements we witnessed last week”. Hvalbye highlighted in the interview that “weather has re-entered the equation in force”, noting that “forecasts show a broad cold spell across large parts of the U.S., particularly the Midwest and East, with temperatures as low as 10 degrees below the 30-year normal for the end of this week”. “That has materially lifted heating demand expectations at a time when the market is already sensitive,” he added. “This is clearly reflected in Henry Hub, which has jumped to another level, and is now trading around $6.2 per MMBtu [million British thermal units], up more than 100 percent from … [around] $3 per MMBtu on January 16”, he continued. Citing Bloomberg data, Hvalbye highlighted to Rigzone that around “10 percent of U.S. gas production is estimated to be offline”. “At the same time, LNG terminal feedgas flows dropped largely to 11.1 Bcfpd [billion cubic feet per day] yesterday and are estimated at … [around] 12.5 Bcfpd today, which is well below the 18.58 Bcfpd average last week,” he said. Hvalbye went on to tell Rigzone that storage is adding some fuel to the move higher in U.S. natural gas. “U.S. inventories have been drawing faster than normal lately, and while stocks are not critically low, the drawdown rate has increased fast,” he noted. Finally, Hvalbye told Rigzone that positioning is amplifying the price action. “The market was leaning bearish after a long period of low prices, and today’s move has been intensified by short-covering (just like the spike last

Read More »

Intel’s AI pivot could make lower-end PCs scarce in 2026

However, he noted, “CPUs are not being cannibalized by GPUs. Instead, they have become ‘chokepoints’ in AI infrastructure.” For instance, CPUs such as Granite Rapids are essential in GPU clusters, and for handling agentic AI workloads and orchestrating distributed inference. How pricing might increase for enterprises Ultimately, rapid demand for higher-end offerings resulted in foundry shortages of Intel 10/7 nodes, Bickley noted, which represent the bulk of the company’s production volume. He pointed out that it can take up to three quarters for new server wafers to move through the fab process, so Intel will be “under the gun” until at least Q2 2026, when it projects an increase in chip production. Meanwhile, manufacturing capacity for Xeon is currently sold out for 2026, with varying lead times by distributor, while custom silicon programs are seeing lead times of 6 to 8 months, with some orders rolling into 2027, Bickley said. In the data center, memory is the key bottleneck, with expected price increases of more than 65% year over year in 2026 and up to 25% for NAND Flash, he noted. Some specific products have already seen price inflation of over 1,000% since 2025, and new greenfield capacity for memory is not expected until 2027 or 2028. Moor’s Sag was a little more optimistic, forecasting that, on the client side, “memory prices will probably stabilize this year until more capacity comes online in 2027.” How enterprises can prepare Supplier diversification is the best solution for enterprises right now, Sag noted. While it might make things more complex, it also allows data center operators to better absorb price shocks because they can rebalance against suppliers who have either planned better or have more resilient supply chains.

Read More »

Reports of SATA’s demise are overblown, but the technology is aging fast

The SATA 1.0 interface made its debut in 2003. It was developed by a consortium consisting of Intel, Dell, and storage vendors like Seagate and Maxtor. It quickly advanced to SATA III in 2009, but there never was a SATA IV. There was just nibbling around the edges with incremental updates as momentum and emphasis shifted to PCI Express and NVMe. So is there any life to be had in the venerable SATA interface? Surprisingly, yes, say the analysts. “At a high level, yes, SATA for consumer is pretty much a dead end, although if you’re storing TB of photos and videos, it is still the least expensive option,” said Bob O’Donnell, president and chief analyst with TECHnalysis Research. Similarly for enterprise, for massive storage demands, the 20 and 30 TB SATA drives from companies like Seagate and WD are apparently still in wide use in cloud data centers for things like cold storage. “In fact, both of those companies are seeing recording revenues based, in part, on the demand for these huge, high-capacity low-cost drives,” he said. “SATA doesn’t make much sense anymore. It underperforms NVMe significantly,” said Rob Enderle, principal analyst with The Enderle Group. “It really doesn’t make much sense to continue make it given Samsung allegedly makes three to four times more margin on NVMe.” And like O’Donnell, Enderle sees continued life for SATA-based high-capacity hard drives. “There will likely be legacy makers doing SATA for some time. IT doesn’t flip technology quickly and SATA drives do wear out, so there will likely be those producing legacy SATA products for some time,” he said.

Read More »

DCN becoming the new WAN for AI-era applications

“DCN is increasingly treated as an end-to-end operating model that standardizes connectivity, security policy enforcement, and telemetry across users, the middle mile, and cloud/application edges,” Sanchez said. Dell’Oro defines DCN as platforms and services that deliver consistent connectivity, policy enforcement, and telemetry from users, across the WAN, to distributed cloud and application edges spanning branch sites, data centers and public clouds. The category is gaining relevance as hybrid architectures and AI-era traffic patterns increase the operational penalty for fragmented control planes. DCN buyers are moving beyond isolated upgrades and are prioritizing architectures that reduce operational seams across connectivity, security and telemetry so that incident response and change control can follow a single thread, according to Dell’Oro’s research. What makes DCN distinct is that it links user-to-application experience with where policy and visibility are enforced. This matters as application delivery paths become more dynamic and workloads shift between on-premises data centers, public cloud, and edge locations. The architectural requirement is eliminating handoffs between networking and security teams rather than optimizing individual network segments. Where DCN is growing the fastest Cloud/application edge is the fastest-growing DCN pillar. This segment deploys policy enforcement and telemetry collection points adjacent to workloads rather than backhauling traffic to centralized security stacks. “Multi-cloud remains a reality, but it is no longer the durable driver by itself,” Sanchez said. “Cloud/application edge is accelerating because enterprises are trying to make application paths predictable and secure across hybrid environments, and that requires pushing application-aware steering, policy enforcement, and unified telemetry closer to workloads.”

Read More »

Edged US Builds Waterless, High-Density AI Data Center Campuses at Scale

Edged US is targeting a narrow but increasingly valuable lane of the hyperscale AI infrastructure market: high-density compute delivered at speed, paired with a sustainability posture centered on waterless, closed-loop cooling and a portfolio-wide design PUE target of roughly 1.15. Two recent announcements illustrate the model. In Aurora, Illinois, Edged is developing a 72-MW facility purpose-built for AI training and inference, with liquid-to-chip cooling designed to support rack densities exceeding 200 kW. In Irving, Texas, a 24-MW campus expansion combines air-cooled densities above 120 kW per rack with liquid-to-chip capability reaching 400 kW. Taken together, the projects point to a consistent strategy: standardized, multi-building campuses in major markets; a vertically integrated technical stack with cooling at its core; and an operating model built around repeatable designs, modular systems, and readiness for rapidly escalating AI densities. A Campus-First Platform Strategy Edged US’s platform strategy is built around campus-scale expansion rather than one-off facilities. The company positions itself as a gigawatt-scale, AI-ready portfolio expanding across major U.S. metros through repeatable design targets and multi-building campuses: an emphasis that is deliberate and increasingly consequential. In Chicago/Aurora, Edged is developing a multi-building campus with an initial facility already online and a second 72-MW building under construction. Dallas/Irving follows the same playbook: the first facility opened in January 2025, with a second 24-MW building approved unanimously by the city. Taken together with developments in Atlanta, Chicago, Columbus, Dallas, Des Moines, Kansas City, and Phoenix, the footprint reflects a portfolio-first mindset rather than a collection of bespoke sites. This focus on campus-based expansion matters because the AI factory era increasingly rewards developers that can execute three things at once: Lock down power and land at scale. Standardize delivery across markets. Operate efficiently while staying aligned with community and regulatory expectations. Edged is explicitly selling the second

Read More »

CBRE’s 2026 Data Center Outlook: Demand Surges as Delivery Becomes the Constraint

The U.S. data center market is entering 2026 with fundamentals that remain unmatched across commercial real estate, but the nature of the dominant constraint has shifted. Demand is no longer gated by capital, connectivity, or even land. It is gated by the ability to deliver very large blocks of power, on aggressive timelines, at a predictable cost. According to the CBRE 2026 U.S. Real Estate Market Outlook as overseen by Gordon Dolven and Pat Lynch, the sector is on track to post another record year for leasing activity, even as vacancy remains at historic lows and pricing reaches all-time highs. What has changed is the scale at which demand now presents itself, and the difficulty of meeting it. Large-Block Leasing Rewrites the Economics AI-driven workloads are reshaping leasing dynamics in ways that break from prior hyperscale norms. Where 10-MW-plus deployments once commanded pricing concessions, CBRE now observes the opposite behavior: large, contiguous blocks of capacity are commanding premiums. Neocloud providers, GPU-as-a-service platforms and AI startups, many backed by aggressive capital deployment strategies, are actively competing for full-building and campus-scale capacity.  For operators, this is altering development and merchandising strategies. Rather than subdividing shells for flexibility, owners increasingly face a strategic choice: hold buildings intact to preserve optionality for single-tenant, high-density users who are willing to pay for scale. In effect, scale itself has become the scarce asset. Behind-the-Meter Power Moves to the Foreground For data centers, power availability meaning not just access, but certainty of delivery, is now the defining variable in the market.  CBRE notes accelerating adoption of behind-the-meter strategies as operators seek to bypass increasingly constrained utility timelines. On-site generation using natural gas, solar, wind, and battery storage is gaining traction, particularly in deregulated electricity markets where operators have more latitude to structure BYOP (bring your own power) solutions. 

Read More »

Blue Origin targets enterprise networks with a multi-terabit satellite connectivity plan

“It’s ideal for remote, sparse, or sensitive regions,” said Manish Rawat, analyst at TechInsights. “Key use cases include cloud-to-cloud links, data center replication, government, defense, and disaster recovery workloads. It supports rapid or temporary deployments and prioritizes fewer customers with high capacity, strict SLAs, and deep carrier integration.” Adoption, however, is expected to largely depend on the sector. For governments and organizations operating highly critical or sensitive infrastructure, where reliability and security outweigh cost considerations, this could be attractive as a redundancy option. “Banks, national security agencies, and other mission-critical operators may consider it as an alternate routing path,” Jain said. “For most enterprises, however, it is unlikely to replace terrestrial connectivity and would instead function as a supplementary layer.” Real-world performance Although satellite connectivity offers potential advantages, analysts note that questions remain around real-world performance. “TeraWave’s 6 Tbps refers to total constellation capacity, not per-user throughput, achieved via multiple optical inter-satellite links and ground gateways,” Rawat said. “Optical crosslinks provide high aggregate bandwidth but not a single terabit-class pipe. Performance lies between fiber and GEO satellites, with lower intercontinental latency than GEO but higher than fiber.” Operational factors could also affect network stability. Jitter is generally low, but handovers, rerouting, and weather conditions can introduce intermittent performance spikes. Packet loss is expected to remain modest but episodic, Rawat added.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »