Stay Ahead, Stay ONMINE

How retail energy suppliers can navigate the new federal energy policies

Nainish Gupta is director of REC portfolio and regulatory compliance at POWWR, an energy software company. After years of shifts towards widespread renewable energy adoption, presidential executive orders are pausing federal investments in new wind and solar projects, slowing the development of clean energy supply chains and manufacturing. The leasing of federal lands for renewables […]

Nainish Gupta is director of REC portfolio and regulatory compliance at POWWR, an energy software company.

After years of shifts towards widespread renewable energy adoption, presidential executive orders are pausing federal investments in new wind and solar projects, slowing the development of clean energy supply chains and manufacturing. The leasing of federal lands for renewables has also been rolled back, limiting the expansion of large-scale projects. In the first quarter of 2025, clean energy manufacturers canceled, closed or downsized nearly $8 billion in projects. Tariffs are also making solar panel installations more expensive, since about 75% of panels come from China.

At the same time, the policy changes benefit the expansion of fossil fuels: they lift restrictions on oil, gas and mineral production in Alaska and expand drilling in places like the Alaska National Wildlife Reserve. These moves open areas for development and revive the natural gas industry, which could pose a risk to wildlife and natural habitats. Simplifying the approval process for natural gas pipelines could also speed up construction, but it might overlook environmental assessments.

Retail energy suppliers are already feeling the effects of increased price volatility and rising compliance costs. To navigate this new reality, suppliers need to focus on three smart strategies: Buy renewable energy certificates (RECs) now, leverage data and predictive analytics to make more informed decisions, and grow market share.

Getting ahead of REC price increases

With fewer renewable projects on the horizon, the supply of RECs is tightening. Yet, state renewable portfolio standards are not going anywhere, and will continue to require retail suppliers to acquire a certain percentage of RECs based on the region and their portfolio size. That supply imbalance, paired with an increased demand, guarantees that prices will go up. The best way for suppliers to mitigate risk is to begin buying RECs now or put money aside to start that process, ahead of projected price increases. Accumulating them gradually spreads financial exposure and ensures inventory is available when needed, and at a lower price.

Some states allow suppliers to hold RECs for months or years before they need to be retired, giving them a chance to build up a stock. Strategically buying RECs in advance, capitalizing during periods of lower pricing, will also help suppliers mitigate the rate increases that are passed on to customers, as prices are expected to rise by the end of this year or early 2026.

It’s also essential to track RECs more proactively than in the past. That doesn’t require new systems, but having visibility into upcoming pricing trends allows suppliers to act when opportunities emerge. Implementing systems to manage and track RECs ensures you’re taking into account all of the information available to make smart decisions.

Leverage predictive analytics and data

Policy changes will also increase price volatility in the natural gas market, making hedging decisions more critical. Suppliers should leverage predictive analytics and data to drive decision-making. Historical trends aren’t enough — predictive analytics proves to be valuable in anticipating swings tied to weather patterns and regulatory shifts. Third-party software and managed services have modernized energy transactions with real-time market data, weather forecasting and load analysis to identify optimal purchase windows and lock in natural gas prices amidst price volatility. Having access to data at this level, with the software to track it, makes the process more transparent and helps suppliers make the most informed hedging decisions, efficiently adjusting market positions based on real-time data.

As policy changes increase natural gas generation and shift away from renewables, suppliers will also need to evaluate their portfolio mix and where they are supplying customers. Determining the most cost-effective and reliable energy sourcing strategy given current regulation may require suppliers to consider greater diversity or reallocation of energy sources in their portfolios. The more granular insights suppliers can use, the easier it will be to optimize portfolios and secure the best mix at the best price.

Increase market share early

Customer-facing technology is another area of opportunity. Smart apps and meters that deliver real-time energy usage data can help customers manage their consumption more effectively. Features like time-of-use pricing offer more transparency for customers to see exactly what the prices are and adjust their energy usage accordingly to reduce their energy costs. This is especially significant with looming rate increases due to REC price changes and compliance costs that may show up on customer bills. These technologies offer customers invaluable insights into their consumption habits and put some of the power in their hands to make their own changes. New product offerings that improve the customer experience have staying power beyond future policy or market changes, and can help ensure suppliers keep their customer share.

As compliance and sourcing costs increase, long-term customer contracts offer stability. One proven approach is structuring power purchase agreements with large, consistent users like data centers. Big technology companies, like Microsoft and Meta, are investing in massive data centers to meet the rapidly growing needs for AI development. While some retail suppliers may have a large variety of customers across residential and commercial sectors, shifting target customer focus to pursue opportunities with large commercial technology users can offer a great balance and reduce risk. These customers have predictable, around-the-clock demand. Agreements with large users offer fixed or indexed pricing over multiple years, giving suppliers a predictable load to hedge against and revenue to plan around. With these long-term PPA agreements, that market share is solidified for years, regardless of any uncertainty in federal regulations.

Building long-term resilience

While some of the current federal orders could be reversed under future administrations, these shifts will take years to fully implement. Restarting shuttered coal or nuclear plants takes time and money, and, in many cases, the talent to do so has already exited the workforce. It would take years for restarted plants to meet current standards and come up to speed.

Even if clean energy incentives return in a few years, the changes to supply chains, project pipelines and market confidence will take years to catch up. And the impact on energy infrastructure and investment wouldn’t be reversed overnight. The actions suppliers take now need to create long-term resilience through continued volatility. Investing in technology that gives suppliers better insight into the market, clearer data to make decisions and transparency for customers is a smart strategy regardless of federal changes.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

US lets China buy semiconductor design software again

The reversal marks a dramatic shift from the aggressive stance the Trump administration took in May, when it imposed sweeping restrictions on electronic design automation (EDA) software — the critical tools needed to design advanced semiconductors.  A short-lived stoppage  The restrictions had targeted what analysts called the “upstream” of chip

Read More »

Ameren plans Missouri gas plant with 400-MW battery storage

Ameren’s Missouri utility subsidiary plans to construct an 800-MW simple-cycle gas plant alongside a 400-MW battery storage facility in Jefferson County to meet rising electricity demand, the company announced last month. The Big Hollow Energy Center is designed to “efficiently meet increased energy demands while ensuring grid resiliency,” Ameren said. The lithium-ion battery will be Ameren’s first first large-scale storage facility, though the utility is planning to have 1 GW of battery storage online by 2030 and 1.8 GW by 2042. “This is the next step to deliver on our strategy to invest in energy infrastructure for our customers’ benefit and provide a balanced generation portfolio,” Ameren Missouri President and Chairman Mark Birk said. The gas plant and battery storage “will operate independently,” leveraging energy infrastructure on land Ameren already owns to reduce construction time and cost, the utility said. “It is crucial to have a balanced mix of generation technologies and equally important to strategically locate them across the region,” said Ajay Arora, senior vice president and chief development officer at Ameren Missouri. “This approach maximizes the energy output from these resources.” The Big Hollow project could be online by 2028, Ameren said, “with timely regulatory approval.” Ameren in February announced a shift in its preferred resource plan to provide for 1.5 GW of anticipated new energy demand by 2032. Along with storage, Ameren is planning to build 1.6 GW of gas generation resources by 2030, with a total planned addition of 6.1 GW by 2045. “Building off the success of bringing in nearly two dozen new economic development projects in 2024, Ameren Missouri has construction agreements with potential large load customers for new energy demand,” the utility said. Ameren also filed an updated Smart Energy Plan with the Missouri Public Service Commission in February. The $16.2 billion, five-year plan calls for

Read More »

Diesel’s Supply Crunch Leaves Market Crying Out for Barrels

The world’s biggest oil product market is hungry for barrels as traders grapple with a summer supply squeeze. US stockpiles of diesel products have dropped to the lowest for the time of year since 1996, while in Europe benchmark futures are signaling a tighter market than during the height of the Israel-Iran conflict. The cost of the fuel relative to crude — a key trading metric known as the crack — is well above seasonal norms in both regions. The pressure on supplies has been driven by refinery closures on both sides of the Atlantic and a slew of recent outages, as well as the impact of production curbs by key OPEC+ producers. Last month, diesel accounted for just 31.4% of global output of oil products, well below the seasonal average, according to figures from Energy Aspects Ltd. The shortage is driving up prices for diesel, used in everything from construction to transport to heating. In northwest Europe, the premium for more immediate supplies over the following month surged to $44 a ton on Friday; excluding the often volatile days when futures contracts expire, that’s the strongest since late-2022. “The Atlantic Basin diesel balance is looking increasingly tight into autumn refinery maintenance season and peak winter demand,” said Natalia Losada, an oil products analyst at Energy Aspects. “European inventories may not build in July, which leaves them in a very fragile position.” Diesel markets spiked last month when the fighting between Israel and Iran threatened millions of barrels of fuel exports from the Persian Gulf. That risk has now receded, but supplies remain under pressure. The biggest of the OPEC+ output cuts came from Saudi Arabia and Russia, both of which pump crude at the heavier end of the spectrum. Meanwhile, Kazakhstan boosted output of its very light Tengiz crude earlier this year. “OPEC+

Read More »

BP, Shell Sign Libya Deals as Majors Step Up Their Return

British energy giants BP Plc and Shell Plc signed agreements with Libya’s National Oil Corp to study new opportunities, joining international majors accelerating their return to the oil-rich African nation. BP, which has pivoted away from its failed low-carbon strategy to focus more on fossil fuels, signed a memorandum of understanding to study reviving two huge oil fields in Libya, it said in a statement on Tuesday. The document outlines a framework for how the energy companies might work together and assess a range of technical data.  Separately, a Shell spokesman confirmed the company has signed a MOU with NOC “to study potential opportunities in the country’s oil and gas sector.” The focus on Libya, a member of the Organization of the Petroleum Exporting Countries, comes as the North African nation tries to bring back oil majors that left. Libya has struggled to quell unrest since the 2011 fall of longtime dictator Moammar Qaddafi. The country, which has the biggest-known crude reserves in Africa, is split between two governments that frequently feud over control of under-invested oil resources. Ever since the civil war, which led to a slump in Libya’s production by about 18-fold to around 100,000 barrels a day in 2011, output has been volatile. The North African nation has pumped about 1.2 million to 1.3 million barrels a day in recent times, though that has had some wild variations. It has a goal of boosting output to 2 million barrels a day in a few years. Starting last year, international companies, including BP, Italy’s Eni SpA, Spain’s Repsol SA and Austria’s OMV AG, resumed drilling in Libya, ending pauses in place since 2014. Libya is currently running its first tender for energy exploration contracts since the 2011 civil war. BP’s agreement “reflects our strong interest in deepening our

Read More »

How retail energy suppliers can navigate the new federal energy policies

Nainish Gupta is director of REC portfolio and regulatory compliance at POWWR, an energy software company. After years of shifts towards widespread renewable energy adoption, presidential executive orders are pausing federal investments in new wind and solar projects, slowing the development of clean energy supply chains and manufacturing. The leasing of federal lands for renewables has also been rolled back, limiting the expansion of large-scale projects. In the first quarter of 2025, clean energy manufacturers canceled, closed or downsized nearly $8 billion in projects. Tariffs are also making solar panel installations more expensive, since about 75% of panels come from China. At the same time, the policy changes benefit the expansion of fossil fuels: they lift restrictions on oil, gas and mineral production in Alaska and expand drilling in places like the Alaska National Wildlife Reserve. These moves open areas for development and revive the natural gas industry, which could pose a risk to wildlife and natural habitats. Simplifying the approval process for natural gas pipelines could also speed up construction, but it might overlook environmental assessments. Retail energy suppliers are already feeling the effects of increased price volatility and rising compliance costs. To navigate this new reality, suppliers need to focus on three smart strategies: Buy renewable energy certificates (RECs) now, leverage data and predictive analytics to make more informed decisions, and grow market share. Getting ahead of REC price increases With fewer renewable projects on the horizon, the supply of RECs is tightening. Yet, state renewable portfolio standards are not going anywhere, and will continue to require retail suppliers to acquire a certain percentage of RECs based on the region and their portfolio size. That supply imbalance, paired with an increased demand, guarantees that prices will go up. The best way for suppliers to mitigate risk is to

Read More »

Load growth, plant retirements could drive 100x increase in blackouts by 2030: DOE

Load growth, plant retirements could drive 100x increase in blackouts by 2030: DOE | Utility Dive Skip to main content An article from Dive Brief The U.S. Department of Energy on Monday published a methodology for assessing grid reliability, but clean energy advocates say it likely exaggerates the risks of blackouts. Published July 8, 2025 Power outage hours could increase from single digits today “to more than 800 hours per year,” the U.S. Department of Energy said in a report published July 7, 2025. Marizza via Getty Images Dive Brief: Blackouts could increase by 100 times in 2030, relative to today’s averages, if the United States continues to shutter power plants and fails to add additional firm capacity amid rising demand, the U.S. Department of Energy said in a Monday report. The report includes a uniform methodology to identify regions at risk of power outages and guide federal reliability “interventions,” DOE said. The report was required by President Donald Trump’s April executive order which directed the agency to respond to an “energy emergency” he declared in January. But clean energy advocates say the report appears to exaggerate the risks, and undercount the contributions of wind, solar and battery storage resources. “If the analysis is overly pessimistic about advanced energy technologies and the future of the grid, consumers will end up paying too much for resources we no longer need,” Caitlin Marquis, managing director at Advanced Energy United, said in an email. Dive Insight: DOE’s report assumes 104 GW of plant retirements by 2030, alongside the addition of 210 GW of new generation — but only 22 GW of the additions will be “firm, reliable, dispatchable generation.” “Modeling shows annual outage hours could increase from single digits today to more than 800 hours per year. Such a surge would leave millions of households and

Read More »

PJM market monitor urges FERC to set conditions on $12B NRG, LS Power deal

The PJM Interconnection’s market monitor on Monday urged federal regulators to set conditions on a pending $12 billion deal between NRG Energy and LS Power citing concern over potential market abuses. Under the planned transaction, NRG would buy from LS Power 18 natural gas-fired power plants, totaling about 13 GW across nine states, some of which fall in PJM’s territory. NRG would also acquire CPower, a commercial and industrial virtual power plant platform that has about 6 GW of capacity and operates in all deregulated markets in the U.S., according to a news release announcing the deal.  Setting conditions on the deal would help prevent NRG from exerting market power, Monitoring Analytics, PJM’s independent market monitor, said in a filing at the Federal Energy Regulatory Commission. The companies expect to close the deal early next year, subject to FERC and other approvals. Without the conditions, Monitoring Analytics said it would oppose the transaction because it would increase NRG’s ability to affect electricity and capacity prices. Under the pending deal, NRG’s capacity in PJM would jump from about 2.1 GW to 9.5 GW. The transaction would also add to NRG’s emergency and pre-emergency demand response capability. Even with the increased capacity, NRG and LS Power contend that NRG would be unable to exert market power, according to their June 12 application at FERC. However, the transaction would increase NRG’s market power in some parts of PJM, according to Monitoring Analytics. Also, under PJM’s market conditions, capacity prices are likely to continue being “very high,” the market monitor said.  “The market conditions and structural market power increase the potential impact of market power on customers under the current market structure,” it said. The market monitor said it is concerned that NRG would be able to affect capacity prices using its increased ownership in emergency and

Read More »

CoreWeave acquires Core Scientific for $9B to power AI infrastructure push

Such a shift, analysts say, could offer short-term benefits for enterprises, particularly in cost and access, but also introduces new operational risks. “This acquisition may potentially lower enterprise pricing through lease cost elimination and annual savings, while improving GPU access via expanded power capacity, enabling faster deployment of Nvidia chipsets and systems,” said Charlie Dai, VP and principal analyst at Forrester. “However, service reliability risks persist during this crypto-to-AI retrofitting.” This also indicates that struggling vendors such as Core Scientific and similar have a way to cash out, according to Yugal Joshi, partner at Everest Group. “However, it does not materially impact the availability of Nvidia GPUs and similar for enterprises,” Joshi added. “Consolidation does impact the pricing power of vendors.” Concerns for enterprises Rising demand for AI-ready infrastructure can raise concerns among enterprises, particularly over access to power-rich data centers and future capacity constraints. “The biggest concern that CIOs should have with this acquisition is that mature data center infrastructure with dedicated power is an acquisition target,” said Hyoun Park, CEO and chief analyst at Amalgam Insights. “This may turn out to create challenges for CIOs currently collocating data workloads or seeking to keep more of their data loads on private data centers rather than in the cloud.”

Read More »

CoreWeave achieves a first with Nvidia GB300 NVL72 deployment

The deployment, Kimball said, “brings Dell quality to the commodity space. Wins like this really validate what Dell has been doing in reshaping its portfolio to accommodate the needs of the market — both in the cloud and the enterprise.” Although concerns were voiced last year that Nvidia’s next-generation Blackwell data center processors had significant overheating problems when they were installed in high-capacity server racks, he said that a repeat performance is unlikely. Nvidia, said Kimball “has been very disciplined in its approach with its GPUs and not shipping silicon until it is ready. And Dell almost doubles down on this maniacal quality focus. I don’t mean to sound like I have blind faith, but I’ve watched both companies over the last several years be intentional in delivering product in volume. Especially as the competitive market starts to shape up more strongly, I expect there is an extremely high degree of confidence in quality.” CoreWeave ‘has one purpose’ He said, “like Lambda Labs, Crusoe and others, [CoreWeave] seemingly has one purpose (for now): deliver GPU capacity to the market. While I expect these cloud providers will expand in services, I think for now the type of customer employing services is on the early adopter side of AI. From an enterprise perspective, I have to think that organizations well into their AI journey are the consumers of CoreWeave.”  “CoreWeave is also being utilized by a lot of the model providers and tech vendors playing in the AI space,” Kimball pointed out. “For instance, it’s public knowledge that Microsoft, OpenAI, Meta, IBM and others use CoreWeave GPUs for model training and more. It makes sense. These are the customers that truly benefit from the performance lift that we see from generation to generation.”

Read More »

Oracle to power OpenAI’s AGI ambitions with 4.5GW expansion

“For CIOs, this shift means more competition for AI infrastructure. Over the next 12–24 months, securing capacity for AI workloads will likely get harder, not easier. Though cost is coming down but demand is increasing as well, due to which CIOs must plan earlier and build stronger partnerships to ensure availability,” said Pareekh Jain, CEO at EIIRTrend & Pareekh Consulting. He added that CIOs should expect longer wait times for AI infrastructure. To mitigate this, they should lock in capacity through reserved instances, diversify across regions and cloud providers, and work with vendors to align on long-term demand forecasts.  “Enterprises stand to benefit from more efficient and cost-effective AI infrastructure tailored to specialized AI workloads, significantly lower their overall future AI-related investments and expenses. Consequently, CIOs face a critical task: to analyze and predict the diverse AI workloads that will prevail across their organizations, business units, functions, and employee personas in the future. This foresight will be crucial in prioritizing and optimizing AI workloads for either in-house deployment or outsourced infrastructure, ensuring strategic and efficient resource allocation,” said Neil Shah, vice president at Counterpoint Research. Strategic pivot toward AI data centers The OpenAI-Oracle deal comes in stark contrast to developments earlier this year. In April, AWS was reported to be scaling back its plans for leasing new colocation capacity — a move that AWS Vice President for global data centers Kevin Miller described as routine capacity management, not a shift in long-term expansion plans. Still, these announcements raised questions around whether the hyperscale data center boom was beginning to plateau. “This isn’t a slowdown, it’s a strategic pivot. The era of building generic data center capacity is over. The new global imperative is a race for specialized, high-density, AI-ready compute. Hyperscalers are not slowing down; they are reallocating their capital to

Read More »

Arista Buys VeloCloud to reboot SD-WANs amid AI infrastructure shift

What this doesn’t answer is how Arista Networks plans to add newer, security-oriented Secure Access Service Edge (SASE) capabilities to VeloCloud’s older SD-WAN technology. Post-acquisition, it still has only some of the building blocks necessary to achieve this. Mapping AI However, in 2025 there is always more going on with networking acquisitions than simply adding another brick to the wall, and in this case it’s the way AI is changing data flows across networks. “In the new AI era, the concepts of what comprises a user and a site in a WAN have changed fundamentally. The introduction of agentic AI even changes what might be considered a user,” wrote Arista Networks CEO, Jayshree Ullal, in a blog highlighting AI’s effect on WAN architectures. “In addition to people accessing data on demand, new AI agents will be deployed to access data independently, adapting over time to solve problems and enhance user productivity,” she said. Specifically, WANs needed modernization to cope with the effect AI traffic flows are having on data center traffic. Sanjay Uppal, now VP and general manager of the new VeloCloud Division at Arista Networks, elaborated. “The next step in SD-WAN is to identify, secure and optimize agentic AI traffic across that distributed enterprise, this time from all end points across to branches, campus sites, and the different data center locations, both public and private,” he wrote. “The best way to grab this opportunity was in partnership with a networking systems leader, as customers were increasingly looking for a comprehensive solution from LAN/Campus across the WAN to the data center.”

Read More »

Data center capacity continues to shift to hyperscalers

However, even though colocation and on-premises data centers will continue to lose share, they will still continue to grow. They just won’t be growing as fast as hyperscalers. So, it creates the illusion of shrinkage when it’s actually just slower growth. In fact, after a sustained period of essentially no growth, on-premises data center capacity is receiving a boost thanks to genAI applications and GPU infrastructure. “While most enterprise workloads are gravitating towards cloud providers or to off-premise colo facilities, a substantial subset are staying on-premise, driving a substantial increase in enterprise GPU servers,” said John Dinsdale, a chief analyst at Synergy Research Group.

Read More »

Oracle inks $30 billion cloud deal, continuing its strong push into AI infrastructure.

He pointed out that, in addition to its continued growth, OCI has a remaining performance obligation (RPO) — total future revenue expected from contracts not yet reported as revenue — of $138 billion, a 41% increase, year over year. The company is benefiting from the immense demand for cloud computing largely driven by AI models. While traditionally an enterprise resource planning (ERP) company, Oracle launched OCI in 2016 and has been strategically investing in AI and data center infrastructure that can support gigawatts of capacity. Notably, it is a partner in the $500 billion SoftBank-backed Stargate project, along with OpenAI, Arm, Microsoft, and Nvidia, that will build out data center infrastructure in the US. Along with that, the company is reportedly spending about $40 billion on Nvidia chips for a massive new data center in Abilene, Texas, that will serve as Stargate’s first location in the country. Further, the company has signaled its plans to significantly increase its investment in Abu Dhabi to grow out its cloud and AI offerings in the UAE; has partnered with IBM to advance agentic AI; has launched more than 50 genAI use cases with Cohere; and is a key provider for ByteDance, which has said it plans to invest $20 billion in global cloud infrastructure this year, notably in Johor, Malaysia. Ellison’s plan: dominate the cloud world CTO and co-founder Larry Ellison announced in a recent earnings call Oracle’s intent to become No. 1 in cloud databases, cloud applications, and the construction and operation of cloud data centers. He said Oracle is uniquely positioned because it has so much enterprise data stored in its databases. He also highlighted the company’s flexible multi-cloud strategy and said that the latest version of its database, Oracle 23ai, is specifically tailored to the needs of AI workloads. Oracle

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »