Stay Ahead, Stay ONMINE

Analysts Look at Upcoming OPEC+ Meeting

In an oil report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team recently, Ole R. Hvalbye, a commodities analyst at the company, noted that the next OPEC+ meeting is set for May 5, “with the proposed June output hike expected to top the agenda”. “The group will likely choose between a scheduled, […]

In an oil report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team recently, Ole R. Hvalbye, a commodities analyst at the company, noted that the next OPEC+ meeting is set for May 5, “with the proposed June output hike expected to top the agenda”.

“The group will likely choose between a scheduled, incremental increase of 138,000 barrels per day, or a more aggressive jump of 411,000 barrels per day – equivalent to … three months’ worth of increases rolled into one,” Hvalbye predicted in the report.

“The latter scenario would put downward pressure on oil prices and highlight deepening tensions within OPEC+, while also exacerbating concerns in a market already clouded by weak demand expectations,” he warned.

“Although the final decision on volumes remains unclear, OPEC+ has demonstrated it still has pricing power, and that it can pull prices lower quickly if it chooses to do so,” Hvalbye went on to state in the report.

In a Stratas Advisors report sent to Rigzone by the Stratas team late Monday, which also highlighted the upcoming OPEC+ meeting, the company said it thinks it will be essential for OPEC+ to communicate that its members will maintain discipline and be proactive in aligning supply with demand so as not to undermine the long-term viability of OPEC+, which Stratas described in the report as “increasingly critical to the stability of the oil market”.

In a market analysis sent to Rigzone on Tuesday, Osama Al Saifi, Managing Director for MENA at Traze, said several OPEC+ members are likely to advocate for additional output hikes in the coming months, potentially exacerbating supply imbalances.

In another market analysis sent to Rigzone on Monday, Konstantinos Chrysikos, Head of Customer Relationship Management at Kudotrade, said the prospect of OPEC+ considering further increases in oil output at their upcoming meeting raises concerns over potential oversupply in the market.

A research note sent to Rigzone by Natasha Kaneva, Head of Global Commodities Strategy at J.P. Morgan, on Tuesday warned that OPEC+ “will continue to face challenges due to the growth in non-OPEC supply and capacity expansion among some alliance members”.

Rigzone has contacted OPEC for comment on the SEB report, the Stratas report, Al Saifi’s comments, Chrysikos’ comments, and the research note sent by Kaneva. At the time of writing, OPEC has not responded to Rigzone.

A release posted on OPEC’s website on April 3 announced that Saudi Arabia, Russia, Iraq, UAE, Kuwait, Kazakhstan, Algeria, and Oman met virtually on that day “to review global market conditions and outlook”.

“In view of the continuing healthy market fundamentals and the positive market outlook, and in accordance with the decision agreed upon on 5 December 2024, subsequently reaffirmed on 3 March 2025, to start a gradual and flexible return of the 2.2 million barrels per day voluntary adjustments starting from 1 April 2025, the eight participating countries will implement a production adjustment of 411,000 barrels per day, equivalent to three monthly increments, in May 2025,” the release noted.

“This comprises the increment originally planned for May in addition to two monthly increments. The gradual increases may be paused or reversed subject to evolving market conditions,” it added.

“This flexibility will allow the group to continue to support oil market stability. The eight OPEC+ countries also noted that this measure will provide an opportunity for the participating countries to accelerate their compensation,” it continued.

The release went on to state that the eight countries reaffirmed their commitment to the voluntary production adjustments agreed at the 53rd JMMC meeting on April 3, 2024.

“They also confirmed their intention to fully compensate any overproduced volume since January 2024 and to submit updated front-loaded compensation plans to the OPEC Secretariat by 15 April 2025 which will be posted on the Secretariat’s website,” it added.

“The eight OPEC+ countries will hold monthly meetings to review market conditions, conformity, and compensation,” it went on to note.

A release posted on OPEC’s site on April 16 announced that the OPEC Secretariat had received updated compensation plans from Saudi Arabia, Russia, Iraq, the United Arab Emirates, Kuwait, Kazakhstan, and Oman.

The updated compensation plans amount to 222,000 barrels per day in April, 378,000 barrels per day in May, and 431,000 barrels per day in June, a table accompanying that release outlined.

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Pantheon of college football gets a Wi-Fi upgrade

Notre Dame has fully adopted mobile ticketing and introduced grab-and-go concession stands, with plans to expand them further. Alcohol sales were recently approved, prompting efforts to support new services like mobile carts. In premium areas, fans can stream various games during events. Notre Dame also tested mobile ordering for concessions

Read More »

The U.S. leads the world in AI (job) anxiety

The Americans have the highest search volume with a population-adjusted value of 440,000 search queries on the topic of AI job loss, while their attitude towards AI is moderately positive at 54.5%. The intensity score of 3 for the U.S. shows that the concern of losing jobs to AI is

Read More »

Tigera extends cloud-native networking with Calico 3.30

This logging capability is exposed through two new components: Goldmane: A gRPC-based API endpoint that aggregates flow logs from Calico’s Felix component, which runs on each node. Whisker: A web-based visualization tool built with React and TypeScript that connects to the Goldmane API. The combination of these components provides detailed

Read More »

Northeast states to seek transmission proposals between PJM, ISO-NE, New York

Dive Brief: A group of nine Mid-Atlantic and Northeast states plans to issue a request for information to gauge the potential for building “no regrets” transmission projects between New England, New York and the PJM Interconnection, according to an action plan released on Monday. No-regrets interregional transmission needs grow to 4 GW between PJM and New York and to 3 GW between New York and New England by 2040 — and could be higher depending on load growth and decarbonization developments, the Brattle Group said in the action plan, which the consulting firm wrote for the Northeast States Collaborative on Interregional Transmission. “We believe that there are likely low-hanging fruit, no-regrets interregional transmission projects that current transmission planning approaches are failing to identify,” Katie Dykes, commissioner of Connecticut’s Department of Energy and Environmental Protection, said during a presentation Tuesday on the action plan. Dive Insight: The collaboration intends to break down barriers to interregional transmission development, but the effort may require state and federal legislation, according to Dykes. The states in the collaborative that was formed in 2023 are: Connecticut, Delaware, Massachusetts, Maine, Maryland, New Jersey, New York, Rhode Island and Vermont. “The focus is really on … identifying beneficial transmission that states can collectively pursue in order to reduce costs for our consumers and increase the security of our system, because each region going it alone … is inherently more expensive in terms of the overall build of resources, but also less secure because you do not have the benefit of the increased resource diversity that interregional transmission can help provide,” said John Bernecker, director of the Transmission Center of Excellence at the New York State Energy Research and Development Authority. The Brattle Group recommended that the Northeast and Mid-Atlantic states work with PJM, the New York Independent System Operator

Read More »

Why customer engagement is the key to VPP success

Hannah Bascom is chief growth officer for Uplight As electricity demand soars — driven by AI-powered data centers, the rise of electric vehicles and beyond — the U.S. grid is feeling the strain. Experts estimate that AI alone could drive a 67 GW increase in energy demand over the next five years, a challenge the U.S. grid is not currently equipped to handle. And while new generation is in the works, building supply fast enough to keep up is a near impossibility. Fortunately, there is a massive but largely untapped resource at utilities’ disposal that doesn’t require years of lead time to start producing: customers and their growing number of distributed energy resources (DERs), including everything from residential solar to home and commercial batteries to combined heat and power systems, are at the ready. Utilities currently tap these DERs to deliver load flexibility through a variety of demand response programs, but there is significant room to expand the impact of these assets through the creation of virtual power plants (VPPs). VPPs operate a portfolio of DERs, comprising multiple technology types, as a cohesive resource to accurately predict and deliver grid flexibility. Aggregating DERs and programs into a coordinated portfolio allows VPPs to unlock substantially more flexibility and economic potential from customer-sited DERs compared to single-technology demand response programs.  By now, many are familiar with the promise of VPPs: the Department of Energy estimates that deploying 80 GW to 160 GW of VPP, enough to serve 10% to 20% of peak load, by 2030 could support rapid load growth while also reducing the cost required to serve new load. Unlocking flexibility and economic potential from customer-sited DERs through VPPs will transform a grid challenge into a grid opportunity, create firm, clean capacity, reduce customer and utility spend and support our journey

Read More »

Wood misses annual results deadline, shares suspended

Aberdeen-headquartered oilfield services firm Wood is facing a temporary suspension from the London stock market after missing a deadline to publish its 2024 financial results. Wood announced in a statement today that it would miss a 30 April deadline to publish its results due to the “extensive work needed” to complete an audit of its accounts. As a result, Wood confirmed the company will be temporarily suspending from listing and trading of its shares from 1 May until it publishes the 2024 results. Wood reported a $983 million half year loss last August amid a slump in revenues. It comes after an independent review commissioned by the firm identified “material weaknesses and failures” in Wood’s financial culture. The review came as further embarrassment for the embattled company after its former chief financial officer Arvind Balan resigned and its share price plummeted. The missed deadline also comes amid a £240 million takeover attempt from Dubai-based rival Sidara, which Wood’s board said it “would be minded to recommend” to shareholders. Sidara walked away from a previous £1.6bn takeover attempt in August last year. In 2023, American private equity group Apollo Global Management also backed away from its own attempts to buy Wood after making five separate offers. Earlier this year, Apollo acquired Aberdeen-based offshore energy services group OEG for $1bn (£770m). Who is Sidara? Formerly known as Dar Group, the company rebranded to Sidara in December 2023 to coincide with the COP28 conference in Dubai. Jordanian businessman Kamal Al-Shair founded the company as Dar Al-Handasah in Lebanon in 1956 alongside three engineering colleagues at the American University of Beirut. The firm gradually expanded operations in the Middle East and North Africa in the 60s and 70s, and made its first acquisition of US firm Perkins & Will in 1986. Sidara expanded in to the UK

Read More »

Britain’s electricity system is ‘highly resilient’, says minister

Britain’s electricity system is “highly resilient”, energy minister Michael Shanks has said in a bid to reassure MPs following Spain and Portugal’s widespread blackouts. Power has largely been restored across the Iberian Peninsula, but not before the outage grounded flights, paralysed train systems and disrupted mobile communication. Speaking in the Commons, Mr Shanks said the UK had not been affected by the power cuts but the Government continues to prepare for “all eventualities”. In a statement on energy grid resilience, he said: “I would like to reassure the House that Great Britain has a highly resilient energy network, and the incident in Iberia has not impacted Great Britain. “The Secretary of State has been in regular contact with the National Energy System Operator, the Neso, over the past two days, and they have provided reassurance that there is no increase in risk to our energy supplies or system stability from this incident.” Mr Shanks added: “An event similar in impact to this incident in Great Britain would be the equivalent of a national power outage, which is the total loss of power across the whole of GB. This risk is listed on the national risk register as a high-impact but low-likelihood event.” He said the Government is prepared “for all eventualities no matter how unlikely”, adding: “This includes having robust emergency plans, summarised in the national emergency plan for downstream gas and electricity, and regularly exercising emergency plans with the energy industry and Ofgem (Office of Gas and Electricity Markets).” Shadow energy minister Andrew Bowie said: “We cannot get away from the fact that this Government’s plans to rush ahead to build a grid that is entirely dependent on the wind and the sun, in just five years’ time, makes our electricity grid significantly less reliable.” He added: “Data from

Read More »

Charities call for Berwick Bank wind farm to be blocked as thousands of bird deaths feared

Nature charities are urging the Scottish Government to refuse consent for one of the world’s largest offshore wind farms, warning it could kill and displace tens of thousands of seabirds. They warn the Berwick Bank wind farm would be “catastrophic” for Scotland’s seabirds and call for developments in more “appropriate” locations. Berwick Bank would have up to 307 turbines located in the outer Firth of Forth, around 40km off the Scottish coast near seabird sites like Bass Rock, St Abb’s Head and the Isle of May – which is home to puffins. RSPB Scotland, the Marine Conservation Society, National Trust for Scotland, Scottish Wildlife Trust and Scottish Seabird Centre have united to call for the development to be refused and have written to Scotland’s First Minister John Swinney. SSE Renewables said it has amended the development designs to reduce, as much as practical, any potential risks to Scottish seabirds. Anne McCall, director of RSPB Scotland, said: “Offshore wind has a key role in tackling the climate and nature emergency. We fully support well-sited wind farms which work with nature. But they are being held back by a major spanner in the works. “Berwick Bank would be catastrophic for Scotland’s globally important seabirds. Its impacts are so damaging it is blocking plans for new, less-harmful wind farms. © PAUndated RSPB Scotland handout photo of two kittiwakes. “As the First Minister said earlier this year, nature should be at the heart of climate action. Our message to the Scottish Government is clear: Refuse Berwick Bank to unlock the full potential of offshore wind while protecting our outstanding wildlife.” The charities claim new offshore wind developments are being held back due to their combined impacts with Berwick Bank on Scotland’s seabird populations, and suggest refusing it could open up opportunities for other such

Read More »

Innovation Zero: In conversation with Ian Hunter, MD of Net Zero Teesside Power

Teesside can set an example for the rest of the world with its innovative carbon capture plans – while also creating thousands of jobs – the Managing Director of Net Zero Teesside Power (NZTP) has said. Talking to Energy Voice at the Innovation Zero conference, Ian Hunter heralded plans to build the planet’s first major gas-fired power station with carbon capture. Some 742MW of “low carbon flexible power” will be provided, he explained. NZTP is working closely with East Coast Cluster and the Northern Endurance Partnership on the project, which will involve transporting CO2 and storing it “permanently and safely” deep off the North Sea. Mr Hunter said the proposals had been drawn up amid “global agreement” around the need for a “massive amount” of CCS in the future. “This is exciting as it would be the first gas-fired power station in the world with this technology,” he continued. “It is really important for the UK…we need a power source that can meet our energy demands when it is not windy or sunny. “We want to take flexible energy and decarbonise it.” An international blueprint for scale © Supplied by EquinorThe Northern Endurance Partnership CCS project is being developed in Teesside. Mr Hunter said the “real beauty” of the scheme was that, if successful, it could be copied globally. NZTP hope to have the project up and running by 2028. “This is really happening,” he said. “Contracts are all signed, we are now in the execution phase. “The UK is really trying to develop a system where big projects get on line and smaller projects tie in later. “That is how you achieve scale and variety in your carbon capture.” E-FWD Analysis: “Just get moving” on Tees Valley’s path to energy leadership Describing the scheme as important on a national

Read More »

Nvidia AI supercluster targets agents, reasoning models on Oracle Cloud

Oracle has previously built an OCI Supercluster with 65,536 Nvidia H200 GPUs using the older Hopper GPU technology and no CPU that offers up to 260 exaflops of peak FP8 performance. According to the blog post announcing the availability, the Blackwell GPUs are available via Oracle’s public, government, and sovereign clouds, as well as in customer-owned data centers through its OCI Dedicated Region and Alloy offerings. Oracle joins a growing list of cloud providers that have made the GB200 NVL72 system available, including Google, CoreWeave and Lambda. In addition, Microsoft offers the GB200 GPUs, though they are not deployed as an NVL72 machine.

Read More »

Deep Data Center: Neoclouds as the ‘Picks and Shovels’ of the AI Gold Rush

In 1849, the discovery of gold in California ignited a frenzy, drawing prospectors from around the world in pursuit of quick fortune. While few struck it rich digging and sifting dirt, a different class of entrepreneurs quietly prospered: those who supplied the miners with the tools of the trade. From picks and shovels to tents and provisions, these providers became indispensable to the gold rush, profiting handsomely regardless of who found gold. Today, a new gold rush is underway, in pursuit of artificial intelligence. And just like the days of yore, the real fortunes may lie not in the gold itself, but in the infrastructure and equipment that enable its extraction. This is where neocloud players and chipmakers are positioned, representing themselves as the fundamental enablers of the AI revolution. Neoclouds: The Essential Tools and Implements of AI Innovation The AI boom has sparked a frenzy of innovation, investment, and competition. From generative AI applications like ChatGPT to autonomous systems and personalized recommendations, AI is rapidly transforming industries. Yet, behind every groundbreaking AI model lies an unsung hero: the infrastructure powering it. Enter neocloud providers—the specialized cloud platforms delivering the GPU horsepower that fuels AI’s meteoric rise. Let’s examine how neoclouds represent the “picks and shovels” of the AI gold rush, used for extracting the essential backbone of AI innovation. Neoclouds are emerging as indispensable players in the AI ecosystem, offering tailored solutions for compute-intensive workloads such as training large language models (LLMs) and performing high-speed inference. Unlike traditional hyperscalers (e.g., AWS, Azure, Google Cloud), which cater to a broad range of use cases, neoclouds focus exclusively on optimizing infrastructure for AI and machine learning applications. This specialization allows them to deliver superior performance at a lower cost, making them the go-to choice for startups, enterprises, and research institutions alike.

Read More »

Soluna Computing: Innovating Renewable Computing for Sustainable Data Centers

Dorothy 1A & 1B (Texas): These twin 25 MW facilities are powered by wind and serve Bitcoin hosting and mining workloads. Together, they consumed over 112,000 MWh of curtailed energy in 2024, demonstrating the impact of Soluna’s model. Dorothy 2 (Texas): Currently under construction and scheduled for energization in Q4 2025, this 48 MW site will increase Soluna’s hosting and mining capacity by 64%. Sophie (Kentucky): A 25 MW grid- and hydro-powered hosting center with a strong cost profile and consistent output. Project Grace (Texas): A 2 MW AI pilot project in development, part of Soluna’s transition into HPC and machine learning. Project Kati (Texas): With 166 MW split between Bitcoin and AI hosting, this project recently exited the Electric Reliability Council of Texas, Inc. planning phase and is expected to energize between 2025 and 2027. Project Rosa (Texas): A 187 MW flagship project co-located with wind assets, aimed at both Bitcoin and AI workloads. Land and power agreements were secured by the company in early 2025. These developments are part of the company’s broader effort to tackle both energy waste and infrastructure bottlenecks. Soluna’s behind-the-meter design enables flexibility to draw from the grid or directly from renewable sources, maximizing energy value while minimizing emissions. Competition is Fierce and a Narrower Focus Better Serves the Business In 2024, Soluna tested the waters of providing AI services via a  GPU-as-a-Service through a partnership with HPE, branded as Project Ada. The pilot aimed to rent out cloud GPUs for AI developers and LLM training. However, due to oversupply in the GPU market, delayed product rollouts (like NVIDIA’s H200), and poor demand economics, Soluna terminated the contract in March 2025. The cancellation of the contract with HPE frees up resources for Soluna to focus on what it believes the company does best: designing

Read More »

Quiet Genius at the Neutral Line: How Onics Filters Are Reshaping the Future of Data Center Power Efficiency

Why Harmonics Matter In a typical data center, nonlinear loads—like servers, UPS systems, and switch-mode power supplies—introduce harmonic distortion into the electrical system. These harmonics travel along the neutral and ground conductors, where they can increase current flow, cause overheating in transformers, and shorten the lifespan of critical power infrastructure. More subtly, they waste power through reactive losses that don’t show up on a basic utility bill, but do show up in heat, inefficiency, and increased infrastructure stress. Traditional mitigation approaches—like active harmonic filters or isolation transformers—are complex, expensive, and often require custom integration and ongoing maintenance. That’s where Onics’ solution stands out. It’s engineered as a shunt-style, low-pass filter: a passive device that sits in parallel with the circuit, quietly siphoning off problematic harmonics without interrupting operations.  The result? Lower apparent power demand, reduced electrical losses, and a quieter, more stable current environment—especially on the neutral line, where cumulative harmonic effects often peak. Behind the Numbers: Real-World Impact While the Onics filters offer a passive complement to traditional mitigation strategies, they aren’t intended to replace active harmonic filters or isolation transformers in systems that require them—they work best as a low-complexity enhancement to existing power quality designs. LoPilato says Onics has deployed its filters in mission-critical environments ranging from enterprise edge to large colos, and the data is consistent. In one example, a 6 MW data center saw a verified 9.2% reduction in energy consumption after deploying Onics filters at key electrical junctures. Another facility clocked in at 17.8% savings across its lighting and support loads, thanks in part to improved power factor and reduced transformer strain. The filters work by targeting high-frequency distortion—typically above the 3rd harmonic and up through the 35th. By passively attenuating this range, the system reduces reactive current on the neutral and helps stabilize

Read More »

New IEA Report Contrasts Energy Bottlenecks with Opportunities for AI and Data Center Growth

Artificial intelligence has, without question, crossed the threshold—from a speculative academic pursuit into the defining infrastructure of 21st-century commerce, governance, and innovation. What began in the realm of research labs and open-source models is now embedded in the capital stack of every major hyperscaler, semiconductor roadmap, and national industrial strategy. But as AI scales, so does its energy footprint. From Nvidia-powered GPU clusters to exascale training farms, the conversation across boardrooms and site selection teams has fundamentally shifted. It’s no longer just about compute density, thermal loads, or software frameworks. It’s about power—how to find it, finance it, future-proof it, and increasingly, how to generate it onsite. That refrain—“It’s all about power now”—has moved from a whisper to a full-throated consensus across the data center industry. The latest report from the International Energy Agency (IEA) gives this refrain global context and hard numbers, affirming what developers, utilities, and infrastructure operators have already sensed on the ground: the AI revolution will be throttled or propelled by the availability of scalable, sustainable, and dispatchable electricity. Why Energy Is the Real Bottleneck to Intelligence at Scale The major new IEA report puts it plainly: The transformative promise of AI will be throttled—or unleashed—by the world’s ability to deliver scalable, reliable, and sustainable electricity. The stakes are enormous. Countries that can supply the power AI craves will shape the future. Those that can’t may find themselves sidelined. Importantly, while AI poses clear challenges, the report emphasizes how it also offers solutions: from optimizing energy grids and reducing emissions in industrial sectors to enhancing energy security by supporting infrastructure defenses against cyberattacks. The report calls for immediate investments in both energy generation and grid capabilities, as well as stronger collaboration between the tech and energy sectors to avoid critical bottlenecks. The IEA advises that, for countries

Read More »

Colorado Eyes the AI Data Center Boom with Bold Incentive Push

Even as states work on legislation to limit data center development, it is clear that some locations are looking to get a bigger piece of the huge data center spending that the AI wave has created. It appears that politicians in Colorado took a look around and thought to themselves “Why is all that data center building going to Texas and Arizona? What’s wrong with the Rocky Mountain State?” Taking a page from the proven playbook that has gotten data centers built all over the country, Colorado is trying to jump on the financial incentives for data center development bandwagon. SB 24-085: A Statewide Strategy to Attract Data Center Investment Looking to significantly boost its appeal as a data center hub, Colorado is now considering Senate Bill 24-085, currently making its way through the state legislature. Sponsored by Senators Priola and Buckner and Representatives Parenti and Weinberg, this legislation promises substantial economic incentives in the form of state sales and use tax rebates for new data centers established within the state from fiscal year 2026 through 2033. Colorado hopes to position itself strategically to compete with neighboring states in attracting lucrative tech investments and high-skilled jobs. According to DataCenterMap.com, there are currently 53 data centers in the state, almost all located in the Denver area, but they are predominantly smaller facilities. In today’s era of massive AI-driven hyperscale expansion, Colorado is rarely mentioned in the same breath as major AI data center markets.  Some local communities have passed their own incentive packages, but SB 24-085 aims to offer a unified, statewide framework that can also help mitigate growing NIMBY (Not In My Backyard) sentiment around new developments. The Details: How SB 24-085 Works The bill, titled “Concerning a rebate of the state sales and use tax paid on new digital infrastructure

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »