Stay Ahead, Stay ONMINE

European Commission Allows Belgium to Extend Life of Two Nuclear Reactors

The European Commission has approved a revised Belgian support measure that will see the lifespan of two nuclear reactors Doel4 and Tihange 3 extended. The Commission said Belgium’s nuclear phase-out law of 2003 required all seven nuclear reactors in the country to be closed by 2025. However, due to energy security concerns amid the war […]

The European Commission has approved a revised Belgian support measure that will see the lifespan of two nuclear reactors Doel4 and Tihange 3 extended.

The Commission said Belgium’s nuclear phase-out law of 2003 required all seven nuclear reactors in the country to be closed by 2025. However, due to energy security concerns amid the war in Ukraine, the Belgian federal government decided to keep the two newest Belgian nuclear power plants, Doel 4 and Tihange 3, open for an additional 10 years, notifying the Commission of its plan.

In July 2024, the Commission launched an investigation to evaluate the need and proportionality of a measure related to a contract-for-difference (CfD) design. Concerns were raised about whether the financial arrangements overly relieved the beneficiaries of risk and the appropriateness of transferred nuclear waste liabilities, the Commission said.

The beneficiaries include Electrabel (a subsidiary of Engie S.A.), Luminus (a subsidiary of EDF S.A.), and BE-NUC, a new joint venture between the Belgian state and Electrabel. After the measure, BE-NUC will own 89.8 percent of both reactors, while Luminus will hold 10.2 percent, the Commission added.

The Commission said that the support package for the nuclear reactor extensions consists of three key components: financial and structural arrangements, including the creation of BE-NUC, a contract-for-difference for stable revenues, and further financial guarantees; the transfer of nuclear waste liabilities from Electrabel to the state for a EUR15 billion ($15.7 billion) lump sum; and risk-sharing and legal protections against future legislative changes impacting nuclear operators.  These elements are considered a single intervention.

To address the Commission’s concerns, Belgium adjusted the public support package for its nuclear project. It confirmed that its reactors use older technology, limiting their ability to frequently adjust power levels, as set by the nuclear safety authority.

Belgium told the Commission the additional financial support mechanisms, including BE-NUC and various loans, are essential to cover different risks and ensure the project’s long-term viability.

To prevent market distortion, the Commission said that Belgium transferred decision-making on economic modulations from BE-NUC to an independent energy manager. This manager will independently sell BE-NUC’s nuclear electricity, with incentives reviewed every 3.5 years, and will conduct a competitive tender process, ensuring fairness, especially if Engie’s trading entity is involved.

The European Commission noted that to ensure proportionality, Belgium set the contract-for-difference strike price based on a discounted cash flow model, intensified market price risk adjustment, capped the operating cashflow guarantee, and implemented strict conditions for the transfer of nuclear waste liabilities, including volume limits, conditioning criteria, and dedicated fund management.

The Commission concluded the aid is necessary, appropriate, and proportionate, minimizing competition distortions.

To contact the author, email [email protected]

What do you think? We’d love to hear from you, join the conversation on the

Rigzone Energy Network.

The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy.


MORE FROM THIS AUTHOR

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Fortinet speeds threat detection with improved FortiAnalyzer

The package also now integrates with FortiAI, the vendor’s genAI assistant, to better support analytics and telemetry to help security teams speed threat investigation and response, the vendor stated. “FortiAI identifies the threats that need analysis from the data collected by FortiAnalyzer, primarily collected from FortiGates. By automating the collection,

Read More »

North America Breaks Rig Loss Streak

North America added three rigs week on week, according to Baker Hughes’ latest North America rotary rig count, which was published on February 21. The U.S. added four rigs week on week, and Canada dropped one rig during the same period, taking the total North America rig count up to 836, comprising 592 rigs from the U.S. and 244 rigs from Canada, the count outlined. Of the total U.S. rig count of 592, 576 rigs are categorized as land rigs, 14 are categorized as offshore rigs, and two are categorized as inland water rigs. The total U.S. rig count is made up of 488 oil rigs, 99 gas rigs, and five miscellaneous rigs, according to the count, which revealed that the U.S. total comprises 530 horizontal rigs, 49 directional rigs, and 13 vertical rigs. Week on week, the U.S. land rig count increased by four, and the country’s offshore and inland water rig counts remained unchanged, the count highlighted. The U.S. oil rig count increased by seven, its gas rig count dropped by two, and its miscellaneous rig count dropped by one, week on week, the count showed. Baker Hughes’ count revealed that the U.S. horizontal rig count increased by six week on week, while the country’s directional rig count dropped by two and its vertical rig count remained unchanged during the period. A major state variances subcategory included in the rig count showed that, week on week, Oklahoma added five rigs and West Virginia added one rig, while New Mexico and Utah each dropped one rig. A major basin variances subcategory included in Baker Hughes’ rig count showed that the Cana Woodford basin added two rigs and the Ardmore Woodford, Granite Wash, and Marcellus basins each added one rig, week on week. Canada’s total rig count of 244 is

Read More »

APA Group Posts Lower Profit for First Half of FY25

Australian natural gas and power utility APA Group has reported AUD 34 million ($21.6 million) in statutory net profit after tax (NPAT) for the half-year ended December 31, 2024, compared to AUD1.5 billion ($666.9 million) for the corresponding quarter in financial year 2024. The company said in its half-year report that the decrease was driven by the prior-period acquisition of the Pilbara Energy business and the associated 1H24 gain on the remeasurement of APA’s previously held interest of 88.2 percent in the Goldfields Gas Pipeline. Underlying earnings before interest, taxes, depreciation, and amortization (EBITDA) was up 9.1 percent to AUD 1.01 billion ($644.7 million), underpinned by strong performance from new assets, higher customer demand for seasonal gas transmission capacity, inflation-linked tariff increases and reduced cost growth, the company said. Higher 1H25 earnings were driven by strong contributions from the Pilbara Energy System business, higher variable revenue, inflation-linked tariff escalations, and cost growth below inflation, APA Group said. Total statutory revenue is up 7.1 percent to AUD 1.36 billion ($866.5 million), the company said. “APA has delivered a strong financial and operating result for the half with growth in revenue, earnings, free cash flow, and distribution”, Adam Watson, APA CEO and Managing Director, said. “Today’s result demonstrates the strength of our underlying business, our continued focus on operational excellence and asset reliability, and our work to drive further cost optimization, which is designed to increase free cash flow, support growth projects and ensure we can continue to deliver for investors”. “At the same time, our operating cash flow and strong balance sheet provide the capacity to fund our AUD1.8 billion ($1.14 billion) organic growth pipeline internally over the next three years”, Watson said. “The result highlights the successful integration of the Pilbara business, with these new assets delivering $66 million in

Read More »

UK too reliant on ‘expensive imported gas’ as energy price spikes

A spike in energy prices has prompted developers to warn the UK is too reliant in imported gas as the government launches a new subsidy. Energy regulator Ofgem has revealed an increase in the energy price cap for April-June 2025. This will add £111 a year for many British households to £1,849 in April, the highest level in more than a year. In response, the government has unveiled plans to expand a warm home discount, giving eligible households £150 off their energy bills. This would bring around 2.7 million households into the scheme – pushing the total number of households that would receive the discount next winter up to an estimated 6.1 million. The move prompted the chair of a company planning to bring wind and solar power to the UK from Morocco via 2,500 mile (4,000km) subsea cable to say the spike showed the reason why the UK needs to diversify its energy system. © Supplied by XlinksAn infographic showing the route of the Xlinks HVDC cable route from Morocco to the UK.. Sir Dave Lewis, chair of Xlinks and former Tesco boss said: “The UK already has amongst the highest energy prices in Europe and this latest hike will pile further pressure on long-suffering consumers. “This highlights the pressing need to further diversify our energy system and reduce this country’s reliance on expensive imported gas, particularly during periods of high demand and low domestic renewables production.” The firm has said its Xlinks Morocco-UK power project could help balance the UK grid and improve resiliency when it launches in the early 2030. It is currently in discussions with the UK government for a contract for difference (CfD) to help fund the scheme. Worrying Launching the subsidy, energy Secretary Ed Miliband said: “This is worrying news for many families. “This

Read More »

Aberdeenshire-based EnerQuip snaps up Hutcheon’s fab division

Oilfield equipment company EnerQuip has acquired the fabrication division of Hutcheon Services, Aberdeen for an undisclosed sum as part of a £1 million investment. Eight members of staff from Hutcheon transfer to EnerQuip’s headquarters at Findon. The fabrication division’s assets and work-in-progress will also move to EnerQuip. EnerQuip, which specialises in manufacturing torque machines, said the deal represents something of a “homecoming” for a radiation source transportation shipping container which began life as the brainchild of EnerQuip forerunner, AMC Engineering. The company was established by former directors and senior staff in 2015 following sale the sale of AMC to US-based Forum Energy Technologies (FET) in 2011. It added it plans to invest £1m to enhance its fabrication offering as it meets a “significant uptick” in global demand. Last year the company said it sold its highest ever number of units to Guyana and Brazil with these growing markets likely to put revenues for EnerQuip Group over £22 million for 2024, as they continue on course towards achieving the £25m in 2025 goal. The latest addition takes EnerQuip back on the acquisition trail where previous deals have included securing the AMC product line from FET and the purchase of Fife sub-contract precision machining company Diamac Engineering.  More recently, an expanded presence in Houston, Texas was achieved thanks to a move to a 6,000sq ft purpose building in the North east of the city, taking the team strategically closer to several key regional customers. EnerQuip managing director, Andrew Robins said: “The well-established reputation of Hutcheon Services’ fabrication division is important in augmenting our ability to present a growing customer base with a safe, capable and respected pair of hands to take care of their project needs, from start to finish. “The additional capacity which this deal creates, and the skills of the personnel

Read More »

Woodside Bets on Louisiana LNG to Become Global Powerhouse

Woodside Energy Group Ltd. is preparing to give the go-ahead for its Louisiana LNG project as Australia’s biggest natural gas producer looks beyond its core markets in Asia. The company is “in the process of bringing partners into that development” and wants to make a final decision from this quarter, Chief Executive Officer Meg O’Neill said in an interview with Bloomberg TV on Tuesday. That will help an expansion away from its plants in Australia, which mainly supply Asian customers, she said, after the company reported a doubling in profit earlier in the day.  “One of the things that we are trying to do at Woodside is to become a more globally impactful LNG player — we’ve used the phrase LNG powerhouse in the past,” O’Neill said. “Louisiana LNG helps us to secure a material presence in the Atlantic Basin into European consumers.” Woodside is “seeing a lot of slowing” in competing LNG projects, but would still be interested in off-take agreements with them, she said. Geopolitical developments mean there’s “a lot to keep track of in our world in what we expect will continue to be a volatile year,” O’Neill said. The Perth-based company reported net income of $3.57 billion for last year, up from $1.66 billion in 2023, and ahead of a consensus estimate of $3.36 billion. Production rose 4 percent to 194 million barrels of oil equivalent.  Production guidance for this year was kept at 186 to 196 million barrels, of which about 40 percent is LNG. Underlying profit fell 13 percent from a year earlier to $2.88 billion.  Woodside shares rose as much as 3.5 percent, their biggest intraday jump since early October, before trading 2.8 percent higher as of 3:39 p.m. in Sydney.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect

Read More »

Hydrogen boiler trial yet to receive safety approvals amid ‘explosion’ fears

A hydrogen boiler trial in Fife, which is the only ongoing test known of its kind, has yet to receive safety approvals months after the associated network was installed. The H100 trial in Levenmouth is the only hydrogen boiler residential trial to be progressing after two similar UK trials were cancelled over safety concerns, following a report that revealed hydrogen boilers are at far greater risk of fire than regular gas boilers. Two hydrogen boiler trials, in Ellesmere Port in Whitby and Redcar in Teesside, were quashed in 2023 over fears of safety among residents, while three known accidents involving blasts in the unproven market for the low-carbon gas have taken place in recent years. Tom Baxter, a lecturer in the department of engineering at the University of Aberdeen, filed a freedom of information request to the Information Commissioner’s office to know what safety approvals the Fife H100 trial had complied with at the start of February. The request was submitted after gas distribution business SGN, the company behind the trials, was forced by the commissioner’s office in December 2023 to release findings from a controlled explosions experiment that compared hydrogen and gas boiler safety. The gas company finally released the explosion risk report and a video of controlled explosions from 2018, which were funded by Ofgem’s Network Innovation Allowance, after the UK’s information commissioner ordered it to do so. That report, conducted independently in advance of the four-year hydrogen heating pilot programme in Buckhaven and Methil, showed hydrogen boilers were more likely to explode than gas boilers, but it was withheld on the basis that the results could be “misinterpreted”. The response from the Health and Safety Executive (HSE) this month shows that the necessary safety case had not yet been lodged for the trials being conducted by SGN. This is despite the

Read More »

3 strategies for carbon-free data centers

Because of the strain that data centers (as well as other electrification sources, such as electric vehicles) are putting on the grid, “the data center industry needs to develop new power supply strategies to support growth plans,” Dietrich said. Here are the underling factors that play into the three strategies outlined by Uptime. Scale creates new opportunities: It’s not just that more data centers are being built, but the data centers under construction are fundamentally different in terms of sheer magnitude. For example, a typical enterprise data center might require between 10 and 25 megawatts of power. Today, the hyperscalers are building data centers in the 250-megawatt range and a large data center campus could require 1,000 megawatts of power. Data centers not only require a reliable source of power, they also require backup power in the form of generators. Dietrich pointed out that if a data center operator builds out enough backup capacity to support 250 megawatts of demand, they’re essentially building a new, on-site power plant. On the one hand, that new power plant requires permitting, it’s costly, and it requires highly training staffers to operate. On the other hand, it provides an opportunity. Instead of letting this asset sit around unused except in an emergency, organizations can leverage these power plants to generate energy that can be sold back to the grid. Dietrich described this arrangement as a win-win that enables the data center to generate revenue, and it helps the utility to gain a new source of power. Realistic expectations: Alternative energy sources like wind and solar, which are dependent on environmental factors, can’t technically or economically supply 100% of data center power, but they can provide a significant percentage of it. Organizations need to temper their expectations, Dietrich said.

Read More »

Questions arise about reasons why Microsoft has cancelled data center lease plans

This, the company said, “allows us to invest and allocate resources to growth areas for our future. Our plans to spend over $80 billion on infrastructure this fiscal year remains on track as we continue to grow at a record pace to meet customer demand.” When asked for his reaction to the findings, John Annand, infrastructure and operations research practice lead at Info-Tech Research Group, pointed to a blog released last month by Microsoft president Brad Smith, and said he thinks the company “is hedging its bets. It reaffirms the $80 billion AI investment guidance in 2025, $40 billion in the US. Why lease when you can build/buy your own?” Over the past four years, he said, Microsoft “has been leasing more data centers than owning. Perhaps they are using the fact that the lessors are behind schedule on providing facilities or the power upgrades required to bring that ratio back into balance. The limiting factor for data centers has always been the availability of power, and this has only become more true with power-hungry AI workloads.” The company, said Annand, “has made very public statements about owning nuclear power plants to help address this demand. If third-party data center operators are finding it tough to provide Microsoft with the power they need, it would make sense that Microsoft vertically integrate its supply chain; so, cancel leases or statements of qualification in favor of investing in the building of their own capacity.” However, Gartner analyst Tony Harvey said of the report, “so much of this is still speculation.” Microsoft, he added, “has not stated as yet that they are reducing their capex spend, and there are reports that Microsoft have strongly refuted that they are making changes to their data center strategy.” The company, he said, “like any other hyperscaler,

Read More »

Quantum Computing Advancements Leap Forward In Evolving Data Center and AI Landscape

Overcoming the Barriers to Quantum Adoption Despite the promise of quantum computing, widespread deployment faces multiple hurdles: High Capital Costs: Quantum computing infrastructure requires substantial investment, with uncertain return-on-investment models. The partnership will explore cost-sharing strategies to mitigate risk. Undefined Revenue Models: Business frameworks for quantum services, including pricing structures and access models, remain in development. Hardware Limitations: Current quantum processors still struggle with error rates and scalability, requiring advancements in error correction and hybrid computing approaches. Software Maturity: Effective algorithms for leveraging quantum computing’s advantages remain an active area of research, particularly in real-world AI and optimization problems. SoftBank’s strategy includes leveraging its extensive telecom infrastructure and AI expertise to create real-world testing environments for quantum applications. By integrating quantum into existing data center operations, SoftBank aims to position itself at the forefront of the quantum-AI revolution. A Broader Play in Advanced Computing SoftBank’s quantum initiative follows a series of high-profile moves into the next generation of computing infrastructure. The company has been investing heavily in AI data centers, aligning with its “Beyond Carrier” strategy that expands its focus beyond telecommunications. Recent efforts include the development of large-scale AI models tailored to Japan and the enhancement of radio access networks (AI-RAN) through AI-driven optimizations. Internationally, SoftBank has explored data center expansion opportunities beyond Japan, as part of its efforts to support AI, cloud computing, and now quantum applications. The company’s long-term vision suggests that quantum data centers could eventually play a role in supporting AI-driven workloads at scale, offering performance benefits that classical supercomputers cannot achieve. The Road Ahead SoftBank and Quantinuum’s collaboration signals growing momentum for quantum computing in enterprise settings. While quantum remains a long-term bet, integrating QPUs into data center infrastructure represents a forward-looking approach that could redefine high-performance computing in the years to come. With

Read More »

STACK Infrastructure Pushes Aggressive Data Center Expansion and Sustainability Strategy Into 2025

Global data center developer and operator STACK Infrastructure is providing a growing range of digital infrastructure solutions for hyperscalers, cloud service providers, and enterprise clients. Like almost all of the cutting-edge developers in the industry, Stack is maintaining the focus on scalability, reliability, and sustainability while delivering a full range of solutions, including build-to-suit, colocation, and powered shell facilities, with continued development in key global markets. Headquartered in the United States, the company has expanded its presence across North America, Europe, and Asia-Pacific, catering to the increasing demand for high-performance computing, artificial intelligence (AI), and cloud-based workloads. The company is known for its commitment to sustainable growth, leveraging green financing initiatives, energy-efficient designs, and renewable power sources to minimize its environmental impact. Through rapid expansion in technology hubs like Silicon Valley, Northern Virginia, Malaysia, and Loudoun County, the company continues to develop industry benchmarks for innovation and infrastructure resilience. With a customer-centric approach and a robust development pipeline, STACK Infrastructure is shaping the future of digital connectivity and data management in an era of accelerating digital transformation. Significant Developments Across 23 Major Data Center Markets Early in 2024, Stack broke ground on the expansion of their existing 100 MW campus in San Jose, servicing the power constrained Silicon Valley. Stack worked with the city of San Jose to add a 60 MW expansion to their SVY01 data center. While possibly the highest profile of Stack’s developments, due to its location, at that point in time the company had announced significant developments across 23 major data center markets, including:       Stack’s 48 MW Santa Clara data center, featuring immediately available shell space powered by an onsite substation with rare, contracted capacity. Stack’s 56 MW Toronto campus, spanning 19 acres, includes an existing 8 MW data center and 48 MW expansion capacity,

Read More »

Meta Update: Opens Mesa, Arizona Data Center; Unveils Major Subsea Cable Initiative; Forges Oklahoma Wind Farm PPA; More

Meta’s Project Waterworth: Building the Global Backbone for AI-Powered Digital Infrastructure Also very recently, Meta unveiled its most ambitious subsea cable initiative yet: Project Waterworth. Aimed at revolutionizing global digital connectivity, the project will span over 50,000 kilometers—surpassing the Earth’s circumference—and connect five major continents. When completed, it will be the world’s longest subsea cable system, featuring the highest-capacity technology available today. A Strategic Expansion to Key Global Markets As announced on Feb. 14, Project Waterworth is designed to enhance connectivity across critical regions, including the United States, India, Brazil, and South Africa. These regions are increasingly pivotal to global digital growth, and the new subsea infrastructure will fuel economic cooperation, promote digital inclusion, and unlock opportunities for technological advancement. In India, for instance, where rapid digital infrastructure growth is already underway, the project will accelerate progress and support the country’s ambitions for an expanded digital economy. This enhanced connectivity will foster regional integration and bolster the foundation for next-generation applications, including AI-driven services. Strengthening Global Digital Highways Subsea cables are the unsung heroes of global digital infrastructure, facilitating over 95% of intercontinental data traffic. With a multi-billion-dollar investment, Meta aims to open three new oceanic corridors that will deliver the high-speed, high-capacity bandwidth needed to fuel innovations like artificial intelligence. Meta’s experience in subsea infrastructure is extensive. Over the past decade, the company has collaborated with various partners to develop more than 20 subsea cables, including systems boasting up to 24 fiber pairs—far exceeding the typical 8 to 16 fiber pairs found in most new deployments. This technological edge ensures scalability and reliability, essential for handling the world’s ever-increasing data demands. Engineering Innovations for Resilience and Capacity Project Waterworth isn’t just about scale—it’s about resilience and cutting-edge engineering. The system will be the longest 24-fiber-pair subsea cable ever built, enhancing

Read More »

Do data centers threaten the water supply?

In a new report, the Royal Academy of Engineering called upon the government to ensure tech companies accurately report how much energy and water their data centers are using and reducing the use of drinking water for cooling. Without such action, warns one of the report’s authors, Professor Tom Rodden, “we face a real risk that our development, deployment and use of AI could do irreparable damage to the environment.” The situation is a little different for the US as the country has large bodies of water offering a  water supply that the UK just does not have. It’s not an accident that there are many data centers around the Chicago area: they’ve also got the Great Lakes to draw upon. Likewise, the Columbia and Klamath Rivers have become magnets for data centers for both water supply and hydroelectric power. Other than the Thames River, the UK doesn’t have these massive bodies of water. Still, the problem is not unique to the UK, says Alan Howard, senior analyst with Omdia. He notes that Microsoft took heat last year because it was draining the water supply of a small Arizona town of Goodyear with a new AI-oriented data center.  The city of Chandler, Arizona passed an ordinance in 2015 that restricted new water-intensive businesses from setting up shop which slowed data center development.   “I believe some data center operators just bowed out,” said Howard.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »