Stay Ahead, Stay ONMINE

Underground cables cost more than four times the price of pylons, report finds

The price of constructing new underground transmission lines is, on average, around 4.5 times more expensive than overhead lines, according to a report. According to research from the Institute of Engineering and Technology (IET), it costs around £1,190 to transmit 1MW of power 1km through an overhead line compared to around £5,330 per MW per […]

The price of constructing new underground transmission lines is, on average, around 4.5 times more expensive than overhead lines, according to a report.

According to research from the Institute of Engineering and Technology (IET), it costs around £1,190 to transmit 1MW of power 1km through an overhead line compared to around £5,330 per MW per km for underground cables

The report noted an example of a 15km long 5GW overhead line, which it estimated would cost around £40m. This compares with an equivalent underground cable costing around £330m and, in a new tunnel, £820m.

In addition, it said that offshore high voltage direct current (HVDC) point-to-point cable is around 5 times more expensive; while an offshore HVDC network connecting multiple sites to the onshore grid is around 11 times more expensive.

Chairman of the project board for the IET Transmission Technologies report Professor Keith Bell said: “As an essential part of the country’s aim to reach net zero, the UK is decarbonising its production of electricity and electrifying the use of energy for heating, transport and industry.

“Access to a cleaner, more affordable, secure supply of energy requires the biggest programme of electricity transmission development since the 1960s.”

Mixed approach

Overhead transmission lines have been a controversial issue due to the visual impacts they have on the surrounding landscapes, and concerns they have a negative effect on nearby property prices.

The IET’s report compared a range of electricity transmission technologies by costs, environmental impacts, carbon intensity, technology readiness, and delivery time.

The group added that each technology should be judged on its merits in each specific context, taking into account environmental impact, engineering challenges and local impacts in addition to cost.

For example, underground cables have lower visual impact than overhead lines, but they are viable only in certain terrains, and have much higher cost; long high voltage AC cables also present electrical engineering challenges.

Subsea solutions still require onshore infrastructure to transport power from coastal landing points to areas of demand and have impacts on marine environments.

© DC Thomson
Andrew Bowie, Scottish Conservative MP for West Aberdeenshire and Kincardine.

West Aberdeenshire and Kincardine MP and acting shadow energy secretary Andrew Bowie was critical of relying on overhead transmission lines, saying: “The overall strategy should be a mix of above ground, underground and marine transmission. There are places where it just doesn’t make sense and that was my approach during my time as a government minister.”

He added: “Brute forcing an overground-only approach is the wrong thing to do, and the Conservatives will fight that every step of the way.”

Re-wiring the UK grid

The report aims to inform the UK government’s plans to re-wire the country as part of the largest grid upgrade since the 1960s.

With the UK looking to expand its offshore wind capacity, the grid will need a similar boost to handle the additional power. Connections will need to be created to take power from offshore in regions like Scotland and move them down to population centres, such as in the south of England.

Energy minister Michael Shanks © Erikka Askeland/DCT Media
UK energy minister Michael Shanks speaks at the EIC’s North Sea Decarbonisation conference 2025. Image: Erikka Askeland/DCT Media

Commenting on the report, energy minister Michael Shanks said: “This latest report shows that pylons are the best option for billpayers – as cables underground cost significantly more to install and maintain.

“At the same time, we want to ensure those hosting this infrastructure benefit, including by offering households near new pylons £2,500 off their energy bills over 10 years.”

In addition, the report also warned of supply chain bottlenecks, especially in cable manufacturing, which could affect delivery times and prices, and create a need for long-term relationships with providers.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Backup-as-a-service explained: Your guide to cloud data protection

BaaS supports private, public and hybrid cloud environments. Hybrid cloud, which pairs on-premises infrastructure with cloud-based storage and management, can help enterprises achieve what’s known as the “3-2-1 rule of backup,” a strategy whereby an enterprise keeps three copies of their data —  two in local storage, one offsite. However,

Read More »

EOG to Acquire Encino’s Assets in Utica for $5.6B

EOG Resources, Inc. said it has entered into a definitive agreement to acquire Encino Acquisition Partners (EAP) from the Canada Pension Plan Investment Board (CPP) and Encino Energy for $5.6 billion, inclusive of EAP’s net debt. EOG expects to fund the acquisition through $3.5 billion of debt and $2.1 billion of cash on hand, the company said in a news release. The transaction is expected to close in the second half, subject to clearance under the Hart-Scott-Rodino Act and other customary closing conditions, EOG said. The acquisition of Encino’s 675,000 net core acres increases EOG’s Utica position to a combined 1.1 million net acres, representing more than 2 billion barrels of oil equivalent of undeveloped net resources, with pro forma production totaling 275,000 barrels of oil equivalent per day (boepd), according to the release. EOG said that the acquisition significantly expands its contiguous liquids-rich acreage, adds premium-priced gas exposure, and increases working interest. The company averages 65 percent liquids production, with 235,000 net acres for a combined contiguous position of 485,000 net acres. On the natural gas front, the acquisition adds 330,000 net acres along with existing natural gas production with firm transportation exposed to premium end markets. In the northern acreage, where the company has delivered outstanding well results, EOG increases its existing average working interest by more than 20 percent, the company stated. “This acquisition combines large, premier acreage positions in the Utica, creating a third foundational play for EOG alongside our Delaware Basin and Eagle Ford assets,” EOG Chairman and CEO Ezra Yacob said. “Encino’s acreage improves the quality and depth of our Utica position, expanding EOG’s multi-basin portfolio to more than 12 billion barrels of oil equivalent net resources”. “We are excited to execute on this unique opportunity that is immediately accretive to our per-share metrics

Read More »

JP Morgan Asks If Oil Prices Are $10 Too Low or $20 Too High

In a research note sent to Rigzone late Thursday by Natasha Kaneva, Head of Global Commodities Strategy at J.P. Morgan, analysts at the company, including Kaneva, asked if oil prices are “$10 too low or $20 too high”. The analysts highlighted in the note that, during the company’s two weeks of marketing in Europe and Asia, J.P. Morgan “encountered both perspectives”. “Some clients believe that current oil prices are $10 too low, while others argue they are $20 too high,” the analysts said. The J.P. Morgan analysts outlined “the bullish argument” and “bear…reasons” in the note.  “Incoming hard data on the economy has held up so far, and oil demand has been relatively robust,” the analysts said under a subcategory for the former. “Perspectives are shifting away from concerns about a U.S. recession to optimism about growth-boosting deregulation and tax cuts,” they added. “Similarly, views on China are evolving. We were surprised by the positive sentiment in China. Chinese copper demand, a reliable indicator of the health of the Chinese economy, increased by 6-7 percent in the first quarter, with sufficient underlying momentum beyond frontloading to sustain growth into the second half of the year,” they continued. “China has a clear policy direction focused on growth, aiming at proactive expansion of domestic demand,” they went on to state. Under this subcategory, the analysts also noted that “healthy refinery margins are encouraging high utilization rates and product cracks are strong, yet global product inventories are stubbornly low”. “Strong demand for crude is reflected in the prompt spreads of both Brent and WTI, which are trading at 58-64c/bbl backwardation,” they added. Under the bear case subhead included in the note, the analysts said, “despite robust demand so far this year, averaging 1.0 million barrels per day year to date, visible inventories have

Read More »

OPEC+ Countries to ‘Implement Production Adjustment’ of 411K Bpd in July

A statement posted on OPEC’s website on Saturday announced that Saudi Arabia, Russia, Iraq, UAE, Kuwait, Kazakhstan, Algeria, and Oman “will implement a production adjustment of 411,000 barrels per day” next month. The statement noted that “the eight OPEC+ countries, which previously announced additional voluntary adjustments in April and November 2023 … met virtually on 31 May 2025, to review global market conditions and outlook”. “In view of a steady global economic outlook and current healthy market fundamentals, as reflected in the low oil inventories, and in accordance with the decision agreed upon on 5 December 2024 to start a gradual and flexible return of the 2.2 million barrels per day voluntary adjustments starting from 1 April 2025, the eight participating countries will implement a production adjustment of 411,000 barrels per day in July 2025 from June 2025 required production level,” it added. The statement noted that this “is equivalent to three monthly increments” and highlighted that “the gradual increases may be paused or reversed subject to evolving market conditions”, adding that “this flexibility will allow the group to continue to support oil market stability”.  “The eight OPEC+ countries also noted that this measure will provide an opportunity for the participating countries to accelerate their compensation,” the statement went on to note. “The eight countries reiterated their collective commitment to achieve full conformity with the Declaration of Cooperation, including the additional voluntary production adjustments that were agreed to be monitored by the JMMC during its 53rd meeting held on April 3rd 2024,” it continued. The statement also noted that the eight countries “confirmed their intention to fully compensate for any overproduced volume since January 2024”. “The eight OPEC+ countries will hold monthly meetings to review market conditions, conformity, and compensation. The eight countries will meet on 6 July 2025 to

Read More »

Naftogaz, Orlen to Expand Energy Partnership

Ukraine’s Naftogaz Group and Poland’s ORLEN SA have signed another energy collaboration agreement, after an earlier one that aims to help the war-hit country secure natural gas supply. Under the new memorandum of understanding, “the parties will seek to increase natural gas deliveries via Poland to Ukraine and to advance joint projects in oil and gas extraction”, ORLEN said in an online statement. “These initiatives are expected to strengthen Ukraine’s resource security and flexibility. “Naftogaz also stands to benefit from ORLEN’s technical expertise in the modernization of the Kremenchuk oil refinery, as well as in the refurbishment of gas infrastructure damaged during the war. “In addition, both companies intend to pursue joint investment projects across fuel distribution and development of the biofuels segment”. ORLEN added it currently supplies Ukrainian customers with refined oil products including gasoline, diesel and bitumen. The state-owned companies previously penned a liquefied natural gas (LNG) agreement. “The agreement is a framework arrangement aimed at strengthening cooperation to enhance Ukraine’s energy security through the diversification of gas supply sources and routes to the country”, ORLEN said in a press release March 7. The LNG agreement has contracted 300 million cubic meters (10.59 billion cubic feet) of gas for Ukraine. Meanwhile Naftogaz is intensifying negotiations with international financial institutions to help Ukraine buy enough gas for the heating season. Discussions were held with representatives of the European Investment Bank, the International Monetary Fund and the European Commission, Naftogaz said Friday. “[W]e have made significant progress in attracting additional financing”, new chief executive Sergii Koretskyi said. On April 28 the company said it had secured a EUR 270 million ($308.62 million) loan from the European Bank for Reconstruction and Development and a EUR 140 million grant from the Norwegian government to buy nearly one billion cubic meters of gas.

Read More »

The power of precision: GenAI is revolutionizing transmission network and asset models

Over the last two decades, electric utilities have reactively transformed their Transmission business processes and ecosystems—using digitization and simple analytics. Today, with increasing load growth, regulatory orders, aging infrastructure and the ever present need to operate the grid more efficiently, Transmission operators and developers are leveraging new technologies, including Generative AI. While other industries were early adopters, embracing AI for competitive edge, the traditionally cautious power and utility sector needed clear line-of-sight to the value proposition for GenAI—particularly in their transmission functions. Network models are the digital DNA of transmission functions, powering planning simulations, real-time operations and supporting asset management decision making. Yet many utilities still struggle to maintain accurate, up to date and relevant models, due in part to the sheer scale and complexity of modern utility infrastructure. GenAI changes the game—automating model upkeep, enriching simulation fidelity and enabling decision making with greater precision and speed. Challenges Facing Transmission Models Transmission Planning Transmission planning hinges on accurate and timely data and asset models to simulate power flows, project load growth, identify optimal points of interconnection and grid operations under contingency scenarios. However, current modeling applications and platforms are often outdated or incomplete, plagued by fragmented data from engineering teams, third-party consultants and GIS systems. Challenge: Long planning cycles with incomplete data inputs and manual model updates increase the risk of missed opportunities and compromised long-range plans. Transmission Network Operations Transmission System operators require real-time network visibility for grid stability, congestion management and response to disturbances. Additionally, proper Transmission switching can alleviate transmission violations and reduce operating costs.  SCADA, EMS and outage management systems depend on models that reflect current network topology, operational states and equipment status to support operators’ decisions. Challenge: Manual updates to operations models cannot keep pace with daily and compromised network changes, especially as renewable penetration

Read More »

Effective grid management runs on simulation

For decades, power systems engineers have relied on modeling and simulation to ensure efficient and reliable power generation and distribution. Electromagnetic field analysis helps them design grid infrastructure components, structural and acoustics simulation is used to mitigate issues like transformer vibration and noise and electric breakdown and corona discharge analyses are used to accurately predict the performance and safety of high-voltage transmission line systems.  Now, thanks to built-in simulation software capabilities that enable engineers to build custom model-based apps for users without modeling expertise, the scope of who can access advanced simulation technology in power systems organizations has dramatically increased. By building and distributing their own simulation apps, power grid companies are able to extend the benefits of simulation-based decision making to more collaborators within the R&D workflow and even to colleagues in the field. What follows is effective collaboration and accelerated innovation. Simulation Supports Efficient Transformer Design Take a company that develops power transformer equipment, for instance. Transformers are inherently noisy, but they can be designed to be as close to silent as possible. As with most things in this world, transformers involve many interconnected physics — electromagnetics, acoustics and structural mechanics, in this case — which means that multiphysics simulation software is the tool for the job when optimizing their designs. The R&D engineers responsible for coming up with one manufacturer’s new transformer designs all knew how to use finite element analysis (FEA) software, but they worked closely with other teams and departments without such expertise. For example, the designers tasked with building the final transformers had no familiarity with FEA. Instead, they preferred to use spreadsheets and other tools based on statistics and empirical models, which worked well for transformers they built frequently, but not for new designs or scenarios where different dimensions were introduced. In these

Read More »

Cisco bolsters DNS security package

The software can block domains associated with phishing, malware, botnets, and other high-risk categories such as cryptomining or new domains that haven’t been reported previously. It can also create custom block and allow lists and offers the ability to pinpoint compromised systems using real-time security activity reports, Brunetto wrote. According to Cisco, many organizations leave DNS resolution to their ISP. “But the growth of direct enterprise internet connections and remote work make DNS optimization for threat defense, privacy, compliance, and performance ever more important,” Cisco stated. “Along with core security hygiene, like a patching program, strong DNS-layer security is the leading cost-effective way to improve security posture. It blocks threats before they even reach your firewall, dramatically reducing the alert pressure your security team manages.” “Unlike other Secure Service Edge (SSE) solutions that have added basic DNS security in a ‘checkbox’ attempt to meet market demand, Cisco Secure Access – DNS Defense embeds strong security into its global network of 50+ DNS data centers,” Brunetto wrote. “Among all SSE solutions, only Cisco’s features a recursive DNS architecture that ensures low-latency, fast DNS resolution, and seamless failover.”

Read More »

HPE Aruba unveils raft of new switches for data center, campus modernization

And in large-scale enterprise environments embracing collapsed-core designs, the switch acts as a high-performance aggregation layer. It consolidates services, simplifies network architecture, and enforces security policies natively, reducing complexity and operational cost, Gray said. In addition, the switch offers the agility and security required at colocation facilities and edge sites. Its integrated Layer 4 stateful security and automation-ready platform enable rapid deployment while maintaining robust control and visibility over distributed infrastructure, Gray said. The CX 10040 significantly expands the capacity it can provide and the roles it can serve for enterprise customers, according to one industry analyst. “From the enterprise side, this expands on the feature set and capabilities of the original 10000, giving customers the ability to run additional services directly in the network,” said Alan Weckel, co-founder and analyst with The 650 Group. “It helps drive a lower TCO and provide a more secure network.”  Aimed as a VMware alternative Gray noted that HPE Aruba is combining its recently announced Morpheus VM Essentials plug-in package, which offers a hypervisor-based package aimed at hybrid cloud virtualization environments, with the CX 10040 to deliver a meaningful alternative to Broadcom’s VMware package. “If customers want to get out of the business of having to buy VM cloud or Cloud Foundation stuff and all of that, they can replace the distributed firewall, microsegmentation and lots of the capabilities found in the old VMware NSX [networking software] and the CX 10k, and Morpheus can easily replace that functionality [such as VM orchestration, automation and policy management],” Gray said. The 650 Group’s Weckel weighed in on the idea of the CX 10040 as a VMware alternative:

Read More »

Indian startup Refroid launches India’s first data center CDUs

They use heat exchangers and pumps to regulate the flow and temperature of fluid delivered to equipment for cooling, while isolating the technology cooling system loop from facility systems. The technology addresses limitations of traditional air cooling, which industry experts say cannot adequately handle the heat generated by modern AI processors and high-density computing applications. Strategic significance for India Industry analysts view the development as a critical milestone for India’s data center ecosystem. “India generates 20% of global data, yet contributes only 3% to global data center capacity. This imbalance is not merely spatial — it’s systemic,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “The emergence of indigenously developed CDUs signals a strategic pivot. Domestic CDU innovation is a defining moment in India’s transition from data centre host to technology co-creator.” Neil Shah, VP for research and partner at Counterpoint Research, noted that major international players like Schneider, Vertiv, Asetek, Liquidstack, and Zutacore have been driving most CDU deployments in Indian enterprises and data centers. “Having a local indigenous CDU tech and supplier designed with Indian weather, infrastructure and costs in mind expands options for domestic data center demand,” he said. AI driving data center cooling revolution India’s data center capacity reached approximately 1,255 MW between January and September 2024 and was projected to expand to around 1,600 MW by the end of 2024, according to CBRE India’s 2024 Data Center Market Update. Multiple market research firms have projected the India data center market to grow from about $5.7 billion in 2024 to $12 billion by 2030. Bhavaraju cited aggressive projections for the sector’s expansion, with AI workloads expected to account for 30% of total workloads by 2030. “All of them need liquid cooling,” he said, noting that “today’s latest GPU servers – GB200 from Nvidia

Read More »

Platform approach gains steam among network teams

Revisting the platform vs. point solutions debate The dilemma of whether to deploy an assortment of best-of-breed products from multiple vendors or go with a unified platform of “good enough” tools from a single vendor has vexed IT execs forever. Today, the pendulum is swinging toward the platform approach for three key reasons. First, complexity, driven by the increasingly distributed nature of enterprise networks, has emerged as a top challenge facing IT execs. Second, the lines between networking and security are blurring, particularly as organizations deploy zero trust network access (ZTNA). And third, to reap the benefits of AIOps, generative AI and agentic AI, organizations need a unified data store. “The era of enterprise connectivity platforms is upon us,” says IDC analyst Brandon Butler. “Organizations are increasingly adopting platform-based approaches to their enterprise connectivity infrastructure to overcome complexity and unlock new business value. When enhanced by AI, enterprise platforms can increase productivity, enrich end-user experiences, enhance security, and ultimately drive new opportunities for innovation.” In IDC’s Worldwide AI in Networking Special Report, 78% of survey respondents agreed or strongly agreed with the statement: “I am moving to an AI-powered platform approach for networking.” Gartner predicts that 70% of enterprises will select a broad platform for new multi-cloud networking software deployments by 2027, an increase from 10% in early 2024. The breakdown of silos between network and security operations will be driven by organizations implementing zero-trust principles as well as the adoption of AI and AIOps. “In the future, enterprise networks will be increasingly automated, AI-assisted and more tightly integrated with security across LAN, data center and WAN domains,” according to Gartner’s 2025 Strategic Roadmap for Enterprise Networking. While all of the major networking vendors have announced cloud-based platforms, it’s still relatively early days. For example, Cisco announced a general framework for Cisco

Read More »

Oracle to spend $40B on Nvidia chips for OpenAI data center in Texas

OpenAI has also expanded Stargate internationally, with plans for a UAE data center announced during Trump’s recent Gulf tour. The Abu Dhabi facility is planned as a 10-square-mile campus with 5 gigawatts of power. Gogia said OpenAI’s selection of Oracle “is not just about raw compute, but about access to geographically distributed, enterprise-grade infrastructure that complements its ambition to serve diverse regulatory environments and availability zones.” Power demands create infrastructure dilemma The facility’s power requirements raise serious questions about AI’s sustainability. Gogia noted that the 1.2-gigawatt demand — “on par with a nuclear facility” — highlights “the energy unsustainability of today’s hyperscale AI ambitions.” Shah warned that the power envelope keeps expanding. “As AI scales up and so does the necessary compute infrastructure needs exponentially, the power envelope is also consistently rising,” he said. “The key question is how much is enough? Today it’s 1.2GW, tomorrow it would need even more.” This escalating demand could burden Texas’s infrastructure, potentially requiring billions in new power grid investments that “will eventually put burden on the tax-paying residents,” Shah noted. Alternatively, projects like Stargate may need to “build their own separate scalable power plant.” What this means for enterprises The scale of these facilities explains why many organizations are shifting toward leased AI computing rather than building their own capabilities. The capital requirements and operational complexity are beyond what most enterprises can handle independently.

Read More »

New Intel Xeon 6 CPUs unveiled; one powers rival Nvidia’s DGX B300

He added that his read is that “Intel recognizes that Nvidia is far and away the leader in the market for AI GPUs and is seeking to hitch itself to that wagon.” Roberts said, “basically, Intel, which has struggled tremendously and has turned over its CEO amidst a stock slide, needs to refocus to where it thinks it can win. That’s not competing directly with Nvidia but trying to use this partnership to re-secure its foothold in the data center and squeeze out rivals like AMD for the data center x86 market. In other words, I see this announcement as confirmation that Intel is looking to regroup, and pick fights it thinks it can win. “ He also predicted, “we can expect competition to heat up in this space as Intel takes on AMD’s Epyc lineup in a push to simplify and get back to basics.” Matt Kimball, vice president and principal analyst, who focuses on datacenter compute and storage at Moor Insights & Strategy, had a much different view about the announcement. The selection of the Intel sixth generation Xeon CPU, the 6776P, to support Nvidia’s DGX B300 is, he said, “important, as it validates Intel as a strong choice for the AI market. In the big picture, this isn’t about volumes or revenue, rather it’s about validating a strategy Intel has had for the last couple of generations — delivering accelerated performance across critical workloads.”  Kimball said that, In particular, there are a “couple things that I would think helped make Xeon the chosen CPU.”

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »