Stay Ahead, Stay ONMINE

Uganda-Tanzania Oil Pipeline Secures First Tranche of External Financing

A Uganda-Tanzania petroleum pipeline project majority-owned by TotalEnergies SE has secured the first tranche of external syndicated financing. The group of backers include regional banks African Export Import Bank, Standard Bank of South Africa Ltd., Stanbic Bank Uganda Ltd., KCB Bank Uganda and Islamic Corporation for the Development of the Private Sector (ICD), the project […]

A Uganda-Tanzania petroleum pipeline project majority-owned by TotalEnergies SE has secured the first tranche of external syndicated financing.

The group of backers include regional banks African Export Import Bank, Standard Bank of South Africa Ltd., Stanbic Bank Uganda Ltd., KCB Bank Uganda and Islamic Corporation for the Development of the Private Sector (ICD), the project joint venture EACOP Ltd. said in an online statement. EACOP did not disclose any amount.

The East African Crude Oil Pipeline will transport up to 246,000 barrels a day from the Lake Albert oilfields in Uganda to the port of Tanga, Tanzania, for export to the global market, according to EACOP.

The funding “demonstrates the support of financial institutions on this transformative regional infrastructure”, it said.

Construction was more than 50 percent complete at the end of last year, EACOP said, noting over 8,000 Ugandan and Tanzanian citizens are employed for the project. Construction started last year and is expected to take 2 years to complete, according to EACOP.

Besides a 1,443-kilometer (896.63 miles), 24-inch buried pipeline, the project will also install 6 pumping stations, 2 pressure reduction stations and a marine export terminal with a 3-megawatt solar plant, according to EACOP.

Early last year TotalEnergies, which owns 62 percent of EACOP, said it had commissioned an assessment of its land acquisition process for the pipeline project and an associated oil development project, following allegations by an interfaith organization that the French energy giant failed to protect hundreds of graves.

“As the land acquisition process draws to a close, this mission will evaluate the land acquisition procedures implemented, the conditions for consultation, compensation and relocation of the populations concerned, and the grievance handling mechanism”, TotalEnergies said in a press release January 4, 2024.

“It will also assess the actions taken by TotalEnergies EP Uganda and EACOP to contribute to the improvement of the living conditions for the people affected by these land acquisitions and suggest additional measures to be implemented if needed”.

International environmental watchdog GreenFaith had documented cases of a lack of compensation for affected burial places; incomplete or poorly constructed relocation sites; risks of limited access to graves due to households having to relocate; and insufficient documentation to account for graves that would be affected.

New York City-based GreenFaith estimated over 2,000 graves in Uganda and Tanzania have been or would be affected by the pipeline designed to run from the town of Kabaale in Uganda to the port of Tanga in Tanzania. It said the figure was based on data from operator and 62-percent owner TotalEnergies itself.

It accused the company of failing to respect local traditions and follow international best practices and engineering standards in treating graves along the EACOP route. 

The most common complaint was an inadequacy of compensation for affected graves, GreenFaith said.

TotalEnergies dismissed the GreenFaith report. It said that in accordance with World Bank project standards on cultural heritage, the project partners “developed a management plan for cultural and archaeological heritage” and conducted interviews “with key stakeholders, including communities”, as well as created an “inventory of sites of archaeological, historical, cultural, artistic and religious importance”.

“As much as possible, the project has adopted an avoidance protocol when choosing locations”, TotalEnergies said in a statement emailed to Rigzone at the time. “In the event that a cultural site cannot be avoided, precautions were taken to minimize the disruptions, inform and engage with stakeholders and ensure that cultural standards are strictly respected.

“Relocation of sacred sites involves strict adherence to the respective families/ clan’s traditional beliefs or customs, e.g. conducting relocation ceremonies to shift the spirits from sacred trees; sacred watercourses; springs and marshes; traditional religious cultural sites (clan sites and family shrines) to another place”.

On claims of a lack of compensation, TotalEnergies said affected residents “are compensated according to the values agreed with the Chief valuer”.

As of February 2025 over 99 percent of compensation agreements were paid, 100 percent of houses built and 97 percent of grievances resolved, according to information on TotalEnergies’ website.

The other owners are Uganda National Oil Co. Ltd. (15 percent), Tanzania Petroleum Development Corp. (15 percent) and China National Offshore Oil Corp. (8 percent).

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Ubuntu namespace vulnerability should be addressed quickly: Expert

Thus, “there is little impact of not ‘patching’ the vulnerability,” he said. “Organizations using centralized configuration tools like Ansible may deploy these changes with regularly scheduled maintenance or reboot windows.”  Features supposed to improve security Ironically, last October Ubuntu introduced AppArmor-based features to improve security by reducing the attack surface

Read More »

Google Cloud partners with mLogica to offer mainframe modernization

Other than the partnership with mLogica, Google Cloud also offers a variety of other mainframe migration tools, including Radis and G4 that can be employed to modernize specific applications. Enterprises can also use a combination of migration tools to modernize their mainframe applications. Some of these tools include the Gemini-powered

Read More »

EnergyPathways pushes to win stakeholder support for MESH

EnergyPathways has published an update as it pushes to engage with stakeholders on its proposed Marram Energy Storage Hub (MESH) project in the UK Irish Sea off the coast of Northwest England. The update outlines the socioeconomic and environmental benefits associated with MESH, with the company winning to win over groups like the North Sea Transition Authority (NSTA) and local MPs. EnergyPathways published the briefing document, outlining the conclusions from its hydrogen and clean energy pre-front end engineering design (pre-FEED) activities. The document also seeks to highlight the role that hydrogen and compressed air storage technologies can play in reducing carbon dioxide (CO2) emissions, in helping to drive the UK energy transition and in enhancing energy security. The MESH project is designed to combine a number of technologies, initially storing natural gas and power generated from offshore wind in the Irish Sea, before being expanded to include hydrogen storage as the UK’s hydrogen economy develops. EnergyPathways moved to expedite the project towards the end of last year, citing changes to the regulatory and fiscal environment in the UK under the Labour government, which came into power in July. The company has touted MESH as a project that could help the UK meet both its energy security and decarbonisation goals. Since then, EnergyPathways has hit various milestones with the development of MESH, including entering into a £5.1m loan facility and selecting Wood as the lead engineering partner. The company also secured licence operatorship approval for Block 110/4a, which includes the MESH site, in January. On schedule and on track In its latest update, EnergyPathways said pre-FEED activities were on schedule and on track for a final investment decision (FID), subject to the necessary outstanding licences being granted. “We have received indications from the government that a decision on awarding the gas

Read More »

The North Sea: A critical crossroads for UK energy and economic future

The North Sea has long been the backbone of Britain’s energy security and economic prosperity. Today, we stand at a pivotal moment that will determine whether this vital asset continues to drive our national success or becomes a missed opportunity. The North Sea Transition Taskforce’s report reveals an urgent challenge: a potentially damaging gap is emerging between the North Sea’s current oil and gas operations and its renewable energy potential. This isn’t just about an industrial transition—it’s a national imperative. The risks are clear. Without decisive action, we could see a premature decline in oil and gas production, loss of critical skills, and degradation of existing infrastructure. Meanwhile, renewable investments are stalling, threatening our net-zero ambitions and economic resilience. We’re not advocating for endless fossil fuel extraction. Instead, we’re calling for a strategic, measured approach that recognises the North Sea’s ongoing importance. Critically, North Sea gas remains lower in carbon intensity than imported alternatives. Every barrel we responsibly extract keeps jobs in the UK, generates exchequer revenues, and supports our energy security. Our recommendations are straightforward. First, the government must replace the current Energy Profits Levy with a predictable, stable fiscal regime that encourages long-term investment. Second, we need clarity on environmental requirements and drilling permissions, particularly for already consented fields. Third, we must commit to a consistent, long-term pipeline of support for renewable investments. Most importantly, this transition requires unprecedented collaboration. We’re proposing a national mission approach, with a ministerially led committee bringing together UK and Scottish governments, agencies, industry, and unions. This isn’t about political point-scoring—it’s about creating a comprehensive, decades-long plan that transcends short-term political cycles. The opportunity is immense. By repurposing existing skills and infrastructure, we can transform the North Sea into a global renewables powerhouse. Offshore wind, carbon capture, and hydrogen technologies aren’t just alternatives—they’re

Read More »

Miliband broaches talks with Centrica to extend Rough beyond 2030

Centrica executives have met with energy secretary Ed Miliband to discuss the future of the Rough gas storage facility in the North Sea. The UK energy company’s group chief executive Christopher O’Shea and the managing director of Centrica Energy Storage Martin Scargill recently attended a meeting with Miliband and others at the Hull Chamber of Commerce. “We’re in discussions with [government] at a high level to find a way to keep Rough open as a key strategic asset for the UK,” O’Shea said. “I can’t go into any details of the discussions but I am hopeful that HMG (Her Majesty’s Government) will be able to move quickly enough to enable this crucial asset to remain open.” He indicated in February that Centrica was yet to decide on a direction for the Rough gas storage site, after the company revealed an anticipated loss at Centrica Energy Storage, stating that “as it currently stands, we wouldn’t be refilling it”. The owner of the country’s largest gas storage facility is seeking to refit the asset as part of a £2 billion refurbishment so that it can function as a hydrogen storage facility. Centrica has said that it is seeking to secure a cap-and-floor mechanism to operate the asset. At the height of the energy crisis, the licence to operate the Rough gas storage facility was renewed, but it is now due to expire in 2030. A spokesperson for the Department for Energy Security and Net Zero (DESNZ) said that the discussions did not go “into detail”, but that the government’s position “remains what it was before”. “We’re not providing updates on what that will look like but we’re having discussions with Centrica about it. Everything’s on the table on what the future would be just as long as it provides value for money,” the

Read More »

Mineral (in)security

With 2024 going down in the history books as the first year to surpass 1.5°C above pre-industrial levels, the spotlight is firmly on accelerating the global net zero agenda and delivering the solutions necessary to mitigate the worst impacts of climate change. The transition from fossil sources to lower-carbon technologies has significantly increased the demand for essential minerals and rare earth elements, such as copper, lithium, and cobalt, and has consequently emerged as a key part of resolving the conflict between Russia and Ukraine. Minerals and rare earths are used extensively in devices such as smartphones, computers, and TVs, but are also critical components in the clean energy and defence industries. Although rare earth elements are abundant in the earth’s crust, they are typically mixed with other minerals, making them difficult and costly to extract and process. © BloombergNeodymium at the Inner Mongolia Baotou Steel Rare-Earth Hi-Tech Co. factory in Baotou, Inner Mongolia. Photographer:Nelson Ching/Bloomberg The global production of rare earth has surged over the last 30 years, increasing from around 75 kilotonnes in 1995 to over 350 kilotonnes in 2023. The increasing demand has in turn highlighted that achieving the global net zero agenda is intrinsically linked to accessing mineral and rare earth elements. Providing context, a typical electric car requires around 200 kg of minerals and rare earths, including batteries and motors, which is up to six times greater than that of a comparable combustion engine car, according to the International Energy Agency (IEA). The IEA also highlights that offshore wind requires approximately 15 tonnes of critical minerals per megawatt (t/MW) capacity, with onshore wind and solar photovoltaics requiring up to ten t/MW and 7 t/MW, respectively. The production of rare earths will need to increase sevenfold by 2040 to meet the needs of the clean energy sector

Read More »

Demand-side flexibility: A key to cheap clean power

Demand-side flexibility, which is the ability to shift or shed electricity consumption during peak times, is an underestimated but critical part of expanding clean power systems. As future power systems will largely rely on variable generation from wind and solar, demand flexibility across growing electricity users can reduce strain on the grid by smoothing peaks, offsetting more expensive and often higher-emitting generation, and therefore reduce costs and speed up decarbonisation. Demand flexibility is already here today, as consumers can dynamically adjust some consumption and save money in response to price signals; however, the opportunity is even greater as the scale of automated responses grows. For example in 2023, Octopus energy customers on an “Agile” tariff saved hundreds of pounds (versus being on a standard tariff), through shifting energy use. Examples of demand flexibility include: charging electric vehicles (EVs) during off-peak hours, heating or cooling rooms a few hours ahead of need in insulated buildings (“pre-heating” or “pre-cooling”), and thermal energy solutions that store and discharge heat for industrial processes (“industrial heat batteries”), lessening peak loads and reducing electricity bills. Data centres also have some potential to perform non-critical data processes at low electricity demand periods – and companies with international operations could choose to shift loads across geographies, taking advantage of regional fluctuations in supply and demand. Demand flexibility offers an affordable, secure route to planning future energy systems, helping to lower upfront investment needs, system running costs and consumer bills. While the expansion of grid capacity and energy storage remain critical for future systems, thanks to the growth of electrification and automated flexibility solutions, demand is now poised to play a larger role in helping to actively balance power systems. Demand flexibility could also help to offset immediate needs for grid upgrades, reducing uptake barriers as electrification expands. Recent analysis from

Read More »

Hunting brings in $38m subsea contract haul

Precision engineering group Hunting has added three fresh contracts worth around $38 million to its subsea technologies sales order book. This has increased the value of its book from $72.5m recorded on 31 December 2024, to approximately $85m as of 31 March 2025. The group’s Enpro business secured decommissioning contracts with two clients in the North Sea, totalling roughly $23m. The clients will utilise Hunting’s proprietary solutions for the attic oil recovery phase of these decommissioning projects. These awards are part of multi-year decommissioning programmes in the North Sea, with the possibility of further orders being awarded in 2026. The awards demonstrate Enpro’s ability to provide subsea solutions from first-production to decommissioning, utilising the company’s modular product solutions. In addition, Hunting secured a new order for its titanium stress joints for a project in the Gulf of Mexico, with a major oil and gas company. The order totals around $15m and represents a new client for the group for this product line. Hunting chief executive Jim Johnson said: “The progress delivered by our subsea businesses during Q1 2025 demonstrates that our widening product offering accesses the whole lifecycle of an oil and gas field, from first-production to abandonment. “Further, with the securing of new titanium stress joint orders with a new client in the Gulf of Mexico, Hunting has demonstrated that our leading riser solution is increasingly being adopted by the majors and large Independents operating in deepwater regions.” Hunting recently undertook a $17.5m buyout for organic oil recovery (OOR) technology. The company has also been looking to restore profitability to its Europe, Middle East and Africa (EMEA) business. This restructuring has led to concerns for 200 jobs across the UK, including at its base in Altens, known as Badentoy, as well as the Scottish village of Fordoun and its

Read More »

Airtel connects India with 100Tbps submarine cable

“Businesses are becoming increasingly global and digital-first, with industries such as financial services, data centers, and social media platforms relying heavily on real-time, uninterrupted data flow,” Sinha added. The 2Africa Pearls submarine cable system spans 45,000 kilometers, involving a consortium of global telecommunications leaders including Bayobab, China Mobile International, Meta, Orange, Telecom Egypt, Vodafone Group, and WIOCC. Alcatel Submarine Networks is responsible for the cable’s manufacturing and installation, the statement added. This cable system is part of a broader global effort to enhance international digital connectivity. Unlike traditional telecommunications infrastructure, the 2Africa Pearls project represents a collaborative approach to solving complex global communication challenges. “The 100 Tbps capacity of the 2Africa Pearls cable significantly surpasses most existing submarine cable systems, positioning India as a key hub for high-speed connectivity between Africa, Europe, and Asia,” said Prabhu Ram, VP for Industry Research Group at CyberMedia Research. According to Sinha, Airtel’s infrastructure now spans “over 400,000 route kilometers across 34+ cables, connecting 50 countries across five continents. This expansive infrastructure ensures businesses and individuals stay seamlessly connected, wherever they are.” Gogia further emphasizes the broader implications, noting, “What also stands out is the partnership behind this — Airtel working with Meta and center3 signals a broader shift. India is no longer just a consumer of global connectivity. We’re finally shaping the routes, not just using them.”

Read More »

Former Arista COO launches NextHop AI for customized networking infrastructure

Sadana argued that unlike traditional networking where an IT person can just plug a cable into a port and it works, AI networking requires intricate, custom solutions. The core challenge is creating highly optimized, efficient networking infrastructure that can support massive AI compute clusters with minimal inefficiencies. How NextHop is looking to change the game for hyperscale networking NextHop AI is working directly alongside its hyperscaler customers to develop and build customized networking solutions. “We are here to build the most efficient AI networking solutions that are out there,” Sadana said. More specifically, Sadana said that NextHop is looking to help hyperscalers in several ways including: Compressing product development cycles: “Companies that are doing things on their own can compress their product development cycle by six to 12 months when they partner with us,” he said. Exploring multiple technological alternatives: Sadana noted that hyperscalers might try and build on their own and will often only be able to explore one or two alternative approaches. With NextHop, Sadana said his company will enable them to explore four to six different alternatives. Achieving incremental efficiency gains: At the massive cloud scale that hyperscalers operate, even an incremental one percent improvement can have an oversized outcome. “You have to make AI clusters as efficient as possible for the world to use all the AI applications at the right cost structure, at the right economics, for this to be successful,” Sadana said. “So we are participating by making that infrastructure layer a lot more efficient for cloud customers, or the hyperscalers, which, in turn, of course, gives the benefits to all of these software companies trying to run AI applications in these cloud companies.” Technical innovations: Beyond traditional networking In terms of what the company is actually building now, NextHop is developing specialized network switches

Read More »

Microsoft abandons data center projects as OpenAI considers its own, hinting at a market shift

A potential ‘oversupply position’ In a new research note, TD Cowan analysts reportedly said that Microsoft has walked away from new data center projects in the US and Europe, purportedly due to an oversupply of compute clusters that power AI. This follows reports from TD Cowen in February that Microsoft had “cancelled leases in the US totaling a couple of hundred megawatts” of data center capacity. The researchers noted that the company’s pullback was a sign of it “potentially being in an oversupply position,” with demand forecasts lowered. OpenAI, for its part, has reportedly discussed purchasing billions of dollars’ worth of data storage hardware and software to increase its computing power and decrease its reliance on hyperscalers. This fits with its planned Stargate Project, a $500 billion, US President Donald Trump-endorsed initiative to build out its AI infrastructure in the US over the next four years. Based on the easing of exclusivity between the two companies, analysts say these moves aren’t surprising. “When looking at storage in the cloud — especially as it relates to use in AI — it is incredibly expensive,” said Matt Kimball, VP and principal analyst for data center compute and storage at Moor Insights & Strategy. “Those expenses climb even higher as the volume of storage and movement of data grows,” he pointed out. “It is only smart for any business to perform a cost analysis of whether storage is better managed in the cloud or on-prem, and moving forward in a direction that delivers the best performance, best security, and best operational efficiency at the lowest cost.”

Read More »

PEAK:AIO adds power, density to AI storage server

There is also the fact that many people working with AI are not IT professionals, such as professors, biochemists, scientists, doctors, clinicians, and they don’t have a traditional enterprise department or a data center. “It’s run by people that wouldn’t really know, nor want to know, what storage is,” he said. While the new AI Data Server is a Dell design, PEAK:AIO has worked with Lenovo, Supermicro, and HPE as well as Dell over the past four years, offering to convert their off the shelf storage servers into hyper fast, very AI-specific, cheap, specific storage servers that work with all the protocols at Nvidia, like NVLink, along with NFS and NVMe over Fabric. It also greatly increased storage capacity by going with 61TB drives from Solidigm. SSDs from the major server vendors typically maxed out at 15TB, according to the vendor. PEAK:AIO competes with VAST, WekaIO, NetApp, Pure Storage and many others in the growing AI workload storage arena. PEAK:AIO’s AI Data Server is available now.

Read More »

SoftBank to buy Ampere for $6.5B, fueling Arm-based server market competition

SoftBank’s announcement suggests Ampere will collaborate with other SBG companies, potentially creating a powerful ecosystem of Arm-based computing solutions. This collaboration could extend to SoftBank’s numerous portfolio companies, including Korean/Japanese web giant LY Corp, ByteDance (TikTok’s parent company), and various AI startups. If SoftBank successfully steers its portfolio companies toward Ampere processors, it could accelerate the shift away from x86 architecture in data centers worldwide. Questions remain about Arm’s server strategy The acquisition, however, raises questions about how SoftBank will balance its investments in both Arm and Ampere, given their potentially competing server CPU strategies. Arm’s recent move to design and sell its own server processors to Meta signaled a major strategic shift that already put it in direct competition with its own customers, including Qualcomm and Nvidia. “In technology licensing where an entity is both provider and competitor, boundaries are typically well-defined without special preferences beyond potential first-mover advantages,” Kawoosa explained. “Arm will likely continue making independent licensing decisions that serve its broader interests rather than favoring Ampere, as the company can’t risk alienating its established high-volume customers.” Industry analysts speculate that SoftBank might position Arm to focus on custom designs for hyperscale customers while allowing Ampere to dominate the market for more standardized server processors. Alternatively, the two companies could be merged or realigned to present a unified strategy against incumbents Intel and AMD. “While Arm currently dominates processor architecture, particularly for energy-efficient designs, the landscape isn’t static,” Kawoosa added. “The semiconductor industry is approaching a potential inflection point, and we may witness fundamental disruptions in the next 3-5 years — similar to how OpenAI transformed the AI landscape. SoftBank appears to be maximizing its Arm investments while preparing for this coming paradigm shift in processor architecture.”

Read More »

Nvidia, xAI and two energy giants join genAI infrastructure initiative

The new AIP members will “further strengthen the partnership’s technology leadership as the platform seeks to invest in new and expanded AI infrastructure. Nvidia will also continue in its role as a technical advisor to AIP, leveraging its expertise in accelerated computing and AI factories to inform the deployment of next-generation AI data center infrastructure,” the group’s statement said. “Additionally, GE Vernova and NextEra Energy have agreed to collaborate with AIP to accelerate the scaling of critical and diverse energy solutions for AI data centers. GE Vernova will also work with AIP and its partners on supply chain planning and in delivering innovative and high efficiency energy solutions.” The group claimed, without offering any specifics, that it “has attracted significant capital and partner interest since its inception in September 2024, highlighting the growing demand for AI-ready data centers and power solutions.” The statement said the group will try to raise “$30 billion in capital from investors, asset owners, and corporations, which in turn will mobilize up to $100 billion in total investment potential when including debt financing.” Forrester’s Nguyen also noted that the influence of two of the new members — xAI, owned by Elon Musk, along with Nvidia — could easily help with fundraising. Musk “with his connections, he does not make small quiet moves,” Nguyen said. “As for Nvidia, they are the face of AI. Everything they do attracts attention.” Info-Tech’s Bickley said that the astronomical dollars involved in genAI investments is mind-boggling. And yet even more investment is needed — a lot more.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »