Stay Ahead, Stay ONMINE

SAP data sovereignty service lets customers run cloud workloads inside their data centers

A range of developments, primarily geo-political in nature, have transformed this outlook. Now, sovereignty is as much tied up with the growing sense that operational, political, and even technological independence is essential, especially for EU-based enterprises. SAP has embraced this concern. “The digital resilience of Europe depends on sovereignty that is secure, scalable and future-ready,” […]

A range of developments, primarily geo-political in nature, have transformed this outlook. Now, sovereignty is as much tied up with the growing sense that operational, political, and even technological independence is essential, especially for EU-based enterprises.

SAP has embraced this concern. “The digital resilience of Europe depends on sovereignty that is secure, scalable and future-ready,” said Martin Merz, president, SAP Sovereign Cloud. “SAP’s full-stack sovereign cloud offering delivers exactly that, giving customers the freedom to choose their deployment model while helping ensure compliance up to the highest standards.”

This reflects the company’s commitment to supporting the EU’s “digital autonomy,” he said. The company has made digital sovereignty a strategic priority, and will invest €20 billion ($23.3 billion) to develop new digital sovereignty products for the EU as well as for other territories.

A decade ago, the idea of cloud services promoted the notion of a single global infrastructure market. Now it looks just as likely that there will be a balkanization of global cloud infrastructure into geographical domains.

“For decades, enterprises have handed over too much power to their cloud providers – power over infrastructure, power over availability, and most importantly, power over their own data,” commented Garima Kapoor, co-founder and co-CEO of US AI object storage company, MinIO.

“CIOs are realizing that outsourcing control to a public cloud provider is no longer an option. The concept of sovereignty is evolving. It’s no longer just as a means of maintaining compliance with data regulations but is now viewed as a strategic and architectural imperative for enterprises that want to own their digital destiny,” she said.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Quinas readies UltraRam, flash memory with DRAM speed

For starters, the memory is built on what is called a” III-V technology,” a class of semiconductor materials that are composed of elements from groups III and V of the periodic table, the company stated. These materials have unique properties that make them ideal for use in electronic devices such

Read More »

7 Wi-Fi certifications to bolster wireless networking skills

Organization: Certified Wireless Network Professionals (CWNP) Price: $149.99 for the CWTS-102 exam How to prepare: CWNP offers several resources to prepare including: a live training class, a self-paced training kit, a study and reference guide, an electronic practice test, an eLearning module, an eLearning bundle, and a test and go

Read More »

Microsoft’s hollow core fiber delivers the lowest signal loss ever

The manufacturing process requires precise control of membrane thicknesses around 500 nanometers. The complex nested tube design adds complexity compared to conventional fiber production, though Microsoft is leveraging its resources to address scaling challenges, he added. “Financial trading was the earliest sector to benefit, with its well-funded operators contributing to

Read More »

Petronas Posts Lower Revenue for H1

Malaysia’s Petroliam Nasional Berhad (Petronas) has posted revenues of MYR 132.6 billion ($31.4 billion) for the first half of 2025, versus $41 billion for H1 2024. The company, in its report, attributed the decline to discontinued operations from the Engen Group divestment in May 2024, as well as unfavorable foreign exchange. Petronas also cited lower average realized prices from petroleum products, crude oil, and condensates following the downward trend in benchmark prices. Profit after tax was $6.2 billion, dropping 19 percent from H1 2024. EBITDA totaled $12.8 billion, lower by 15 percent in line with lower profits, Petronas said. “Petronas remains unwavering in our commitment to strengthen our business and portfolio resilience for long-term growth amid an increasingly challenging macro environment in the first half of 2025”, Muhammad Taufik, Petronas President and Group CEO. “Through the focused execution of our Energy Transition Strategy, portfolio optimization and prudent capital management, we are expediting a critical transformation to continue delivering energy safely, reliably and sustainably to those we serve”. Petronas said it expects oil prices to remain low due to ongoing geopolitical tensions, uncertainties in the macroeconomic landscape, changing regulatory conditions, and the swift reversal of OPEC+ production cuts, which will continually alter global energy dynamics and trade patterns. Petronas said it is focusing on providing advantaged hydrocarbons with reduced costs and emissions. This initiative is supported by numerous upstream discoveries and developments, especially in Malaysia, Suriname, Indonesia, and Angola, it said. The inaugural liquefied natural gas cargo from the LNG Canada facility also marked a strategic expansion of Petronas’ supply footprint. Petronas said it is enhancing its integrated low-carbon value chain through focused investments in solar energy and energy solutions hubs. To contact the author, email [email protected] What do you think? We’d love to hear from you, join the conversation on the Rigzone

Read More »

TotalEnergies Gets New Blocks in Congo Brazzaville, Nigeria

TotalEnergies SE has received two exploration permits offshore Nigeria and one offshore Republic of the Congo, growing its footprint in the two countries. The Nzombo exploration license sits 100 kilometers (62.14 miles) off the coast of Pointe-Noire, near the Moho production facilities operated by TotalEnergies EP Congo. The block spans 1,000 square kilometers (386.1 square miles), according to the French energy giant. “The work program includes the drilling of one exploration well, which is expected to spud before the end of 2025”, TotalEnergies said in a statement on its website. “This award of a promising exploration permit, with the material Nzombo prospect, reflects our continued strategy of expanding our exploration portfolio with high-impact prospects, which can be developed leveraging our existing facilities, and confirms our longstanding partnership with the Republic of the Congo”, said Kevin McLachlan, senior vice president for exploration at TotalEnergies. In the new block TotalEnergies is operator with a 50 percent stake. QatarEnergy and national oil company SNPC own 35 percent and 15 percent respectively. Meanwhile for the new Nigerian awards, TotalEnergies and local player South Atlantic Petroleum Ltd. (SAPETRO) signed a production sharing contract (PSC) containing PPLs 2000 and 2001, TotalEnergies said Tuesday. The West African country awarded the permits under its 2024 exploration bidding. TotalEnergies is operator with an 80 percent stake while SAPETRO owns 20 percent. Located in the “prolific” West Delta basin, the two permits cover about 2,000 square kilometers (772.2 square miles), TotalEnergies said. “The work program includes drilling one firm exploration well”, it said. “TotalEnergies is honored to be the first international company to be awarded exploration licenses in a bid round in Nigeria in more than a decade, marking a new milestone in our long-term partnership with the country”, said McLachlan. “These promising block captures are fully aligned with our strategy of strengthening

Read More »

Trump administration axes $679M in offshore wind infrastructure funding

The U.S. Department of Transportation is withdrawing or terminating $679 million in funding for 12 port and infrastructure upgrades that would support offshore wind projects, it announced Friday. “As part of the Department of Transportation’s review of all discretionary grant programs with obligated and unobligated projects, USDOT identified 12 offshore wind grants and project selections that were not aligned with the goals and priorities of the administration,” the department said in a release. The defunded projects include the Sparrows Point Steel Marshalling Port Project, which had been awarded $47.3 million in Port Infrastructure Development Program; the Arthur Kill Terminal, an offshore wind port in New York, which had been allocated $48 million in PIDP funding; and the Humboldt Bay Offshore Wind Heavy Lift Multipurpose Marine Terminal, which had been allocated $426.7 million in Nationally Significant Freight and Highway Projects funding. The department’s release said the Trump administration has “refocused” DOT and its Maritime Administration “on rebuilding America’s shipbuilding capacity, unleashing more reliable, traditional forms of energy, and utilizing the nation’s bountiful natural resources to unleash American energy.” “Where possible, funding from these projects will be recompeted to address critical port upgrades and other core infrastructure needs of the United States,” DOT said. This funding clawback is the latest in a series of moves from President Donald Trump to reverse course on federal support for offshore wind. Last month the U.S. Department of Commerce opened a probe into wind turbine imports, setting the stage for new tariffs, and it rescinded all “wind energy areas” the Biden administration had designated for future lease sales on the U.S. Outer Continental Shelf. Also last month, the Department of the Interior issued a stop work order to the 700-MW Revolution Wind project and said in court that it intends to revoke the approved construction and operations plan

Read More »

USA, India Keep Door Open to Trade Talks

President Donald Trump said India offered to cut tariffs, while New Delhi signaled it’s continuing to negotiate a trade agreement with the US, indicating both sides are keeping the door open to resolving tensions.  Trump said in a social media post on Monday that India offered to reduce its tariffs on US goods to zero, without saying when the concession was made. He added that “it’s getting late” and India should have made the offer “years ago.”  India’s government hasn’t officially responded to Trump’s remarks, but Commerce Minister Piyush Goyal said at an event on Tuesday that both sides continue to engage to reach a trade agreement. “We are in dialog with the US for a bilateral trade agreement,” Goyal said in New Delhi.  There are no formal talks taking place between the trade negotiators at the moment after a US team canceled its trip to India in August. Bloomberg News reported last week that the two nations are keeping informal communication channels open. Trump’s latest post may signal that the US remains open to trade talks, according to Abhijit Das, a former Indian trade official and author on trade matters. “This is his attempt to walk back on his earlier hardened stance on India,” he said. Trump slapped India with 50% tariffs on exports to the US to penalize it for its trade barriers and its purchases of Russian oil. India was among the first countries to open trade talks with the Trump administration, with the US president and Prime Minister Narendra Modi committing to a bilateral deal by the fall of this year. The two sides failed to clinch a deal after five rounds of talks, with the US getting increasingly frustrated with New Delhi’s unwillingness to budge on key issues, such as opening up its dairy and agriculture markets. Trump later

Read More »

WTI Posts Biggest Gain Since July

Oil edged up by the most since late July as technical buying supported a rally driven by signs of enduring physical market tightness. West Texas Intermediate rose 2.5% to settle near $66 a barrel. Ukraine struck two oil refineries in Russia in a continued assault on energy infrastructure that has begun to hurt flows, pushing Moscow’s crude-processing runs to the lowest since May 2022 last month. The conflict has contributed to unforeseen tightness in a market that was expected to be overwhelmed by OPEC crude at this time of year. Commodity trading advisers, meanwhile, were steadily buying throughout the session, helping push prices higher, according to Daniel Ghali, a commodity strategist at TD Securities. However, the algorithmically driven traders will sell both benchmarks in any scenario for prices over the coming week, indicating that crude’s run may reverse soon, he said. Russian flows have been in the spotlight over the past few weeks amid US efforts to pressure Moscow to make peace in Ukraine by targeting India, a top importer of its crude. Treasury Secretary Scott Bessent said Washington would look at sanctions on Russia this week. Elsewhere, US stockpiles have remained low at the key storage hub of Cushing, Oklahoma. The wealth of bullish near-term factors — from the war in Ukraine to the US deploying naval forces off the coast of Venezuela — contributed to timespreads widening in their backwardated structures toward the end of last week. “Sentiment in the oil market is shifting from very negative to more neutral,” said Arne Lohmann Rasmussen, chief analyst at A/S Global Risk Management. “The main support for oil prices is the geopolitical premium. No one believes anymore that a peace deal between Russia and Ukraine is imminent.” The jolt of strength comes amid a bearish chapter for crude. US benchmark

Read More »

Northern Lights achieves first injection, storage of CO2 volumes in North Sea

The Northern Lights Joint Venture, comprised of Equinor, Shell plc, and TotalEnergies, has injected and stored the first volumes of CO2 into the reservoir 2.6 m under the seabed and 100 km off the coast of Western Norway. The CO2 is transported from Heidelberg Materials’ cement factory in Brevik  to Øygarden via two 130-m-long vessels (Northern Pathfinder and Northern Pioneer) designed by Shell engineers. The liquefied-carbon carriers are each capable of transporting 7,500 cu m of CO2 in a single trip, according to Shell. The CO2 is then offloaded and transported through a 100-km pipeline and injected into the Aurora reservoir. Heidelberg Materials is expected to capture around 400,000 tonnes/year (tpy) of CO2 from its cement factory in Brevik, two hours south of Oslo, accounting for about half the plant’s total emissions, according to Shell.  Equinor, as the Technical Service Provider (TSP), has been responsible for the construction of the Øygarden receiving infrastructure and the offshore infrastructure on behalf of the joint venture. Equinor also will have operational responsibility of the CO2 plant. The start of CO2 injection completes the first phase of the development, which has a total capacity of 1.5 million tpy of CO2. The project’s industrial customers include Hafslund Celsio and Heidelberg Materials in Norway, Yara in the Netherlands, and Ørsted in Denmark. In March, the owners of Northern Lights made the final investment decision for Phase 2 of the development, which will increase transport and storage capacity to a minimum of 5 million tpy of CO2 from 2028. FID for Phase 2 followed the signing of an agreement to transport and store up to 900,000 tonnes/year of CO2 from Stockholm Exergi.  The expansion of Northern Lights builds on existing infrastructure and includes additional onshore storage tanks, a new jetty, and additional injection wells. Development of the second

Read More »

SAP data sovereignty service lets customers run cloud workloads inside their data centers

A range of developments, primarily geo-political in nature, have transformed this outlook. Now, sovereignty is as much tied up with the growing sense that operational, political, and even technological independence is essential, especially for EU-based enterprises. SAP has embraced this concern. “The digital resilience of Europe depends on sovereignty that is secure, scalable and future-ready,” said Martin Merz, president, SAP Sovereign Cloud. “SAP’s full-stack sovereign cloud offering delivers exactly that, giving customers the freedom to choose their deployment model while helping ensure compliance up to the highest standards.” This reflects the company’s commitment to supporting the EU’s “digital autonomy,” he said. The company has made digital sovereignty a strategic priority, and will invest €20 billion ($23.3 billion) to develop new digital sovereignty products for the EU as well as for other territories. A decade ago, the idea of cloud services promoted the notion of a single global infrastructure market. Now it looks just as likely that there will be a balkanization of global cloud infrastructure into geographical domains. “For decades, enterprises have handed over too much power to their cloud providers – power over infrastructure, power over availability, and most importantly, power over their own data,” commented Garima Kapoor, co-founder and co-CEO of US AI object storage company, MinIO. “CIOs are realizing that outsourcing control to a public cloud provider is no longer an option. The concept of sovereignty is evolving. It’s no longer just as a means of maintaining compliance with data regulations but is now viewed as a strategic and architectural imperative for enterprises that want to own their digital destiny,” she said.

Read More »

Alibaba Cloud tweaks software for networking efficiency gains

Alibaba Cloud said that it has been using ZooRoute in AliCloud for the last 18 months, where it has reduced outage time by 92.71%. Nezha for network performance in high-demand VMs Another software upgrade is helping Alibaba Cloud maintain network performance for high-demand virtual machines (VMs) without spending more on SmartNIC-accelerated virtual switches (vSwitches). Nezha, a distributed vSwitch load-sharing system, identifies idle SmartNICs and uses them to create a remote resource pool for high-demand virtual NICs (vNICs). Alibaba has tested the system in its data centers for a year and said in the paper that “Nezha effectively resolves vSwitch overloads and removes it as a bottleneck.” With the number of concurrent flows improved by up to 50x, and the number of vNICs by up to 40x, the bottleneck s now the VM kernel stack, the researchers wrote. Dai’s Forrester said that Nezha’s stateless offloading and cluster-wide pooling design is superior to solutions being pursued by rival cloud service providers. Separately, Alibaba’s cloud computing division has also been working on another software update that will enable it to provide better network performance for AI workloads.

Read More »

AI networking success requires deep, real-time observability

Most research participants also told us they need to improve visibility into their data center network fabrics and WAN edge connectivity services. (See also: 10 network observability certifications to boost IT operations skills) The need for real-time data Observability of AI networks will require many enterprises to optimize how their tools collect network data. For instance, most observability tools rely on SNMP polling to pull metrics from network infrastructure, and these tools typically poll devices at five minute intervals. Shorter polling intervals can adversely impact network performance and tool performance. Sixty-nine percent of survey participants told EMA that AI networks require real-time infrastructure monitoring that SNMP simply cannot support. Real-time telemetry closes visibility gaps. For instance, AI traffic bursts that create congestion and packet drops may last only seconds, an issue that a five-minute polling interval would miss entirely. To achieve this level of metric granularity, network teams will have to adopt streaming network telemetry. Unfortunately, support of such technology is still uneven among network infrastructure and network observability vendors due to a lack of industry standardization and a perception among vendors that customers simply don’t need it. Well, AI is about to create a lot of demand for it.  In parallel to the need for granular infrastructure metrics, 51% of respondents told EMA that they need more real-time network flow monitoring. In general, network flow technologies such as NetFlow and IPFIX can deliver data nearly in real-time, with delays of seconds or a couple minutes depending on the implementation. However, other technologies are less timely. In particular, the VPC flow logs generated by cloud providers are do not offer the same data granularity. Network teams may need to turn to real-time packet monitoring to close cloud visibility gaps.  Smarter analysis for smarter networks Network teams also need their network

Read More »

Equinix Bets on Nuclear and Fuel Cells to Meet Exploding Data Center Energy Demand

A New Chapter in Data Center Energy Strategy Equinix’s strategic investments in advanced nuclear and fuel cell technologies mark a pivotal moment in the evolution of data center energy infrastructure. By proactively securing power sources like Oklo’s fast reactors and Radiant’s microreactors, Equinix is not merely adapting to the industry’s growing energy demands but is actively shaping the future of sustainable, resilient power solutions. This forward-thinking approach is mirrored across the tech sector. Google, for instance, has partnered with Kairos Power to develop small modular reactors (SMRs) in Tennessee, aiming to supply power to its data centers by 2030 . Similarly, Amazon has committed to deploying 5 gigawatts of nuclear energy through partnerships with Dominion Energy and X-energy, underscoring the industry’s collective shift towards nuclear energy as a viable solution to meet escalating power needs . The urgency of these initiatives is underscored by projections from the U.S. Department of Energy, which anticipates data center electricity demand could rise to 6.7%–12% of total U.S. production by 2028, up from 4.4% in 2023. This surge, primarily driven by AI technologies, is straining existing grid infrastructure and prompting both public and private sectors to explore innovative solutions. Equinix’s approach, i.e. investing in both immediate and long-term energy solutions, sets a precedent for the industry. By integrating fuel cells for near-term needs and committing to advanced nuclear projects for future scalability, Equinix exemplifies a balanced strategy that addresses current challenges while preparing for future demands. As the industry moves forward, the collaboration between data center operators, energy providers, and policymakers will be crucial. The path to a sustainable, resilient energy future for data centers lies in continued innovation, strategic partnerships, and a shared commitment to meeting the digital economy’s power needs responsibly.

Read More »

Evolving to Meet AI-Era Data Center Power Demands: A Conversation with Rehlko CEO Brian Melka

On the latest episode of the Data Center Frontier Show Podcast, we sat down with Brian Melka, CEO of Rehlko, to explore how the century-old mission-critical power provider is reinventing itself to support the new realities of AI-driven data center growth. Rehlko, formerly known as Kohler Energy, rebranded a year ago but continues to draw on more than a century of experience in power generation and backup systems. Melka emphasized that while the name has changed, the mission has not: delivering reliable, scalable, and flexible energy solutions to support always-on digital infrastructure. Meeting Surging AI Power Demands Asked how Rehlko is evolving to support the next wave of data center development, Melka pointed to two major dynamics shaping the market: Unprecedented capacity needs driven by AI training and inference. New, “spiky” usage patterns that strain traditional backup systems. “Power generation is something we’ve been doing longer than anyone else, starting in 1920,” Melka noted. “As we look forward, it’s not just about the scale of backup power required — it’s about responsiveness. AI has very large short-duration power demands that put real strain on traditional systems.” To address this, Rehlko is scaling its production capacity fourfold over the next three to four years, while also leveraging its global in-house EPC (engineering, procurement, construction) capabilities to design and deliver hybrid systems. These combine diesel or gas generation with battery storage and short-duration modulation, creating a more responsive power backbone for AI data centers. “We’re the only ones out there that can deliver that breadth of capability on a full turnkey basis,” Melka said. “It positions us to support customers as they navigate these new patterns of energy demand.” Speed to Power Becomes a Priority In today’s market, “speed to power” has become the defining theme. Developers and operators are increasingly considering

Read More »

Data Center Chip Giants Negotiate Political Moves, Tariffs, and Corporate Strategies

And with the current restrictions being placed on US manufacturers selling AI parts to China, reporting says NVIDIA is developing a Blackwell-based China chip, more capable than the current H20 but still structured to comply with U.S. export rules. Reuters reported that it would be  a single-die design (roughly half the compute of the dual-die B300), with HBM and NVLink, sampling as soon as next month. A second compliant workstation/inference product (RTX6000D) is also in development. Chinese agencies have reportedly discouraged use of NVIDIA H20 in government work, favoring Huawei Ascend. However, there have been reports describing AI training using the Ascend to be “challenging”, forcing some AI firms to revert to NVIDIA for large-scale training while using Ascend for inference. This keeps China demand alive for compliant NVIDIA/AMD parts—hence the U.S. interest in revenue-sharing. Meanwhile, AMD made its announcements at June’s “Advancing AI 2025” to set MI350 (CDNA 4) expectations and a yearly rollout rhythm that’s designed to erase NVIDIA’s time lead as much as fight on absolute perf/Watt. If MI350 systems ramp aligns with major cloud designs in 2026, AMD’s near-term objective is defending MI300X momentum while converting large customers to multi-vendor strategies (often pairing MI clusters with NVIDIA estates for redundancy and price leverage). The 15% China license fee will shape how AMD prices MI-series export SKUs and whether Chinese hyperscalers still prefer them to the domestic alternative (Huawei Ascend), which continue to face software/toolchain challenges. If Chinese buyers balk or Beijing discourages purchases, the revenue-share may be moot; if they don’t, AMD has a path to keep seats warm in China while building MI350 demand elsewhere. Beyond China export licenses, the U.S. and EU recently averted a larger trade war by settling near 15% on certain sectors, which included semiconductors, as opposed to the far more

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »