Stay Ahead, Stay ONMINE

Amazon and SAP partner on European data sovereignty platforms to offer better oversight

Other vendors also see the appeal. Adobe, Snowflake, Cisco, and MongoDB have already signed up to support the AWS Sovereign Cloud on a global basis. Two clouds better than one But what does deploying a SAP sovereign cloud inside a separate AWS sovereign cloud mean in practice? The official answer is that customers using SAP […]

Other vendors also see the appeal. Adobe, Snowflake, Cisco, and MongoDB have already signed up to support the AWS Sovereign Cloud on a global basis.

Two clouds better than one

But what does deploying a SAP sovereign cloud inside a separate AWS sovereign cloud mean in practice? The official answer is that customers using SAP that also have a heavy investment in AWS applications can get the best of both worlds.

“By deploying the SAP Sovereign Cloud portfolio on the AWS European Sovereign Cloud, customers gain access to our comprehensive suite of sovereign cloud solutions, further strengthened by our trusted, long-standing partnership with Amazon Web Services,” said Thomas Saueressig, Member of the Executive Board of SAP SE, Customer Services and Delivery.

Another way of putting it: neither platform offers complete controls across the underlying cloud and application stack; combining them fills in these gaps.

From the AWS European Sovereign Cloud side, customers will be guaranteed that the compute, storage, networking, identity access management, security, and operations staff will comply with European data residency regulations.

It will be independent of AWS’s broader infrastructure, with no critical dependencies on non-EU infrastructure, SAP’s announcement said. AWS European Sovereign Cloud will have EU-based “strong technical controls, sovereign assurances, and legal protections.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Riverbed survey reveals AI readiness gap

“Looking ahead, however, there is a broad consensus around future readiness. By 2028, 86% of respondents expect their organizations will be prepared to support AI at scale, with alignment between both businesses and technical stakeholders,” the report reads. Another hurdle to AI success is data. The Riverbed study asked respondents

Read More »

Network resiliency climbs in importance for businesses

Managed services now the norm Most organizations no longer manage networks on their own: 72% supplement in-house teams with third-party providers, according to C1’s report. It finds businesses are shifting to managed services, which is consistent with other industry research. A report from KPMG found that 73% of organizations have

Read More »

Australian provider outage leaves emergency callers unable to connect

Further recommendations included the requirement that network operators establish a Triple Zero custodian, with responsibility for the efficient functioning of the Triple Zero ecosystem, including monitoring its end-to-end performance. In addition, providers had to conduct six-monthly end-to-end testing of all aspects of the ecosystem within and across networks, which must

Read More »

Oil Surges Past Key Level

Oil rose by the most since July, topping $64 a barrel, after US President Donald Trump’s increasingly hawkish rhetoric on Russia sent futures above a key technical level. Trump on Wednesday said NATO nations should shoot down Russian aircraft that violated their airspace, raising the specter of supply disruption after remarking on Tuesday, that Europe should energy purchases from Moscow. West Texas Intermediate crossed its 100-day moving average of about $64.68. Commodity trading advisers, which can exaggerate price swings, have been liquidating short positions, which sat at 36% on Monday, compared with 55% on Sept. 23, according to data from Bridgeton Research Group. Meanwhile, Russia mulled restrictions on diesel exports for some companies following a spate of attacks by Ukrainian drones on its energy infrastructure, including pipeline facilities, that have put the nation’s supply in focus. Gasoil, a category that includes diesel, settled 2.4% higher in Europe on Tuesday, its biggest gain in three weeks, while diesel traders piled into bets on a surging diesel market. “Crude continues to march higher today as geopolitical headlines escalate across several regions, while Ukraine’s strikes on energy infrastructure are impacting product flows and could eventually spill over into crude exports if unresolved,” said Rebecca Babin, a senior energy trader at CIBC Private Wealth Group. Oil is little changed this month as traders weigh a bearish fundamental outlook against long-running geopolitical tensions. On the supply front, Iraq is finalizing a deal to restart crude exports from its Kurdistan region following a two-year halt. That could bring about 230,000 barrels a day back to the international market, exacerbating a looming glut. Some market metrics are strengthening, with WTI’s prompt spread — the difference between its two closest contracts — at 51 cents a barrel in backwardation, almost doubling from a week ago. Meanwhile, the difference

Read More »

India Says It Wants to Buy More USA Energy

India wants to increase purchase of oil and gas from the US, Commerce Minister Piyush Goyal said, amid efforts to persuade Washington to lower tariff on country’s exports and secure a trade deal. “We expect to increase our trade with the US on energy products in the years to come,” Goyal said at an event in New York. “And being close friends, natural partners, our energy security goals will have a very high element of US involvement.” The minister is visiting US to meet his counterparts after President Donald Trump imposed 50% tariffs on Indian exports last month, partly to penalize New Delhi for continuing to buy Russian oil. The move upended decades of US diplomacy with India and came amid favorable signs after resumption of trade talks. Trump administration has maintained that India is helping to fund Vladimir Putin’s war in Ukraine. India has been the largest buyer of Russian seaborne crude as the discounted barrels have helped the world’s third largest oil consumer keep its import bill in check.  Stepping up oil and gas imports from the US will help reduce the trade imbalance between the two countries and bolster the trade negotiations.  Relations between New Delhi and Washington seemed to be back on track after Trump called Prime Minister Narendra Modi on his birthday. But it ran into rough weather again after Washington slapped a $100,000 fees on new H-1B visa used mainly by Indian tech workers.  Goyal said US had a crucial role to play in diversifying India’s energy security goals, and ensuring stability for the import-dependent nation. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

PJM CEO Asthana responds to criticism, says states must address supply challenges

In the face of mounting criticism from governors, Manu Asthana, the president and CEO of the PJM Interconnection, said states need infrastructure siting and permit reform to make sure the PJM Interconnection has enough power to meet their needs. PJM has adopted interconnection and capacity market reforms to help address resource adequacy concerns, but key issues remain, Asthana said Monday during the Summit on the State of PJM Interconnection in Philadelphia. Those issues include developing rules for adding data center loads to the grid, providing avenues for demand flexibility, improving load forecasting and further interconnection reform, Asthana said. State leaders used the summit to demand a greater role in PJM’s governance, accusing the grid operator of failing to ensure that power supplies are being added to its system quickly enough. PJM is seeing a sudden change in its outlook for load growth, with new load expected mainly from data center development. Although the bulk of new demand has yet to materialize, the system is already struggling with tightening supply-demand conditions, which caused a surge in capacity prices a year ago. That led to double-digit electricity bill increases in some states in PJM’s footprint, which spans 13 Mid-Atlantic and Midwestern states and the District of Columbia. PJM set a new record for capacity auction prices again this year, prompting outcry from members, consumers and politicians.  Asthana, however, said states need to work on issues under their control, such as infrastructure siting and permitting reform, that can affect resource adequacy. “We can do whatever we want in the [wholesale power] markets, but if our [power] plants run into a hostile siting and permitting regime, they will not get built,” Asthana said. States also control cost allocation at the retail level and engage in risk management through default services auctions that protect ratepayers from potentially volatile spot energy prices, Asthana said.

Read More »

GHG Protocol prioritizes looking good over doing good

Lee Taylor is cofounder and CEO of REsurety. Last month, the GHG Protocol dropped a bombshell on clean energy markets. Through a blog post and meeting minutes, we learned that the GHGP Independent Standards Board, or ISB, had voted to disregard the recommendation of its experts, walk back expectations previously set publicly, and prioritize optics over impact.  This explosive development relates to the GHGP’s revision process for Scope 2 accounting that governs how organizations calculate the emissions associated with electricity use and clean energy purchases.  And before you dismiss this concern as the realm of wonkish energy nerds, know this: In the U.S. alone, carbon accounting has supported the development of over $150 billion of clean energy infrastructure. Changes to the GHGP Scope 2 Protocol will absolutely shape how much clean energy infrastructure we build in the decade ahead — for better or for worse.   GHGP thumbs its nose at emissions experts and the public There were two major decisions ahead of the ISB for a vote: whether to advance hourly matching as a mandate for inventory accounting, and whether to advance an impact accounting metric in parallel. Prior to the ISB vote, nearly 40% of the expert Technical Working Group, or TWG, that is tasked with developing the technical standards voted against hourly matching as a mandate, while 74% of TWG members voted to advance impact accounting in parallel as a critical, complementary methodology.  Despite the widespread TWG support, the ISB voted not to advance impact accounting to the public comment period in parallel with hourly matching, while all but a single dissenting ISB member voted to advance the plan to mandate hourly matching. Some have argued that the ISB didn’t kill impact accounting, but merely sent it on a different path. This is nonsense. The public notes show the ISB prefers not to allow

Read More »

Macquarie Expands LNG Business

Macquarie Group Ltd. aims to boost its liquefied natural gas business by ramping up physical trading, the latest effort by the bank to capture a larger share of the fast-growing market. The company is pursuing long-term LNG supply deals of up to 10 to 15 years, according to people with direct knowledge of the matter. The strategy also involves taking positions in shipping and assets, they said, asking not to be identified as the plans have not been made public.  The push underscores a broader shift as financial institutions move to establish themselves in the LNG market, which is dominated by oil majors, commodity traders and utilities. The industry forecasts that global demand for the super-chilled fuel is poised to climb 60% through 2040. A spokesperson for Macquarie declined to comment.  Macquarie’s physical LNG plans are being developed via a non-banking unit, the people added. The company is looking to expand its team in the coming months, with more hires in locations including London, they said.  The company recently hired former Equinor ASA executive Samuele Ravelli to lead its global LNG trading business. Macquarie last month signed a sales and purchase agreement for 600,000 tons a year of LNG over 15 years with AMIGO LNG, a Mexican joint venture that aims to start exports in 2028. The bank is also in the process of converting its preliminary offtake pact with the planned Texas LNG project into a definite agreement.  Several banks are considering getting back into physical trading to take advantage of an expected flurry of activity. Regulatory pressure forced financial institutions to pull back from the sector more than a decade ago. JPMorgan Chase & Co. is in talks to start trading physical LNG again, Bloomberg reported last year. Macquarie has been one of the few banks that has remained active in the LNG market over the last decade,

Read More »

USEDC Opening Houston Office

U.S. Energy Development Corporation (USEDC) announced the opening of a Houston office, and the appointment of Howard House as Senior Vice President and Chief Strategic Officer, in a release sent to Rigzone late Tuesday. The company noted in the release that, “in his new role, House will focus on strategic initiatives and opportunities to ensure the company remains competitive while adapting to the evolving energy landscape”. “In addition to these responsibilities, he will coordinate activities associated with opening and managing the new Houston office. He also will join USEDC’s investment committee,” the release added. USEDC highlighted in its release that House has over 35 years of experience as an energy investment banker and “brings over $125 billion in transactional experience in the energy space, including approximately $60 billion of completed strategic advisory mandates including mergers, acquisitions, divestitures, takeover defense and restructurings”. House has also raised over $65 billion in public and private capital markets transactions, the release added, noting that “his industry reputation and extensive oil and gas network will be key to facilitating the firm’s growth”. “House will focus on strengthening industry relationships across the energy value chain as well as sourcing and evaluating strategic opportunities,” the release stated. USEDC’s release pointed out that House’s career includes senior roles at Greenhill & Co, Raymond James & Associates, and Lehman Brothers. It also highlighted that he began his career in the energy industry as a petroleum engineer at Exxon Company USA. In its release, USEDC said the decision to open a Houston office “reflects USEDC’s commitment to continued growth in Texas and beyond”.  “Houston is a global hub of the energy industry, and establishing an office here will allow the company to leverage existing and new relationships in order to enhance deal flow,” it added. In the release, Jordan Jayson,

Read More »

NVIDIA and OpenAI Forge $100B Alliance to Power the Next AI Revolution

The new strategic partnership between OpenAI and NVIDIA, formalized via a letter of intent in September 2025, is designed to both power and finance the next generation of OpenAI’s compute infrastructure, with initial deployments expected in the second half of 2026. According to the joint press release, both parties position this as “the biggest AI infrastructure deployment in history,” explicitly aimed at training and running OpenAI’s next-generation models.  At a high level: The target scale is 10 gigawatts (GW) or more of deployed compute capacity, realized via NVIDIA systems (comprising millions of GPUs).  The first phase (1 GW) is slated for the second half of 2026, built on the forthcoming Vera Rubin platform.  NVIDIA will progressively invest up to $100 billion into OpenAI, contingent on deployment of capacity in stages.  An initial $10 billion investment from NVIDIA is tied to the execution of a definitive purchase agreement for the first gigawatt of systems.  The equity stake NVIDIA will acquire is described as non-voting / non-controlling, meaning it gives financial skin in the game without governance control.  From a strategic standpoint, tying investment to capacity deployment helps OpenAI lock in capital and hardware over a long horizon, mitigating supply-chain and financing risk. With compute frequently cited as a binding constraint on advancing models, this kind of staged, anchored commitment gives OpenAI a more predictable growth path (at least in theory; that said, the precise economic terms and risk-sharing remain to be fully disclosed.) Press statements emphasize that millions of GPUs will ultimately be involved, and that co-optimization of NVIDIA’s hardware with OpenAI’s software/stack will be a key feature of the collaboration.  Importantly, this deal also fits into OpenAI’s broader strategy of diversifying infrastructure partnerships beyond any single cloud provider. Microsoft remains a central backer and collaborator, but this NVIDIA tie-up further

Read More »

Balancing AI’s opportunities and challenges to serve enterprises

AI has taken the technology industry by storm, with enterprises deploying emerging applications to create business value. Amid this shift, operators are leveraging network automation, optical innovation and more to support enterprise AI use cases. Still, the technology ecosystem must balance AI’s opportunities with its challenges. While AI can improve operations, it can also leave companies more vulnerable to cyberattacks. As organizations deploy more AI tools and employees increasingly use them, the overall attack surface expands and opens more security gaps. This article explores how internet carriers are building their networks to support enterprises, while also discussing how operators are establishing trust with customers. Table stakes: reliability, diversity and reach AI’s requirements are similar to content distribution, cloud networking and previous industry shifts, but place even greater pressure on carrier-delivered enterprise network services.  In these services, network diversity is integral, allowing carriers to eliminate single points of failure in the event of an outage, then quickly reroute traffic through the next best available path. This improved reliability is vital for enabling real-time enterprise AI operations amid increased instances of network disruption due to geopolitical sabotage or accidental damage. As more hyperscalers build sprawling AI data center campuses, network reach will also prove even more crucial. By continuously expanding their network footprints, carriers can help enterprises access these sites no matter where they’re located, with operators’ high-capacity connectivity infrastructure facilitating the transfer of massive data volumes between these campuses. Similar to how content distribution networks rely on a robust network underlay, backbone connectivity provides the high-capacity, long-haul transport underpinning the delivery of AI inferencing responses. While the backbone itself does not cache or deliver these responses, its densely interconnected networks ensure that this AI traffic reaches regional and access networks, which then distribute responses to end users. Lightspeed: optical innovation With

Read More »

Microsoft’s new cooling tech targets AI’s thermal bottleneck as hyperscalers hit power ceilings

Rising thermal pressure on AI hardware AI workloads and high-performance computing have placed unprecedented strain on data center infrastructure. Thermal dissipation has emerged as one of the toughest bottlenecks, with traditional methods such as airflow and cold plates increasingly unable to keep pace with new generations of silicon. “Modern accelerators are throwing out thermal loads that air systems simply cannot contain, and even advanced water loops are straining. The immediate issues are not only the soaring TDP of GPUs, but also grid delays, water scarcity, and the inability of legacy air-cooled halls to absorb racks running at 80 or 100 kilowatts,” said Sanchit Vir Gogia, CEO and chief analyst at Greyhound Research. “Cold plates and immersion tanks have extended the runway, but only marginally. They still suffer from the resistance of thermal interfaces that smother heat at the die. The friction lies in the last metre of the thermal path, between junction and package, and that is where performance is being squandered.” Cooling costs: the next data center budget crisis Cooling isn’t just a technical challenge but also an economic one. Data centers spend heavily to manage the immense heat generated by servers, networking gear, and GPUs. Hence, the cost of cooling a data center is also a significant expense. “As per 2025 AI infra buildouts TCO analysis, over 45%-47% of data center power budget typically goes into cooling, which could further expand to 65%-70% without advancement in cooling method efficiency,” said Danish Faruqui, CEO at Fab Economics. “In 2024, Nvidia Hopper H100 had 700 watts of power requirements per GPU, which scaled in 2025 to double with Blackwell B200 and Blackwell Ultra B300 to 1000 W and 1400 watts per GPU. Going forward in 2026, it will again more than double by Rubin and Rubin Ultra GPU to 1800W

Read More »

Nvidia and OpenAI open $100B, 10 GW data center alliance

A Nvidia spokesperson said that this deal is separate from Project Stargate, the $500 billion data center project announced earlier this year featuring OpenAI, Oracle, and SoftBank. It launched with much hoopla but has since struggled to gain any traction. OpenAI is already an exclusive AI partner for Microsoft, offering ChatGPT through the Bing search engine and Microsoft Office 365. Microsoft promised in January to invest $85 billion in AI data centers. However, that deal seems to be unraveling. OpenAI Has partnered with Oracle to offer its services through Oracle Cloud Infrastructure, while Microsoft has added Anthropic’s Perplexity generative AI service alongside ChatGPT. OpenAI’s next-generation datacenters will use Nvidia’s Vera Rubin platform, which went into production in August and is expected to begin shipping late next year. They are expected to be capable of performing FP4 inference at 3.6 exaflops and FP8 training at 1.2 exaflops.

Read More »

Community Watch: Data Center Pushback – Q3 2025

As the pace of data center construction accelerates, so too does the wave of local resistance. While multi-billion-dollar investment announcements often draw national or even global attention, the disputes that arise around individual projects typically play out at the local or regional level — and receive far less visibility. With this recurring feature, Data Center Frontier will highlight community opposition efforts that are shaping, delaying, or in some cases halting, data center development. Tarboro, North Carolina: Energy Storage Solutions Project At first glance, the proposal seemed like a win for Tarboro: a $6.2 billion hyperscale data center on a 50-acre site already zoned for heavy industrial use. But after more than five hours of deliberation, the town council voted 6–1 against granting a special use permit for the project. North Carolina’s unusual quasi-judicial process limited how the council could reach its decision. Because the permit required a courtroom-style proceeding, members were allowed to weigh only factual evidence and expert testimony, not personal opinions or community objections. Developer Danieal Schaffer has since stated he will take the next step of appealing the decision to the Edgecombe Superior Court. Menomonie, Wisconsin: Mystery Data Center Raises Alarm When the town of Menomonie annexed more than 300 acres of farmland, residents quickly grew uneasy about the project’s true purpose. Official information was limited to a vague reference to a “potential data center,” accompanied by a FAQ article on the town’s website. According to Fox Business News, city officials were told only that the project involved a U.S. company and one of the five major tech firms. In a community of just over 16,000 people, opposition has gained significant traction. A Facebook group called Save Our City. Stop the Menomonie Data Center now counts more than 8,000 members. With no clear tenant identified and only

Read More »

Who wins/loses with the Intel-Nvidia union?

In announcing the deal, Jensen Huang emphasized the client aspect of the deal, saying future Intel chips would have Nvidia GPUs baked into them instead of Intel’s own GPU technology. But there will be impact for the server business as well. There are two things the analysts all agree on:  AMD is the big loser in this deal. It had the advantage of CPU and GPU combination that Intel and Nvidia didn’t have individually. It was apparent in supercomputers like Frontier and El Capitan, which are an all-AMD design of CPUs and GPUs working in tandem. Now the two companies are joined at the hip and will have a competitive offering in due time. The second area of agreement is that the future of Jaguar Shores, Intel’s AI accelerator based on its GPU technology and the Gaudi AI accelerator is uncertain. “Nvidia already has solutions here and it doesn’t make sense for Intel to work on a redundant product that needs to be marketed over an established one,” said Nguyen. A significant event coming from this deal is that Intel is adopting the Nvidia proprietary NVlink high-speed interconnect protocols. “This means that Intel has essentially determined its ability to compete head-to-head with Nvidia in the current large scale AI marketplace, despite its best efforts, have mostly failed,” wrote Jack Gold of J. Gold Associates in a research note. Gold notes that Nvidia already uses a few Xeon data center chips to power their largest systems, and the x86 chips provide most of the controls and pre-processing that their large-scale GPU racks require. By accelerating the performance of the Xeon, the GPU benefits as well. That leaves the question mark hanging over Nvidia’s Arm CPUs, which is likely to continue for “niche areas,” Gold wrote. “But with this announcement, it now

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »