Stay Ahead, Stay ONMINE

Does It Matter If Microsoft Is Cancelling AI Data Center Leases?

Strategic Reallocation: Microsoft is a major owner and operator of data centers and might be reallocating resources to in-house infrastructure rather than leased spaces. Supply Chain Delays: TD Cowen noted that Microsoft used power and facility delays as justifications for voiding agreements, a tactic previously employed by Meta. Oversupply Issues: Analysts at TD Cowen speculate […]

Strategic Reallocation: Microsoft is a major owner and operator of data centers and might be reallocating resources to in-house infrastructure rather than leased spaces.

Supply Chain Delays: TD Cowen noted that Microsoft used power and facility delays as justifications for voiding agreements, a tactic previously employed by Meta.

Oversupply Issues: Analysts at TD Cowen speculate that Microsoft may have overestimated AI demand, leading to an excess in capacity. As it is all speculation, it could simply be that the latest information has driven Microsoft to reevaluate demand and move to more closely align projected supply with projected demand. Microsoft has reiterated their commitment to spend $80 billion on AI in the coming year. Reallocating this spending internally or wit a different set of partners remains on the table.

And when you put the TD Cowen report that Microsoft has cancelled leases for “a couple hundred megawatts” into context with Microsoft’s overall leased power, which is estimated at around 20 GW, you see that more than 98% of their energy commitment remains unchanged.

Investment Markets Might See the Biggest Hits

Microsoft’s retreat has had ripple effects on the stock market, particularly among energy and infrastructure companies. European firms like Schneider Electric and Siemens Energy experienced a decline in stock value, indicating fears that major AI companies might scale back energy-intensive data center investments.

However, at press time we have not seen any other indicators that this is an issue as despite these concerns about potential AI overcapacity, major tech firms continue to invest heavily in AI infrastructure:

  •         Amazon: Pledged $100 billion towards AI data centers.
  •         Alphabet (Google): Committed $75 billion.
  •         Meta (Facebook): Planning to spend up to $65 billion.
  •         Alibaba: Announced a $53 billion investment over the next three years.

If we see a rush of announcements identifying retrenchment by other players in the AI/Hyperscaler market, then it will definitely be time to take a step back and reconsider the trajectory the AI-driven data center development is on.

AI Is Here To Stay

And, of course, critics argue that while tech giants are making unprecedented investments in AI, the real-world applications remain limited.

Couple this with the recent emergence of potentially more cost-efficient open-source AI models, such as those developed by DeepSeek, which has intensified debates about whether Big Tech is overcommitting to AI infrastructure.

But few are arguing that AI will not continue to grow — and preparation to meet the demand will fluctuate much as any new technology does, especially as those real-world applications for AI become more prevalent.

Conclusion

Microsoft’s purported decision to cancel data center leases signals a potential recalibration of its AI strategy. Whether this is a sign of cautious planning or an oversupply issue remains to be seen. As AI adoption evolves, companies will need to strike a balance between meeting demand and avoiding unnecessary capital expenditures. Investors and industry stakeholders will closely watch how Microsoft and its competitors navigate this dynamic landscape.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

SolarWinds buys Squadcast to speed incident response

Squadcast customers shared their experiences with the technology. “Since implementing Squadcast, we’ve reduced incoming alerts from tens of thousands to hundreds, thanks to flexible deduplication. It has a direct impact on reducing alert fatigue and increasing awareness,” said Avner Yaacov, Senior Manager at Redis, in a statement. According to SolarWinds,

Read More »

Norway Opens Application for One CO2 Storage Exploration Area

Norway’s Energy Ministry has designated another area of the North Sea for application for licenses to explore the potential of carbon dioxide (CO2) storage. The acreage comprises defined blocks on the Norwegian side of the sea, upstream regulator the Norwegian Offshore Directorate said in an online statement. This is the eighth time acreage is being offered for CO2 storage exploration or exploitation on the Norwegian continental shelf, it noted. The application window for the latest acreage offer closes April 23. “In line with the regulations on transportation and storage of CO2 into subsea reservoirs on the continental shelf, the ministry normally expects to award an exploration license prior to awarding an exploitation license in a relevant area”, the Energy Ministry said separately. Norway has so far awarded 13 CO2 storage licenses: 12 for exploration and one for exploitation. Energy Minister Terje Aasland commented, “The purpose of allocating land is to be able to offer stakeholders in Europe large-scale CO2 storage on commercial terms”. Licensing for CO2 storage is part of Norwegian regulations passed December 2014 to support CO2 storage to mitigate climate change.  “Norway has great potential for storage on the continental shelf”, the ministry added. The Norwegian continental shelf holds a theoretical CO2 storage capacity of 80 billion metric tons, representing about 1,600 years of Norwegian CO2 emissions at current levels, according to a statement by the ministry April 30, 2024. In the latest awards two consortiums with Norway’s majority state-owned Equinor ASA won two exploration licenses in the North Sea. Equinor and London-based Harbour Energy PLC together won a permit straddling blocks 15/8, 15/9, 15/11 and 15/12. The permit, EXL012, lasts four years with three phases. Harbour Energy Norge AS holds a 60 percent stake as operator while Equinor Low Carbon Solution AS has 40 percent, according to a work

Read More »

MP for Truro and Falmouth calls for Cornwall offshore wind strategy

A Labour politician in Cornwall has called for the region to ramp up its domestic offshore wind supply chain. Jayne Kirkham, member of parliament for Truro and Falmouth, said: “At a recent Celtic Sea Power event, I saw just how many brilliant companies are doing amazing things here.” She made the comments months after The Crown Estate entered the second stage of leasing acreage in the Celtic Seas last autumn. “Cornwall has a long history of industrial innovation,” Kirkham said while meeting with marine construction firm MintMech in Penryn. “We’ve got the heritage and the expertise, now we need a strategy that ensures Cornwall maximises the benefits of offshore wind.” The Crown Estate entered the latest phase in its fifth offshore wind leasing round to establish floating offshore wind farms in the Celtic Sea, off the south-west of England and South Wales coast, in August. The second phase of the leasing round was launched, in which bidders must lay out plans to deliver new wind farms and explain how they will benefit local communities. The round has the potential to source up to 4.5GW of new wind capacity and spur investment in the local supply chain. Kirkham expressed hope that Cornish companies will soon be busy on UK projects. She said there are ongoing conversations with the National Energy System Operator (NESO) about ensuring potential wind energy hubs are well connected to the grid. The minister also referenced The Crown Estate’s £50 million Supply Chain Development Fund, which was launched to ensure the UK is prepared to meet offshore wind demands. The first £5m from the fund was awarded in 2024. Kirkham met with directors of Penryn-based marine construction firm MintMech in Jubilee Wharf to discuss the role Cornwall can play in the expansion of the UK’s offshore wind industry.

Read More »

Payroll in USA Oil and Gas Totals $168 Billion in 2024

Payroll in the U.S. oil and gas industry totaled $168 billion in 2024. That’s what the Texas Independent Producers & Royalty Owners Association (TIPRO) said in its latest State of Energy report, which was released this week, highlighting that this figure was “an increase of nearly $5 billion compared to the previous year”. Texas had the highest oil and gas payroll in the country in 2024, according to the report, which pointed out that this figure stood at $62 billion. The report outlined that California was “a distant second” with an oil and gas payroll figure of $15 billion, and that Louisiana was third, with an oil and gas payroll figure of $10 billion. Gasoline Stations with Convenience Stores had the highest U.S. oil and gas payroll by industry figure last year, at $26.8 billion, the report showed. Support Activities for Oil and Gas Operations had the second highest U.S. oil and gas payroll by industry figure in 2024, at $23.9 billion, and Crude Petroleum Extraction had the third highest, at $19.1 billion, the report outlined. The number of U.S. oil and gas businesses totaled 165,110, subject to revisions, TIPRO’s latest report stated. It highlighted that direct oil and natural gas Gross Regional Product exceeded $1 trillion last year and said the U.S. oil and natural gas industry purchased goods and services from over 900 different U.S. industry sectors in the amount of $865 billion in 2024. According to the report, Texas had the highest number of oil and gas businesses in the nation last year, with 23,549. This was followed by California, with 9,486 oil and gas businesses, Florida, with 7,695 oil and gas businesses, Georgia, with 6,453 oil and gas businesses, and New York, with 5,768 oil and gas businesses, the report outlined. The report noted that, in

Read More »

European Commission Proposes to Extend Gas Storage Regulation to 2027

The European Commission has formally proposed to prolong to 2027 a regulation requiring that natural gas storage facilities in the European Union be at least 90 percent full by November each year. “In the current geopolitical context and volatile situation in the global gas markets, this 2-year extension will contribute to ensuring continued security of energy supply across the EU and stability of the European gas market”, the Commission said in an online statement. “It will notably ensure that the EU prepares for the upcoming winter seasons in a coordinated manner”. The Gas Storage Regulation was adopted June 2022 at the height of the energy crisis. It will expire at the end of 2025. “The EU’s significant gas storage capacities and EU storage facilities are the main supply source of gas in winter, ensuring 30 percent of EU winter supply”, the Commission added. “Enabling companies to purchase and store cheaper gas in summer, when demand is lower in the EU, helps to make energy more affordable for EU citizens”. The regulation has helped the 27-member bloc resolve gas shortages and provided a cushion against market uncertainties and price volatility, says the proposal published on the Commission’s website. While the current situation has improved compared to 2022-23, “the European gas market remains tight”, thus the need to continue storing gas, according to the proposal. “More intense competition for global LNG supplies can increase Member States’ exposure to price volatility”, the proposal says. “The gas price development during the 2024/2025 winter may confirm the trend”. The regulation contains intermediary filling targets for the months of February, May, July and September. “Predictable filling trajectories increase transparency and prevent market distortion”, the proposal says. According to the intermediary targets this year, as announced by the Commission November 29, 2024, most member states including top gas consumers Germany, Italy and

Read More »

EnQuest launches bid for Serica Energy

Serica Energy and EnQuest are in discussions about a possible transaction to combine the two companies. According to a company statement, the board of Serica believes that a combined company will have greater scale and diversification, as well as unlocking synergies and providing a stronger platform for further growth. Although discussions are ongoing, the transaction will likely be structured as an all share offer by EnQuest for Serica by way of a reverse takeover under the UK Listing Rules. This would involve a return of capital to existing Serica shareholders conditional upon completion of the transaction, and that Serica shareholders would hold a majority of the shares in the enlarged company with shares listed on the equity shares (commercial companies) (ESCC) of the London Stock Exchange. The Serica Energy statement added that there is currently no certainty either that an offer will be made, nor as to the terms on which such offer will be made. EnQuest will need to make a firm intention of an offer for Serica or that it does not intend to make such an offer by 4 April 2025, though this deadline can be extended. Recommended for you Saipem and Subsea7 unveil €20bn merger plan

Read More »

Aramco CEO Says Company Is Deploying AI Tech at Scale

In Saudi Aramco’s latest results statement, which was posted on the company’s site this week, Aramco President and CEO Amin H. Nasser said Aramco is “adopting and deploying AI technologies and solutions at scale” across its operations. Nasser outlined in the statement that this is “unlocking greater efficiencies and value creation throughout” the company. In a statement posted on Aramco’s site in September, Aramco’s EVP of Technology & Innovation, Ahmad Al-Khowaiter, said “new digital technologies such as generative AI and the Industrial Internet of Things are expected to transform not only how we work, but also our commercial environment”. “Aramco is pioneering the use of these technologies at an industrial scale to add significant value across our operations. Our history of innovation inspires us to continue harnessing emerging technologies and help realize the Kingdom’s ambitions to become a global AI leader,” he added. The statement posted on Aramco’s site in September noted that, during the Global AI Summit (GAIN) – which took place in Riyadh, Saudi Arabia, in September 2024 – Aramco signed Memoranda of Understanding (MoU) with Cerebras Systems and FuriosaAI to explore collaboration in the supercomputing and AI domains. It said another MoU signed with Rebellions focuses on potential deployment of the latter’s Neural Processing Unit chips in Aramco’s data centers, with a view to enhancing digital infrastructure and driving advanced AI innovations, and noted that Aramco signed another MoU with SambaNova Systems to explore ways to accelerate AI capabilities, innovation, and Kingdom-wide adoption. “Aramco also announced the deployment of an AI supercomputer, one of the first systems of its kind in the region,” Aramco added in that statement. “Powered by some of the most powerful NVIDIA Graphical Processing Units (GPUs), it is designed to accelerate complex computing tasks like analyzing drilling plans and geological data to recommend

Read More »

Seven important trends in the server sphere

The pace of change around server technology is advancing considerably, driven by hyperscalers but spilling over into the on-premises world as well. There are numerous overall trends, experts say, including: AI Everything: AI mania is everywhere and without high power hardware to run it, it’s just vapor. But it’s more than just a buzzword, it is a very real and measurable trend. AI servers are notable because they are decked out with high end CPUs, GPU accelerators, and oftentimes a SmartNIC network controller.  All the major players — Nvidia, Supermicro, Google, Asus, Dell, Intel, HPE — as well as smaller vendors are offering purpose-built AI hardware, according to a recent Network World article. AI edge server growth: There is also a trend towards deploying AI edge servers. The Global Edge AI Servers Market size is expected to be worth around $26.6 Billion by 2034, from $2.7 Billion in 2024, according to a Market.US report. Considerable amounts of data are collected on the edge.  Edge servers do the job of culling the useless data and sending only the necessary data back to data centers for processing. The market is rapidly expanding as industries such as manufacturing, automotive, healthcare, and retail increasingly deploy IoT devices and require immediate data processing for decision-making and operational efficiency, according to the report. Liquid cooling gains ground: Liquid cooling is inching its way in from the fringes into the mainstream of data center infrastructure. What was once a difficult add-on is now becoming a standard feature, says Jeffrey Hewitt, vice president and analyst with Gartner. “Server providers are working on developing the internal chassis plumbing for direct-to-chip cooling with the goal of supporting the next generation of AI CPUs and GPUs that will produce high amounts of heat within their servers,” he said.  New data center structures: Not

Read More »

Data center vacancies hit historic lows despite record construction

The growth comes despite considerable headwinds facing data center operators, including higher construction costs, equipment pricing, and persistent shortages in critical materials like generators, chillers and transformers, CRBE stated. There is a considerable pricing disparity between newly built data centers and legacy facilities, reflecting the premium placed on modern, energy-efficient infrastructure. Specifically, liquid/immersion cooling is preferred over air cooling for modern server requirements, CRBE found. On the networking side of things, major telecom companies made substantial investments in fiber in the second half of 2024, reflecting the growing need for more network infrastructure and capacity to accommodate growing demand from AI and data providers. There have also been many notable deals recently: AT&T’s multi-year, $1 billion agreement with Corning to provide next-generation fiber, cable and connectivity solutions; Comcast’s proposed acquisition of Nitel; Verizon’s agreement to acquire Frontier, the largest pure-play fiber internet provider in the U.S.; and T-Mobile’s entry into the fiber internet market via partnerships with fiber-optic providers. In the quarter, Meta announced plans for a 25,000-mile undersea fiber cable that would connect the U.S. East and West coasts with global markets across the Atlantic, Indian and Pacific oceans. The project would mark the first privately owned and operated global fiber cable network. Data Center Outlook

Read More »

AI driving a 165% rise in data center power demand by 2030

Goldman Sachs Research estimates the power usage by the global data center market to be around 55 gigawatts, which breaks down as 54% for cloud computing workloads, 32% for traditional line of business workloads and 14% for AI. By 2027, that number jumps to 84 GW, with AI growing to 27% of the overall market, cloud dropping to 50%, and traditional workloads falling to 23%, Schneider stated. Goldman Sachs Research estimates that there will be around 122 GW of data center capacity online by the end of 2030, and the density of power use in data centers is likely to grow as well, from 162 kilowatts per square foot to 176 KW per square foot in 2027, thanks to AI, Schneider stated.  “Data center supply — specifically the rate at which incremental supply is built — has been constrained over the past 18 months,” Schneider wrote. These constraints have arisen from the inability of utilities to expand transmission capacity because of permitting delays, supply chain bottlenecks, and infrastructure that is both costly and time-intensive to upgrade. The result is that due to power demand from data centers, there will need to be additional utility investment, to the tune of about $720 billion of grid spending through 2030. And then they are subject to the pace of public utilities, which move much slower than hyperscalers. “These transmission projects can take several years to permit, and then several more to build, creating another potential bottleneck for data center growth if the regions are not proactive about this given the lead time,” Schneider wrote.

Read More »

Top data storage certifications to sharpen your skills

Organization: Hitachi Vantara Skills acquired: Knowledge of data center infrastructure management tasks automation using Hitachi Ops Center Automator. Price: $100 Exam duration: 60 minutes How to prepare: Knowledge of all storage-related operations from an end-user perspective, including planning, allocating, and managing storage and architecting storage layouts. Read more about Hitachi Vantara’s training and certification options here. Certifications that bundle cloud, networking and storage skills AWS Certified Solutions Architect – Professional The AWS Certified Solutions Architect – Professional certification from leading cloud provider Amazon Web Services (AWS) helps individuals showcase advanced knowledge and skills in optimizing security, cost, and performance, and automating manual processes. The certification is a means for organizations to identify and develop talent with these skills for implementing cloud initiatives, according to AWS. The ideal candidate has the ability to evaluate cloud application requirements, make architectural recommendations for deployment of applications on AWS, and provide expert guidance on architectural design across multiple applications and projects within a complex organization, AWS says. Certified individuals report increased credibility with technical colleagues and customers as a result of earning this certification, it says. Organization: Amazon Web Services Skills acquired: Helps individuals showcase skills in optimizing security, cost, and performance, and automating manual processes Price: $300 Exam duration: 180 minutes How to prepare: The recommended experience prior to taking the exam is two or more years of experience in using AWS services to design and implement cloud solutions Cisco Certified Internetwork Expert (CCIE) Data Center The Cisco CCIE Data Center certification enables individuals to demonstrate advanced skills to plan, design, deploy, operate, and optimize complex data center networks. They will gain comprehensive expertise in orchestrating data center infrastructure, focusing on seamless integration of networking, compute, and storage components. Other skills gained include building scalable, low-latency, high-performance networks that are optimized to support artificial intelligence (AI)

Read More »

Netskope expands SASE footprint, bolsters AI and automation

Netskope is expanding its global presence by adding multiple regions to its NewEdge carrier-grade infrastructure, which now includes more than 75 locations to ensure processing remains close to end users. The secure access service edge (SASE) provider also enhanced its digital experience monitoring (DEM) capabilities with AI-powered root-cause analysis and automated network diagnostics. “We are announcing continued expansion of our infrastructure and our continued focus on resilience. I’m a believer that nothing gets adopted if end users don’t have a great experience,” says Netskope CEO Sanjay Beri. “We monitor traffic, we have multiple carriers in every one of our more than 75 regions, and when traffic goes from us to that destination, the path is direct.” Netskope added regions including data centers in Calgary, Helsinki, Lisbon, and Prague as well as expanded existing NewEdge regions including data centers in Bogota, Jeddah, Osaka, and New York City. Each data center offers customers a range of SASE capabilities including cloud firewalls, secure web gateway (SWG), inline cloud access security broker (CASB), zero trust network access (ZTNA), SD-WAN, secure service edge (SSE), and threat protection. The additional locations enable Netskope to provide coverage for more than 220 countries and territories with 200 NewEdge Localization Zones, which deliver a local direct-to-net digital experience for users, the company says.

Read More »

Inside the Nuclear Race for Data Center Energy with Aalo Atomics CEO Matt Loszak

The latest episode of the DCF Show podcast delves into one of the most pressing challenges facing the data center industry today: the search for sustainable, high-density power solutions. And how, as hyperscale operators like Google and Meta contend with growing energy demands—and, in some cases, resistance from utilities unwilling or unable to support their expanding footprints—the conversation around nuclear energy has intensified.  Both legacy nuclear providers and innovative startups are racing to secure the future business of data center giants, each bringing unique approaches to the table. Our guest for this podcast episode is Matt Loszak, co-founder and CEO of Aalo Atomics, an Austin-based company that’s taking a fresh approach to nuclear energy. Aalo, which secured a $29.5 million Series A funding round in 2024, stands out in the nuclear sector with its 10-megawatt sodium-cooled reactor design—eliminating the need for water, a critical advantage for siting flexibility. Inspired by the Department of Energy’s MARVEL microreactor, Aalo’s technology benefits from direct expertise, as the company’s CTO was the chief architect behind MARVEL. Beyond reactor design, Aalo’s vision extends to full-scale modular plant production. Instead of just building reactors, the company aims to manufacture entire nuclear plants using prefabricated, LEGO-style components. The fully modular plants, shipped in standard containers, are designed to match the footprint of a data center while requiring no onsite water—features that could make them particularly attractive to hyperscale operators seeking localized, high-density power.  Aalo has already made significant strides, with the Department of Energy identifying land at Idaho National Laboratory (INL) as a potential site for its first nuclear facility. The company is on an accelerated timeline, expecting to complete a non-nuclear prototype within three months and break ground on its first nuclear reactor in about a year—remarkably fast progress for the nuclear industry. In our discussion,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »