Stay Ahead, Stay ONMINE

China Gas Sector Lobbies for More Power Plants to Boost Demand

China’s natural gas producers are lobbying Beijing to increase the number of power plants that run on the fuel, in a bid to help prop up faltering demand. The power sector – which currently accounts for 18 percent of China’s gas consumption – is viewed by the industry as a key engine of growth, according to people involved […]

China’s natural gas producers are lobbying Beijing to increase the number of power plants that run on the fuel, in a bid to help prop up faltering demand.

The power sector – which currently accounts for 18 percent of China’s gas consumption – is viewed by the industry as a key engine of growth, according to people involved in advising on energy policy. Under the sector’s latest proposal, China would build nearly 70 gigawatts of new gas-fired capacity by 2030, an almost 50 percent increase from 2025’s estimated level, they said, asking not to be named as the plan is not public.

The government has started collecting proposals as it drafts China’s next five-year plan, which will be ratified by the National People’s Congress in March 2026. The strategic blueprint will outline economy-wide targets that balance growth, decarbonization and energy security goals. 

China’s gas demand, once fast-expanding, has slowed over the last few years due to weaker industrial activity, booming renewable-energy supply and a continued reliance on coal. An unseasonably warm winter and strong inventories have prompted analysts to cut forecasts for China’s imports of liquefied natural gas in 2025, with deliveries slated to fall compared to the previous year.

For domestic drillers, which have increasingly leaned on gas as oil consumption too stutters, expanding the amount that can be sold to the power sector offers a way to offset weaker growth in heating and elsewhere. Slowing urbanization and improved air quality have essentially ended a decade-long coal-to-gas transition among households.

China is advancing energy market reforms that will favor more cost-effective sources of electricity generation. Although gas power is more expensive than solar, which currently trades at less than half the price, it’s able to ramp up more quickly than baseload coal or nuclear. That agility could secure the fuel a larger role in the power mix, even if clean energy ultimately dominates and limits utilization rates.

Any potential buildout still faces major hurdles. Seaborne gas imports are prohibitively expensive compared to domestic coal or renewable sources. Domestic alternatives have to wrestle with the technical challenges and expense of tapping harder-to-reach shale and coalbed methane reserves. Storage constraints also limit gas’s ability to play a larger role in meeting peak demand and handle seasonal spikes.



WHAT DO YOU THINK?

Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Kyndryl, AWS unwrap AI-driven mainframe migration service

Kyndryl is tapping into agentic AI technology from AWS to offer enterprise customers another way to integrate with or move mainframe applications to the AWS Cloud platform. Specifically, Kyndryl will offer an implementation service based on the recently announced AWS Transform package, which analyzes mainframe code bases, decomposes them into

Read More »

IBM claims to have ‘only realistic path’ to quantum computing

Error correction is the biggest obstacle to practical quantum computing. Quantum computing companies typically address error correction with redundant qubits, but with previous approaches the number of these redundant qubits would grow much faster than the number of total usable qubits, keeping the computers from reading any useful size. Earlier

Read More »

El Capitan retains No. 1 supercomputer ranking

The El Capitan system at Lawrence Livermore National Laboratory in California maintained its title as the world’s fastest supercomputer, according to the latest TOP500 list released today. U.S.-based systems also won the second and third spots with long-time list leader Frontier at No. 2 and the Aurora system at No.

Read More »

NJ Wind Project That Trump Dubbed a Disaster Is Canceled

The developer of a New Jersey offshore wind farm designed to power more than 700,000 homes has pulled the plug on the project, the latest setback for a sector whose prospects have worsened under the Trump administration. “This filing marks the closing of a chapter,” Atlantic Shores Offshore Wind Chief Executive Officer Joris Veldhoven said in a statement Monday. Last week the company filed a motion with regulators to terminate the renewable energy certificates the project had been awarded. The certificates would have underpinned the project’s finances.  Long seen as a key resource to power major Northeast and Mid-Atlantic cities, offshore wind has struggled with inflation, supply constraints and now an administration intent on slowing its advance. The new president has long been critical of the technology and took particular aim at Atlantic Shores. A few days after his election, Trump called Atlantic Shores a disaster and halted federal funding of wind projects.  Atlantic Shores, planned about 8.7 miles (14 kilometers) off the coast, was meant to encompass as many as 200 wind turbines and provide as much as 2.8 gigawatts of power.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

Energy Secretary Wright Testifies Before House Energy Subcommittee on FY2026 Budget Request

WASHINGTON— U.S. Secretary of Energy Chris Wright testified today before the U.S. House Energy Subcommittee on the Department of Energy’s Fiscal Year 2026 budget request. Last month, Secretary Wright testified before the U.S. Senate Appropriations Subcommittee on Energy and Water Development and the U.S. House Appropriations Subcommittee on Energy and Water Development to outline the Department’s priorities and provide an overview of the FY2026 request. The FY2026 Budget aligns with President Trump’s directive to restore American energy dominance and rein in bloated federal spending. It brings non-defense discretionary spending to the most disciplined level since 2017 and redirects more than $15 billion away from Green New Scam programs that drive up costs and weaken the U.S. energy system. For more details, view the budget toplines here. Secretary Wright’s opening remarks: Thank you Chairman Latta, Chairman Guthrie, Ranking Member Castor, and Ranking Member Pallone, and Members of the Committee. It is an honor to appear before you today as Secretary of Energy to discuss the President’s Fiscal Year 2026 Budget request for the Department of Energy.    Under President Trump’s leadership, our priorities for the Department are clear – to unleash a golden era of American energy dominance, strengthen our national security, and lead the world in innovation. A reliable and abundant energy supply is the foundation of a strong and prosperous nation. When America leads in energy, we lead in prosperity, security and human flourishing.   America has a historic opportunity to secure our energy systems, propel scientific and technological innovation, including AI; maintain and strengthen our weapons stockpiles; and meet Cold War legacy waste commitments. The Department of Energy will advance this critical mission while cutting red tape, increasing efficiency, and ensuring we are better stewards of taxpayer dollars.    The President’s Fiscal Year 26 budget will ensure taxpayer

Read More »

Atlantic Shores Offshore Wind asks to terminate project’s renewable energy credits

Atlantic Shores Offshore Wind, the developer behind the 1.5-GW Atlantic Shores 1 wind energy project in progress offshore New Jersey, filed a petition Wednesday asking the state’s Board of Public Utilities to terminate the project’s offshore renewable energy credits and release it from all associated obligations. The petition cites President Donald Trump’s executive order pausing the development of offshore wind, along with other actions from his administration and the overall macroeconomic environment as reasons that “the project is no longer viable upon the terms and conditions set forth in the [Offshore Wind Renewable Energy Certificate] Order,” which was issued in 2021. “This is despite Petitioner’s diligent and good faith efforts to advance the Project toward completion as set forth in Section II of this Petition, including submission of a rebid of the Project in the Fourth Solicitation, which was concluded by the BPU without an award,” the petition said. However, Atlantic Shores Offshore Wind’s CEO Joris Veldhoven said in a Monday release that the project is seeking a “reset period” and that “this filing marks the closing of a chapter, but not the end for Atlantic Shores.” The project in March had its Clean Air Act permit remanded by the Environmental Protection Agency’s Environmental Appeals Board, an action which Atlantic Shores Offshore Wind argued lacked “good cause.” The board maintained that it has “broad discretion to grant a voluntary remand.” The uncertainty caused by Trump’s executive order, the project’s loss of its air permit, “and other actions taken by the current administration more generally” mean that the “Petitioner’s parent company has been forced to materially reduce its personnel, terminate contracts, and cancel planned project investments.” The project was launched as a joint venture between EDF Renewables and Shell. In January, Shell booked a $1 billion impairment associated with the project, which

Read More »

US utility-scale energy storage to double, reach 65 GW by 2027: EIA

Utility-scale battery storage in the United States is poised to more than double over the next two years and will close out 2026 at nearly 65 GW — a rapid rise from 17 GW in the first quarter of 2024. The U.S. Energy Information Administration published its Short Term Energy Outlook on Tuesday, showing a rapid increase in anticipated energy storage resources. Counting projects larger than 1 MW in the electric power sector, EIA said domestic storage capacity will rise from about 28 GW at the end of Q1’25 to 64.9 GW at the end of 2026.  Large-scale battery storage resources in the commercial and industrial sectors are expected to rise from about 100 MW to about 300 MW across the same time frame. EIA’s data shows annual U.S. electricity generation rising from 4,180 billion kWh in 2023 to 4,490 billion kWh in 2026. Monthly electricity demand is highest during the summer months in the United States, but despite a cooling degree forecast in line with last year’s weather EIA said it expects total U.S. electricity generation this summer will increase by 1%, or 14 billion kWh, compared with summer 2024.  “We expect strong demand growth in the commercial sector because of the expansion of data centers,” EIA said. EIA anticipates natural gas-fired power plants between June and September 2025 will generate 3% less than they did last summer “because of higher natural gas prices and the continuing increase in new solar generating capacity.” Domestic solar generation this summer will grow by 33%, EIA said, and improving water supply in the western states will contribute to a 6% increase in U.S. hydroelectric generation.

Read More »

MISO submits revised fast-track interconnection process for FERC approval

The Midcontinent Independent System Operator on Friday asked the Federal Energy Regulatory Commission to approve a fast-track generation interconnection process after the agency rejected its initial plan last month. MISO runs the grid and wholesale power markets in 15 states and part of Canada, from Louisiana to Manitoba. In response to FERC’s May 16 decision, MISO set more limits around its proposed Expedited Resource Addition Study process, which would allow projects that meet certain criteria to avoid MISO’s generation interconnection queue and potentially receive an interconnection agreement within 90 days. Under the revised proposal, the ERAS process would be capped at 68 projects, with MISO studying only 10 projects each quarter. The proposed cap includes eight projects for the restructured states of Illinois and Michigan as well as 10 projects for independent power producers with power supply agreements with non-utility entities. The revised process requires “relevant electric retail regulatory authorities,” such as state utility commissions, to verify in writing that a proposed project will address either a load addition not already planned for or an identified resource adequacy deficiency within its jurisdictional footprint, MISO said in its filing. The new proposal also requires projects to have a commercial operation date within three years of an application being filed. MISO plans to end the ERAS process by Aug. 31, 2027, or after its 68-project cap is met, whichever happens first. The proposal addresses potential capacity shortfalls driven by load growth and power plant retirements, according to MISO, which pointed to various reports to support its claims. “These surveys and forecasts demonstrate that MISO must address resource adequacy and reliability needs in the next five years and that an accelerated interconnection study process must be established to address time-critical challenges,” the grid operator told FERC. The ERAS process will provide a bridge to 2028,

Read More »

Utility demand response messages do not need additional customer consent: FCC

Electric utilities can make calls and send demand response text messages to customers who have provided their phone number, the Federal Communications Commission said in a Monday declaratory ruling. “When a consumer gives a utility their phone number they give their prior express consent to receive non-telemarketing demand response calls and texts,” the commission wrote. “We confirm that demand response communications are closely related to utility service and thus within the scope of consent customers give when they provide their phone number to their utility,” FCC said. “We thus make clear that a utility can make critical, time-sensitive demand response communications to its customers rather than having to solicit additional consent from each customer, a burdensome and often costly undertaking.” The Edison Electric Institute, which represents investor-owned utilities, requested in March that the FCC clarify that utilities have prior express consent under the Telephone Consumer Protection Act, and utilities can make demand response communications without risking penalties or fines. EEI “appreciates the FCC allowing electric companies to deliver timely and clear communications that help customers conserve electricity — and save money — at times of extremely high demand,” Senior Vice President of Customer Solutions Phil Dion said in a statement. “As summer begins and Americans use more electricity to cool their homes and businesses, demand response programs allow grid operators to maintain reliability while helping to keep customer bills as low as possible.” Utilities see demand response as a key resource to meet growing electricity demand. EEI told the FCC in its March request that the clarification would help “ensure that utilities can continue to provide safe, reliable, and efficient service to customers.” U.S. electricity demand could grow 25% from 2023 to 2030, ICF said in a May report. And utility demand-side management programs, which include demand response, could be able to meet

Read More »

Cisco reinvigorates data center, campus, branch networking with AI demands in mind

“We have a number of … enterprise data center customers that have been using bi-directional optics for many generations, and this is the next generation of that feature,” said Bill Gartner, senior vice president and general manager of Cisco’s optical systems and optics business. “The 400G lets customer use their existing fiber infrastructure and reduces fiber count for them so they can use one fiber instead of two, for example,” Gartner said. “What’s really changed in the last year or so is that with AI buildouts, there’s much, much more optics that are part of 400G and 800G, too. For AI infrastructure, the 400G and 800G optics are really the dominant optics going forward,” Gartner said. New AI Pods Taking aim at next-generation interconnected compute infrastructures, Cisco expanded its AI Pod offering with the Nvidia RTX 6000 Pro and Cisco UCS C845A M8 server package. Cisco AI Pods are preconfigured, validated, and optimized infrastructure packages that customers can plug into their data center or edge environments as needed. The Pods include Nvidia AI Enterprise, which features pretrained models and development tools for production-ready AI, and are managed through Cisco Intersight. The Pods are based on Cisco Validated Design principals, which offer customers pre-tested and validated network designs that provide a blueprint for building reliable, scalable, and secure network infrastructures, according to Cisco. Building out the kind of full-scale AI infrastructure compute systems that hyperscalers and enterprises will utilize is a huge opportunity for Cisco, said Daniel Newman, CEO of The Futurum Group. “These are full-scale, full-stack systems that could land in a variety of enterprise and enterprise service application scenarios, which will be a big story for Cisco,” Newman said. Campus networking For the campus, Cisco has added two new programable SiliconOne-based Smart Switches: the C9350 Fixed Access Smart Switches and C9610

Read More »

Qualcomm’s $2.4B Alphawave deal signals bold data center ambitions

Qualcomm says its Oryon CPU and Hexagon NPU processors are “well positioned” to meet growing demand for high-performance, low-power compute as AI inferencing accelerates and more enterprises move to custom CPUs housed in data centers. “Qualcomm’s advanced custom processors are a natural fit for data center workloads,” Qualcomm president and CEO Cristiano Amon said in the press release. Alphawave’s connectivity and compute technologies can work well with the company’s CPU and NPU cores, he noted. The deal is expected to close in the first quarter of 2026. Complementing the ‘great CPU architecture’ Qualcomm has been amassing Client CPUs have been a “big play” for Qualcomm, Moor’s Kimball noted; the company acquired chip design company Nuvia in 2021 for $1.4 billion and has also announced that it will be designing data center CPUs with Saudi AI company Humain. “But there was a lot of data center IP that was equally valuable,” he said. This acquisition of Alphawave will help Qualcomm complement the “great CPU architecture” it acquired from Nuvia with the latest in connectivity tools that link a compute complex with other devices, as well as with chip-to-chip communications, and all of the “very low level architectural goodness” that allows compute cores to deliver “absolute best performance.” “When trying to move data from, say, high bandwidth memory to the CPU, Alphawave provides the IP that helps chip companies like Qualcomm,” Kimball explained. “So you can see why this is such a good complement.”

Read More »

LiquidStack launches cooling system for high density, high-powered data centers

The CDU is serviceable from the front of the unit, with no rear or end access required, allowing the system to be placed against the wall. The skid-mounted system can come with rail and overhead piping pre-installed or shipped as separate cabinets for on-site assembly. The single-phase system has high-efficiency dual pumps designed to protect critical components from leaks and a centralized design with separate pump and control modules reduce both the number of components and complexity. “AI will keep pushing thermal output to new extremes, and data centers need cooling systems that can be easily deployed, managed, and scaled to match heat rejection demands as they rise,” said Joe Capes, CEO of LiquidStack in a statement. “With up to 10MW of cooling capacity at N, N+1, or N+2, the GigaModular is a platform like no other—we designed it to be the only CDU our customers will ever need. It future-proofs design selections for direct-to-chip liquid cooling without traditional limits or boundaries.”

Read More »

Enterprises face data center power design challenges

” Now, with AI, GPUs need data to do a lot of compute and send that back to another GPU. That connection needs to be close together, and that is what’s pushing the density, the chips are more powerful and so on, but the necessity of everything being close together is what’s driving this big revolution,” he said. That revolution in new architecture is new data center designs. Cordovil said that instead of putting the power shelves within the rack, system administrators are putting a sidecar next to those racks and loading the sidecar with the power system, which serves two to four racks. This allows for more compute per rack and lower latency since the data doesn’t have to travel as far. The problem is that 1 mW racks are uncharted territory and no one knows how to manage the power, which is considerable now. ”There’s no user manual that says, hey, just follow this and everything’s going to be all right. You really need to push the boundaries of understanding how to work. You need to start designing something somehow, so that is a challenge to data center designers,” he said. And this brings up another issue: many corporate data centers have power plugs that are like the ones that you have at home, more or less, so they didn’t need to have an advanced electrician certification. “We’re not playing with that power anymore. You need to be very aware of how to connect something. Some of the technicians are going to need to be certified electricians, which is a skills gap in the market that we see in most markets out there,” said Cordovil. A CompTIA A+ certification will teach you the basics of power, but not the advanced skills needed for these increasingly dense racks. Cordovil

Read More »

HPE Nonstop servers target data center, high-throughput applications

HPE has bumped up the size and speed of its fault-tolerant Nonstop Compute servers. There are two new servers – the 8TB, Intel Xeon-based Nonstop Compute NS9 X5 and Nonstop Compute NS5 X5 – aimed at enterprise customers looking to upgrade their transaction processing network infrastructure or support larger application workloads. Like other HPE Nonstop systems, the two new boxes include compute, software, storage, networking and database resources as well as full-system clustering and HPE’s specialized Nonstop operating system. The flagship NS9 X5 features support for dual-fabric HDR200 InfiniBand interconnect, which effectively doubles the interconnect bandwidth between it and other servers compared to the current NS8 X4, according to an HPE blog detailing the new servers. It supports up to 270 networking ports per NS9 X system, can be clustered with up to 16 other NS9 X5s, and can support 25 GbE network connectivity for modern data center integration and high-throughput applications, according to HPE.

Read More »

AI boom exposes infrastructure gaps: APAC’s data center demand to outstrip supply by 42%

“Investor confidence in data centres is expected to strengthen over the remainder of the decade,” the report said. “Strong demand and solid underlying fundamentals fuelled by AI and cloud services growth will provide a robust foundation for investors to build scale.” Enterprise strategies must evolve With supply constrained and prices rising, CBRE recommended that enterprises rethink data center procurement models. Waiting for optimal sites or price points is no longer viable in many markets. Instead, enterprises should pursue early partnerships with operators that have robust development pipelines and focus on securing power-ready land. Build-to-suit models are becoming more relevant, especially for larger capacity requirements. Smaller enterprise facilities — those under 5MW — may face sustainability challenges in the long term. The report suggested that these could become “less relevant” as companies increasingly turn to specialized colocation and hyperscale providers. Still, traditional workloads will continue to represent up to 50% of total demand through 2030, preserving value in existing facilities for non-AI use cases, the report added. The region’s projected 15 to 25 GW gap is more than a temporary shortage — it signals a structural shift, CBRE said. Enterprises that act early to secure infrastructure, invest in emerging markets, and align with power availability will be best positioned to meet digital transformation goals. “Those that wait may find themselves locked out of the digital infrastructure they need to compete,” the report added.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »