Stay Ahead, Stay ONMINE

IBM wrangles AI agents to work across complex enterprise environments

In addition, the new Agent Catalog in watsonx Orchestrate can simplify access to more than 150 agents and pre-built tools from IBM and its partners, which include Box, MasterCard, Oracle, Salesforce, ServiceNow, and Symplistic.ai. IBM is also rolling out an agent builder tool in June that will let customers build their own agents in less than five minutes, IBM stated. […]

In addition, the new Agent Catalog in watsonx Orchestrate can simplify access to more than 150 agents and pre-built tools from IBM and its partners, which include Box, MasterCard, Oracle, Salesforce, ServiceNow, and Symplistic.ai.

IBM is also rolling out an agent builder tool in June that will let customers build their own agents in less than five minutes, IBM stated.

Multi-agent orchestration capabilities will take things a step further, allowing AI agents to work in concert and enabling advanced, agent-to-agent collaboration. “This allows specialized agents, whether built in-house, by partners, or using open source, to share information, and tackle complex, multi-step processes together. Companies can embed these capabilities into their agentic systems to analyze user requests and route instructions across the right agents, assistants, and skills in real time,” Gunnar wrote.

watsonx Orchestrate enables collaboration across agents and with companies’ existing technology investments, including their existing automations, APIs, data sources, and core applications, Gunnar stated.

On the monitoring front, IBM is offering tools to monitor AI performance and reliability and to help scale AI resources, according to Gunner: “For instance, we have tools that help companies evaluate and select which AI models to use based on specific goals like cost-efficiency or performance,” she wrote. “We’re also developing tools that help with the discovery and orchestration of agents and tools, enabling tasks to be designed and executed more effectively. Additionally, our industry-leading AI governance capabilities can give companies greater visibility and control over agentic systems, helping manage accuracy, performance and risk.”

Another new integration solution, webMethods Hybrid Integration, is a platform that automates the integration of applications, APIs and data regardless of their location. The idea is to let agents work with and automate hybrid workflows, IBM stated. 

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

IBM introduces new generation of LinuxOne AI mainframe

In addition to generative AI applications, new multiple model AI approaches are engineered to enhance prediction and accuracy in many industry use cases like advanced fraud detection, image processing and retail automation, according to IBM. LinuxONE Emperor 5 also comes with advanced security features specifically designed for the AI threat

Read More »

Juniper extends Mist AI observability, performance management capabilities

“Unlike traditional solutions for digital twinning and synthetic testing, Marvis Minis don’t require manual configuration or any additional hardware or software. They are digital experience twins, now client-to-cloud available on all Juniper full-stack devices,” according to a data sheet from Juniper. “Marvis Minis are always on and constantly ingesting user

Read More »

StorONE and Phison partner for AI-native on-premises storage platform

The idea is to let customers to train the largest language models — up to several billion parameters — on-premises with less hardware required while offering a built-in conversational chatbot that helps users manage, query, and optimize storage configurations and performance in plain English.  “In today’s AI-first world, infrastructure must evolve beyond raw

Read More »

ISO New England prepares for 11% rise in annual electricity consumption

The electrification of heating systems and the adoption of electric vehicles are expected to drive New England annual electricity consumption to rise 11% over the next decade, according to the region’s grid operator. “The ISO is forecasting steady growth in net annual energy use as state policy goals for carbon emissions reductions drive the increased electrification of heating systems and transportation in the region,” the regional grid operator said in a Thursday blog post about its 2025–2034 Forecast Report of Capacity, Energy, Loads, and Transmission. Net annual energy usage in the region has trended down for the last two decades, the ISO said, “mainly due to an increase in energy efficiency from advanced cooling and heating technologies, energy-efficient appliances and lighting, and the increased prevalence of [behind-the-meter solar PV] generation. However, ISO New England is predicting that trend will reverse in the next decade.” The regional grid serves Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island and Vermont.  The report indicates net annual energy use across the region is expected to grow by 1.2% annually through 2034. Behind-the-meter solar will reduce consumption of grid-supplied electricity by 5,555 GWh this year, rising to 11,487 GWh in 2034, the ISO said. Electric vehicles are expected to account for 8,735 GWh of energy use in 2034, while heating electrification is expected to account for 8,049 GWh that year, the grid operator said. Summer net peak demand is also expected to rise by 0.9% annually across the next 10 years. Behind-the-meter solar will reduce the summer peak by an average of 1,870 MW, the grid operator said. “Although in past years the summer peak typically occurred in the afternoon, widespread adoption of BTM PV has effectively pushed the peak toward evening,” ISO New England said. “This means that even as more BTM PV comes on

Read More »

Oil Drops on Powell Rate Comments

Oil slumped after Federal Reserve Chair Jerome Powell said officials aren’t in a hurry to adjust rates and as tensions between the US and China linger ahead of trade negotiations. West Texas Intermediate futures dipped 1.7% to settle near $58 a barrel as the Fed outlook and trade conflict weighed on the prospects for economic growth and energy demand. President Donald Trump said he’s unwilling to preemptively lower tariffs on China, the world’s largest crude importer, to kickstart substantive trade talks. The comments come a day before Treasury Secretary Scott Bessent and Trade Representative Jamieson Greer are set to meet with Chinese officials. In the US, the Federal Reserve held interest rates steady in their meeting Wednesday, and Powell said tariffs could lead to higher inflation and unemployment. “There is a fear that the trade negotiations in Switzerland with China could backfire and turn into a demand destruction event,” said Robert Yawger, director of the energy futures division at Mizuho Securities USA. Market sentiment that a rate-cut isn’t coming anytime soon is also weighing on prices, he added. Oil has trended lower since late January due to escalating trade frictions and plans by OPEC+ to keep boosting idled supply. Also weighing on crude, Trump said the US would stop its bombing campaign against Houthis in Yemen after a ceasefire was facilitated by Oman. That followed earlier comments from Vice President JD Vance that a nuclear deal with Iran could see the OPEC member reintegrated into the global economy. The decline in crude prices will likely lead to falling American shale output, according to Diamondback Energy Inc., the largest US independent oil producer in the Permian Basin. In another sign of the hit to output, the Energy Information Administration cut its forecast for US crude production this year for a second

Read More »

China Halts Imports of US Oil in Negative Sign for Shale Drillers

China halted purchases of US crude in March as trade tensions between the two countries ramped up, casting a shadow over demand for oil produced from American shale formations.  The lack of such purchases by the world’s largest oil buyer compares with imports of 149,000 barrels of American crude a day in February, according to data from the US Census. China has bought US crude in every month since March 2020, except for August last year, when the country cut total purchases because of shaky economic growth. China’s vanishing appetite for US crude is bad news for shale producers, which already are warning that American production is set to decline amid a prolonged period of low oil prices driven by shaky demand and increased OPEC output. US drillers pumped a record 13.2 million barrels a day last year, making the country the world’s largest crude producer. US President Donald Trump has imposed tariffs of as much as 145% on most goods from China, and Beijing has retaliated, slowing trade between the world’s two largest economies.   China already has been buying less crude from the world because of economic uncertainty and weaker fuel demand as motorists switch to electric vehicles and trucks powered by liquefied natural gas. The country, which imported a record 481,000 barrels a day of US crude in 2020, slashed purchases by half last year.  Overall demand for US crude waned in March as refineries carried out planned maintenance ahead of the summer, when fuel demand rises. Exports fell to 4.04 million barrels a day in March, down 5.9% from the previous month.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed. MORE FROM THIS AUTHOR Bloomberg

Read More »

Administrators RSM confirm 162 jobs lost at failed Beam

Shuttered offshore and subsea technology company Beam has appointed RSM UK Restructuring Advisory as its administrators. The group’s partner and co-head of restructuring Damian Webb and restructuring partner David Shambrook will lead work to handle the failed Bristol-headquartered company. Employees of the Beam, formerly Rovco, recently rook to social media to confirm their shock at having suddenly been made redundant, with 162 of its 195 employees having lost their jobs. The administrators are currently exploring ways to maximise returns for the Beam’s creditors and to preserve jobs for staff where possible. Maritime technology and naval architecture consultancy Tymor Marine is one of the company’s creditors. Beam brought them onboard to mobilise some of its vessels, with Tymor designing the fastenings to secure a remote operated vehicle (ROV) to the back of the ship’s deck. Speaking to Energy Voice, Tymor Marine managing director Kevin Moran said that the cost of the work as of the end of March was around £23,000. “But they also owe us money for approvals for some of their equipment, where we had to go through a third-party authority,” he said, accounting for around £4,000 of the figure. Founded in 2015, Beam was formed when Rovco and Vaarst merged in September last year. It provided automated offshore wind operations, including development and maintenance, through the integration of artificial intelligence and machine learning technologies Since launching, the two firms have raised close to £50 million from investors across three funding rounds in 2019, 2022 and 2024. However, reports claim the company’s most recent equity raise was unsuccessful, and an accelerated sale of the business and assets was unfruitful, leading to it entering administration. The company reported an £8.1m loss before tax in 2023, following an £8.7m loss in 2022. In addition to its Bristol office, the company has

Read More »

EPCOR Sells Texas Gas Utility to SiEnergy

EPCOR USA Inc., part of EPCOR Utilities Inc. (EPCOR), has signed a deal to divest its natural gas utility in Texas to SiEnergy Operating, a subsidiary of Northwest Natural Holding Co., for $60 million. Hughes Gas Resources Inc., or EPCOR Texas Gas, has about 6,900 metered connections and 353 miles of pipeline serving 12 communities northeast of Houston. “EPCOR Texas Gas, which has owned and operated the assets since 2017, is expected to have a rate base of approximately $46 million at the end of 2025”, a joint statement said. The parties expect to close the transaction this quarter. “With a similar business model as SiEnergy, Hughes has grown organically by providing infrastructure to residential and commercial developments in high-growth areas surrounding Houston”, commented Northwest Natural chief executive Justin Palfreyman. “EPCOR Texas Gas is a great fit for us as we continue to expand and diversify our overall utility portfolio in Texas”. EPCOR Utilities president John Elford said the sale is part of the company’s drive to focus on water opportunities in the state. “Our customers in Texas will benefit from SiEnergy’s substantial footprint in natural gas utilities while EPCOR continues to focus on growing its water business in the United States”, said Joe Gysel, EPCOR senior vice president for North American commercial services.  Besides SiEnergy, Northwest Natural also owns Northwest Natural Gas Co., which provides natural gas services to around 2 million people in over 140 communities through 800,000 meters in Oregon and Southwest Washington. Another Northwest Natural subsidiary, NW Natural Renewables Holdings, is investing in the production of low-carbon renewable natural gas from food, agricultural and forestry waste, landfills and wastewater. Meanwhile NW Natural Water Co. delivers water and wastewater utility services in Arizona, Idaho, Oregon, Texas and Washington. EPCOR USA provides water, wastewater, wholesale water and natural

Read More »

India’s First Offshore Decommissioning Project Completed

The Panna-Multa and Tapti (PMT) joint venture partners, Shell PLC, Reliance Industries Limited (RIL), and Oil and Natural Gas Corporation Limited (ONGC), have completed India’s first offshore facilities decommissioning project. According to ONGC, the project entailed the removal of mid and south Tapti field facilities, which ceased production in March 2016. The PMT joint venture operates the Tapti fields under a production sharing contract with the Indian government. ONGC holds a 40 percent share, while RIL and BG Exploration & Production India Ltd (BGEPIL-Shell) each hold 30 percent. ONGC said the project also involved the removal of five wellhead platforms, associated infield pipelines, load-in at the onshore dismantling yard and the safe plugging and abandonment of 38 wells – all executed in line with the approved decommissioning plan. “The project’s complexity, especially its proximity to ONGC’s live assets, demanded strategic planning, precise execution, and utmost focus on safety. It marks a defining moment in India’s energy landscape and sets a strong foundation for the next chapter in offshore infrastructure transformation”, Pankaj Kumar, Production Director at ONGC, said. To support the government’s “Make in India” vision, the PMT JV awarded contracts to Indian firms Larsen and Toubro (L&T) for offshore work and Chowgule Shipyard (CLSPL) for onshore dismantling. Offshore operations have been completed, while dismantling is ongoing at CLSPL’s Ratnagiri facilities, ONGC said. The Tapti decommissioning project has also set a benchmark for India’s regulatory and operational framework for offshore decommissioning, developed in collaboration with key stakeholders like MoPNG, DGH, and OISD, following global best practices adapted for local conditions, ONGC said. “The safe and successful completion of the Tapti offshore project is a landmark moment for India’s offshore energy sector. This project sets a new benchmark for responsible decommissioning, made possible by global expertise, strong collaboration, and an unwavering commitment to

Read More »

Liquid cooling technologies: reducing data center environmental impact

“Highly optimized cold-plate or one-phase immersion cooling technologies can perform on par with two-phase immersion, making all three liquid-cooling technologies desirable options,” the researchers wrote. Factors to consider There are numerous factors to consider when adopting liquid cooling technologies, according to Microsoft’s researchers. First, they advise performing a full environmental, health, and safety analysis, and end-to-end life cycle impact analysis. “Analyzing the full data center ecosystem to include systems interactions across software, chip, server, rack, tank, and cooling fluids allows decision makers to understand where savings in environmental impacts can be made,” they wrote. It is also important to engage with fluid vendors and regulators early, to understand chemical composition, disposal methods, and compliance risks. And associated socioeconomic, community, and business impacts are equally critical to assess. More specific environmental considerations include ozone depletion and global warming potential; the researchers emphasized that operators should only use fluids with low to zero ozone depletion potential (ODP) values, and not hydrofluorocarbons or carbon dioxide. It is also critical to analyze a fluid’s viscosity (thickness or stickiness), flammability, and overall volatility. And operators should only use fluids with minimal bioaccumulation (the buildup of chemicals in lifeforms, typically in fish) and terrestrial and aquatic toxicity. Finally, once up and running, data center operators should monitor server lifespan and failure rates, tracking performance uptime and adjusting IT refresh rates accordingly.

Read More »

Cisco unveils prototype quantum networking chip

Clock synchronization allows for coordinated time-dependent communications between end points that might be cloud databases or in large global databases that could be sitting across the country or across the world, he said. “We saw recently when we were visiting Lawrence Berkeley Labs where they have all of these data sources such as radio telescopes, optical telescopes, satellites, the James Webb platform. All of these end points are taking snapshots of a piece of space, and they need to synchronize those snapshots to the picosecond level, because you want to detect things like meteorites, something that is moving faster than the rotational speed of planet Earth. So the only way you can detect that quickly is if you synchronize these snapshots at the picosecond level,” Pandey said. For security use cases, the chip can ensure that if an eavesdropper tries to intercept the quantum signals carrying the key, they will likely disturb the state of the qubits, and this disturbance can be detected by the legitimate communicating parties and the link will be dropped, protecting the sender’s data. This feature is typically implemented in a Quantum Key Distribution system. Location information can serve as a critical credential for systems to authenticate control access, Pandey said. The prototype quantum entanglement chip is just part of the research Cisco is doing to accelerate practical quantum computing and the development of future quantum data centers.  The quantum data center that Cisco envisions would have the capability to execute numerous quantum circuits, feature dynamic network interconnection, and utilize various entanglement generation protocols. The idea is to build a network connecting a large number of smaller processors in a controlled environment, the data center warehouse, and provide them as a service to a larger user base, according to Cisco.  The challenges for quantum data center network fabric

Read More »

Zyxel launches 100GbE switch for enterprise networks

Port specifications include: 48 SFP28 ports supporting dual-rate 10GbE/25GbE connectivity 8 QSFP28 ports supporting 100GbE connections Console port for direct management access Layer 3 routing capabilities include static routing with support for access control lists (ACLs) and VLAN segmentation. The switch implements IEEE 802.1Q VLAN tagging, port isolation, and port mirroring for traffic analysis. For link aggregation, the switch supports IEEE 802.3ad for increased throughput and redundancy between switches or servers. Target applications and use cases The CX4800-56F targets multiple deployment scenarios where high-capacity backbone connectivity and flexible port configurations are required. “This will be for service providers initially or large deployments where they need a high capacity backbone to deliver a primarily 10G access layer to the end point,” explains Nguyen. “Now with Wi-Fi 7, more 10G/25G capable POE switches are being powered up and need interconnectivity without the bottleneck. We see this for data centers, campus, MDU (Multi-Dwelling Unit) buildings or community deployments.” Management is handled through Zyxel’s NebulaFlex Pro technology, which supports both standalone configuration and cloud management via the Nebula Control Center (NCC). The switch includes a one-year professional pack license providing IGMP technology and network analytics features. The SFP28 ports maintain backward compatibility between 10G and 25G standards, enabling phased migration paths for organizations transitioning between these speeds.

Read More »

Engineers rush to master new skills for AI-driven data centers

According to the Uptime Institute survey, 57% of data centers are increasing salary spending. Data center job roles that saw the highest increases were in operations management – 49% of data center operators said they saw highest increases in this category – followed by junior and mid-level operations staff at 45%, and senior management and strategy at 35%. Other job categories that saw salary growth were electrical, at 32% and mechanical, at 23%. Organizations are also paying premiums on top of salaries for particular skills and certifications. Foote Partners tracks pay premiums for more than 1,300 certified and non-certified skills for IT jobs in general. The company doesn’t segment the data based on whether the jobs themselves are data center jobs, but it does track 60 skills and certifications related to data center management, including skills such as storage area networking, LAN, and AIOps, and 24 data center-related certificates from Cisco, Juniper, VMware and other organizations. “Five of the eight data center-related skills recording market value gains in cash pay premiums in the last twelve months are all AI-related skills,” says David Foote, chief analyst at Foote Partners. “In fact, they are all among the highest-paying skills for all 723 non-certified skills we report.” These skills bring in 16% to 22% of base salary, he says. AIOps, for example, saw an 11% increase in market value over the past year, now bringing in a premium of 20% over base salary, according to Foote data. MLOps now brings in a 22% premium. “Again, these AI skills have many uses of which the data center is only one,” Foote adds. The percentage increase in the specific subset of these skills in data centers jobs may vary. The Uptime Institute survey suggests that the higher pay is motivating workers to stay in the

Read More »

ExtraHop looks to eliminate ‘extra hops’ in NDR stack

This deep visibility allows ExtraHop to provide insights across the entire network stack, from basic connectivity to application-level transactions. “The benefit of going all the way through Layer 7 is I can actually see a database transaction going through on the wire,” Vasani said. “If you have application teams complaining about database query latency, we can map it to what session was that tied to and what flows was it tied to from a network perspective and is this really an app server issue, or is it a network issue, or is it an endpoint issue?” The new sensor integrates with ExtraHop’s RevealX platform, feeding telemetry into the company’s cloud-scale ML/AI engine that powers its detection and analysis capabilities. “The sensor collects the telemetry, feeds it into an ML/AI engine that sits in the cloud, and then we layer in workflow engines on top to enable the various use cases,” Vasani said. In modern distributed enterprise environments, network visibility must extend beyond traditional data centers. ExtraHop’s all-in-one sensor is designed to address this reality with deployment options that span physical appliances, virtual machines and cloud environments. ExtraHop has both virtual and physical hardware appliances for sensor deployment. ExtraHop sensors can plug into a network through multiple methods including, Network Tap, SPAN (Switched Port Analyzer) port, packet broker or a cloud provider’s vTAP capabilities.

Read More »

AI’s energy appetite drives interest in nuclear power

In its new report, Deloitte said that its analysis of figures from the World Nuclear Association, the American Nuclear Society, the U.S. Department of Energy, and others showed that new nuclear power could potentially meet about 10% of the projected increase in data center demand over the next decade, assuming capacity is also significantly expanded by between 35GW and 62GW, and 30% of the expansion is earmarked for data centers. “Nuclear energy presents a potential solution for meeting some of the growing electricity demands of data centers, with its reliable and clean energy profile,” Deloitte’s report said, noting five key advantages of the technology: Reliable baseload power: Nuclear reactors operate 24/7, regardless of the weather, providing the reliable power so important to data centers. In addition, Deloitte said, “Their capacity factor, exceeding 92.5%, outperforms other sources like natural gas (56%) and renewables like wind (35%) and solar (25%).” High energy density: A small amount of fuel generates a lot of power, which minimizes the need for fuel storage and transportation. “This efficiency can translate to a smaller physical footprint and enhanced sustainability,” Deloitte said. Scalable power output: A full-sized reactor typically generates 800 megawatts (MW) or more of electricity, which accommodates the needs of large data centers. Low carbon emissions: Nuclear power plants produce virtually no greenhouse gas emissions during operation. Enhanced land use efficiency: Compared to other energy sources, nuclear power plants require relatively little land. Gartner’s Johnson echoed these advantages, and also predicted that nuclear energy, and small modular reactors (SMRs) in particular, will “provide a viable answer” to the question of what to do when electricity demand exceeds supply. They can, he said, “ensure independence from grid power fluctuations by providing dedicated on-site power for large data centers.” However, both Gartner and Deloitte also highlighted challenges in

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »