Stay Ahead, Stay ONMINE

IBM targets mainframe customers with prebuilt AI training modules

If a customer has an existing model or LLM, synthetic data provides additional data that is rich, labeled, and diverse to fine-tune the AI model. If a client does not have a model, the Synthetic Data Sets are designed to offer quick and privacy-compliant training data to create models from scratch, the vendor stated. Customers […]

If a customer has an existing model or LLM, synthetic data provides additional data that is rich, labeled, and diverse to fine-tune the AI model. If a client does not have a model, the Synthetic Data Sets are designed to offer quick and privacy-compliant training data to create models from scratch, the vendor stated.

Customers can deploy models on IBM Z and IBM LinuxONE with AI Toolkit for IBM Z and IBM LinuxONE, Cloud Pak for Data on Z, or Machine Learning for z/OS, wrote Elpida Tzortzatos, an IBM Fellow and Z architect, and Tina Tarquinio, IBM vice president, in a blog about the news. They can “perform inference on IBM z16 and IBM LinuxONE 4, leveraging hardware acceleration investments and data gravity to dramatically enhance AI inferencing speed and scale,” the authors wrote.

In addition customers can enhance predictive AI models and fine-tune LLMs with additional rich and broad data, leading to significant cost savings in areas such as fraud detection and money laundering prevention,” the authors wrote.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Chinese AI firm trains state-of-the-art model entirely on Huawei chips

The pricing positions GLM-Image as a cost-effective option for enterprises generating marketing materials, presentations, and other text-heavy visual content at scale. Technical approach and benchmark performance GLM-Image employs a hybrid architecture combining a 9-billion-parameter autoregressive model with a 7-billion-parameter diffusion decoder, according to Zhipu’s technical report. The autoregressive component handles

Read More »

BPCL lets contracts for expansions at Bina, Mumbai refineries

Bharat Petroleum Corp. Ltd. (BPCL) has awarded separate contracts to Technip Energies NV for delivery of major works on key projects designed to support expanded production of petrochemicals at two of the operator’s Indian refineries. Under a first contract revealed on Jan. 7, Technip Energies said it will provide engineering,

Read More »

Saudi Aramco to resume Perro Negro 7 offshore operations

@import url(‘https://fonts.googleapis.com/css2?family=Inter:[email protected]&display=swap’); a { color: var(–color-primary-main); } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: Inter; } body { line-height: 150%; letter-spacing: 0.025em; font-family: Inter; } button, .ebm-button-wrapper { font-family: Inter; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style

Read More »

NRF 2026: HPE expands network, server products for retailers

The package also integrates information from HPE Aruba Networking User Experience Insight sensors and agents, which now include support for WiFi 7 networks. The combination can measure end-user activity and allow IT teams to baseline network performance, continuously test network health, track trends, and plan for device growth and AI-native

Read More »

Record Offshore Wind Auction Boosts UK Hopes for 2030 Goal

Britain stepped up support for offshore wind in the latest subsidy auction, showing the government is still determined to meet its ambitious 2030 clean-power goal even as costs rise. The 8.2 gigawatts of offshore wind beat analysts’ expectations and will boost the likelihood of the government delivering on its promise to almost totally exit fossil fuels in power generation. The UK now needs around 7 gigawatts of new capacity in the next auction, which is the last realistic chance to get projects built in time.   The government will pay developers more for projects won in this auction compared with last year, a cost that’s ultimately paid for by consumers. It creates a difficult balancing act for Prime Minister Keir Starmer, who has pledged to cut household bills during the current parliament.  “With these results, Britain is taking back control of our energy sovereignty,” said Energy Secretary Ed Miliband in a statement. The results deliver the biggest single procurement of offshore wind energy in British and European history, according to the statement. The auction secured capacity at a price of £65.45 ($88) per megawatt-hour in 2012 prices, a commonly used benchmark, or £91.20 in 2024 terms, accounting for some inflation. This price, higher than in last year’s auction, still represents a “net benefit to bills over the next decade,” according to analysis from Aurora Energy Research. RWE AG was the major winner, involved in all but one of the projects that won. Separately, RWE said it has agreed a deal with KKR & Co to develop, construct and operate the Norfolk Vanguard East and Norfolk Vanguard West projects, which were awarded contracts in the auction.  Another winner, RWE’s Dogger Bank South, doesn’t yet have planning permission, which means it may not be built in time to meet the 2030 goal. RWE’s

Read More »

GasBuddy Reveals 2026 USA Gasoline Price Forecast

In a report published recently, GasBuddy revealed its average U.S. gasoline price prediction for 2026. According to this report, the company expects the U.S. gasoline price to come in at $2.97 per gallon this year and sees December as the month with the lowest average U.S. gasoline price in 2026, at $2.83 per gallon. GasBuddy expects May to see the highest average U.S. gasoline price in 2026, at $3.12 per gallon, the report outlined. “GasBuddy’s forecast projects the national average price of gas to fall to $2.97 per gallon in 2026, marking the fourth consecutive annual decline and the lowest average since 2020,” Patrick De Haan, Head of Petroleum Analysis at GasBuddy, said in the report. “This continued decrease reflects the unwinding of post-pandemic market distortions, expanding global refining capacity, and more stable supply chains,” he added. “While the drop from 2025 is modest compared to previous years, it underscores a meaningful shift toward greater overall market stability,” he continued. A statement accompanying the release of the report posted on GasBuddy’s website highlighted that the U.S. gasoline price averaged $3.102 per gallon in 2025. GasBuddy also pointed out in that statement that it is forecasting the yearly U.S. average price of gasoline to fall back below $3 per gallon for the first time since the Covid-19 pandemic. In the report, GasBuddy projected that average household spending on gasoline will come in at $2,083 in 2026. That’s the lowest figure since 2021, which saw an average household gasoline spend of $1,979, the report showed.  De Haan went on to project in the report that gasoline prices “are expected to follow a traditional seasonal pattern in 2026, with imbalances left behind by Covid and geopolitical tensions balanced for the time being”. “The national average is projected to briefly rise into the low

Read More »

DOE, NASA Advance Partnership to Enable Nuclear Power on Moon

The United States Department of Energy (DOE) and the National Aeronautics and Space Administration (NASA) on Tuesday announced a memorandum of understanding (MOU) renewing their commitment to developing a lunar power system using fission by 2030. The collaboration aims to enable sustained NASA missions on the Moon – though radioisotope systems have already powered long-term U.S. space missions for decades according to DOE. “Thanks to President Trump’s leadership and his America First Space Policy, the department is proud to work with NASA and the commercial space industry on what will be one of the greatest technical achievements in the history of nuclear energy and space exploration”, Energy Secretary Chris Wright declared. The agencies eye deploying a surface power system able to operate for years without refueling. “The deployment of a lunar surface reactor will enable future sustained lunar missions by providing continuous and abundant power, regardless of sunlight or temperature”, DOE and NASA said. “Under President Trump’s national space policy, America is committed to returning to the Moon, building the infrastructure to stay and making the investments required for the next giant leap to Mars and beyond”, said NASA Administrator Jared Isaacman. “Achieving this future requires harnessing nuclear power.  “This agreement enables closer collaboration between NASA and the Department of Energy to deliver the capabilities necessary to usher in the golden age of space exploration and discovery”. Westinghouse Contract Before the MOU, DOE and NASA had already contracted Westinghouse Electric Co LLC to develop a space microreactor design under the agencies’ Fission Surface Power (FSP) project. On January 7, 2025, Pennsylvania-based Westinghouse announced a new contract that builds on “the successful design work Westinghouse completed during phase 1 to optimize its contributions to the design of FSP systems and their configuration, and begin testing of critical technology elements”. “The continued progress

Read More »

Texas Upstream Employment, Job Postings Drop

Texas upstream employment and job postings declined in the fourth quarter of 2025. That’s what the Texas Independent Producers and Royalty Owners Association (TIPRO) said in a statement sent to Rigzone on Friday, which TIPRO outlined corresponded with the latest Current Employment Statistics (CES) report from the U.S. Bureau of Labor Statistics (BLS) and provided “additional insight on markets trends”. TIPRO noted in the statement that, due to the federal government shutdown and suspension of related services last year, the CES report from the BLS was delayed until the government resumed operations. TIPRO highlighted that, on Friday, CES data was released simultaneously for the months of October and November 2025. “According to … TIPRO, employment in the Texas upstream sector declined between October and November 2025,” TIPRO said in the statement. The organization noted in the statement that oil and natural gas extraction jobs increased “modestly” by 100 to 69,600, which it pointed out was a 0.1 percent month on month increase, “buoyed by Permian Basin efficiencies”. Support activities employment fell by 3,600 to 131,600, a drop of 2.7 percent month on month, TIPRO outlined in the statement, “amid rig count erosion (down 7.6 percent year on year) and service sector streamlining”. “Combined upstream employment decreased by 3,500 jobs to 201,200 (-1.7 percent month on month),” TIPRO highlighted. In its statement, TIPRO noted that, from January to November 2025, employment in the Texas upstream sector “displayed early resilience followed by late-year softening”. “Oil and gas extraction added a net 1,400 jobs (+2.1 percent), peaking at 70,200 in June and July before a -400 dip from August to November, driven by robust Permian production but offset by layoffs and lower oil prices,” TIPRO stated. “Support activities employment saw a net loss of 3,700 jobs (-2.7 percent), with a February-May surge (+2,800) undone by mid-year declines (-3,400

Read More »

Norway Offers 57 New Production Licenses to 19 Companies

The Norwegian Ministry of Energy announced, in a statement posted on its website on Tuesday, that it has offered 57 new production licenses to 19 companies on the Norwegian Continental Shelf in the APA (Awards in Predefined Areas) 2025 licensing round. Of the 57 production licenses offered in APA 2025, 31 are located in the North Sea, 21 in the Norwegian Sea, and five in the Barents Sea, the statement highlighted. Equinor Energy AS was offered the highest number of combined parts in licenses and operatorships, with 52, followed by Aker BP ASA, with 34, and DNO Norge AS, with 21, the statement revealed. A complete list of offers, showing parts/operatorships, as shown on the ministry’s site, can be seen below: Aker BP ASA (22/12) Concedo AS (2/1) ConocoPhillips Skandinavia AS (1/1) DNO Norge AS (17/4) Equinor Energy AS (35/17) Harbour Energy Norge AS (9/4) INPEX Idemitsu Norge AS (5/1)Japex Norge AS (2/0) Lime Petroleum AS (1/0) OKEA ASA (3/1) OMV (Norge) AS (4/2) Orlen Upstream Norway AS (6/0) Pandion Energy Norge AS (1/0) Petrolia NOCO AS (1/1) Repsol (2/2) Source Energy AS (2/0) TotalEnergies EP Norge AS (1/0) Vår Energi ASA (14/6) Wellesley Petroleum AS (5/5) All petroleum licensing rounds are carried out within the framework established by the Norwegian Parliament for where new production licenses may be awarded, the ministry’s statement noted, adding that APA is an annual exploration round for the Norwegian continental shelf. “The APA rounds are carried out within a fixed area, the APA area, which is expanded on the basis of petroleum professional assessments and in accordance with a fixed annual cycle,” the statement highlighted. “The APA area comprises the majority of the opened, available acreage on the continental shelf, including areas in the North Sea, the Norwegian Sea, and the Barents Sea,” it

Read More »

Banks in Talks to Lend $1B for Argentina Gas Pipeline

A group of banks including JPMorgan Chase & Co. and Citigroup Inc. are in talks to lend natural gas producers in Argentina roughly $1 billion to build a cross-country pipeline, according to two people familiar with the matter.  The banks, which also include Banco Santander SA, are negotiating the syndicated loan with a consortium led by Pan American Energy Group after a similar deal was struck last year for a pipeline and port dedicated to shale oil exports. That project, known as VMOS, is currently under construction. More banks may join the financing for the gas pipeline, the people added.  Pan American, which is half-owned by British oil major BP Plc, holds a 30 percent stake in the consortium, called Southern Energy SA. Argentina’s state-run energy giant YPF SA owns 25 percent. Three other companies, Pampa Energia SA, UK-based Harbour Energy Plc and Golar LNG Ltd. also have smaller stakes in the project.  Negotiations are ongoing and terms could still change before an agreement finalizes. JPMorgan and Citi declined to comment. Santander and Pan American didn’t respond to requests for comment.  Argentina’s Vaca Muerta shale patch is growing fast as President Javier Milei’s free-market reforms have opened up the energy industry to global credit, unleashing investments. The $2 billion loan for the oil pipeline was the biggest project financing in Argentina’s history, according to JPMorgan. Southern Energy is now aiming to unlock the Vaca Muerta’s gas potential with Argentina’s first floating liquefaction terminal for natural gas. The pipeline would transport natural gas from Vaca Muerta to the terminal on the Atlantic coast. Argentina holds the world’s second-biggest resources of shale gas, and its daily production averaged the equivalent of about 550,000 barrels last year. The consortium’s first leased liquefaction vessel, Hilli Episeyo, is set to start production at the end of 2027.

Read More »

Microsoft tells communities it will ‘pay its way’ as AI data center resource usage sparks backlash

It will work with utilities and public commissions to set the rates it pays high enough to cover data center electricity costs (including build-outs, additions, and active use). “Our goal is straightforward: To ensure that the electricity cost of serving our data centers is not passed on to residential customers,” Smith emphasized. For example, the company is supporting a new rate structure Wisconsin that would charge a class of “very large customers,” including data centers, the true cost of the electricity required to serve them. It will collaborate “early, closely, and transparently” with local utilities to add electricity and supporting infrastructure to existing grids when needed. For instance, Microsoft has contracted with the Midcontinent Independent System Operator (MISO) to add 7.9GW of new electricity generation to the grid, “more than double our current consumption,” Smith noted. It will pursue ways to make data centers more efficient. For example, it is already experimenting with AI to improve planning, extract more electricity from existing infrastructure, improve system resilience, and speed development of new infrastructure and technologies (like nuclear energy). It will advocate for state and national public policies that ensure electricity access that is affordable, reliable, and sustainable in neighboring communities. Microsoft previously established priorities for electricity policy advocacy, Smith noted, but “progress has been uneven. This needs to change.” Microsoft is similarly committed when it comes to data center water use, promising four actions: Reducing the overall amount of water its data centers use, initially improving it by 40% by 2030. The company is exploring innovations in cooling, including closed-loop systems that recirculate cooling liquids. It will collaborate with local utilities to map out water, wastewater, and pressure needs, and will “fully fund” infrastructure required for growth. For instance, in Quincy, Washington, Microsoft helped construct a water reuse utility that recirculates

Read More »

Can retired naval power plants solve the data center power crunch?

HGP’s plan includes a revenue share with the government, and the company would create a decommissioning fund, according to Bloomberg. The alternative? After a lengthy decommissioning process, the reactors are shipped to a remote storage facility in Washington state together dust along with dozens of other retired nuclear reactors. So the carrier itself isn’t going to be turned into a data center, but its power plants are being proposed for a data center on land. And even with the lengthening decommissioning process, that’s still faster than building a nuclear power plant from scratch. Don’t hold your breath, says Kristen Vosmaer, managing director, JLL Work Dynamics Data Center team. The idea of converting USS Nimitz’s nuclear reactors to power AI data centers sounds compelling but faces insurmountable obstacles, he argues. “Naval reactors use weapons-grade uranium that civilian entities cannot legally possess, and the Nuclear Regulatory Commission has no pathway to license such facilities. Even setting aside the fuel issue, these military-designed systems would require complete reconstruction to meet civilian safety standards, eliminating any cost advantages over purpose-built nuclear plants,” Vosmaer said. The maritime concept itself, however, does have some merit, said Vosmaer. “Ocean cooling can reduce energy consumption compared to land-based data centers, and floating platforms offer positioning flexibility that fixed facilities cannot match,” Vosmaer said.

Read More »

What exactly is an AI factory?

Others, however, seem to use the word to mean something smaller than a data center, referring more to the servers, software, and other systems used to run AI. For example, the AWS AI Factory is a combination of hardware and software that runs on-premises but is managed by AWS and comes with AWS services such as Bedrock, networking, storage and databases, and security.  At Lenovo, AI factories appear to be packaged servers designed to be used for AI. “We’re looking at the architecture being a fixed number of racks, all working together as one design,” said Scott Tease, vice president and general manager of AI and high-performance computing at Lenovo’s infrastructure solutions group. That number of racks? Anything from a single rack to hundreds, he told Computerworld. Each rack is a little bigger than a refrigerator, comes fully assembled, and is often fully preconfigured for the customer’s use case. “Once it arrives at the customer site, we’ll have service personnel connect power and networking,” Tease said. For others, the AI factory concept is more about the software.

Read More »

Meta establishes Meta Compute to lead AI infrastructure buildout

At that scale, infrastructure constraints are becoming a binding limit on AI expansion, influencing decisions like where new data centers can be built and how they are interconnected. The announcement follows Meta’s recent landmark agreements with Vistra, TerraPower, and Oklo aimed at supporting access to up to 6.6 gigawatts of nuclear energy to fuel its Ohio and Pennsylvania data center clusters. Implications for hyperscale networking Analysts say Meta’s approach indicates how hyperscalers are increasingly treating networking and interconnect strategy as first-order concerns in the AI race. Tulika Sheel, senior vice president at Kadence International, said that Meta’s initiative signals that hyperscale networking will need to evolve rapidly to handle massive internal data flows with high bandwidth and ultra-low latency. “As data centers grow in size and GPU density, pressure on networking and optical supply chains will intensify, driving demand for more advanced interconnects and faster fiber,” Sheel added. Others pointed to the potential architectural shifts from this. “Meta is using Disaggregated Scheduled Fabric and Non-Scheduled Fabric, along with new 51 Tbps switches and Ethernet for Scale-Up Networking, which is intensifying pressure on switch silicon, optical modules, and open rack standards,” said Biswajeet Mahapatra, principal analyst at Forrester. “This shift is forcing the ecosystem to deliver faster optical interconnects and greater fiber capacity, as Meta targets significant backbone growth and more specialized short-reach and coherent optical technologies to support cluster expansion.” The network is no longer a secondary pipe but a primary constraint. Next-generation connectivity, Sheel said, is becoming as critical as access to compute itself, as hyperscalers look to avoid network bottlenecks in large-scale AI deployments.

Read More »

AI, edge, and security: Shaping the need for modern infrastructure management

The rapidly evolving IT landscape, driven by artificial intelligence (AI), edge computing, and rising security threats, presents unprecedented challenges in managing compute infrastructure. Traditional management tools struggle to provide the necessary scalability, visibility, and automation to keep up with business demand, leading to inefficiencies and increased business risk. Yet organizations need their IT departments to be strategic business partners that enable innovation and drive growth. To realize that goal, IT leaders should rethink the status quo and free up their teams’ time by adopting a unified approach to managing infrastructure that supports both traditional and AI workloads. It’s a strategy that enables companies to simplify IT operations and improve IT job satisfaction. 5 IT management challenges of the AI era Cisco recently commissioned Forrester Consulting to conduct a Total Economic Impact™ analysis of Cisco Intersight. This IT operations platform provides visibility, control, and automation capabilities for the Cisco Unified Computing System (Cisco UCS), including Cisco converged, hyperconverged, and AI-ready infrastructure solutions across data centers, colocation facilities, and edge environments. Intersight uses a unified policy-driven approach to infrastructure management and integrates with leading operating systems, storage providers, hypervisors, and third-party IT service management and security tools. The Forrester study first uncovered the issues IT groups are facing: Difficulty scaling: Manual, repetitive processes cause lengthy IT compute infrastructure build and deployment times. This challenge is particularly acute for organizations that need to evolve infrastructure to support traditional and AI workloads across data centers and distributed edge environments. Architectural specialization and AI workloads: AI is altering infrastructure requirements, Forrester found.  Companies design systems to support specific AI workloads — such as data preparation, model training, and inferencing — and each demands specialized compute, storage, and networking capabilities. Some require custom chip sets and purpose-built infrastructure, such as for edge computing and low-latency applications.

Read More »

DCF Poll: Analyzing AI Data Center Growth

@import url(‘https://fonts.googleapis.com/css2?family=Inter:[email protected]&display=swap’); a { color: var(–color-primary-main); } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: Inter; } body { line-height: 150%; letter-spacing: 0.025em; font-family: Inter; } button, .ebm-button-wrapper { font-family: Inter; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #1796c1 !important; border-color: #1796c1 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #1796c1 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #1796c1 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #1796c1 !important; border-color: #1796c1 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #1796c1 !important; border-color: #1796c1 !important; } Coming out of 2025, AI data center development remains defined by momentum. But momentum is not the same as certainty. Behind the headlines, operators, investors, utilities, and policymakers are all testing the assumptions that carried projects forward over the past two years, from power availability and capital conditions to architecture choices and community response. Some will hold. Others may not. To open our 2026 industry polling, we’re taking a closer look at which pillars of AI data center growth are under the most pressure. What assumption about AI data center growth feels most fragile right now?

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »