Stay Ahead, Stay ONMINE

The long-term impacts of AI on networking

Only 21 of the enterprises who offered AI network comments were doing any AI self-hosting, but all who did and almost all of those who were seriously evaluating self-hosting said that AI hosting meant a specialized cluster of computers with GPUs, and that this cluster would have to be connected both within itself and to […]

Only 21 of the enterprises who offered AI network comments were doing any AI self-hosting, but all who did and almost all of those who were seriously evaluating self-hosting said that AI hosting meant a specialized cluster of computers with GPUs, and that this cluster would have to be connected both within itself and to the main points of storage for their core business data. They all saw this as a whole new networking challenge.

Every enterprise who self-hosted AI told me the mission demanded more bandwidth to support “horizontal” traffic than their normal applications, more than their current data center needed to support. Ten of the group said that this meant they’d need the “cluster” of AI servers to have faster Ethernet connections and higher-capacity switches. Everyone agreed that a real production deployment of on-premises AI would need new network devices, and fifteen said they bought new switches even for their large-scale trials.

The biggest problem with the data center network I heard from those with experience is that they believed they built up more of an AI cluster than they needed. Running a popular LLM, they said, requires hundreds of GPUs and servers, but small language models can run on a single system, and a third of current self-hosting enterprises said they believed it is best to start small, with small models, and build up only when you had experience and could demonstrate a need. This same group also pointed out that control was needed to ensure only truly useful AI applications where run. “Applications otherwise build up, exceed, and then increase, the size of the AI cluster,” said users.

Every current AI self-hosting user said that it was important to keep AI horizontal traffic off their primary data center network because of its potential congestion impact on other applications. Horizontal traffic from hosted generative AI can be enormous and unpredictable; one enterprise said that their cluster could generate as much horizontal traffic as their whole data center, but in bursts rarely lasting more than a minute. They also said that latency in this horizontal burst could hamper application value significantly, stretching out both the result delivery and the length of the burst. They said that analyzing AI cluster flows was critical in picking the right cluster network hardware, and that they found they “knew nothing” about AI network needs until they ran trials and tests.

The data relationship between the AI cluster and enterprise core data repositories is complicated, and its this relationship that determines how much the AI cluster impacts the rest of the data center. The challenge here is that both the application(s) being supported and the manner of implementation have a major impact on how data moves from data center repositories to AI.

AI/ML applications of very limited scope, such as the use of AI/ML in operations analysis in IT or networking, or in security, are real-time and require access to real-time data, but this is usually low-volume telemetry and users report it has little impact. Generative AI applications targeting business analytics need broad access to core business data, but often need primarily historical summaries rather than full transactional detail, which means it’s often possible to keep this condensed source data as a copy within the AI cluster.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Quantum networking advances on Earth and in space

“Currently, the U.S. government is not investing in such testbeds or demonstrations, ensuring it will be a follower and not a leader in the development of technical advances in the field,” said a report released last year by the Quantum Economic Development Consortium, a global association of more than 250

Read More »

Who wins in networking in 2025 and beyond?

In this third team, we also find vendors most interested and active in the IoT opportunity, two of which are usually not considered enterprise vendors at all: Ericsson and Nokia. Both these companies are actively pursuing new IoT strategies, and while they are surely hoping that broad public-sensor IoT will

Read More »

Cisco researchers highlight emerging threats to AI models

Cisco security researchers this week detailed a number of threats they are seeing from bad actors trying to infect or attack AI’s most common component – the large language model. Some techniques used to hide messages or attacks from anti-spam systems are familiar to security specialists: “Hiding the nature of

Read More »

UK gas policies could lead to increased emissions from imports, report finds

Reduced North Sea investment as a result of UK government policy could lead to millions of tonnes of additional CO2 emissions from imports, according to a report. Edinburgh-based financial analyst firm Gneiss Energy modelled the potential impact of UK government policies relating to the country’s offshore oil and gas sector. Since taking office last year, the UK Labour government has increased the windfall tax on North Sea oil and gas firms and pledged not to issue new exploration licences. These policies, combined with a recent court ruling focused on downstream emissions, have raised concerns about the impact on future investment in the North Sea. Without new projects and further investment, domestic UK gas production in the maturing North Sea basin is likely to significantly decrease over the next decade, leading to a greater reliance on imports. According to the North Sea Transition Authority (NSTA), the carbon intensity of UK gas production is almost four times lower than liquefied natural gas (LNG) imports. UK gas and LNG emissions The Gneiss Energy report estimated the emissions impact of reduced UK production on whilst in tandem increasing imports of liquefied natural gas (LNG) into the 2030s. It modelled scenarios ranging from total reliance on UK domestic production to a ‘sunset’ scenario which assumes a 50% reduction in gas supplies from the UK continental shelf (UKCS). In the sunset scenario, UK gas demand is met with increased supply balanced between Norway and LNG imports from countries like the United States, Qatar and Algeria. Gneiss estimates this would lead to 13% higher pre-consumption CO2 emissions compared to the current UK energy mix. © Supplied by National Grid GrainThe Grain LNG terminal on the Isle of Grain. Image: National Grid By contrast, if the UK were to achieve 100% domestic gas supply it would lead to

Read More »

Adnoc to Shift USA Assets to New Global Unit XRG

Abu Dhabi’s main energy company will transfer stakes in some US assets to its new international investment firm, XRG PJSC, part of a push to create a $80 billion global venture. Abu Dhabi National Oil Co.’s holdings in a hydrogen venture with Exxon Mobil Corp. and NextDecade Corp.’s liquefied natural gas export project will move to XRG, according to Chief Executive Officer Sultan Al Jaber. XRG aims to invest in technologies that power the energy transition and the shift to advanced technologies, Al Jaber wrote in a column on the news website Semafor. Adnoc in September agreed to take a 35% stake in Exxon’s proposed hydrogen project in Baytown, Texas, which is likely to be the world’s biggest once built. That was its second acquisition in the US after Adnoc in May agreed to buy a stake in NextDecade’s LNG project, also in Texas. Adnoc announced its plan for XRG last year, saying the company will have an enterprise value of over $80 billion and will aim to more than double the value of its assets over the next decade. Adnoc plans to start the unit in the first quarter.  The UAE company has already said XRG will control Covestro AG following a roughly $13 billion deal to buy the German chemical producer. XRG will also hold Abu Dhabi’s share of a joint venture with BP Plc for gas projects in the Mediterranean region, including in Egypt. The UAE is pursuing expansion in chemicals and gas to feed future energy demand and supply plastics for consumer products amid the energy transition. Through Adnoc, it is expanding its energy business beyond oil to focus on gas trading and chemical derivatives like plastics on expectation demand for these products will benefit from the energy transition.  Over the weekend, Adnoc said it was in

Read More »

Oil Prices Slip Amid USA-China Trade War Fears

Oil edged lower as concerns that trade war between the US and China would hurt global growth outweighed the announcement of reinforced sanctions on Iran. West Texas Intermediate fell 0.6% to settle below $73 a barrel, recovering from an earlier decline of as much as 3.4% that was driven by the prospect that dueling tariffs between the US and China would reduce energy demand. Oil pared losses after Trump signed a directive that aims to ramp up economic pressure on Iran. Over the past four years, sanctions evasion and more relaxed US enforcement have allowed Iran to boost oil exports by about 1 million barrels a day. Enforcing sanctions may slash the country’s exports by about two-thirds and could cost Iran roughly $30 billion a year. Earlier on Tuesday, China announced retaliatory measures against Trump’s tariffs, temporarily sending futures below where they ended 2024 for the first time this year. Speaking to reporters Tuesday afternoon, Trump said it was “fine” that China responded in kind with tariffs on some American products. China will place levies on a range of US goods, including oil and liquefied natural gas, in response to Washington’s “unilateral imposition of tariffs,” the country’s finance ministry said. The US shipped about 250,000 barrels of crude a day to China on average last year, a relatively small volume. But an escalation of trade disputes between the world’s two largest economies could have a broader impact and hurt global consumption.   The trade confrontation with China stands in contrast to Trump’s agreement to push back planned levies on Canada and Mexico by a month after the nations agreed to take tougher measures to combat migration and drug trafficking. The flare-up came as China’s markets were shut for the Lunar New Year holidays. Oil futures have faced a bumpy few

Read More »

US ‘needs more storage’ to ensure grid reliability, resilience: SEIA

Dive Brief: The United States should deploy 10 million distributed energy storage installations and reach 700 GWh in total deployed storage capacity to meet its energy needs by 2030, the Solar Energy Industries Association said on Jan. 28. SEIA also released a national state energy storage policy guide that includes current statutory procurements and deployment targets, state and utility incentives, demonstration programs and consumer protections for energy storage. Amid rising load growth projections and an increasing share of renewable generation, a rapid increase in energy storage capacity is essential to “[ensure] our grid can adapt to challenges, support critical infrastructure and deliver reliable power to every community,” SEIA President and CEO Abigail Ross Hopper said in a statement. Dive Insight: The U.S. has nearly 500,000 distributed energy storage installations and about 83 GWh of total energy storage capacity, SEIA said, citing data provided by Wood Mackenzie.  The 2030 targets announced last month would represent a more than 20-fold increase in the number of distributed storage installations and a more than eight-fold increase in total deployed capacity. The 700-GWh target is also well above Wood Mackenzie’s business-as-usual forecast of 450 GWh in energy storage capacity by 2030, SEIA said. Wood Mackenzie’s most recent Energy Storage Monitor showed total U.S. storage deployments reaching a record 9.9 GWh in the fourth quarter of 2024, up 58% from the year-ago period. For full-year 2024, deployments increased 30%, marking the industry’s “strongest year yet,” Wood Mackenzie Senior Research Analyst Nina Rangel told Utility Dive in December. Early-stage development constraints mean “it will be difficult to keep this pace,” leading Wood Mackenzie to forecast average annual growth of 10% between 2025 and 2028, Rangel said.  Utility-scale storage installations account for the lion’s share of U.S. storage deployments, reaching about 9 GW in Q4 2024, Wood Mackenzie

Read More »

Macquarie Strategists Forecast USA Crude Inventory Build

In an oil and gas report sent to Rigzone by the Macquarie team late Monday, Macquarie strategists revealed that they are forecasting that U.S. crude inventories will be up by 10.8 million barrels for the week ending January 31. “This compares to our early look for the week which anticipated an 11.7 million barrel build, and a 3.5 million barrel build realized for the week ending January 24,” the strategists said in the report. “On the product side of the ledger, in aggregate, our expectations are slightly looser than our early view,” they added. In the report, the Macquarie strategists noted that, “for this week’s crude balance, from refineries”, they “model crude runs down modestly (-0.3 million barrels per day)”. “Among net imports, we model a slight decrease, with exports slightly higher (+0.1 million barrels per day) and imports effectively unchanged on a nominal basis,” they added. The strategists stated in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod. +adj. +transfers), we look for a rebound (+0.8 million barrels per day) following a soft print last week amidst potential weather impacts,” the strategists said in the report. “Rounding out the picture, we anticipate another small increase in SPR [Strategic Petroleum Reserve] inventory (+0.3 million barrels) on the week,” they added. The Macquarie strategists also noted in the report that, “among products”, they “look for builds in gasoline (+0.5 million barrels) and jet (+0.5 million barrels), with a draw in distillate (-3.8 million barrels)”. “We model implied demand for these three products at ~13.8 million barrels per day for the week ending January 31,” the Macquarie strategists went on to state. In an oil and gas report sent to Rigzone by the Macquarie team late Thursday, Macquarie

Read More »

California PUC streamlines transmission permitting process

The California Public Utilities Commission on Thursday streamlined its transmission permitting requirements as part of a broader effort to meet the state’s clean energy goals. Among the changes, the PUC now requires project developers to meet with commission staff before filing a permit application to help the review process run more smoothly, according to the agency. The revised rules also give transmission developers a chance to submit a draft version of California Environmental Quality Act documents instead of an environmental assessment to help speed permit reviews, according to a proposed decision approved by the commission on Thursday. Also, the PUC will assume a project is needed if the California Independent System Operator has already determined that it is needed, a change that will avoid duplicative analysis, according to the commission. The decision adopts a pilot to study ways to streamline the PUC’s environmental review process. “These changes will accelerate permitting timelines by reducing redundancy and shifting environmental analysis earlier in the application process,” PUC Commissioner Karen Douglas, who led the proceeding, said in a press release. The revised permitting rules will help the PUC address the scale of grid upgrades that will be needed to maintain a reliable grid and meet California’s climate goals, Douglas said. California will need about $46 billion to $63 billion in new transmission over the next 20 years, according to a “transmission outlook” report CAISO released July 31. The investments include up to $12 billion for upgrades to its system, up to $15.2 billion for transmission to import wind from other states and up to $37 billion for offshore wind. California law requires that all the state’s electricity comes from renewable energy and zero-carbon sources by 2045. The PUC’s decision completes a two-phase process to implement SB 529, a bill signed into law in 2022 that

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »