Stay Ahead, Stay ONMINE

From AI agent hype to practicality: Why enterprises must consider fit over flash

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More As we step fully into the era of autonomous transformation, AI agents are transforming how businesses operate and create value. But with hundreds of vendors claiming to offer “AI agents,” how do we cut through the […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


As we step fully into the era of autonomous transformation, AI agents are transforming how businesses operate and create value. But with hundreds of vendors claiming to offer “AI agents,” how do we cut through the hype and understand what these systems can truly accomplish and, more importantly, how we should use them?

The answer is more complicated than creating a list of tasks that could be automated and testing whether an AI agent can achieve those tasks against benchmarks. A jet can move faster than a car, but it’s the wrong choice for a trip to the grocery store.

Why we shouldn’t be trying to replace our work with AI agents

Every organization creates a certain amount of value for their customers, partners and employees.

This amount is a fraction of the total addressable value creation (that is, the total amount of value the organization is capable of creating that would be welcomed by its customers, partners and employees).

If every employee leaves the workday with a long list of to-dos for the next day and another list of to-dos to deprioritize altogether — items that would have created value if they could have been prioritized — there is an imbalance of value, time and effort, leaving value on the table.

The easiest place to start with AI agents is looking at the work already being done and the value being created. This makes the initial mental math easy, as you can map the value that already exists and analyze opportunities to create the same value faster or more reliably.

There’s nothing wrong with this exercise as a phase in a transformation process, but where most organizations and AI initiatives fail is in only considering how AI can apply to value already being created. This narrows their focus and investments to the narrow overlapping sliver in the Venn diagram below, leaving the majority of the addressable value on the table.

Humans and machines inherently have different strengths and weaknesses. Organizations that collaboratively reinvent work with their business, technology and industry partners will outplay those who merely focus on one body of value and endlessly pursue greater degrees of automation without increasing total value output.

Understanding AI agent capabilities through the SPAR framework

To help explain how AI agents work, we’ve created what we call the SPAR framework: sense, plan, act and reflect. This framework mirrors how humans achieve our own goals and provides a natural way to understand how AI agents operate.

Sensing: Just as we use our senses to gather information about the world around us, AI agents collect signals from their environment. They track triggers, gather relevant information and monitor their operating context.

Planning: Once an agent has collected signals about its environment, it doesn’t just jump into execution. Like humans considering their options before acting, AI agents are developed to process available information in the context of their objectives and rules to make informed decisions about achieving their goals.

Acting: The ability to take concrete action sets AI agents apart from simple analytical systems. They can coordinate multiple tools and systems to execute tasks, monitor their actions in real-time, and make adjustments to stay on course.

Reflecting: Perhaps the most sophisticated capability is learning from experience. Advanced AI agents can evaluate their performance, analyze outcomes and refine their approaches based on what works best — creating a continuous improvement cycle.

What makes AI agents powerful is how these four capabilities work together in an integrated cycle, creating a system that can pursue complex goals with increasing sophistication.

This exploratory capability can be contrasted against existing processes that have already been optimized several times through digital transformation. Their reinvention might yield small short-term gains, but exploring new methods of creating value and making new markets could yield exponential growth.

5 Steps to build your AI agent strategy

Most technologists, consultants and business leaders follow a traditional approach when introducing AI (accounting for an 87% failure rate):

  1. Create a list of problems;

or

  1. Examine your data;
  2. Pick a set of potential use cases;
  3. Analyze use cases for return on investment (ROI), feasibility, cost, timeline;
  4. Choose a subset of use cases and invest in execution.

This approach may seem defensible because it’s commonly understood to be best practice, but the data shows that it isn’t working. It’s time for a new approach.

  1. Map the total addressable value creation your organization could provide to your customers and partners given your core competencies and the regulatory and geopolitical conditions of the market.
  2. Assess the current value creation of your organization.
  3. Choose the top five most valuable and market-making opportunities for your organization to create new value.
  4. Analyze for ROI, feasibility, cost and timeline to engineer AI agent solutions (repeat steps 3 and 4 as necessary).
  5. Choose a subset of value cases and invest in execution.

Creating new value with AI

The journey into the era of autonomous transformation (with more autonomous systems creating value continuously) isn’t a sprint — it’s a strategic progression, building organizational capability alongside technological advancement. By initially identifying value and growing ambitions methodically, you’ll position your organization to thrive in the era of AI agents.

Brian Evergreen is the author of Autonomous Transformation: Creating a More Human Future in the Era of Artificial Intelligence 

Pascal Bornet is the author of Agentic Artificial Intelligence: Harnessing AI Agents to Reinvent Business, Work and Life

Evergreen and Bornet are teaching a new online course on AI agents with Cassie Kozyrkov: Agentic Artificial Intelligence for Leaders

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

NetBox Labs embraces intersection of network management and AI

“NetBox is intent,” Beevers explained. “This is where network teams are documenting ‘Here is what the network and the infrastructure should look like.’ Think of intent as what is in NetBox.” With the general availability of NetBox Assurance announced this week, the platform now extends beyond documentation to address the

Read More »

9 steps to take to prepare for a quantum future

“If you’re in pharma or chemical industry, they’re using it already,” says Constellation’s Mueller. “You have to look into it,” Mueller warns. And quantum computers are already playing an important role in protein folding, he says. “Quantum qubits are taking over traditional architectures for protein folding and mapping,” he says.

Read More »

Trump tariffs bite down on UK energy shares

Global share prices have fallen after US president Donald Trump introduced sweeping tariffs on a vast range of the country’s trading partners. Fears about the impact that these tariffs could have on the global economy meant that shares on stock markets around the world started to dip when they were first announced on Thursday 3 April. Friday 4 April saw these loses grow, and when markets opened after the weekend, shares feel even further. Energy stocks have shown that they aren’t immune to these loses, with companies across the size and supply chain spectrum all being affected. The supermajors have been hard hit on both sides of the Atlantic. BP and Shell, ExxonMobil and Chevron, and TotalEnergies and Equinor all taking hits. On 2 April, before the full announcement of the Trump tariffs, BP closed the day on 432.75p It saw an immediate drop when the market opened on 3 April, falling to 422.20p. By the time the first two days were over, its share price was 371.28p And when markets opened Monday, shares fell again, hitting a low of 347.85p before regaining some ground to go above 350p Shell suffered a similar fate, closing 2 April at 2,775.50p and immediately dropping 2,707p the next day. By market close on 4 April, it was down on 2,491.50p, and fell again Monday to 2,294.50p. Like BP, it regained a little ground to hover around 2,340p. US stocks were similarly hit. ExxonMobil lost $10 over the first two days of the tariff-inspired selloff, going from over $115 to close the week at $105. Monday’s market open cut another $5 off its share price, and the day has seen it go just below the $100 level. Meanwhile, Chevron went from its pre-tariff level of $163.47 down to $143.11 at market close on Friday.

Read More »

Trump Team Proposes Ending Clean Energy Office, Cutting Billions

The US Energy Department is proposing to shut down its Office of Clean Energy Demonstrations and cut some $9 billion in awards for programs regarding carbon capture, direct air capture, solar and hydrogen, according to documents seen by Bloomberg.  Under the plan, which isn’t final, the $27 billion agency’s staff would be reduced to 35 employees, and about $10 billion in projects, including $3 billion for so-called hydrogen hubs, would be kept “as is” and transferred to other parts of the Energy Department. The office employed about 250 people before President Donald Trump, a climate change skeptic, returned to office.  A decision on the fate of the agency, which was created in 2021 in bipartisan infrastructure legislation, could come as soon as next week, according to Axios, which earlier reported on the proposal.  The Energy Department didn’t immediately respond to a request for comment on Friday evening. Money for hydrogen hubs, in which the gas can both be produced and used, is intended to highlight its potential as carbon-free fuel to run factories and power plants. Direct air capture involves technology that sucks planet-warming carbon dioxide out of the air. Among the funding identified for potential elimination are billions of dollars for hubs in California, the Pacific Northwest, the Mid-Atlantic and the Midwest while hubs in Texas, Appalachia, and the Minnesota, North and South Dakota would be spared, according to the documents. The plan also recommends keeping $3.5 billion for advanced nuclear reactor projects and $1.9 billion in industrial demonstration program projects. Also slated for termination are carbon capture projects, battery storage projects, and a direct air capture project by a subsidiary of Occidental Petroleum Corp. that was selected for an award of as much as $1.2 billion by the Biden administration in 2023.  The plan comes as the Energy Department, which has a

Read More »

US electricity demand will grow 50% by 2050, electrical manufacturer study finds

Driven by data centers and transportation electrification, U.S. electricity demand will increase 2% annually and 50% by 2050, the National Electrical Manufacturers Association said in a study published Monday. Improvements to energy efficiency will mute some potential demand gains, leaving the electrical manufacturing group’s projections “somewhere in the middle” compared with other recent studies, NEMA President and CEO Debra Phillips said in a Friday discussion of the report. Permission granted by the National Electrical Manufacturers Association The analysis, completed for NEMA by PA Consulting, anticipates 300% growth in data center energy consumption over next 10 years and 9,000% projected growth in e-mobility power consumption through 2050. The electric vehicle deployment curve has “flattened” in recent years but “we still think the promise is there, for the consumer,” Phillips said, pointing to advances in battery technology establishing greater range and improving prices. The electricity demand expected in the next quarter century “is fairly remarkable,” Phillips said, adding, “our grid wasn’t designed really to meet demand growth at this rate … and so we’re going to have to get creative around the technology and policy solutions that are going to help us meet the demand.” On the policy front, NEMA’s report calls for: Permitting and siting reform, including for generation, transmission, distribution and critical minerals development, improved interregional electric transmission and the adoption of grid enhancing technologies; Tax certainty around incentives for grid technologies and domestic manufacturing of critical grid infrastructure, and incentives that enable utilities to make significant smart grid, distributed energy, and resiliency investments; An all-of-the-above approach to energy resources, including natural gas, small modular reactors and geothermal.  In the face of “unprecedented” growth in energy consumption and challenges posed by aging grid, “utility operators will require cutting-edge solutions that enhance the resilience and reliability of our grid,” Michael Plaster, executive vice president

Read More »

FERC urged to reject proposed ROE, incentives for $3B Valley Link transmission project

The Federal Energy Regulatory Commission should reduce the proposed return on equity and deny various incentives for a roughly $3 billion transmission project planned by Transource Energy, Dominion Energy and FirstEnergy, according to Friday filings at the agency. The Valley Link transmission project, which includes two 765-kV backbone transmission lines, is part of the PJM Interconnection’s latest Regional Transmission Expansion Plan, which was approved by the grid operator’s board in late February. The project is slated to come online in December 2029. The project is set to be built by Valley Link Transmission, a joint venture between Transource (36%), which is owned by American Electric Power and Evergy, FirstEnergy (34%) and Dominion (30%). Valley Link in mid-March asked FERC to approve formula rates and transmission incentives for the project, which the company said will “ensure reliability and brings innovative collaboration to large-scale infrastructure development in PJM at a time when efficient and cost-effective regional transmission development is essential.” Valley Link is seeking a 10.9% base ROE, plus an additional 0.5% for belonging to PJM. It also asked for a suite of incentives for the project, including “construction work in progress,” the ability to recoup any prudent costs it may have incurred even if the project is abandoned and a hypothetical capital structure of 60% equity and 40% debt. The proposed ROE and incentives will ease the risks that Valley Link faces as a startup transmission developer, the company told FERC. “Valley Link will face significant risks and challenges in financing, developing, and constructing the project portfolio,” the company said. The incentive package results in “an impermissible transfer of risk onto ratepayers,” the Maryland Office of People’s Counsel, which represents residential ratepayers, told FERC on Friday. Also, the proposed base ROE appears to be too high, according to the OPC, which

Read More »

Industry Bodies Examine Texas Upstream Employment for February

According to the Texas Independent Producers and Royalty Owners Association’s (TIPRO) analysis, direct Texas upstream employment for February totaled 205,400. That’s what TIPRO said in a statement sent to Rigzone by the TIPRO team recently, which cited the latest Current Employment Statistics (CES) report from the U.S. Bureau of Labor Statistics (BLS). In the statement, TIPRO highlighted that the February figure was “an increase of 1,900 industry positions from January employment numbers, subject to revisions”. The industry body noted that this represented an increase of 2,500 jobs in the services sector and decline of 600 jobs in oil and gas extraction. TIPRO said in its statement that its new workforce data indicated strong job postings for the Texas oil and natural gas industry, adding that there were 10,172 active unique jobs postings for the Texas oil and natural gas industry in February, including 3,337 new postings. Among the 19 specific industry sectors TIPRO uses to define the Texas oil and natural gas industry, Gasoline Stations with Convenience Stores led in the ranking for unique job listings in February with 2,541 postings, the organization highlighted in its statement. This sector was followed by Support Activities for Oil and Gas Operations, with 2,389 postings, and Petroleum Refineries, with 870 postings, TIPRO outlined.   The leading three cities by total unique oil and natural gas job postings were Houston, with 2,368, Midland, with 669, and Odessa, with 449, TIPRO pointed out in the statement. The top three companies ranked by unique job postings in February were Cefco, with 927, Love’s, with 680, and John Wood Group, with 308, according to TIPRO. In its statement, TIPRO noted that, of the top ten companies listed by unique job postings in February, four companies were in the services sector, three were in the gasoline stations with convenience stores category, two were midstream companies, and one was an oil and gas operator. Top posted industry occupations for February included

Read More »

US DOE Awards Operating Contract for Emergency Oil Reserve

The Department of Energy (DOE) has awarded Strategic Storage Partners LLC a $1.4-billion contract to manage and operate the United States’ Strategic Petroleum Reserve for an extendable five years. “After a transition period, Strategic Storage Partners, LLC, will assume responsibility for management and operation of the SPR on June 15, 2025”, the DOE said in an online statement. The federally owned oil stocks are stored in underground salt caverns at four sites in Louisiana and Texas. The DOE said the SPR has run under a series of contracts awarded by the agency since 1985. “Strategic Storage Partners submitted their proposal in partnership with two primary joint venture partners, Aptim Federal Services, LLC, headquartered in Baton Rouge, Louisiana and BWXT Technical Services Group, Inc., headquartered in Lynchburg, Virginia”, the statement said. The partnership was selected after a “rigorous competitive selection process”, the DOE said. The previous contractor was Fluor Federal Petroleum Operations LLC. In 2013 this consortium won a $1.46 billion award to operate the SPR, according to a DOE press release September 18 that year. The contract was awarded for an initial 5 years, extendable for another 5, the DOE said then. Based on information on government website USAspending.gov, the contract started April 2014 and is expected to end June 2025. The SPR, mandated by the Energy Policy and Conservation Act to be only used during severe supply disruptions, stood at 396.434 million barrels as of the fourth week of March, according to an online dashboard by the Energy Information Administration. In November 2024 the DOE made the final awards for a reinjection campaign to compensate withdrawals induced by Russia’s invasion of Ukraine. The replenishment program exceeded the 2022 sales from the SPR by 20 million barrels, according to the DOE. Purchases under the refill program were made “at a good

Read More »

DARPA backs multiple quantum paths in benchmarking initiative

Nord Quantique plans to use the money to expand its team, says Julien Camirand Lemyre, the company’s president, CTO and co-founder. That’s an opportunity to accelerate the development of the technology, he says. “By extension, what this will mean for enterprise users is that quantum solutions to real-world business problems will be available sooner, due to that acceleration,” he says. “And so enterprise customers need to also accelerate how they are thinking about adoption because the advantages quantum will provide will be tangible.” Lemyre predicts that useful quantum computers will be available for enterprises before the end of the decade. “In fact, there has been tremendous progress across the entire quantum sector in recent years,” he says. “This means industry needs to begin thinking seriously about how they will integrate quantum computing into their operations over the medium term.” “We’re seeing, with the deployment of programs like the QBI in the US and investments of billions of dollars from  public and private investors globally, an increasing maturity of quantum technologies,” said Paul Terry, CEO at Photonic, which is betting on optically-linked silicon spin qubits.  “Our architecture has been designed from day one to build modular, scalable, fault-tolerant quantum systems able to be deployed in data centers,” he said. He’s not the only one to mention fault-tolerance. DARPA stressed fault-tolerance in its announcement, and its selections point to the importance of error correction for the future of quantum computing. The biggest problem with today’s quantum computers is that the number of errors increases faster than the number of qubits, making them impossible to scale up. Quantum companies are working on a variety of approaches to reduce the error rates low enough that quantum computers can get big enough to actually to real work.

Read More »

Zayo’s Fiber Bet: Scaling Long-Haul and Metro Networks for AI Data Centers

Zayo Group Holdings Inc. has emerged as one of the most aggressive fiber infrastructure players in North America, particularly in the context of AI-driven growth. With a $4 billion investment in AI-related long-haul fiber expansion, Zayo is positioning itself as a critical enabler of the AI and cloud computing boom. The company is aggressively expanding its long-haul fiber network, adding over 5,000 route miles to accommodate the anticipated 2-6X increase in AI-driven data center capacity by 2030. This initiative comes as AI workloads continue to push the limits of existing network infrastructure, particularly in long-haul connectivity. New Fiber Routes The new routes include critical connections between key AI data center hubs, such as Chicago-Columbus, Las Vegas-Reno, Atlanta-Ashburn, and Columbus-Indianapolis, among others. Additionally, Zayo is overbuilding seven existing routes to further enhance network performance, resiliency, and low-latency connectivity. This new development is a follow-on to 15 new long haul routes representing over 5300 route miles of new and expanded capacity deployed over the last five years. These route locations were selected based on expected data center growth, power availability, existing capacity constraints, and specific regional considerations. The AI Data Center Sector: A Significant Driver of Fiber Infrastructure The exponential growth of AI-driven data center demand means that the U.S. faces a potential bandwidth shortage. Zayo’s investments look to ensure that long-haul fiber capacity keeps pace with this growth, allowing AI data centers to efficiently transmit data between key markets. This is especially important as data center development locations are being driven more by land and power availability rather than proximity to market. Emerging AI data center markets get the high speed fiber they need, especially as they are moving away from expensive power regions (e.g., California, Virginia) to lower-cost locations (e.g., Ohio, Nevada, Midwest). Without the high-speed networking capabilities offered by

Read More »

Crusoe Adds 4.5 GW Natural Gas to Fuel AI, Expands Abilene Data Center to 1.2 GW

Crusoe and the Lancium Clean Campus: A New Model for Power-Optimized Compute Crusoe Energy’s 300-megawatt deployment at the Lancium Clean Campus in Abilene is a significant marker of how data center strategies are evolving to integrate more deeply with energy markets. By leveraging demand flexibility, stranded power, and renewable energy, Crusoe is following a path similar to some of the most forward-thinking projects in the data center industry. But it’s also pushing the model further—fusing AI and high-performance computing (HPC) with the next generation of power-responsive infrastructure. Here’s how Crusoe’s strategy compares to some of the industry’s most notable power-driven data center deployments: Google’s Oklahoma Data Center: Proximity to Renewable Growth A close parallel to Crusoe’s energy-centric site selection strategy is Google’s Mayes County data center in Oklahoma. Google sited its facility there to take advantage of abundant wind energy, aligning with the local power grid’s renewable capacity. Similarly, Crusoe is tapping into Texas’s deregulated energy market, optimizing for low-cost renewable power and the ability to flexibly scale compute operations in response to grid conditions. Google has also been an industry leader in time-matching workloads to renewable energy availability, something that Crusoe is enabling in real time through grid-responsive compute orchestration. Sabey Data Centers in Quincy: Low-Cost Power as a Foundation Another instructive comparison is Sabey Data Centers’ Quincy, Washington, campus, which was built around one of the most cost-effective power sources in the U.S.—abundant hydroelectric energy. Sabey’s long-term strategy has been to co-locate power-intensive compute infrastructure near predictable, low-cost energy sources. Crusoe’s project applies a similar logic but adapts it for a variable grid environment. Instead of relying on a fixed low-cost power source like hydro, Crusoe dynamically adjusts to real-time energy availability, a strategy that could become a model for future power-aware, AI-driven workloads. Compass and Aligned: Modular, Energy-Adaptive

Read More »

Executive Roundtable: Data Center Site Selection and Market Evolution in a Constrained Environment

For the third installment of our Executive Roundtable for the First Quarter of 2025, we asked our panel of seasoned industry experts about how the dynamics of data center site selection have never been more complex—or more critical to long-term success. In an industry where speed to market is paramount, operators must now navigate an increasingly constrained landscape in the age of AI, ultra cloud and hyperscale expansion, marked by fierce competition for land, tightening power availability, and evolving local regulations.  Traditional core markets such as Northern Virginia, Dallas, and Phoenix remain essential, but supply constraints and permitting challenges are prompting developers to rethink their approach. As hyperscalers and colocation providers push the boundaries of site selection strategy, secondary and edge markets are emerging as viable alternatives, driven by favorable energy economics, infrastructure investment, and shifting customer demand.  At the same time, power procurement is now reshaping the equation. With grid limitations and interconnection delays creating uncertainty in major hubs, operators are exploring new solutions, from direct utility partnerships to on-site generation with renewables, natural gas, and burgeoning modular nuclear concepts. The question now is not just where to build but how to ensure long-term operational resilience. As data center demand accelerates, operators face mounting challenges in securing suitable land, reliable power, and regulatory approvals in both established and emerging markets.  And so we asked our distinguished executive panel for the First Quarter of 2025, with grid capacity constraints, zoning complexities, and heightened competition shaping development decisions, how are companies refining their site selection strategies in Q1 2025 to balance speed to market, scalability, and sustainability? And, which North American regions are showing the greatest potential as the next wave of data center expansion takes shape?

Read More »

Podcast: iMasons CEO Santiago Suinaga on the Future of Sustainable AI Data Centers

For this episode of the DCF Show podcast, host Matt Vincent, Editor in Chief of Data Center Frontier, is joined by Santiago Suinaga, CEO of Infrastructure Masons (iMasons), to explore the urgent challenges of scaling data center construction while maintaining sustainability commitments, among other pertinent industry topics. The AI Race and Responsible Construction “Balancing scale and sustainability is key because the AI race is real,” Suinaga emphasizes. “Forecasted capacities have skyrocketed to meet AI demand. Hyperscale end users and data center developers are deploying high volumes to secure capacity in an increasingly constrained global market.” This surge in demand pressures the industry to build faster than ever before. Yet, as Suinaga notes, speed and sustainability must go hand in hand. “The industry must embrace a build fast, build smart mentality. Leveraging digital twin technology, AI-driven design optimization, and circular economy principles is critical.” Sustainability, he argues, should be embedded at every stage of new builds, from integrating low-carbon materials to optimizing energy efficiency from the outset. “We can’t afford to compromise sustainability for speed. Instead, we must integrate renewable energy sources and partner with local governments, utilities, and energy providers to accelerate responsible construction.” A key example of this thinking is peak shaving—using redundant infrastructure and idle capacities to power the grid when data center demand is low. “99.99% of the time, this excess capacity can support local communities, while ensuring the data center retains prioritized energy supply when needed.” Addressing Embodied Carbon and Supply Chain Accountability Decarbonization is a cornerstone of iMasons’ efforts, particularly through the iMasons Climate Accord. Suinaga highlights the importance of tackling embodied carbon—the emissions embedded in data center construction materials and IT hardware. “We need standardized reporting metrics and supplier accountability to drive meaningful change,” he says. “Greater transparency across the supply chain can be

Read More »

Executive Roundtable: The Changing Economics of Data Center Development

For the final installment of our Executive Roundtable for the First Quarter of 2025, we asked our panel of seasoned industry experts about how the economics of data center development are shifting as rising construction costs, supply chain volatility, and evolving enterprise workloads reshape the industry’s growth trajectory.  Operators are under pressure to deliver capacity at scale while managing higher capital expenditures, extended lead times for critical infrastructure, and increasing complexity in customer requirements. At the same time, demand remains strong, driven by AI workloads, cloud expansion, and heightened enterprise reliance on digital infrastructure. To remain competitive, data center providers must rethink everything from procurement strategies to financing models, balancing long-term investment with the need for speed and flexibility in deployment.  Energy procurement and real estate dynamics are also in flux. As power availability becomes a gating factor in core markets, operators and energy specialists are forging new partnerships to secure sustainable, cost-effective solutions. Meanwhile, the real estate landscape is evolving, with developers looking at creative approaches to land acquisition, modular construction, and even repurposing existing assets.  With construction costs escalating, supply chain disruptions extending project timelines, and enterprise IT requirements shifting toward AI and high-density workloads, data center operators, energy providers, and real estate firms are reevaluating their business models. And so we asked our expert panel: From innovative procurement strategies and new financing approaches to evolving power solutions and creative site selection methods, what adjustments are being made to sustain growth and profitability in 2025? And how are industry leaders balancing cost efficiency, scalability, and customer needs in an increasingly complex and competitive market?

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »