Stay Ahead, Stay ONMINE

Why power utilities must recalibrate strategy now

Power utilities today are confronting a perfect storm of explosive demand, aging assets and unpredictable policy shifts, making strategic alignment more critical and more challenging to achieve than ever before. A surge in demand from large, energy-intensive loads, such as data operations, electric vehicle manufacturing and hydrogen production facilities, complicates forecasting and challenges existing planning […]

Power utilities today are confronting a perfect storm of explosive demand, aging assets and unpredictable policy shifts, making strategic alignment more critical and more challenging to achieve than ever before. A surge in demand from large, energy-intensive loads, such as data operations, electric vehicle manufacturing and hydrogen production facilities, complicates forecasting and challenges existing planning processes. Simultaneously, aging grid assets, some nearing or exceeding their design lives, are increasingly vulnerable. These technical challenges are compounded by heightened regulatory scrutiny, decarbonization goals and shifting policy around energy affordability, reliability and sustainability. The unique circumstances companies are operating in today make it essential for executive teams to proactively address and shape company strategy.

In today’s dynamic environment, strategic planning must be an essential capability of the executive team. It requires a deliberate, structured approach that is grounded in alignment at the top, provides clarity in execution, resourcing and goals, and sustains leadership attention for frequent assessment and adaptation. Despite the best intentions, strategic initiatives can miss the mark if they lack this foundation.

Why strategy can often fail with utilities

For today’s power utilities, especially those governed by elected boards or serving highly engaged communities, leadership and board perspectives can vary widely. One board member may be exclusively focused on affordability, while another may advocate for decarbonization and renewables integration. These different viewpoints can be a strength in the development of the company strategy, but without an intentional process that drives alignment, they can often create confusion and slow progress.

Common symptoms of executive misalignment include:

  • Decision-making paralysis, where little progress is made on strategic priorities
  • Conflicting interpretations leading to diluted execution, as different leaders act on their own interpretations of “the strategy”
  • Wasted effort, as incomplete or poorly supported initiatives don’t move the needle
  • Frustration at all levels, as priorities seem to shift and unifying objectives remains unclear

Misalignment is not always the result of disagreement or differences in opinion. It often stems from a lack of consistent and transparent engagement with company leadership. Without a forum to discuss trade-offs, assumptions and risks, organizations will produce strategic plans that look good on paper and likely include “something for everyone” but fail to guide strategic action. This creates organizational inefficiencies, where operational teams invest time and money in projects that lack support or do not align with true strategic priorities.

The hidden cost of misalignment: A case of data center growth

Across the United States, power utilities are currently seeing an unprecedented surge in requests from data operations (hyperscalers, edge data centers, data miners, etc.). While each opportunity promises economic development, they also present seismic shifts in demand profiles, reliability risks and strain on infrastructure. Misaligned planning efforts in this scenario could lead to serious adverse outcomes.

Consider an example where a utility board in the northeastern United States enthusiastically supported new data operations to stimulate economic development. Some executives advocated for “accelerated service” and proposed quick solutions such as backup gas generation and temporary interconnections to fast-track the load and ensure the opportunity was captured. Meanwhile, operations and engineering executives were advocating to prioritize system reliability through capital investment in upgrades and refurbishing aging infrastructure. Ultimately, the utility caved to pressure from the potential new customer and moved forward with the “accelerated service” plan to capture the data operation. Leadership received initial acclaim in the press and with policymakers for enabling “efficient economic growth,” but as the data operations came online, issues arose:

  • Peak demand surges: Record summer heat coincided with ramping data operation activity. System balancing became fragile, and rolling outages were narrowly avoided using emergency market purchases
  • Backfeed and grid instability: Temporary interconnections caused unanticipated backfeed in parts of the grid, and the system operator initiated emergency curtailment protocols
  • Capital cost overruns: Deferred grid upgrades eventually proceeded to address the realization of reliability issues, but the cost was significantly higher due to rushed design work and inflation of material and equipment costs
  • Public trust and regulator friction: Reliability issues and overspending spurred the state commission to launch an investigation into whether the utility misrepresented system readiness and cost exposure

This scenario reveals an important insight. This utility didn’t fail because it lacked foresight but because its strategic vision was fragmented.

What should utilities do differently?

To successfully navigate an evolving landscape, strategic planning must be treated as a deliberate, structured process and not a one-time document. That process begins with a clear understanding of what strategy is and what it must do:

  1. Diagnose the current state – Why will the status quo no longer work? What disruptions, risks, or opportunities demand a new path?
  2. Establish guiding principles and assumptions – What parameters, trade-offs and guardrails will shape the strategy? What values or priorities are non-negotiable?
  3. Prioritize and sequence strategic inputs – What actions or initiatives will move the strategy forward? In what order and with what resources?
  4. Create alignment through dialogue – How will leaders debate, communicate and sustain a shared understanding of the strategy? What forums or rhythms ensure that alignment holds over time?

Executives and boards must shift from treating strategy as a static plan to a living framework that evolves with conditions and is continuously tested for alignment, clarity and impact.

Tools and tactics for strategic alignment

The following practices are worth considering to create alignment and increase the effectiveness of strategy development and execution:

  • Structured executive interviews: Conduct confidential discussions with leadership to understand strategic perspectives, identify divergent views and surface areas of friction
  • Strategic alignment assessment: Measure how aligned executive leadership is by comparing strategic perspectives across functions, business units, etc. This comparison should be used as a foundation to facilitate discussion and debate that creates alignment around strategic priorities and performance gaps
  • Facilitated retreats or strategy workshops: Create forums away from the daily office grind where leaders can work through differences in real time and build alignment
  • Performance scorecards: Develop metrics that track progress against strategic priorities
  • Regular strategic check-ins: Frequently revisit the strategy to ensure it remains aligned with evolving regulatory, financial and system conditions

Why strategic alignment is critical now

The transformation underway in the utility industry has raised the stakes for every energy provider. With recent changes in the legal, regulatory and economic landscape, leaders must have a deliberate approach to aligning around company strategy to successfully navigate the road ahead. Long-term success hinges on the ability to unify leadership behind a shared direction through an integrated process that builds alignment as conditions and risks evolve. For utilities navigating this turbulent landscape, aligning leadership around a shared strategic vision is more than best practice. It is a competitive advantage.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

IBM, AMD team on quantum computing

IBM and AMD are working together to blend Big Blue’s quantum computers with the chipmaker’s CPUs, GPUs and FPGAs to build intelligent, quantum-centric, high-performance computers. The plan is to combine the power and intelligence of quantum computers with the benefits of classic computing to enable new kinds of algorithms that

Read More »

Spotlight report: IT careers in the AI era

AI is not only changing the world at large, it’s radically changing the professional worlds of all IT workers—from developers and SOC analysts, to helpdesk staff, I&O teams, enterprise architects, and CIOs. What new skills and knowledge will you need to thrive in the age of AI? What new roles

Read More »

In the enterprise IT view of AI agents, three words matter most

Reactive AI agents Starting with the reactive category is justified, because this category makes up almost half the total number of deployed AI agent applications enterprises report. Most early enterprise AI applications were chatbots aimed at pre- and post-sale support, and some of these have been gradually converting from online

Read More »

Petronas Starts Production in Bindu Field

Malaysia’s Petroliam Nasional Berhad (Petronas) has produced the first hydrocarbons from the Bindu field, located 210 kilometers (136 miles) off the coast of Terengganu. The project is under Petronas Carigali Sdn. Bhd., which became operator of the Gas Production Sharing Contract (GPSC) April 1, 2025. The Bindu field development includes a new wellhead platform linked to the Guntong E production hub through a 62-kilometer (38.5-mile) pipeline. Both the topside and substructure were built locally, designed to operate without personnel, and powered by a solar energy system, Petronas said in a media release. This project represents the last greenfield development under the current GPSC terms, with the field expected to reach a peak production of 75 million standard cubic feet a day of gas from two wells, Petronas said. Petronas Carigali operates the field with a 50 percent participating interest in the GPSC. ExxonMobil Exploration and Production Malaysia Inc. holds the remaining stake. “This first hydrocarbon production establishes Petronas Carigali’s operational readiness as GPSC operator. It demonstrates our technical expertise, accelerates domestic gas monetization, and most importantly, ensures maximum value from Malaysia’s resources for the nation”, Hazli Sham Kassim, Chief Executive Officer of Petronas Carigali, said. “The GPSC operations supply nearly half of Peninsular Malaysia’s gas demand, powering both industrial-scale power plants and small-to-medium enterprise manufacturers through our existing infrastructure”, he added. To contact the author, email [email protected] What do you think? We’d love to hear from you, join the conversation on the Rigzone Energy Network. The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our industry, share knowledge, connect with peers and industry insiders and engage in a professional community that will empower your career in energy. MORE FROM THIS AUTHOR

Read More »

Equinor Makes New Oil and Gas Discovery Near Troll Field

Equinor ASA and its partners have struck an oil and gas discovery nine kilometers (5.59 miles) north of the Troll field on Norway’s side of the North Sea. Well 35/11-31 S, the 24th exploration well drilled in production license 090, is estimated to hold recoverable resources of 0.1-1.1 million standard cubic meters of oil equivalent, or 0.6-6.9 million barrels of oil equivalent, according to the Norwegian Offshore Directorate. Drilling, conducted by the COSL Innovator rig, primarily targeted reservoir rocks from the Late Jurassic (Sognefjord Formation) and, secondarily, the Middle Jurassic (Brent Group) and the Late Palaeocene. “Well 35/11-31 S encountered a five-meter oil column in the Sognefjord Formation, about 36 meters of which consists of sandstone with very good reservoir properties”, the Directorate reported. “A four-meter gas column was encountered in the secondary exploration target, in a 68-meter thick sandstone layer with moderate to good reservoir properties. “In the Lista Formation, the well encountered aquiferous sandstone with very good reservoir properties. “The well was not formation-tested, but extensive volumes of data and samples were collected. “Well 35/11-31 S was drilled to a vertical depth of 2636 meters below sea level and was terminated in the Oseberg Formation from the Middle Jurassic. “Water depth at the site is 354 meters, and the well has been permanently plugged and abandoned”. Under production license 090, Norway’s majority state-owned Equinor is operator with a 45 percent stake through Equinor Energy AS. Var Energi ASA owns 40 percent. INPEX Idemitsu Norge AS has 15 percent. Equinor said separately it would consider tying the new discovery back to existing or future infrastructure. “These are discoveries in an interesting area with a well-developed infrastructure. In recent years, we have made several discoveries in the neighborhood, and we plan to further explore the area”, Geir Sortveit, Equinor senior

Read More »

Oil Settles Lower After Rally

Oil dropped after four sessions of gains as investors waited for clearer signals on supplies and tracked the wider market fallout from US President Donald Trump’s attempt to oust a Federal Reserve governor. West Texas Intermediate fell by 2.4%, the biggest loss since early August, to settle near $63 a barrel, remaining within the range of $62 to $65 that prices have been locked in most of this month. Crude’s dip on Tuesday reflected a wariness toward risk assets in wider markets, driven by Trump’s bid to remove the Fed’s Lisa Cook. The Department of Homeland Security issued a notice Monday to double tariffs on all Indian imports as the US sought to penalize the country for buying Russian crude. The move — to take effect Wednesday — is billed as part of a broader effort by Trump to broker peace between Russia and Ukraine. Still, Trump’s history of reneging on trade threats is leading many investors to discount the prospect that the higher tariffs will take effect. A similar sentiment is visible in the South Asian nation, where refiners plan to maintain the bulk of their purchases from Russia in the coming weeks. Crude has traded in a narrow band for most of August, with traders assessing the effect of US levies as well as the longer-term consequences of major supply hikes from OPEC+. The International Energy Agency warned earlier this month that the global oil market was on track for a record surplus next year as demand growth slows while supplies swell, though markets have been firm over the summer. Meanwhile, Trump praised oil prices “down close to $60 a barrel” at a Cabinet meeting Tuesday, adding the price will be “breaking that pretty soon.” He didn’t clarify which catalyst is set to push prices below $60. Trading

Read More »

PetroChina Proposes Buying Gas Cos for $5.6B

PetroChina Co. has proposed buying three natural gas storage companies for 40 billion yuan ($5.6 billion) to bolster the nation’s infrastructure for the fuel. The board of China’s biggest oil and gas producer on Tuesday approved a plan to buy Xinjiang Gas Storage, Xiangguosi Gas Storage and Liaohe Gas Storage from state-owned controlling shareholder China National Petroleum Corp., according to a filing on the Hong Kong stock exchange. The move comes as China ramps up its gas consumption in an effort to boost energy security and ease dependence on coal to reduce emissions and air quality. China’s natural gas demand is expected to rise more than 40% through 2050, according to a projection by the International Energy Agency that takes into account existing and announced policies. Buying the three companies would add about 11 billion cubic meters to PetroChina’s gas storage capacity, the company said. “This will enhance adjustment efficiency and maximize the overall benefits of the natural gas industry chain,” the company said in the filing.  PetroChina is offering to pay about 17 billion yuan for Xinjiang Gas Storage, 10 billion yuan for Xiangguosi Gas Storage and about 13 yuan for Liaobe Gas Storage.  The proposed deal comes after PetroChina’s profits have fallen from a record high in the first half of the year as crude prices dropped and domestic oil demand stagnated. The company reported 84 billion yuan ($12 billion) in net profit for the first six months of the year, compared to 89 billion yuan in the same period in 2024. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

U.S. Department of Energy to Distribute Next Round of HALEU to U.S. Nuclear Industry

HALEU to support deployment of next-generation reactors and domestic fuel lines WASHINGTON—The U.S. Department of Energy (DOE) today made conditional commitments to provide high-assay low-enriched uranium (HALEU) to three U.S. companies to meet near-term fuel needs. The second round of HALEU allocations will support the testing of two advanced reactor designs and jumpstart a new domestic advanced fuel line to unleash an American nuclear energy renaissance. “President Trump has prioritized jumpstarting a true nuclear energy renaissance, and the Department of Energy is doing everything within its power to achieve this ambitious agenda, including increasing access to materials needed to fabricate advanced nuclear fuels,” said U.S. Secretary of Energy Chris Wright. “We’re reducing our dependence on foreign-sourced minerals while giving the private sector the boost it needs to succeed. The HALEU allocation program is a win for the economy, energy security, and the American people.” HALEU is not currently available from domestic suppliers and many advanced reactors need the material to achieve smaller designs, longer operating cycles, and increased efficiencies over current technologies. To help fill this gap, DOE created a process for nuclear developers to request HALEU material from DOE sources, including material from the National Nuclear Security Administration (NNSA). DOE allocated its first amounts of HALEU to five companies earlier this year, with three of them requiring fuel delivery in 2025. Three additional companies have been selected for conditional commitments based on prioritization criteria established through the program. The three companies are: Antares Nuclear, Inc., for use in their advanced microreactor design that is looking to go critical by July 4, 2026, under the Department’s Reactor Pilot Program. Standard Nuclear, Inc., to establish TRISO fuel lines to support the Reactor Pilot Program and other TRISO-fueled reactors. Abilene Christian University/Natura Resources LLC, for use in a new molten salt research

Read More »

Import tariffs could slow transmission development, drive up utility costs: Morningstar

Dive Brief: The Trump administration’s 50% import tariffs on steel, aluminum and copper could drive up the cost of utility grid modernization and clean energy projects while making it harder to buy transformers and other equipment, according to Morningstar DBRS. With materials typically accounting for 20% to 30% of a transmission project’s costs, the tariffs could slow or reduce the scope of transmission development, Morningstar analysts said in a report released Monday. “Some marginal projects (e.g., economic-driven grid upgrades or non-mandated resiliency projects) may no longer generate a positive net present value when significant increases in costs are taken into consideration,” the analysts said. Dive Insight: The tariffs the Trump administration has imposed this year target imported inputs that are critical to the utilities sector, including structural steel for transmission towers, aluminum conductors and copper wiring, analysts with the credit rating agency said. The tariffs also cover semi-finished and derivative copper products, such as pipes, tubes, wires and cables, rods, sheets/strips and fittings/connectors/electrical components, the analysts said. “For utilities, which consume large quantities of steel (e.g., poles, pylons, rebar), aluminum (e.g., conductors, transformer components), and copper products (e.g., finished wire/cable, connectors, fittings), these dynamics point to higher input costs for capital projects,” Morningstar analysts said. Some importers are diverting shipments away from the United States to avoid the tariffs, raising the risk of procurement delays, according to the report. “Electrical steel — a specialty steel used in transformers — remains in short domestic supply, and new tariffs could further constrain imports needed for grid equipment,” the analysts said. “The 50% steel tariff extends to the steel content of manufactured products, meaning even imported transformers are subject to hefty duties on their steel cores.” With U.S. electric bills climbing 23%, on average, from 2019 to 2024, increased utility costs may raise concerns

Read More »

HPE extends Juniper’s Mist AI to boost data center management

Further, Aaron stated that Marvis Actions offers automated remediations for IT-approved scenarios. Using a Human-in-the-Loop (HITL) trust model, customers can develop confidence over time, giving Marvis AI Assistant permission to automatically resolve problems such as: Correcting VLAN misconfigurations Shutting down ports to resolve network loops Upgrading noncompliant devices Handling routine policy updates and firmware compliance Resolving port-stuck issues and misconfigured access points “Each action, whether initiated by IT or executed autonomously by Marvis AI Assistant, is validated post-remediation and logged in the Marvis Actions Dashboard. This maintains full auditability and HITL oversight while building trust through consistent, accurate results,” Aaron wrote. Juniper also extended Marvis further into the vendor’s Apstra data center networking environment by letting the platform have access to Apstra’s contextual graph database, which maps the components in the data center including switches, routers, servers, links, policies and services.  The idea is to let the MistAI framework understand complex queries, break them into logical components, and iteratively query data sources to synthesize actionable responses, Aaron stated.  “This framework currently supports nearly 300 API queries. It will expand to enable autonomous service provisioning activities, incorporate additional data sources like elastic search, and enhance feedback mechanisms for continuous learning—critical steps toward fully self-driving data centers,” Aaron wrote. In addition to the Apstra extension, Juniper is adding Marvis Minis capabilities to data center operations. Marvis Minis set up a digital twin of a customer’s network environment to simulate and test user connections, validate network configurations, and find/detect problems without users being present and without requiring any additional hardware, according to Juniper.

Read More »

Vertiv launches one-day installation package for AI data center systems

Data center infrastructure vendor Vertiv has introduced Vertiv OneCore, a fully modular data center building block design supporting AI and HPC applications intended to speed deployment of compute equipment in as little as one day. OneCore is a unified “slab-up,” factory-assembled, turnkey platform that integrates the company’s power, thermal, and IT infrastructure systems within a pre-assembled shell. Slab-up is a data center design where equipment—such as server racks and cabinets—are installed directly on a solid concrete slab floor, rather than on a raised floor system. Vertiv says the design simplifies logistics, minimizes on-site labor and complexity, and supports consistent quality, cost, and schedule outcomes. Vertiv Unify, which assists Vertiv cooling, UPS, and power management equipment to connect to building and data center management, provides integrated system visibility and centralized management. 

Read More »

Nvidia turns to software to speed up its data center networking hardware for AI

Typically chunks of AI tasks are distributed across GPUs, which then coordinate to provide a unified output. Adaptive routing ensures the network and GPUs over long distances are in sync when running AI workloads, Shainer said. Jitter bugs “If I retransmit the packet, I create jitter, which means one GPU out of many will be delayed and all the others have to wait for that GPU to finish,” Shainer said. The congestion control improvements remove bottlenecks by balancing transmissions across switches. Nvidia tested XGS algorithms in its server hardware and measured a 1.9x improvement in GPU-to-GPU communication compared to off-the-shelf networking technology, executives said during a briefing on the technology. Cloud providers already have long-distance high-speed networks. For example, Google’s large-scale Jupiter network uses optical switching for fast communications between its AI chips, which are called TPUs. It is important to separate the physical infrastructure from the software algorithms like XGS, Shainer said.

Read More »

Fluke Networks expands testing to help ease data center networking challenges

High-density fiber connections amplify contamination risks The shift toward higher-density fiber connections has significantly complicated contamination control. Modern array connectors can house up to 24 individual fibers within a single connection point. In contrast, traditional duplex connections contained just two fibers. “The slightest little bit of dust on one of those nine micron wide fibers, which, by the way, is much smaller than a human hair, the slightest little bit of dust on any one of the 24 in that connector, and it’s not going to work,” Mullins explained.  The inspection and cleaning requirements extend beyond traditional fiber testing. Each kit includes cleaning and inspection capabilities. Mullins noted that many technicians take shortcuts on fiber preparation.  “Cleaning and inspecting a fiber, every time you unplug it and plug it back in, adds, like another minute worth of work. But you know what? If you don’t do it, you’re gonna pay for it down the road,” he said. Cable identification a persistent challenge In addition to the new kits, Fluke Networks is also continuing to help solve other persistent networking issues. Physical cable identification continues to plague data center operations despite advances in network management and monitoring. Fluke’s solutions address this through multiple approaches. These include tone and probe technology, remote identification systems, and active network port discovery.

Read More »

Cisco ties storage networking gear to IBM z17 mainframe

“IBM Z systems are mainframes known for their ability to handle massive transaction volumes, support large-scale databases, and provide unmatched levels of security and uptime,” wrote Fausto Vaninetti, a senior solutions engineer for data center technologies at Cisco, in a blog post about the news. “The newest in the IBM Z system family, IBM z17 is the first mainframe fully engineered for the AI age, unlocking expanded capabilities for enterprise-scale AI, such as large language models, generative AI, and accelerated inferencing. However, the performance of mainframe applications depends on the underlying storage infrastructure.” SANs play a critical role in ensuring fast, reliable, and secure access to data, Vaninetti wrote: “For mainframe environments, which leverage high-speed [Fibre Connection] FICON protocol, having a robust SAN fabric that supports these requirements is non-negotiable. A solution that combines high throughput, low latency, and enterprise-class resilience is vital to ensure seamless operations and meet stringent service-level agreement requirements.” According to Vaninetti, some standout features of the MDS 9000 Series for mainframe environments include:

Read More »

Scaling Up: Tract’s Master-Planned Land and Infrastructure Approach to Data Center Development

With the rapid growth of physical data center infrastructure, it’s no surprise that a niche market has emerged for companies specializing in land acquisition. Reports of massive property purchases by firms planning new facilities appear almost daily—and so do accounts of the challenges developers face before the first shovel hits the ground. As parcel sizes grow and power and water demands intensify, the complexities of acquiring and preparing these sites have only increased. Tract is a leader in this space. The Denver-based company develops master-planned data center parks, with more than 25,000 acres of potential sites under its control and plans to support over 25 GW of workload capacity. To put that into perspective, 25,000 acres is roughly 40 square miles—about two-thirds the land area of Washington, D.C., or, for European readers, two-thirds the size of Liechtenstein. Building Shovel-Ready Megasites Rather than waiting for developers to come knocking, Tract takes a proactive approach, built on the core belief that the future of data center growth lies in pre-entitled, zoned, and infrastructure-ready megasites. The company works years in advance to deliver shovel-ready campuses with reliable energy, fiber connectivity, and municipal cooperation already in place. Its model emphasizes strategic land aggregation in high-growth regions, the cultivation of long-term relationships with utilities and governments, and master planning for power, cooling, transportation, and sustainability. This integrated approach positions Tract to deliver both speed and certainty to hyperscale project developers—at scale. Tract’s leadership team brings deep industry experience. Founder and Executive Chairman Grant van Rooyen previously led acquisitions and expansions at Cologix and Teraco. President Matt Spencer brings more than 35 years of telecom and infrastructure leadership, while Chief Energy Officer Nat Sahlstrom, former head of Amazon’s global energy, water, and sustainability teams, helped make Amazon the world’s largest buyer of renewable energy. Backed by

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »