Stay Ahead, Stay ONMINE

Google’s AI-Powered Grid Revolution: How Data Centers Are Reshaping the U.S. Power Landscape

Google Unveils Groundbreaking AI Partnership with PJM and Tapestry to Reinvent the U.S. Power Grid In a move that underscores the growing intersection between digital infrastructure and energy resilience, Google has announced a major new initiative to modernize the U.S. electric grid using artificial intelligence. The company is partnering with PJM Interconnection—the largest grid operator […]

Google Unveils Groundbreaking AI Partnership with PJM and Tapestry to Reinvent the U.S. Power Grid

In a move that underscores the growing intersection between digital infrastructure and energy resilience, Google has announced a major new initiative to modernize the U.S. electric grid using artificial intelligence. The company is partnering with PJM Interconnection—the largest grid operator in North America—and Tapestry, an Alphabet moonshot backed by Google Cloud and DeepMind, to develop AI tools aimed at transforming how new power sources are brought online.

The initiative, detailed in a blog post by Alphabet and Google President Ruth Porat, represents one of Google’s most ambitious energy collaborations to date. It seeks to address mounting challenges facing grid operators, particularly the explosive backlog of energy generation projects that await interconnection in a power system unprepared for 21st-century demands.

“This is our biggest step yet to use AI for building a stronger, more resilient electricity system,” Porat wrote.

Tapping AI to Tackle an Interconnection Crisis

The timing is critical. The U.S. energy grid is facing a historic inflection point. According to the Lawrence Berkeley National Laboratory, more than 2,600 gigawatts (GW) of generation and storage projects were waiting in interconnection queues at the end of 2023—more than double the total installed capacity of the entire U.S. grid.

Meanwhile, the Federal Energy Regulatory Commission (FERC) has revised its five-year demand forecast, now projecting U.S. peak load to rise by 128 GW before 2030—more than triple the previous estimate.

Grid operators like PJM are straining to process a surge in interconnection requests, which have skyrocketed from a few dozen to thousands annually. This wave of applications has exposed the limits of legacy systems and planning tools. Enter AI.

Tapestry’s role is to develop and deploy AI models that can intelligently manage and streamline the complex process of interconnecting power sources—renewables, storage, and conventional generation—across PJM’s vast network, which spans 13 states and the District of Columbia, serving 67 million people.

A Unified, AI-Powered Grid Management Platform

The partnership’s multi-year roadmap aims to cut the interconnection approval process from years to months. Key pillars of the effort include:

  • Accelerating capacity additions: By automating time-intensive verification and modeling processes, AI tools from Tapestry will help PJM quickly assess and approve new energy projects. This could significantly reduce the development cycle for grid-connected power, addressing bottlenecks that have plagued renewable developers in particular.
  • Driving cost-effective grid expansion: Tapestry will integrate disparate databases and modeling tools into a single secure platform. The goal is to create a unified model of PJM’s network where grid planners and developers can collaborate seamlessly, boosting transparency and planning agility.
  • Integrating diverse energy resources: With variable renewables such as solar and wind comprising a large share of PJM’s queue, Tapestry’s AI solutions aim to enable more precise modeling and faster incorporation of these intermittent resources into the grid mix.

Strategic Implications for Data Centers and the AI Economy

For the data center industry—especially as AI workloads dramatically reshape infrastructure demand—Google’s announcement is more than a technical achievement. It’s a signal that the rules of engagement for grid interaction are changing. As hyperscalers seek not only to power their operations sustainably but also to help shape the energy systems around them, partnerships like this one may become a template.

Google is also backing complementary technologies such as advanced nuclear and enhanced geothermal, with the long-term goal of unlocking new, firm capacity. These efforts align with the industry’s growing push for direct grid participation and innovative procurement strategies to manage skyrocketing power needs.

As Porat noted, “Creative solutions from across the private and public sectors are crucial to ensure the U.S. has the energy capacity, affordability, and reliability needed to capitalize on the opportunity for growth.”

Backing PJM’s Long-Term Planning Reforms

This collaboration arrives at a critical juncture for PJM, which is already deep into a multi-year effort to reform its planning and interconnection processes. The AI-powered partnership with Google and Tapestry is designed to complement and accelerate this work—especially as PJM processes the final 67 GW of projects remaining in its interconnection transition phase, part of a broader 200 GW backlog.

“Innovation will be critical to meeting the demands on the future grid, and we’re leveraging some of the world’s best capabilities with these cutting-edge tools,” said Aftab Khan, Executive Vice President of Operations, Planning & Security at PJM. “PJM is committed to bringing new generation onto the system as quickly and reliably as possible.”

PJM plans to launch a new cycle-based process for interconnection applications in early 2026, and the AI partnership is expected to play a foundational role in that effort. As part of its broader grid modernization push, PJM has also rolled out the Reliability Resource Initiative, aimed at expediting selected projects within its current queue.

Tapestry General Manager Page Crahan described the effort as one that “will enable PJM to make faster decisions with greater confidence, making more energy capacity available to interconnect in shorter time frames.” For Google, this isn’t just about grid optimization—it’s a strategic necessity for a digital economy whose energy appetite is growing exponentially.

“This initiative brings together our most advanced technologies to help solve one of the greatest challenges of the AI era—evolving our electricity systems to meet this moment,” said Amanda Peterson Corio, Head of Data Center Energy for Google.

Looking Ahead: A Blueprint for Grid Innovation

Google’s collaboration with PJM and Tapestry represents more than a software upgrade. It’s an architectural rethinking of how intelligence and infrastructure must co-evolve. At the heart of this shift is the belief that AI isn’t just a driver of data center demand—it may also be the key to making that demand sustainable.

By aligning cutting-edge AI innovation with PJM’s operational depth and Tapestry’s moonshot ambition, this partnership lays the groundwork for something much larger: a national model for grid modernization. It represents a fusion of deep tech, institutional coordination, and real-world urgency—the very factors that will define the power landscape of the AI era.

For the data center industry and beyond, it’s a clear signal that the future grid won’t just be bigger—it will need to be smarter, faster, and more adaptive to the surging complexity of energy demand. As data center operators, energy developers, and policymakers look to the future, this initiative offers a compelling glimpse into what a smarter, faster, and more dynamic grid could look like—with AI at the helm.

Hyperscalers’ Growing Role in Grid Modernization: Expanding AI-Driven Initiatives

As Google embarks on its bold collaboration to modernize the U.S. power grid through artificial intelligence (AI), other hyperscalers are following suit with initiatives aimed at addressing the challenges posed by an increasingly strained energy infrastructure. The intersection of AI, data centers, and energy resilience is rapidly emerging as a central focus for major players like Microsoft, Amazon, and Meta, who are aligning their strategies to accelerate grid modernization and optimization.

Microsoft’s AI-Powered Grid Optimization

Microsoft is another hyperscaler at the forefront of AI applications in grid management. The company has been exploring the potential of AI for grid optimization as part of its broader commitment to sustainability and energy efficiency.

In partnership with the Bonneville Power Administration (BPA) and other utility providers in the Pacific Northwest, Microsoft is leveraging AI to forecast and balance electricity demand across the region. The initiative, known as the “Grid Optimization Project,” aims to reduce energy waste and enhance grid reliability by predicting shifts in energy consumption with unprecedented accuracy.

By applying machine learning algorithms to real-time grid data, Microsoft’s AI tools can anticipate fluctuations in renewable energy generation, such as solar and wind, and adjust load distribution accordingly. The goal is to integrate renewable energy more seamlessly into the grid while maintaining stability and avoiding blackouts.

In addition, Microsoft has committed to providing its AI solutions to help utilities across the U.S. improve grid flexibility and resilience, positioning itself as a key player in transforming the power sector through digital infrastructure.

Amazon’s Renewable Integration and Demand Response

Amazon has also recognized the critical role AI will play in the future of grid modernization. Through the Amazon Web Services (AWS) platform, the company is actively developing AI models to enhance renewable energy integration and optimize energy consumption for its massive network of data centers.

As part of its commitment to reaching net-zero carbon emissions by 2040, Amazon is using AI to balance energy use and improve grid demand response, particularly in areas where renewable energy penetration is high and intermittency poses challenges.

One of Amazon’s standout efforts is its partnership with the California Independent System Operator (CAISO) to develop an AI-based energy management platform that predicts and mitigates the risks associated with renewable energy volatility. The system not only helps Amazon data centers adjust their energy usage during periods of low supply but also assists CAISO in managing grid congestion by offering real-time insights into demand patterns.

Amazon’s continued innovation in AI-driven energy solutions reflects the company’s broader strategy to decarbonize its operations while ensuring the reliability and efficiency of the power systems it relies on.

Meta’s Strategic Energy Investments and AI Integration

Meta is similarly exploring AI applications in grid management, but with a focus on accelerating the transition to renewable energy sources for its data centers. As part of its strategy to reach 100% renewable energy for global operations by 2030, Meta has invested in AI technologies designed to optimize energy procurement and minimize carbon emissions.

Through its partnership with several utility providers, Meta is using AI to predict energy demand and automate the process of sourcing clean energy at the lowest cost. Meta’s AI system integrates data from smart grids and renewable energy sources to create an efficient energy portfolio, enabling Meta to adjust its data center energy usage in real-time.

The company is also investigating AI’s potential in demand-side management, which allows energy consumers to influence grid stability and optimize usage based on fluctuating supply. With its AI-powered solutions, Meta aims to demonstrate how large-scale energy consumption can be made more adaptable to the changing dynamics of the grid.

Strategic Implications for the Data Center Industry

For the data center industry, these AI-driven initiatives represent a new paradigm in grid interaction and energy management. As hyperscalers increasingly integrate AI into their operations, they are not only positioning themselves as innovators in energy optimization but also as active contributors to the broader grid modernization efforts.

By creating smarter, more adaptive energy ecosystems, hyperscalers are paving the way for a more resilient grid capable of meeting the surging demand for energy from digital infrastructure. The growing role of hyperscalers in grid modernization also highlights the broader trend of digital infrastructure and energy systems co-evolving.

As AI continues to drive advancements in both data center operations and energy grid management, these companies are well-positioned to influence the future of power distribution and generation. The efforts made by Google, Microsoft, Amazon, and Meta underscore a pivotal shift: that AI is not only a tool for powering the digital economy but also a critical enabler of sustainable and resilient energy systems for the future.

In this context, collaborations like Google’s with PJM and Tapestry are more than just technical partnerships—they signal a new approach to energy management in the AI era. For the data center industry and the grid operators that serve it, this intersection of digital and energy infrastructure is likely to define the future of how power is distributed, optimized, and consumed at scale.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Why enterprise networks need both reach and resilience

As enterprises expand across regions, so do their cloud platforms and digital ecosystems. But with the rise of AI and its unprecedented appetite for data, networks are now under more pressure. Many businesses are learning the limits of legacy architecture the hard way. In the race to meet today’s standard

Read More »

Oil Jumps on Vietnam Trade Deal

Oil climbed in light pre-holiday trading after US President Donald Trump said he had reached a trade deal with Vietnam. West Texas Intermediate rose 3.1% to settle above $67 a barrel after Trump said he had reached a pact with the Southeast Asian nation that eliminated the nation’s import tariff on US goods. The deal is the third announced following agreements with the major trade partners UK and China, with investors pricing in a tentative optimism that more will be reached ahead of a July 9 deadline. Oil’s jump was probably amplified by low liquidity ahead of Friday’s July Fourth holiday in the US. The price gains came despite government data Wednesday showing a buildup in US crude inventories of 3.85 million barrels. The increase is the largest in three months, and more than five times the 680,000 barrel increase projected by the industry-funded American Petroleum Institute on Tuesday. Trading activity in crude futures has declined overall since the truce between Israel and Iran led prices to plunge early last week, with volatility returning to the lower levels seen before the war. The market is likely to turn its attention to a glut forecast for later this year, with an OPEC+ meeting this weekend expected to deliver another substantial increase in production quotas. “Speculators who are already net-long are trying to protect their position,” said Robert Yawger, director of the energy futures division at Mizuho Securities USA. “The problem is that they are running into a OPEC+ meeting with no place to hide over the long weekend.” Investors will also hone in on a slew of inputs expected in the coming days, ranging from a jobs report Thursday to an OPEC+ output decision at the weekend. Oil Prices WTI for August delivery rose 3.1% to settle at $67.45 a barrel

Read More »

Chevron, Total Vying in Libya’s First Oil Tender Since 2011 War

Chevron Corp. and TotalEnergies SE are competing in Libya’s first energy exploration tender since the 2011 conflict, the country’s state-run oil firm said, as the OPEC member looks to oil majors to help ramp up production to a record. Eni SpA and Exxon Mobil Corp. are also among the 37 companies that have lodged interest, with contracts due to be signed with successful bidders by the end of 2025, National Oil Corp Chairman Massoud Seliman said in an interview in the capital, Tripoli.  “Almost all well-known international companies” are vying for the 22 offshore and onshore blocks, he said. Foreign firms stepping back into exploration would mark a watershed for the North African country, which is home to the continent’s largest reserves but has seen production hobbled by more than a decade of conflict.  Libya is split between dueling governments in its east and west, and sporadic stoppages and rounds of violence have left much of its energy infrastructure neglected and damaged. A representative for TotalEnergies declined to comment. Eni and Exxon Mobil didn’t respond to requests for comment. Chevron said it constantly reviews new exploration opportunities, but doesn’t comment on commercial matters. Authorities target daily oil output of 2 million barrels before 2030 — surpassing the 1.75 million-barrel peak reached during strongman Muammar Qaddafi’s reign in 2006. Libya currently pumps about 1.4 million barrels a day. Libya last held a bidding round in 2007, four years before the NATO-backed uprising in which Qaddafi was killed. Winners of the new tenders will bear the costs for seismic surveys and other exploration steps though they can recoup those if commercial quantities of hydrocarbons are discovered, the chairman said. NOC is awaiting approval of a development budget of about $3 billion, which will help raise output to 1.6 million daily barrels within a year, according

Read More »

California budget leaves grid reliability programs in limbo, advocates say

Dive Brief: California Gov. Gavin Newsom, D, approved a $321 billion state budget last week that cut about $18 million in previously appropriated funding from grid reliability programs and deferred decisions about future spending on the programs to a later date, clean energy advocates said. The affected programs — Demand Side Grid Support and Distributed Electricity Backup Assets — are designed to shore up the state’s energy resources by providing on-call emergency supply or load reduction resources during extreme weather events such as heat waves or other grid emergencies. Earlier proposals called for allocating $473 million to the programs through 2028, an amount that was later reduced to $50 million in a revised draft budget in May. The final adopted budget cut $18 million from DSGS without including any new funding for either program, advocates said, as legislators and the governor agreed to hold off on most decisions about the state’s Greenhouse Gas Reduction Fund and voter-approved climate bonds. Dive Insight: Advanced Energy United, a trade group representing a diverse array of energy, transportation and tech companies, said in a statement that the budget leaves “crucial clean energy and climate programs in limbo” at a time when California is facing heat waves that strain the grid and a pullback of federal support.   “We recognize the difficult fiscal environment and uncertainty around federal funding, but California cannot keep deferring on tough decisions,” said Edson Perez, California lead at the organization. “Reliability programs like DSGS have delivered real results by keeping the lights on with clean energy and should be strengthened, not scaled back.”  Newsom’s office did not immediately respond to a request for comment. In his past public statements, the governor blamed California’s budget shortfall on President Donald Trump’s “economic sabotage,” including his on-again, off-again tariffs, and market volatility. The state’s finance department had not updated its budget

Read More »

Iraq Power Grid Suffers Capacity Cut as Iran Gas Supply Slumps

Iraq’s electricity grid lost around 15% of its generation capacity after gas supplies from neighboring Iran were more than halved on Tuesday, highlighting the country’s vulnerability to energy shocks despite its oil wealth. Iranian gas deliveries currently stand at 25 million cubic meters per day, less than half the 55 million cubic meters agreed under a bilateral deal, Iraq’s Electricity Ministry said in a statement. The lost volumes have resulted in the shutdown of some gas-fired power plants and a loss of about 3,800 megawatts of generation.  High domestic demand combined with maintenance work in Iran was cited as the reason for the drop in gas supply, said Saad Freih, director of the ministry. The shortfall has strained Iraq’s already fragile power grid at a time of high summer demand and the ministry said it’s coordinating with the Oil Ministry to secure diesel as an emergency fuel. Iraq, OPEC’s second-biggest oil producer, doesn’t have enough gas to operate its mostly gas-fired power plants and suffers from crippling blackouts every summer when demand peaks. It’s also been trying to reduce the amount of wasteful gas flaring from its own fields, and has been looking at buying LNG for years as a way to fill the shortages. Iraq receives Iranian natural gas from two pipelines, but flows have been interrupted several times in recent years. In 2023, Iran cut volumes in half because of unpaid bills, which Baghdad said arose due to US sanctions on Iran.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

Groups decry Senate’s elimination of building efficiency deduction

HVAC and other industry groups are trying to retain a federal incentive for making commercial buildings more energy efficient after the U.S. Senate eliminated the Section 179D Energy Efficient Commercial Building Deduction in the 940-page domestic policy bill it passed Tuesday morning. “Section 179D … helps HVACR contractors, building owners, and the broader skilled-trades community improve energy efficiency and strengthen America’s built environment,” Air Conditioning Contractors of America said in a letter to congressional leaders last week. The group shared a summary of the letter on its website.  The provision lets owners deduct more than $1 per square foot on their federal taxes for installing LED lights, replacing old HVAC systems and making envelope renovations that improve the efficiency of their buildings. The deduction can increase to more than $5 per square foot if prevailing wage and other labor requirements are met. Supporters say the deduction has grown in value in amendments Congress has made to it since its enactment in 2005.   “Section 179D is no longer a niche benefit — it is a mainstream, high-impact opportunity when making energy-efficient upgrades,” Carey Heyman and Agatha Li of accounting firm CliftonLarsonAllen say in an information page on the provision.  In their article on the program, the accountants said they worked with a company last year that owns a 250,000-square foot Class A office building. The company was able to get a $3-per-square-foot deduction — $750,000 total —  after installing LED lights and upgrading the HVAC system while achieving compliance with prevailing wage standards. “This deduction significantly reduced the firm’s taxable income, offset the capital improvement costs, and increased the building’s appeal to sustainability-conscious tenants,” the accountants said.  In a letter last week to congressional leaders, the Sheet Metal and Air Conditioning Contractors’ National Association called the deduction the most important of

Read More »

Base Power, GVEC partner on 2-MW Texas VPP

Dive Brief: South Central Texas cooperative Guadalupe Valley Electric Cooperative has partnered with distributed energy developer Base Power on a 2-MW virtual power plant that will provide residential customers with electricity in the event of a blackout, while also allowing the utility to use home batteries for price arbitrage and transmission cost management. The battery systems are installed in new homes constructed by Lennar and will be operated directly by GVEC using Base Power’s proprietary software platform. In the future, GVEC and Base Power will work together to qualify the aggregated battery capacity in the Electric Reliability Council of Texas’ aggregated distributed energy resource, or ADER, pilot program, Gary Coke, GVEC power supply manager, said in an email. The batteries will be owned by Base Power. Dive Insight: The virtual power plant builds on Base Power’s ongoing collaboration with Lennar to install batteries in new homes. “GVEC has no direct relationship with our members in relation to this program,” Coke said. “The member selects the system as an option on the home and as a part of that selection acknowledges GVEC has the right to control the system, and we compensate Base for the exclusive right to access the batteries.” The program has already begun, with nine battery systems installed for just over 100 kW of capacity and 225 kWh of energy, Coke said. “We expect to reach 20 systems by the end of July.”  GVEC is already operating the installed batteries for transmission cost reduction during the summer and will continue to do so through September, corresponding to ERCOT’s 4CP program managing peak demand. The cooperative will also regularly operate the batteries for price arbitrage during periods of high pricing in the ERCOT market, Coke said. And the utility will work with Base Power to qualify the batteries for ADER. ADER launched in

Read More »

Arista Buys VeloCloud to reboot SD-WANs amid AI infrastructure shift

What this doesn’t answer is how Arista Networks plans to add newer, security-oriented Secure Access Service Edge (SASE) capabilities to VeloCloud’s older SD-WAN technology. Post-acquisition, it still has only some of the building blocks necessary to achieve this. Mapping AI However, in 2025 there is always more going on with networking acquisitions than simply adding another brick to the wall, and in this case it’s the way AI is changing data flows across networks. “In the new AI era, the concepts of what comprises a user and a site in a WAN have changed fundamentally. The introduction of agentic AI even changes what might be considered a user,” wrote Arista Networks CEO, Jayshree Ullal, in a blog highlighting AI’s effect on WAN architectures. “In addition to people accessing data on demand, new AI agents will be deployed to access data independently, adapting over time to solve problems and enhance user productivity,” she said. Specifically, WANs needed modernization to cope with the effect AI traffic flows are having on data center traffic. Sanjay Uppal, now VP and general manager of the new VeloCloud Division at Arista Networks, elaborated. “The next step in SD-WAN is to identify, secure and optimize agentic AI traffic across that distributed enterprise, this time from all end points across to branches, campus sites, and the different data center locations, both public and private,” he wrote. “The best way to grab this opportunity was in partnership with a networking systems leader, as customers were increasingly looking for a comprehensive solution from LAN/Campus across the WAN to the data center.”

Read More »

Data center capacity continues to shift to hyperscalers

However, even though colocation and on-premises data centers will continue to lose share, they will still continue to grow. They just won’t be growing as fast as hyperscalers. So, it creates the illusion of shrinkage when it’s actually just slower growth. In fact, after a sustained period of essentially no growth, on-premises data center capacity is receiving a boost thanks to genAI applications and GPU infrastructure. “While most enterprise workloads are gravitating towards cloud providers or to off-premise colo facilities, a substantial subset are staying on-premise, driving a substantial increase in enterprise GPU servers,” said John Dinsdale, a chief analyst at Synergy Research Group.

Read More »

Oracle inks $30 billion cloud deal, continuing its strong push into AI infrastructure.

He pointed out that, in addition to its continued growth, OCI has a remaining performance obligation (RPO) — total future revenue expected from contracts not yet reported as revenue — of $138 billion, a 41% increase, year over year. The company is benefiting from the immense demand for cloud computing largely driven by AI models. While traditionally an enterprise resource planning (ERP) company, Oracle launched OCI in 2016 and has been strategically investing in AI and data center infrastructure that can support gigawatts of capacity. Notably, it is a partner in the $500 billion SoftBank-backed Stargate project, along with OpenAI, Arm, Microsoft, and Nvidia, that will build out data center infrastructure in the US. Along with that, the company is reportedly spending about $40 billion on Nvidia chips for a massive new data center in Abilene, Texas, that will serve as Stargate’s first location in the country. Further, the company has signaled its plans to significantly increase its investment in Abu Dhabi to grow out its cloud and AI offerings in the UAE; has partnered with IBM to advance agentic AI; has launched more than 50 genAI use cases with Cohere; and is a key provider for ByteDance, which has said it plans to invest $20 billion in global cloud infrastructure this year, notably in Johor, Malaysia. Ellison’s plan: dominate the cloud world CTO and co-founder Larry Ellison announced in a recent earnings call Oracle’s intent to become No. 1 in cloud databases, cloud applications, and the construction and operation of cloud data centers. He said Oracle is uniquely positioned because it has so much enterprise data stored in its databases. He also highlighted the company’s flexible multi-cloud strategy and said that the latest version of its database, Oracle 23ai, is specifically tailored to the needs of AI workloads. Oracle

Read More »

Datacenter industry calls for investment after EU issues water consumption warning

CISPE’s response to the European Commission’s report warns that the resulting regulatory uncertainty could hurt the region’s economy. “Imposing new, standalone water regulations could increase costs, create regulatory fragmentation, and deter investment. This risks shifting infrastructure outside the EU, undermining both sustainability and sovereignty goals,” CISPE said in its latest policy recommendation, Advancing water resilience through digital innovation and responsible stewardship. “Such regulatory uncertainty could also reduce Europe’s attractiveness for climate-neutral infrastructure investment at a time when other regions offer clear and stable frameworks for green data growth,” it added. CISPE’s recommendations are a mix of regulatory harmonization, increased investment, and technological improvement. Currently, water reuse regulation is directed towards agriculture. Updated regulation across the bloc would encourage more efficient use of water in industrial settings such as datacenters, the asosciation said. At the same time, countries struggling with limited public sector budgets are not investing enough in water infrastructure. This could only be addressed by tapping new investment by encouraging formal public-private partnerships (PPPs), it suggested: “Such a framework would enable the development of sustainable financing models that harness private sector innovation and capital, while ensuring robust public oversight and accountability.” Nevertheless, better water management would also require real-time data gathered through networks of IoT sensors coupled to AI analytics and prediction systems. To that end, cloud datacenters were less a drain on water resources than part of the answer: “A cloud-based approach would allow water utilities and industrial users to centralize data collection, automate operational processes, and leverage machine learning algorithms for improved decision-making,” argued CISPE.

Read More »

HPE-Juniper deal clears DOJ hurdle, but settlement requires divestitures

In HPE’s press release following the court’s decision, the vendor wrote that “After close, HPE will facilitate limited access to Juniper’s advanced Mist AIOps technology.” In addition, the DOJ stated that the settlement requires HPE to divest its Instant On business and mandates that the merged firm license critical Juniper software to independent competitors. Specifically, HPE must divest its global Instant On campus and branch WLAN business, including all assets, intellectual property, R&D personnel, and customer relationships, to a DOJ-approved buyer within 180 days. Instant On is aimed primarily at the SMB arena and offers a cloud-based package of wired and wireless networking gear that’s designed for so-called out-of-the-box installation and minimal IT involvement, according to HPE. HPE and Juniper focused on the positive in reacting to the settlement. “Our agreement with the DOJ paves the way to close HPE’s acquisition of Juniper Networks and preserves the intended benefits of this deal for our customers and shareholders, while creating greater competition in the global networking market,” HPE CEO Antonio Neri said in a statement. “For the first time, customers will now have a modern network architecture alternative that can best support the demands of AI workloads. The combination of HPE Aruba Networking and Juniper Networks will provide customers with a comprehensive portfolio of secure, AI-native networking solutions, and accelerate HPE’s ability to grow in the AI data center, service provider and cloud segments.” “This marks an exciting step forward in delivering on a critical customer need – a complete portfolio of modern, secure networking solutions to connect their organizations and provide essential foundations for hybrid cloud and AI,” said Juniper Networks CEO Rami Rahim. “We look forward to closing this transaction and turning our shared vision into reality for enterprise, service provider and cloud customers.”

Read More »

Data center costs surge up to 18% as enterprises face two-year capacity drought

“AI workloads, especially training and archival, can absorb 10-20ms latency variance if offset by 30-40% cost savings and assured uptime,” said Gogia. “Des Moines and Richmond offer better interconnection diversity today than some saturated Tier-1 hubs.” Contract flexibility is also crucial. Rather than traditional long-term leases, enterprises are negotiating shorter agreements with renewal options and exploring revenue-sharing arrangements tied to business performance. Maximizing what you have With expansion becoming more costly, enterprises are getting serious about efficiency through aggressive server consolidation, sophisticated virtualization and AI-driven optimization tools that squeeze more performance from existing space. The companies performing best in this constrained market are focusing on optimization rather than expansion. Some embrace hybrid strategies blending existing on-premises infrastructure with strategic cloud partnerships, reducing dependence on traditional colocation while maintaining control over critical workloads. The long wait When might relief arrive? CBRE’s analysis shows primary markets had a record 6,350 MW under construction at year-end 2024, more than double 2023 levels. However, power capacity constraints are forcing aggressive pre-leasing and extending construction timelines to 2027 and beyond. The implications for enterprises are stark: with construction timelines extending years due to power constraints, companies are essentially locked into current infrastructure for at least the next few years. Those adapting their strategies now will be better positioned when capacity eventually returns.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »