Stay Ahead, Stay ONMINE

Grid planners and experts on why markets keep choosing renewables

Listen to the article 11 min This audio is auto-generated. Please let us know if you have feedback. As electricity demand grows, alongside wind and solar’s share in the U.S. energy mix, concerns about renewables’ reliability are being raised more frequently — including at the highest levels of the federal government. “With the electricity grid, you […]

As electricity demand grows, alongside wind and solar’s share in the U.S. energy mix, concerns about renewables’ reliability are being raised more frequently — including at the highest levels of the federal government.

“With the electricity grid, you have to match supply and demand in every moment in time,” Energy Secretary Chris Wright said recently on Fox News. “With wind and solar, you don’t know when they’re going to be there, and you don’t know when they’re going to go away.”

Wright went on to say the development of renewables has led to an “extra distribution grid” that has raised energy prices. 

But utility planners, grid operators and analysts say wind, solar and batteries are an important part of an evolving power system in which intermittent resources can be reliably scheduled and called upon using sophisticated software and other tools.

They also point to both the levelized cost of electricity for renewables and their competitiveness in automated energy markets that select the least cost units to run in each hour.

System operators don’t decide whether resources bidding into the market are good or bad,” said Rob Gramlich, president of energy sector consultant Grid Strategies, in an interview.

“There is no central decision maker,” he said. “Markets don’t play favorites.”

resource portfolio

ERCOT’s multi-resource price-selected portfolio for August 20, 2025, at 5:50 PM CDT

ERCOT “dashboard” [jpg]. Retrieved from dashboard.

With generators retiring, demand rising, and construction, financing, permitting and supply chain challenges growing, operators and analysts acknowledged concerns over the future of the U.S. power system. Many called for diversification of resources, including renewables and storage, to protect electricity reliability and affordability.

“The objective of planning is a portfolio of diverse resources at the least cost possible to avoid outsized impacts from any single one,” said Michael Eugenis, Arizona Public Service’s director of resource planning.

APS is pursuing renewables alongside more natural gas to maintain reliability in its territory, he said. 

That is not the same as a second set or a shadow set of resources,” he added. 

Automated markets are selecting renewables

One of the major concerns cited by Wright and others is the intermittent nature of renewables and the impact of that on grid reliability.

One of the ways grid operators measure reliability is using a metric called effective load carrying capability, or ELCC. 

ELCC is a complex calculation based on comparing what portion of a resource’s nameplate capacity it produces on average in simulations of decades that include factors that cause it to vary, like performance, supply and demand. For fossil resources, variabilities are caused by things like maintenance outages and fuel supply issues. Weather variations impact wind and solar.

ELCCs vary by region and by each system’s portfolio makeup, but nuclear power generally has the highest ELCC, meaning on average over the years, it produces the closest percentage of its nameplate capacity. 

In the PJM Interconnection, the country’s largest grid, nuclear was assigned an ELCC of 95% in its most recent auction

Fixed-tilt solar had the lowest at 8%, and tracking solar was 11%. Onshore wind was 41%, offshore was 69%; storage was between 50% and 72%; gas was between 60% and 78%, and coal was 83%. 


“System operators don’t decide whether resources bidding into the market are good or bad … Markets don’t play favorites.”

Rob Gramlich

president of Grid Strategies


But despite their relatively low ELCCs, automated markets are consistently choosing renewables over other resources because they are cheaper at the time the grid needs them.

“The automated market mechanisms use all the relevant variables that affect load to pick the cheapest resources,” said Richard Doying, Grid Strategies vice president and a former executive vice president for market and grid strategy for the former Midcontinent Independent System Operator.

Those resources are bid out, and “then the next cheapest resources, and so on,” Doying said. 

“The markets are selecting wind and solar despite their low ELCCs and low capacity values because the return justifies the investment,” he added. 

When they are both generating at the same time, renewables are often cheaper than fossil fuel power.

One reason for this is that the fuel and maintenance costs for fossil-based resources are significant, continuous and volatile. Natural gas prices, for example, have nearly doubled since 2024, but are still lower than they were in 2022 following Russia’s invasion of Ukraine.

The fuels for wind and solar are cost-free, maintenance outage costs are significantly less, and the capital expenditures to build projects are typically amortized over 20 years.

“If natural gas was the cheapest option to meet the peak, the markets would select it,” said Sean Kelly, co-founder and CEO of forecast provider Amperon and a former energy analyst. 

When the sun’s not shining

Kelly said today’s advanced load and weather forecasting enhances the reliability of a resource portfolio with a high renewables penetration. 

While good forecasting can alleviate much of the need for backup generation, he said, rising demand for power will likely require more firm generation to fill the gaps when there is no sun or wind.

The ability to choose the cheapest available power source in real time is why grid operators say renewables are an important part of the same, increasingly flexible power system — not a separate system.

“We don’t have different systems; we have a portfolio of resources and capabilities that meet the needs of the system,” said Mark Rothleder, senior vice president and COO for the California Independent System Operator, in an interview. 

California has one of the highest renewables penetrations in the United States. California also has some of the highest electricity rates. Some observers link those to things to argue that renewables drive up energy costs.

resource portfolio

CAISO’s multi-resource price-selected portfolio for August 20, 2025, at 4:05 PM PDT

CAISO. (2025). “dashboard” [jpg]. Retrieved from dashboard.

But other states with high levels of renewable power have comparatively low rates compared to the national average. 

In Iowa, for example, wind turbines generated 59% of the state’s electricity in 2023, according to the U.S. Energy Information Administration. Iowa was also in the 10 states with the lowest average electricity price that year. 

While solar and wind are not always available locally when demand is high, storage penetrations are rising and some states are moving toward more regional energy markets where cheaply-produced energy from renewables can be more easily traded across state boundaries.

The New York Independent System Operator said in a recent report that although “duration-limited” generators like wind and solar require standby resources when the availability of the renewable fleet diminishes, “no generating type operates at full power, full time.”

Thinking of resources as either intermittent or backup oversimplifies the complexity of the grid, experts said. 

“Wind and solar don’t have to produce 100% of the time because people aren’t consuming all available electricity 100% of the time,” said Beth Garza, a senior fellow at the R Street Institute think tank and former ERCOT market monitor CEO.

“It is easy to vilify solar and wind,” Garza said. But regardless of the resource mix, “the average electricity load is only about half the peak demand.”

When demand peaks

Power systems have long been required to have reserve margins over the highest expected peak demand, and the growth of wind and solar has not changed that, said Julia Matevosyan, an associate director and chief engineer with the Energy Systems Integration Group and a former ERCOT planning engineer. 

But advancements in software and market shifts have democratized incentives for energy production, conservation and storage in ways that can smooth demand peaks and deploy resources as needed. 

This flexibility can optimize wind, solar and batteries to deliver low cost electricity more reliably than “a more limited set of traditional resources,” Matevosyan said.

Over the summer, California deployed what some advocates have called the largest virtual power plant in the world when several aggregators discharged 539 MW of average output from more than 100,000 customer-sited batteries during peak demand between 7 and 9 p.m.

A Brattle Group study of the deployment commissioned by Sunrun and Tesla Energy, both of which participated in the VPP program, concluded that it could reduce the need for gas peaker plants and potentially save ratepayers $206 million between 2025 and 2028.

Ryan Hledik, one of the report’s authors, said the VPP makes better use of assets that have already been deployed.

“There are a growing number of examples of VPPs scaling faster and at a lower cost than conventional resources,” Hledik said. 

resource portfolio

MISO’s multi-resource price-selected portfolio for August 20, 2025, at 5:50 PM EST

MISO. (2025). “dashboard” [jpg]. Retrieved from dashboard.

Renewables continue to dominate new generation as demand rises

Demand is rising for the first time in two decades. Last year, the U.S. used more electricity than ever before, and the EIA projects demand to grow more than 2% per year until at least 2026. Farther out, the predictions vary widely and are highly dependent on the growth of data centers, electrification and manufacturing.

The vast majority of new generation resources coming online now are renewable, with solar in the lead, followed by wind. Thrugh July 2025. In the first seven months of 2025, solar accounted for over 16 GW of the 21.5 GW added to the U.S. power system. Wind accounted for almost 3.3 GW and natural gas for 2.2 GW, the September Federal Energy Regulatory Commission Infrastructure Update reported.

Federal tax credits and other incentives that are being phased out boosted renewable development. But experts say even without that support, the economics are also in renewables’ favor. 


“Wind and solar don’t have to produce 100% of the time because people aren’t consuming all available electricity 100% of the time.”

Beth Garza

senior fellow at R Street Institute and former ERCOT market monitor CEO.


The levelized cost of electricity for utility-scale solar and onshore wind remain “the most cost-effective forms of new-build energy generation on an unsubsidized basis,” according to Lazard analysisCalculating a technology’s LCOE involves its capital cost, fuel cost, capacity factor, and other values that vary by location and time.

Recently, renewable trade groups have warned of a slowdown in the solar industry. But alternative resources, like natural gas, face their own challenges, from lengthy planning and approval processes to rising fuel prices and years-long equipment backlogs.

“The newest, most efficient natural gas peaker plants are expensive to build and are likely to become more expensive to run because of competition for natural gas,” Garza said. “The market’s answer to load growth is still to build more wind, solar, and batteries because they are the cheaper and faster to build.”

Michelle Solomon, a manager of electricity policy at Energy Innovation, offered a similar observation. 

Utility planning models “have largely been choosing wind and solar and batteries for several years,” she said. “A diverse set of resources is most likely to be the cheapest portfolio.”

resource portfolio

PJM’s multi-resource price-selected portfolio for August 20, 2025, at 6:00 PM EDT

PJM. (2025). “dashboard” [jpg]. Retrieved from dashboard.
Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Zscaler, café-inspired branch networks, and mobile security

In Japan, I met with Nathan Howe, senior vice president of innovation and product management for Zscaler, and talked to him about the Zscaler Cellular service. Without getting into the technical nuances, the service works by integrating zero trust into the mobile network. This makes it ideally suited to secure

Read More »

Civitas Weighs Merger With Permian Basin Rival

Civitas Resources Inc., an oil and gas explorer that has been weighing a sale, is considering a merger with Permian Basin rival SM Energy Co., according to people familiar with the matter.  Civitas has been discussing a deal with SM that wouldn’t include a premium and would be structured as a merger of equals as it explores strategic options, said the people, who asked to not be identified because the details aren’t public. No deal has been finalized and other parties are circling Civitas, the people added.  Representatives for Civitas and SM Energy declined to comment.  If a transaction is consummated, the combined company would be worth at least $14 billion, including debt, making it one of the year’s biggest oil and gas deals, according to data compiled by Bloomberg.  The Permian Basin of West Texas and New Mexico — the largest and most productive US oil field — has seen a blitz of merger activity in recent years as small players pair up to gain scale and major operators look for a toehold. In August, Crescent Energy Co. agreed to buy Permian Basin rival Vital Energy Inc. for $3.1 billion. This deal would bring together two of the region’s midsize, public players. Civitas, with a market value of about $3.2 billion, produces oil across about 140,000 net acres throughout the basin, according to an investor presentation in August. SM has a market value of about $2.9 billion and about 109,000 acres in a well-developed swathe of the Permian known as the Midland Basin. SM has an enterprise value of about $5.5 billion while Civitas is worth about $8.5 billion, including debt. EOG Resources Inc.’s $5.6 billion takeover of Encino Acquisition Partners is the largest deal announced this year in the US oil and gas exploration sector, according to data compiled by Bloomberg.  The companies’ operations

Read More »

Oil Slips as Middle East Tensions Ease

Oil edged lower as traders focused on cooling tensions in the Middle East and broader markets struck a more cautious tone. West Texas Intermediate fell 1.7% to settle below $62 a barrel while Brent closed near $65. Israel has begun implementing a ceasefire deal in Gaza after it reached an agreement with Hamas for the the release of all the hostages it holds, a major step toward ending a two-year war that’s loomed over flows from the Middle East, the source of a third of the world’s crude. After a dip lower at the start of the month, crude has edged back toward the $62 to $67 band in which it traded for weeks at the end of the summer. The Organization of the Petroleum Exporting Countries and its allies are ramping up supplies, but so far the impact on prices has been limited by China hoarding barrels. “Sentiment remains subdued, weighed down by concerns over a sizable fourth-quarter surplus and fears that Chinese crude buying is slowing,” said Rebecca Babin, a senior energy trader at CIBC Private Wealth Group. “Trading likely stays rangebound with a mild downside bias, particularly if broader risk assets come under pressure.” The commodity also moved lower in tandem with wider markets without any strong new indicators on supply and demand. The dollar strengthened, making commodities priced in the currency less attractive. Offering a floor to prices, the US Treasury Department sanctioned more than 50 individuals, entities and vessels that “facilitate” Iranian oil and liquefied petroleum gas sales and shipments from the country. Traders will be following whether an end to fighting in Gaza will impact the status of restrictions against Iran, which backs Hamas. Many Wall Street banks and other observers including the International Energy Agency have predicted the market will move into a

Read More »

Newsom signs 1 bill to speed geothermal approvals, vetoes another

Gov. Gavin Newsom, D-Calif., signed into law Monday a bill that will expand the California Energy Commission’s streamlined certification program to allow it to speed approvals for geothermal energy, but he vetoed a bill to speed approvals of well-drilling for geothermal exploratory projects. “In addition to delaying much-needed regulations that are already in process, the [Geologic Energy Management] Division would need to substantially increase fees on geothermal operators to implement the new requirements imposed by the bill,” Newsom wrote in his veto message for the second bill, AB527. Newsom said in the message that while he supports the expansion of geothermal energy in California “as a much-needed source of baseload clean power,” he thinks that the increased fees would disincentivize geothermal development “beyond any incentive provided by a [California Environmental Quality Act] exemption for one part of a project’s permitting process.” The governor also noted that he signed AB1359 last year, which reformed the approval process for geothermal exploration, and he thinks it is “prudent that we understand the effects of these changes before granting wholesale CEQA exemptions with costly and complicated conditions.” AB527 would have allowed geothermal exploratory projects that GEMD deems to meet certain criteria to also be presumed to “have satisfied the requirements of CEQA for the geothermal exploratory project, including to support the issuance of any permit, funding, or other approval by a state or local agency, as provided.” AB531, which Newsom signed, makes geothermal projects one of the types of projects that — once the state Energy Commission certifies them as an “environmental leadership development project” — can benefit from a streamlined approval process under CEQA. The other projects that can receive that certification are solar or wind farms generating 50 MW or more, energy storage systems capable of storing 200 MWh or more, power plants 50 MW or

Read More »

Citigroup Flags Oil Market’s Bearish Consensus

The broad mood in the oil market remains bearish, although there are discrepancies as to how gloomy crude’s prospects are, according to Citigroup Inc., summarizing views from clients in North America and Europe. “Conviction differs on the depth of downside,” analysts including Francesco Martoccia said in a note. “Some clients doubt that a price floor at $60 a barrel for Brent crude oil would be enough to induce a supply-and-demand reaction to balance a global liquids market generally seen heading for a surplus.” Oil prices have shed more than 10% this year, with global benchmark Brent posting back-to-back monthly losses in August and September. The weakness has been driven largely by expectations that supplies will run ahead of demand as OPEC+ loosens output curbs and rival drillers also step up production. Still, stockpiling by China has acted to support the market, with inventory builds so far seen concentrated away from the market’s main pricing centers. “Other clients expect a more moderate, orderly price correction, arguing that projected stock builds could continue to accumulate outside of key pricing hubs, certainly ex-Cushing,” the analysts said, referring to the storage hub in Oklahoma that’s the physical delivery point for West Texas Intermediate. The Organization of the Petroleum Exporting Countries and its allies endorsed another quota hike last weekend, although the increment — 137,000 barrels a day for November’s production — was smaller than some of the sums that had been reported in the run-up to the gathering. “Today’s slower non-OPEC+ growth and greater OPEC+ optionality, along with heightened geopolitical risks looming on large producers” such as Russia and Iran, could temper the pace of price adjustment, the analysts said. Brent futures — which tumbled 8% last week ahead of the OPEC+ supply decision — traded slightly lower at $65.80 a barrel on Thursday. “Within the energy complex, consensus

Read More »

USA Crude Oil Stocks Rise by Almost 4 Million Barrels WoW

U.S. commercial crude oil inventories, excluding those in the Strategic Petroleum Reserve (SPR), increased by 3.7 million barrels from the week ending September 26 to the week ending October 3, the U.S. Energy Information Administration (EIA) highlighted in its latest weekly petroleum status report. That report, which was released on October 8 and included data for the week ending October 3, showed that crude oil stocks, not including the SPR, stood at 420.3 million barrels on October 3, 416.5 million barrels on September 26, and 422.7 million barrels on October 4, 2024. The report highlighted that data may not add up to totals due to independent rounding. Crude oil in the SPR stood at 407.0 million barrels on October 3, 406.7 million barrels on September 26, and 382.9 million barrels on October 4, 2024, the report revealed. Total petroleum stocks – including crude oil, total motor gasoline, fuel ethanol, kerosene type jet fuel, distillate fuel oil, residual fuel oil, propane/propylene, and other oils – stood at 1.694 billion barrels on October 3, the report highlighted. Total petroleum stocks were down 0.9 million barrels week on week and up 52.2 million barrels year on year, the report showed. “At 420.3 million barrels, U.S. crude oil inventories are about four percent below the five year average for this time of year,” the EIA said in its latest weekly petroleum status report. “Total motor gasoline inventories decreased by 1.6 million barrels from last week and are about one percent below the five year average for this time of year. Finished gasoline inventories increased, while blending components inventories decreased last week,” it added. “Distillate fuel inventories decreased by 2.0 million barrels last week and are about six percent below the five year average for this time of year. Propane/propylene inventories decreased by 2.9 million

Read More »

To keep energy affordable, Virginia must embrace power line innovation

Jeff Dennis is executive director of the Electricity Customers Alliance, a coalition that advocates for customer-centric solutions to grid modernization and energy affordability challenges. A modern economy requires modern energy solutions to meet the growing energy demand created by reshored manufacturing, increased electrification of homes, businesses, and vehicles, and — of course — data centers. Virginia and the Mid-Atlantic house the data center capital of the Western world, and the Commonwealth is positioned to lead America’s push to win the global artificial intelligence race. Data center growth can deliver many economic benefits to communities. However, this increased electricity demand requires smartly planned power infrastructure and a focus on short- and long-term solutions to address consumers’ concerns about their power bills. Planning and building electricity infrastructure to meet rapidly growing demand while keeping electricity affordable for all consumers will take time. However, data centers and manufacturers need power now to win the AI race and capture new economic opportunities in manufacturing and the digital economy. Virginia must capitalize on short-term opportunities to expand and maximize the use of our existing grid assets to deliver more power while setting the foundation for smart, long-term buildout of our shared electricity grid. To address this challenge, the General Assembly passed, and Gov. Glenn Youngkin signed, legislation to encourage deployment of advanced transmission technologies that maximize the amount of power that can be delivered from our existing lines. Beginning in 2026, the legislation requires the State Corporation Commission to consider the use of new power line technologies called “advanced conductors” when deciding how to bring more power supply onto the grid. Many other Mid-Atlantic states have followed Virginia’s lead in encouraging adoption of these technologies to help meet our needs today while establishing a foundation for an efficient future electricity system. That’s great news for consumers —

Read More »

Cisco seriously amps-up Silicon One chip, router for AI data center connectivity

Some say deep buffers shouldn’t be used to handle this type of traffic; the contention is that these buffers fill and drain, creating jitter in the workloads, and that slows things down, Chopra told Network World. “But the real source of that challenge is not the buffers. It’s a poor congestion management scheme and poor load balancing with AI workloads, which are completely deterministic and predictable. You can actually proactively figure out how to place flows across the network and avoid the congestion,” he said. The 8223’s deep-buffer design provides ample memory to temporarily store packets during congestion or traffic bursts, an essential feature for AI networks where inter-GPU communication can create unpredictable, high-volume data flows, according to Gurudatt Shenoy, vice president of Cisco Provider Connectivity. “Combined with its high-radix architecture, the 8223 allows more devices to connect directly, reducing latency, saving rack space, and further lowering power consumption. The result is a flatter, more efficient network topology supporting high-bandwidth, low-latency communication that is critical for AI workloads,” Shenoy wrote in a blog post. NOS options Notably, the first operating systems that the 8223 supports are the Linux Foundation’s Software for Open Networking in the Cloud (SONiC) and Facebook open switching system (FBOSS) – not Cisco’s own IOS XR.  IXR will be supported, too, but at a later date, according to Cisco.  SONiC decouples network software from the underlying hardware and lets it run on hundreds of switches and ASICs from multiple vendors while supporting a full suite of network features such as Border Gateway Protocol (BGP), remote direct memory access (RDMA), QoS, and Ethernet/IP. One of the keys to SONiC is its switch-abstraction interface, which defines an API to provide a vendor-independent way of controlling forwarding elements such as a switching ASIC, an NPU, or a software switch in a uniform

Read More »

Utilities Race to Meet Surging Data Center Demand With New Power Models

Over the last 18 months or so, the energy generation industry and its public utilities have been significantly impacted by the AI data center boom. It has been demonstrated across North America that the increase in demand for power, as driven by the demand for hyperscale and AI data centers, greatly exceeds the ability of the industry to actually generate and deliver power to meet the demand. We have covered many of the efforts being made to control the availability of power. In response, utilities and regulators have begun rethinking how to manage power availability through means such as: temporary moratoriums on new data center interconnections; the creation of new rate classes; cogeneration and load-sharing agreements; renewable integration; and power-driven site selection strategies.  But the bottom line is that in many locations utilities will need to change the way they work and how and where they spend their CAPEX budgets. The industry has already realized that their demand forecast models are hugely out of date, and that has had a ripple effect on much of the planning done by public utilities to meet the next generation of power demand. Most utilities now acknowledge that their demand forecasting models have fallen behind reality, triggering revisions to Integrated Resource Plans (IRPs) and transmission buildouts nationwide. This mismatch between forecast and actual demand is forcing a fundamental rethink of capital expenditure priorities and long-term grid planning. Spend More, Build Faster Utilities are sharply increasing CAPEX and rebalancing their resource portfolios—not just for decarbonization, but to keep pace with multi-hundred-megawatt data center interconnects. This trend is spreading across the industry, not confined to a few isolated utilities. Notable examples include: Duke Energy raised its five-year CAPEX plan to $83 billion (a 13.7% increase) and plans to add roughly 5 GW of natural gas capacity

Read More »

Duos Pairs Mobile Power and Modular Edge Data Centers for Rapid Texas Rollout

Duos Technology Group has launched the fifth of its AI edge data centers, part of a plan to deploy 15 units by the end of 2025. The projects are executed through Duos Edge AI, a subsidiary focused on modular, rapidly installed edge data centers (EDCs) in underserved markets, beginning with school districts and regional carrier hubs across Texas. The newest site is being deployed on-premises with the Dumas Independent School District in Dumas, Texas. High-Density Edge Design Duos’ EDCs emphasize very high rack densities (100 kW+ per rack), SOC 2 Type II compliance, N+1 power with dual generators, and a 90-day build/turn-up cycle. Each site is positioned approximately 12 miles from end users, cutting latency for real-time workloads. To meet the power demands of these edge deployments, Duos formed Duos Energy and partnered with Fortress/APR Energy to deliver behind-the-meter mobile gas turbines. This approach allows compute to go live in 90 days without waiting years for utility interconnection upgrades. The goal is straightforward: move power and compute close to demand, with rapid deployment. Duos’ modular pods are designed for exurban and rural locations as localized compute hubs for carriers, schools, healthcare systems, and municipal users. The rugged design pairs high-density racks with the short deployment cycle and proximity targeting, enabling a wide range of applications. With Dumas ISD now live, Duos has five sites in Texas, including Amarillo/Region 16, Victoria/Region 3, Dumas ISD, and multiple Corpus Christi locations. Mobile Power vs. Modular Compute While Duos doesn’t consistently describe its data center units as “mobile,” they are modular and containerized, engineered for rapid, site-agnostic deployment. The “mobile” label more explicitly applies to Duos’ power strategy—a turbine fleet that can be fielded or re-fielded to match demand. From an operator’s perspective, the combined proposition functions like a mobile platform: pre-integrated compute pods

Read More »

Report: AMD could be Intel’s next foundry customer

[ Related: More Intel news and insights ] AMD has lagged behind Nvidia in the AI business but has done well in the federal supercomputing business, holding numerous top spots with supercomputers like El Capitan and Frontier. Manufacturing its chips in the United States would be a good way to get the Trump administration off its back given its push for domestic manufacturing of semiconductors. The Trump administration is pushing for 50% of chips sold in America to be manufactured domestically, and tariffs on chips that are not. It also faces outbound restrictions. Earlier this year, AMD faced export restrictions GPUs meant for China as part of U.S. export controls against China’s AI business. “I believe this is a smart move by AMD to secure capacity in the local market without fighting against Nvidia and Apple and their deeper pockets for the limited capacity at TSMC,” said Alvi Nguyen, senior analyst with Forrester Research.” With the US investment in Intel, followed by Nvidia, this is can be seen as diversifying their supply chain and providing cheaper, locally sourced parts.” For Intel, this will continue a streak of good news it has enjoyed recently. “Having customers take up capacity at their foundries will go a long way in legitimizing their semiconductor processes and hopefully create the snowball effect of getting even more US-based customers,” said Nguyen. In recent weeks, Intel has partnered with Nvidia to jointly make PC and data center chips. Nvidia also took a $5B stake in Intel. Earlier the Trump administration made a $11.1B, or 10%, stake in Intel.

Read More »

AI Infra Summit 2025 Insights: AI Factories at the Core of the Fifth Industrial Revolution

NVIDIA’s AI Factory: Engineering the Future of Compute NVIDIA’s keynote illuminated the invisible but indispensable heart of the AI revolution—the AI factory. This factory blends hardware and software innovation to achieve performance breakthroughs that transcend traditional limits. Technologies such as disaggregated rack-scale GPUs and the novel 4-bit floating point numerical representation move beyond incremental improvements; they redefine what is achievable in energy efficiency and cost-effectiveness. The software ecosystem NVIDIA fosters, including open-source frameworks like Dynamo, enables unprecedented flexibility in managing inference workloads across thousands of GPUs. This adaptability is crucial given the diverse, dynamic demands of modern AI, where workloads can fluctuate dramatically in scale and complexity. The continuous leap in benchmark performance, often quadrupling hardware capabilities through software alone, continues to reinforce their accelerated innovation cycle. NVIDIA’s framing of AI factories as both technology platforms and business enablers highlights an important shift. The value computed is not merely in processing raw data but in generating economic streams through optimizing speed, reducing costs, and creating new AI services. This paradigm is central to understanding how the new industrial revolution will operate through highly efficient AI factories uniting production and innovation. AWS and the Cloud’s Role in Democratizing AI Power Amazon Web Services (AWS) represents a key pillar in making AI capabilities accessible across the innovation spectrum. AWS’ focus on security and fault tolerance reflects maturity in cloud design, ensuring trust and resilience are priorities alongside raw compute power. The evolution towards AI agents capable of specification-driven operations signifies a move beyond traditional computing paradigms towards contextual, autonomous AI services embedded deeply in workflows. Their launch of EC2 P6-B200 instances with next-generation Blackwell GPUs and specialized Trainium chips represents a continual drive to optimize AI training and inference at scale and out-of-box improvements in performance of 85% reduction in training time;

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Looking for Data Center Candidates? Check out Pkaza’s Active Candidate / Featured Candidate Hotlist Business Development Manager – Mechanical Data Center Solutions Remote / traveler This position can be located anywhere in the U.S. as long as candidate can travel and will need to be located near a major airport in the U.S. Key for this role is data center mechanical experience selling engineered products and solutions with a rolodex of either colo or hyperscale contacts.  This opportunity is working directly with a successful global OEM looking to expand market share in the critical facilities industry. They help data centers reduce energy and operating costs by providing mechanical solutions that modernize their infrastructure. By ensuring high-reliability mission-critical facilities for some of the world’s largest hyperscale, colocation, and enterprise customers, this company offers a career-growth-minded opportunity with exciting projects, cutting-edge technology, competitive salaries, and benefits. Engineering Design Director – Data CenterDenver, CO / Dallas, TX / Remote This position is a remote position in the Midwest supporting HPC / AI colo design projects being built in TX. Previous A/E experience with data center hyperscale design a must! We are seeking an experienced director of data center engineering design who will lead the development and oversight of critical power and mechanical infrastructure design for new data center builds and existing data center facilities. The ideal candidate will bring strong technical acumen, leadership skills, and extensive experience in mission-critical environments. You will oversee all aspects of engineering design, guide project teams, interface with stakeholders, and ensure best practices are upheld in quality,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »