Stay Ahead, Stay ONMINE

How the grid can ride out winter storms

The eastern half of the US saw a monster snowstorm over the weekend. The good news is the grid has largely been able to keep up with the freezing temperatures and increased demand. But there were some signs of strain, particularly for fossil-fuel plants. One analysis found that PJM, the nation’s largest grid operator, saw significant unplanned outages in plants that run on natural gas and coal. Historically, these facilities can struggle in extreme winter weather. Much of the country continues to face record-low temperatures, and the possibility is looming for even more snow this weekend. What lessons can we take from this storm, and how might we shore up the grid to cope with extreme weather? Living in New Jersey, I have the honor of being one of the roughly 67 million Americans covered by the PJM Interconnection. So I was in the thick of things this weekend, when PJM saw unplanned outages of over 20 gigawatts on Sunday during the height of the storm. (That’s about 16% of the grid’s demand that afternoon.) Other plants were able to make up the difference, and thankfully, the power didn’t go out in my area. But that’s a lot of capacity offline. Typically, the grid operator doesn’t announce details about why an outage occurs until later. But analysts at Energy Innovation, a policy and research firm specializing in energy and climate, went digging. By examining publicly available grid mix data (a breakdown of what types of power plants are supplying the grid), the team came to a big conclusion: Fossil fuels failed during the storm. The analysts found that gas-fired power plants were producing about 10 gigawatts less power on Sunday than the peak demand on Saturday, even while electricity prices were high. Coal- and oil-burning plants were down too. Because these plants weren’t operating, even when high prices would make it quite lucrative, they were likely a significant part of the problem, says Michelle Solomon, a manager in the electricity program at Energy Innovation. PJM plans to share more details about the outages at an upcoming committee meeting once the cold snap passes, Dan Lockwood, a PJM spokesperson, told me via email. Fossil-fuel plants can see reliability challenges during winter: When temperatures drop, pressures in natural-gas lines fall too, which can lead to issues for fuel supply. Freezing temperatures can throw compression stations and other mechanical equipment offline and even freeze piles of coal. One of the starkest examples came in 2021, when Texas faced freezing temperatures that took many power plants offline and threw the grid into chaos. Many homes lost power for days, and at least 246 people died during that storm. Texas fared much better this time around. After 2021, the state shored up its grid, adding winter weatherization for power plants and transmission systems. Texas has also seen a huge flood of batteries come online, which has greatly helped the grid during winter demand peaks, especially in the early mornings. Texas was also simply lucky that this storm was less severe there, as one expert told Inside Climate News this week. Here on the East Coast, we’re not out of the woods yet. The snow has stopped falling, but grids are still facing high electricity demand because of freezing temperatures. (I’ve certainly been living under my heated blanket these last few days.) PJM could see a peak power demand of 130 gigawatts for seven straight days, a winter streak that the local grid has never experienced, according to an update to the utility’s site on Tuesday morning. The US Department of Energy issued emergency orders to several grid operators, including PJM, that allow power plants to run while basically ignoring emissions regulations. The department also issued orders allowing several grids to tell data centers and other facilities to begin using backup generators. (This is good news for reliability but bad news for clean air and the climate, since these power sources are often incredibly emissions-intensive.) We here on the East Coast could learn a thing or two from Texas so we don’t need to resort to these polluting emergency measures to keep the lights on. More energy storage could be a major help in future winter storms, lending flexibility to the grid to help ride out the worst times, Solomon says. Getting offshore wind online could also help, since those facilities typically produce reliable power in the winter.  No one energy source will solve the massive challenge of building and maintaining a resilient grid. But as we face the continued threat of extreme storms, renewables might actually help us weather them.  This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The eastern half of the US saw a monster snowstorm over the weekend. The good news is the grid has largely been able to keep up with the freezing temperatures and increased demand. But there were some signs of strain, particularly for fossil-fuel plants.

One analysis found that PJM, the nation’s largest grid operator, saw significant unplanned outages in plants that run on natural gas and coal. Historically, these facilities can struggle in extreme winter weather.

Much of the country continues to face record-low temperatures, and the possibility is looming for even more snow this weekend. What lessons can we take from this storm, and how might we shore up the grid to cope with extreme weather?

Living in New Jersey, I have the honor of being one of the roughly 67 million Americans covered by the PJM Interconnection.

So I was in the thick of things this weekend, when PJM saw unplanned outages of over 20 gigawatts on Sunday during the height of the storm. (That’s about 16% of the grid’s demand that afternoon.) Other plants were able to make up the difference, and thankfully, the power didn’t go out in my area. But that’s a lot of capacity offline.

Typically, the grid operator doesn’t announce details about why an outage occurs until later. But analysts at Energy Innovation, a policy and research firm specializing in energy and climate, went digging. By examining publicly available grid mix data (a breakdown of what types of power plants are supplying the grid), the team came to a big conclusion: Fossil fuels failed during the storm.

The analysts found that gas-fired power plants were producing about 10 gigawatts less power on Sunday than the peak demand on Saturday, even while electricity prices were high. Coal- and oil-burning plants were down too. Because these plants weren’t operating, even when high prices would make it quite lucrative, they were likely a significant part of the problem, says Michelle Solomon, a manager in the electricity program at Energy Innovation.

PJM plans to share more details about the outages at an upcoming committee meeting once the cold snap passes, Dan Lockwood, a PJM spokesperson, told me via email.

Fossil-fuel plants can see reliability challenges during winter: When temperatures drop, pressures in natural-gas lines fall too, which can lead to issues for fuel supply. Freezing temperatures can throw compression stations and other mechanical equipment offline and even freeze piles of coal.

One of the starkest examples came in 2021, when Texas faced freezing temperatures that took many power plants offline and threw the grid into chaos. Many homes lost power for days, and at least 246 people died during that storm.

Texas fared much better this time around. After 2021, the state shored up its grid, adding winter weatherization for power plants and transmission systems. Texas has also seen a huge flood of batteries come online, which has greatly helped the grid during winter demand peaks, especially in the early mornings. Texas was also simply lucky that this storm was less severe there, as one expert told Inside Climate News this week.

Here on the East Coast, we’re not out of the woods yet. The snow has stopped falling, but grids are still facing high electricity demand because of freezing temperatures. (I’ve certainly been living under my heated blanket these last few days.)

PJM could see a peak power demand of 130 gigawatts for seven straight days, a winter streak that the local grid has never experienced, according to an update to the utility’s site on Tuesday morning.

The US Department of Energy issued emergency orders to several grid operators, including PJM, that allow power plants to run while basically ignoring emissions regulations. The department also issued orders allowing several grids to tell data centers and other facilities to begin using backup generators. (This is good news for reliability but bad news for clean air and the climate, since these power sources are often incredibly emissions-intensive.)

We here on the East Coast could learn a thing or two from Texas so we don’t need to resort to these polluting emergency measures to keep the lights on. More energy storage could be a major help in future winter storms, lending flexibility to the grid to help ride out the worst times, Solomon says. Getting offshore wind online could also help, since those facilities typically produce reliable power in the winter. 

No one energy source will solve the massive challenge of building and maintaining a resilient grid. But as we face the continued threat of extreme storms, renewables might actually help us weather them. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Crooks are hijacking and reselling AI infrastructure: Report

Threat actors may not only be stealing AI access from fully developed applications, the researchers added. A developer trying to prototype an app, who, through carelessness, doesn’t secure a server, could be victimized through credential theft as well. Joseph Steinberg, a US-based AI and cybersecurity expert, said the report is

Read More »

Amazon confirms 16,000 job cuts, including to AWS

Amazon is cutting about 16,000 jobs across the company, SVP of People Experience and Technology Beth Galetti wrote in an email to employees Wednesday. The cuts were widely expected — and although Galetti’s email did not mention Amazon Web Services, the cuts came as no surprise to AWS staff, some

Read More »

Denmark, Germany Ink Agreement on Bornholm Energy Island

The governments of Germany and Denmark said they have paved a new regulatory ground with the signing of an agreement for a cross-border project that will connect several wind farms via a single hub on the Danish island of Bornholm in the Baltic Sea. The project, called the Bornholm Energy Island, is designed to deliver three gigawatts of electricity onshore to be distributed to around three million homes and businesses in Denmark and Germany. The project is owned by Danish state-owned power and gas transmission systems operator Energinet and German electricity transmission system operator 50Hertz. “Denmark and Germany have agreed on sharing the costs to finance the support need for the offshore wind”, said a joint statement by the countries’ energy ministries. “The project entails the first example of two countries sharing the support needed for an offshore wind farm and proves a landmark in the development of European cross-border offshore cooperation”, the statement added. Energinet said separately “the agreement provides greater clarity on the political and economic framework for planning and coordination across the border”. 50Hertz said in its own statement the government agreement is the project’s “decisive building block”. 50 Hertz chief executive Stefan Kapferer highlighted, “The agreement between Germany and Denmark has an impact on the offshore industry across Europe, extending far beyond the Baltic Sea project Bornholm Energy Island. It is furthermore a strong European signal in geopolitically difficult times”. A statement from the European Commission, which last year signed a grant agreement worth EUR 645.2 million ($770.68 million) for the Bornholm Energy Island, said the agreement between Berlin and Copenhagen “marks a first key delivery of the Energy Highways initiative set out as part of the European Grids Package published in December 2025”. The project “will transform offshore wind energy from a national resource into a shared European resource

Read More »

Oil Prices ‘Remain Elevated and Volatile’

In a market comment sent to Rigzone on Thursday, Mazen Salhab, Head of Market Research at MH Markets, highlighted that oil prices “remain elevated and volatile … as the market prices supply risk and demand uncertainty at the same time”. “Geopolitical tension around Iran continues to underpin prices by lifting the risk premium, even without actual disruptions, while producer discipline and recent weather related outages are limiting downside,” Salhab said in the comment. “At the same time, economic data points to slowing but resilient demand, keeping oil supported as both a growth-sensitive asset and a hedge,” he added. Looking at the near term in his comment, Salhab said labor market data “is the key trigger”. “Weaker jobs data would support oil via a softer dollar but raise demand concerns, while stronger data may pressure prices initially yet reinforce demand expectations,” he warned. “Until either geopolitics or macro data clearly dominates, oil is likely to remain range-bound but firm, with sharp reactions to headlines and data surprises,” Salhab projected. In a report sent to Rigzone by the Standard Chartered team on Wednesday, Standard Chartered Bank Energy Research Head Emily Ashford noted that a U.S. winter storm “disrupted crude oil output and refining operations, prompting localized supply tightness”. “The NYMEX heating oil-WTI spread has stabilized between $36-38 per barrel over the week, while the ICE gasoil-Brent spread pushed above $25 per barrel on 21 January and is currently trading around $23 per barrel,” Ashford added. “We expect these cracks to remain sensitive to refining throughput-recovery timelines,” Ashford continued. The Standard Chartered report projected that the ICE Brent nearby future crude oil price will average $63.50 per barrel in 2026 and $67.00 per barrel in 2027. This commodity averaged $68.50 per barrel in 2025, the report showed. A quarterly breakdown included in the Standard Chartered

Read More »

Petrobras Expands Contracts to Supply Key Indian Refiners

Petróleo Brasileiro SA said Wednesday it has renewed contracts to supply crude oil to three of India’s main refiners. “The crude oil sales contracts with Indian Oil Corp Ltd (IOC), Bharat Petroleum Corp Ltd (BPCL) and Hindustan Petroleum Corp Ltd (HPCL) will remain in force until March 2027”, the Brazilian state-owned company said in an online statement. “The commercial instruments represent potential sales of up to 60 million barrels, with a total value that could exceed $3.1 billion. “With IOC, India’s largest state-owned refiner, the new contract provides for the sale of up to 24 million barrels of Brazilian crude oil, with a 12-month term, renewable for an equal period. “With BPCL and HPCL, Petrobras expanded the maximum volume of each contract from six million barrels to 18 million barrels, also running through March 2027”. Petrobras noted, “India, one of the world’s largest crude oil importers, purchases around five million barrels per day and is a strategic market for Petrobras”. Petrobras director for logistics, marketing and markets Cláudio Schlosser said, “The contracts strengthen Petrobras’ presence in the Indian market and contribute to diversifying our crude oil export customer base. We are committed to strengthening strategic partnerships, expanding our global footprint and generating value for Brazil”. The new contracts with the state-owned refiners were signed at the India Energy Week in the Indian city of Goa. Increased Reserves Separately on Wednesday Petrobras said it had increased its proven reserves of oil, natural gas and condensate to 12.1 billion barrels of oil equivalent (Bboe) as of the end of 2025. The company said it had added 1.7 Bboe of reserves last year. “The reserve replacement rate was 175 percent, even after a record annual production in 2025. The proved reserves-to-production ratio is 12.5 years”, Petrobras said in a press release. “The reserves addition occurred mainly due to the outstanding performance of the assets, with emphasis on Búzios,

Read More »

Lukoil to Sell Most of Foreign Assets to Carlyle

Russia’s Lukoil PJSC agreed to sell most of its international assets to private equity giant Carlyle Group, three months after being hit by US sanctions. The deal doesn’t include assets in Kazakhstan, the oil producer said Thursday in a statement, without disclosing financial terms. The transaction is subject to regulatory approvals and talks with other potential buyers continue, it said.  The US last year blacklisted the company and peer Rosneft PJSC as the Trump administration ramped up pressure on the Kremlin to end the war in Ukraine. The move sent shockwaves across countries where Lukoil operates, prompting some governments to press Washington for special licenses. The proposed transaction has been structured to be “fully compliant” with the requirements of the US Office of Foreign Assets Control, Carlyle said in a separate statement. The firm said it’ll ensure “operational continuity, preserving jobs, stabilizing the asset base and supporting safe, reliable performance.” Lukoil announced in October it had agreed to sell its international assets to energy trader Gunvor Group, but the deal collapsed after the US Treasury described Gunvor as the Kremlin’s “puppet.” The assets subsequently drew attention from companies including Exxon Mobil Corp., Chevron Corp. and Abu Dhabi National Oil Co. As Russia’s most internationally diversified oil company, Lukoil has stakes in refineries in Europe, significant holdings in oil fields from Iraq to Kazakhstan and a network of 5,300 fuel stations in 20 countries. Kazakh Interest Kazakhstan has expressed an interest in buying out Lukoil’s share of joint projects in the country and has submitted a request to OFAC, Interfax reported this week, citing Energy Minister Yerlan Akkenzhenov. Following the imposition of sanctions, Lukoil’s international oil-trading business has had to wind down operations in some locations. In the US, gasoline stations linked to the company experienced issues with card payments, while its fuel-retail

Read More »

Department of Energy Seeks Hosts for Nuclear Lifecycle Innovation Campuses

New Effort to Modernize America’s Nuclear Fuel Cycle and Support Advanced Reactor Deployment WASHINGTON—The U.S. Department of Energy (DOE) today issued a Request for Information (RFI) inviting states to express interest in hosting Nuclear Lifecycle Innovation Campuses, a new effort to modernize the nation’s full nuclear fuel cycle and strengthen America’s leadership in advanced nuclear energy. This action marks the first step towards potentially establishing voluntary Federal-State partnerships designed to advance regional economic growth, enhance national energy security, and build a coherent, end-to-end nuclear energy strategy for the country. “Unleashing the next American nuclear renaissance will drive innovation, fuel economic growth, and create good-paying American jobs while delivering the affordable, reliable and secure energy America needs to power its future,” said U.S. Energy Secretary Chris Wright. “Nuclear Lifecycle Innovation Campuses give us the opportunity to work directly with states on regional priorities that support President Trump’s vision to revitalize America’s nuclear base.” The proposed campuses could support activities across the full nuclear fuel lifecycle, including fuel fabrication, enrichment, reprocessing used nuclear fuel, and disposition of waste. Depending on state priorities and regional capabilities, the sites could also host advanced reactor deployment, power generation, advanced manufacturing, and co located data centers. DOE is inviting states to provide clear statements of interest and constructive feedback on the structure of the Innovation Campuses. Submissions should outline state priorities—such as workforce development, infrastructure investment, economic diversification, or technology leadership— and describe the scope of activities the state envisions hosting. States are also encouraged to identify the funding structures, risk sharing approaches, incentives and federal partnerships required to successfully establish and sustain a full-cycle Innovation Campus.  Responses to the RFI are requested no later than April 1, 2026. Interested parties can learn more and respond at SAM.gov. ###

Read More »

Oil Options on Longest Bullish Run Since 2024

Oil traders are paying a premium for bullish call options for the longest stretch in about 14 months as they huddle in the options market to protect against the risk of a new confrontation between the US and Iran.  The global Brent benchmark has registered a call skew for 14 consecutive sessions, while the equivalent US marker has seen such a pattern for the 13 most recent trading days. Those are the longest stretches since late 2024, when Israel launched attacks on Iranian military installations.  Thousands are estimated to have been killed in the recent wave of unrest to challenge Supreme Leader Ayatollah Ali Khamenei and his regime, sparking an international outcry, including warnings from US President Donald Trump of “strong action” if the killings did not stop. Trump said this week that the US has a “big armada” headed to the Middle East because of Iran, but added that he hoped the US won’t have to use it.  Options markets have been the main way traders have wagered on heightened geopolitical risk in the Middle East in recent years, in a period that started with Hamas’s attack on Israel in October 2023. When the US struck Iran last year, premiums for calls spiked and then collapsed after it became apparent that oil facilities had been spared.  “The focus on Iran continues,” said Arne Lohmann Rasmussen, chief analyst at A/S Global Risk Management. “The market will likely remain nervous over the coming days.” The uncertainty is leading to chunky additions of bullish options contracts. Open interest in Brent call options has accrued at the fastest pace this month in at least six years, according to Bloomberg calculations of ICE Futures Europe data. It follows the busiest ever day of Brent crude call options trading earlier this month. Hedge funds have also boosted net-bullish wagers on crude

Read More »

China clears Nvidia H200 sales to tech giants, reshaping AI data center plans

China is also accelerating efforts to strengthen domestic training chip design and manufacturing capabilities, with the strategic objective of reducing long-term dependence on foreign suppliers, Zeng added. Things could get more complex if authorities mandated imported chips to be deployed alongside domestically produced accelerators. Reuters has reported that this may be a possibility. “A mandated bundling requirement would create a heterogeneous computing environment that significantly increases system complexity,” Zeng said. “Performance inconsistencies and communication protocol disparities across different chip architectures would elevate O&M [operations and maintenance] overhead and introduce additional network latency.” However, the approvals are unlikely to close the gap with US hyperscalers, Zeng said, noting that the H200 remains one generation behind Nvidia’s Blackwell architecture and that approved volumes fall well short of China’s overall demand. Implications for global enterprises For global enterprise IT and network leaders, the move adds another variable to long-term AI infrastructure planning. Expanded sales of Nvidia’s H200 chips could help the company increase production scale, potentially creating room to ease pricing for Western enterprises deploying H200-based AI infrastructure, said Neil Shah, VP for research at Counterpoint Research.

Read More »

Nuclear safety rules quietly rewritten to favor AI

‘Referee now plays for the home team’ Kimball pointed out that while an SMR works on the same principle as a large-scale nuclear plant, using controlled fission to generate heat which is then converted to electricity, its design reduces environmental impacts such as groundwater contamination, water use, and the impact in the event of failure. For example, he said, the integral reactor design in an SMR, with all components in a single vessel, eliminates external piping. This means that accidents would be self-contained, reducing the environmental impact. In addition, he said, SMRs can be air-cooled, which greatly reduces the amount of water required. “These are just a couple of examples of how an SMR differs from the large industrial nuclear power plants we think of when we think of nuclear power.”  Because of differences like this, said Kimball, “I can see where rules generated/strengthened in the post-Three Mile Island era might need to be revisited for this new nuclear era. But it is really difficult to speak to how ‘loose’ these rules have become, and whether distinctions between SMRs and large-scale nuclear plants comprise the majority of the changes reported.” Finally, he said, “I don’t think I need to spend too many words on articulating the value of nuclear to the hyperscale or AI data center. The era of the gigawatt datacenter is upon us, and the traditional means of generating power can’t support this insatiable demand. But we have to ensure we deploy power infrastructure, such as SMRs, in a responsible, ethical, and safe manner.”  Further to that, Gogia pointed out that for CIOs and infrastructure architects, the risks extend well beyond potential radiation leaks. “What matters more immediately is that system anomalies — mechanical, thermal, software-related — may not be documented, investigated, or escalated with the diligence one would expect from

Read More »

Mplify launches AI-focused Carrier Ethernet certifications

“We didn’t want to just put a different sticker on it,” Vachon said. “We wanted to give the opportunity for operators to recertify their infrastructure so at least you’ve now got this very competitive infrastructure.” Testing occurs on live production networks. The automated testing platform can be completed in days once technical preparation is finished. Organizations pay once per certification with predictable annual maintenance fees required to keep certifications active. Optional retesting can refresh certification test records. Carrier Ethernet for AI The Carrier Ethernet for AI certification takes the business certification baseline and adds a performance layer specifically designed for AI workloads. Rather than creating a separate track, the AI certification requires providers to first complete the Carrier Ethernet for Business validation, then demonstrate they can meet additional stringent requirements. “What we identified was that there was another tier that we could produce a standard around for AI,” Vachon explained. “With extensive technical discussions with our membership, our CTO, and our director of certification, they identified the critical performance and functionality parameters.” The additional validation focuses on three key performance parameters: frame delay, inter-frame delay variation, and frame loss ratio aligned with AI workload requirements. Testing uses MEF 91 test requirements with AI-specific traffic profiles and performance objectives that go beyond standard business service thresholds. The program targets three primary use cases: connecting subscriber premises running AI applications to AI edge sites, interconnecting AI edge sites to AI data centers, and AI data center to data center interconnections.

Read More »

Gauging the real impact of AI agents

That creates the primary network issue for AI agents, which is dealing with implicit and creeping data. There’s a singular important difference between an AI agent component and an ordinary software component. Software is explicit in its use of data. The programming includes data identification. AI is implicit in its data use; the model was trained on data, and there may well be some API linkage to databases that aren’t obvious to the user of the model. It’s also often true that when an agentic component is used, it’s determined that additional data resources are needed. Are all these resources in the same place? Probably not. The enterprises with the most experience with AI agents say it would be smart to expect some data center network upgrades to link agents to databases, and if the agents are distributed away from the data center, it may be necessary to improve the agent sites’ connection to the corporate VPN. As agents evolve into real-time applications, this requires they also be proximate to the real-time system they support (a factory or warehouse), so the data center, the users, and any real-time process pieces all pull at the source of hosting to optimize latency. Obviously, they can’t all be moved into one place, so the network has to make a broad and efficient set of connections. That efficiency demands QoS guarantees on latency as well as on availability. It’s in the area of availability, with a secondary focus on QoS attributes like latency, that the most agent-experienced enterprises see potential new service opportunities. Right now, these tend to exist within a fairly small circle—a plant, a campus, perhaps a city or town—but over time, key enterprises say that their new-service interest could span a metro area. They point out that the real-time edge applications

Read More »

Photonic chip vendor snags Gates investment

“Moore’s Law is slowing, but AI can’t afford to wait. Our breakthrough in photonics unlocks an entirely new dimension of scaling, by packing massive optical parallelism on a single chip,” said Patrick Bowen, CEO of Neurophos. “This physics-level shift means both efficiency and raw speed improve as we scale up, breaking free from the power walls that constrain traditional GPUs.” The new funding includes investments from Microsoft’s investment fund M12 that will help speed up delivery of Neurophos’ first integrated photonic compute system, including datacenter-ready OPU modules. Neurophos is not the only company exploring this field. Last April, Lightmatter announced the launch of photonic chips to tackle data center bottlenecks, And in 2024, IBM said its researchers were exploring optical chips and developing a prototype in this area.

Read More »

Intel wrestling with CPU supply shortage

“We have important customers in the data center side. We have important OEM customers on both data center and client and that needs to be our priority to get the limited supply we have to those customers,” he added. CEO Lip-Bu Tan added that the continuing proliferation and diversification of AI workloads is placing significant capacity constraints on traditional and new hardware infrastructure, reinforcing the growing and essential role CPUs play in the AI era. Because of this, Intel decided to simplify its server road map, focusing resources on the 16-channel Diamond Rapids product and accelerate the introduction of Coral Rapids. Intel had removed multithreading from diamond Rapids, presumably to get rid of the performance bottlenecks. With each core running two threads, they often competed for resources. That’s why, for example, Ampere does not use threading but instead applies many more cores per CPU. With Coral Rapids, Intel is not only reintroducing multi-threading back into our data center road map but working closely with Nvidia to build a custom Xeon fully integrated with their NVLink technology to Build the tighter connection between Intel Xeon processors and Nvidia GPUs. Another aspect impacting supply has been yields or the new 18A process node. Tan said he was disappointed that the company could not fully meet the demand of the markets, and that while yields are in line with internal plans, “they’re still below where I want them to be,” Tan said.  Tan said yields for 18A are improving month-over-month and Intel is targeting a 7% to 8% improvement each month.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »