Stay Ahead, Stay ONMINE

Why climate researchers are taking the temperature of mountain snow

On a crisp morning in early April, Dan McEvoy and Bjoern Bingham cut clean lines down a wide run at the Heavenly Ski Resort in South Lake Tahoe, then ducked under a rope line cordoning off a patch of untouched snow.  They side-stepped up a small incline, poled past a row of Jeffrey pines, then dropped their packs.  The pair of climate researchers from the Desert Research Institute (DRI) in Reno, Nevada, skied down to this research plot in the middle of the resort to test out a new way to take the temperature of the Sierra Nevada snowpack. They were equipped with an experimental infrared device that can take readings as it’s lowered down a hole in the snow to the ground. The Sierra’s frozen reservoir provides about a third of California’s water and most of what comes out of the faucets, shower heads, and sprinklers in the towns and cities of northwestern Nevada. As it melts through the spring and summer, dam operators, water agencies, and communities have to manage the flow of billions of gallons of runoff, storing up enough to get through the inevitable dry summer months without allowing reservoirs and canals to flood. The need for better snowpack temperature data has become increasingly critical for predicting when the water will flow down the mountains, as climate change fuels hotter weather, melts snow faster, and drives rapid swings between very wet and very dry periods.  In the past, it has been arduous work to gather such snowpack observations. Now, a new generation of tools, techniques, and models promises to ease that process, improve water forecasts, and help California and other states safely manage one of their largest sources of water in the face of increasingly severe droughts and flooding.Observers, however, fear that any such advances could be undercut by the Trump administration’s cutbacks across federal agencies, including the one that oversees federal snowpack monitoring and survey work. That could jeopardize ongoing efforts to produce the water data and forecasts on which Western communities rely. “If we don’t have those measurements, it’s like driving your car around without a fuel gauge,” says Larry O’Neill, Oregon’s state climatologist. “We won’t know how much water is up in the mountains, and whether there’s enough to last through the summer.” The birth of snow surveys The snow survey program in the US was born near Lake Tahoe, the largest alpine lake in North America, around the turn of the 20th century.  Without any reliable way of knowing how much water would flow down the mountain each spring, lakefront home and business owners, fearing floods, implored dam operators to release water early in the spring. Downstream communities and farmers pushed back, however, demanding that the dam was used to hold onto as much water as possible to avoid shortages later in the year.  In 1908, James Church, a classics professor at the University of Nevada, Reno, whose passion for hiking around the mountains sparked an interest in the science of snow, invented a device that helped resolve the so-called Lake Tahoe Water Wars: the Mt. Rose snow sampler, named after the peak of a Sierra spur that juts into Nevada. James Church, a professor of classics at the University of Nevada, Reno, became a pioneer in the field of snow surveys.COURTESY OF UNIVERSITY OF NEVADA, RENO It’s a simple enough device, with sections of tube that screw together, a sharpened end, and measurement ticks along the side. Snow surveyors measure the depth of the snow by plunging the sampler down to the ground. They then weigh the filled tube on a specialized scale to calculate the water content of the snow.  Church used the device to take measurements at various points across the range, and calibrated his water forecasts by comparing his readings against the rising and falling levels of Lake Tahoe.  It worked so well that the US began a federal snow survey program in the mid-1930s, which evolved into the one carried on today by the Department of Agriculture’s Natural Resources Conservation Service (NRCS). Throughout the winter, hundreds of snow surveyors across the American West head up to established locations on snowshoes, backcountry skis, or snowmobiles to deploy their Mt. Rose samplers, which have barely changed over more than a century.  In the 1960s, the US government also began setting up a network of permanent monitoring sites across the mountains, now known as the SNOTEL network. There are more than 900 stations continuously transmitting readings from across Western states and Alaska. They’re equipped with sensors that measure air temperature, snow depth, and soil moisture, and include pressure-sensitive “snow pillows” that weigh the snow to determine the water content.  The data from the snow surveys and SNOTEL sites all flows into snow depth and snow water content reports that the NRCS publishes, along with forecasts of the amount of water that will fill the streams and reservoirs through the spring and summer. Taking the temperature None of these survey and monitoring programs, however, provide the temperature throughout the snowpack.  The Sierra Nevada snowpack can reach more than 6 meters (20 feet), and the temperature within it may vary widely, especially toward the top. Readings taken at increments throughout can determine what’s known as the cold content, or the amount of energy required to shift the snowpack to a uniform temperature of 32˚F.  Knowing the cold content of the snowpack helps researchers understand the conditions under which it will begin to rapidly melt, particularly as it warms up in the spring or after rain falls on top of the snow. If the temperature of the snow, for example, is close to 32˚F even at several feet deep, a few warm days could easily set it melting. If, on the other hand, the temperature measurements show a colder profile throughout the middle, the snowpack is more stable and will hold up longer as the weather warms. Bjoern Bingham, a research scientist at the Desert Research Institute, digs at snowpit at a research plot within the Heavenly Ski Resort, near South Lake Tahoe, California. JAMES TEMPLE The problem is that taking the temperature of the entire snowpack has been, until now, tough and time-consuming work. When researchers do it at all, they mainly do so by digging snow pits down to the ground and then taking readings with probe thermometers along an inside wall.There have been a variety of efforts to take continuous remote readings from sensors attached to fences, wires, or towers, which the snowpack eventually buries. But the movement and weight of the dense shifting snow tends to break the devices or snap the structures they’re assembled upon. “They rarely last a season,” McAvoy says. Anne Heggli, a professor of mountain hydrometeorology at DRI, happened upon the idea of using an infrared device to solve this problem during a tour of the institute’s campus in 2019, when she learned that researchers there were using an infrared meat thermometer to take contactless readings of the snow surface. In 2021, Heggli began collaborating with RPM Systems, a gadget manufacturing company, to design an infrared device optimized for snowpack field conditions. The resulting snow temperature profiler is skinny enough to fit down a hole dug by snow surveyors and dangles on a cord marked off at 10-centimeter (4-inch) increments. Bingham and Daniel McEvoy, an associate research professor at the Desert Research Institute, work together to take temperature readings from inside the snowpit as well as from within the hole left behind by a snow sampler.JAMES TEMPLE At Heavenly on that April morning, Bingham, a staff scientist at DRI, slowly fed the device down a snow sampler hole, calling out temperature readings at each marking. McEvoy scribbled them down on a worksheet fastened to his clipboard as he used a probe thermometer to take readings of his own from within a snow pit the pair had dug down to the ground. They were comparing the measurements to assess the reliability of the infrared device in the field, but the eventual aim is to eliminate the need to dig snow pits. The hope is that state and federal surveyors could simply carry along a snow temperature profiler and drop it into the snowpack survey holes they’re creating anyway, to gather regular snowpack temperature readings from across the mountains. In 2023, the US Bureau of Reclamation, the federal agency that operates many of the nation’s dams, funded a three-year research project to explore the use of the infrared gadgets in determining snowpack temperatures. Through it, the DRI research team has now handed devices out to 20 snow survey teams across California, Colorado, Idaho, Montana, Nevada, and Utah to test their use in the field and supplement the snowpack data they’re collecting. The Snow Lab The DRI research project is one piece of a wider effort to obtain snowpack temperature data across the mountains of the West. By early May, the snow depth had dropped from an April peak of 114 inches to 24 inches (2.9 meters to 0.6 meters) at the UC Berkeley Central Sierra Snow Lab, an aging wooden structure perched in the high mountains northwest of Lake Tahoe.Megan Mason, a research scientist at the lab, used a backcountry ski shovel to dig out a trio of instruments from what was left of the pitted snowpack behind the building. Each one featured different types of temperature sensors, arrayed along a strong polymer beam meant to hold up under the weight and movement of the Sierra snowpack.   She was pulling up the devices after running the last set of observations for the season, as part of an effort to develop a resilient system that can survive the winter and transmit hourly temperature readings. The lab is working on the project, dubbed the California Cold Content Initiative, in collaboration with the state’s Department of Water Resources. California is the only western state that opted to maintain its own snow survey program and run its own permanent monitoring stations, all of which are managed by the water department.  The plan is to determine which instruments held up and functioned best this winter. Then, they can begin testing the most promising approaches at several additional sites next season. Eventually, the goal is to attach the devices at more than 100 of California’s snow monitoring stations, says Andrew Schwartz, the director of the lab.The NRCS is conducting a similar research effort at select SNOTEL sites equipped with a beaded temperature cable. One such cable is visible at the Heavenly SNOTEL station, next to where McEvoy and Bingham dug their snow pit, strung vertically between an arm extended from the main tower and the snow-covered ground.  DRI’s Bjoern Bingham feeds the snow temperature profiler, an infrared device, down a hole in the Sierra snowpack.JAMES TEMPLE Schwartz said that the different research groups are communicating and collaborating openly on the projects, all of which promise to provide complementary information, expanding the database of snowpack temperature readings across the West. For decades, agencies and researchers generally produced water forecasts using relatively simple regression models that translated the amount of water in the snowpack into the amount of water that will flow down the mountain, based largely on the historic relationships between those variables.  But these models are becoming less reliable as climate change alters temperatures, snow levels, melt rates, and evaporation, and otherwise drives alpine weather patterns outside of historic patterns. “As we have years that scatter further and more frequently from the norm, our models aren’t prepared,” Heggli says. Plugging direct temperature observations into more sophisticated models that have emerged in recent years, Schwartz says, promises to significantly improve the accuracy of water forecasts. That, in turn, should help communities manage through droughts and prevent dams from overtopping even as climate change fuels alternately wetter, drier, warmer, and weirder weather. About a quarter of the world’s population relies on water stored in mountain snow and glaciers, and climate change is disrupting the hydrological cycles that sustain these natural frozen reservoirs in many parts of the world. So any advances in observations and modeling could deliver broader global benefits. Ominous weather There’s an obvious threat to this progress, though. Even if these projects work as well as hoped, it’s not clear how widely these tools and techniques will be deployed at a time when the White House is gutting staff across federal agencies, terminating thousands of scientific grants, and striving to eliminate tens of billions of dollars in funding at research departments.  The Trump administration has fired or put on administrative leave nearly 6,000 employees across the USDA, or 6% of the department’s workforce. Those cutbacks have reached regional NRCS offices, according to reporting by local and trade outlets. That includes more than half of the roles at the Portland office, according to O’Neill, the state climatologist. Those reductions prompted a bipartisan group of legislators to call on the Secretary of Agriculture to restore the positions, warning the losses could impair water data and analyses that are crucial for the state’s “agriculture, wildland fire, hydropower, timber, and tourism sectors,” as the Statesman Journal reported. There are more than 80 active SNOTEL stations in Oregon. The fear is there won’t be enough people left to reach all the sites this summer to replace batteries, solar panels, and drifting or broken sensors, which could quickly undermine the reliability of the data or cut off the flow of information.  “Staff and budget reductions at NRCS will make it impossible to maintain SNOTEL instruments and conduct routine manual observations, leading to inoperability of the network within a year,” the lawmakers warned. The USDA and NRCS didn’t respond to inquiries from MIT Technology Review.  DRI’s Daniel McEvoy scribbles down temperature readings at the Heavenly site.JAMES TEMPLE If the federal cutbacks deplete the data coming back from SNOTEL stations or federal snow survey work, the DRI infrared method could at least “still offer a simplistic way of measuring the snowpack temperatures” in places where state and regional agencies continue to carry out surveys, McAvoy says. But most researchers stress the field needs more surveys, stations, sensors, and readings to understand how the climate and water cycles are changing from month to month and season to season. Heggli stresses that there should be broad bipartisan support for programs that collect snowpack data and provide the water forecasts that farmers and communities rely on.  “This is how we account for one of, if not the, most valuable resource we have,” she says. “In the West, we go into a seasonal drought every summer; our snowpack is what trickles down and gets us through that drought. We need to know how much we have.”

On a crisp morning in early April, Dan McEvoy and Bjoern Bingham cut clean lines down a wide run at the Heavenly Ski Resort in South Lake Tahoe, then ducked under a rope line cordoning off a patch of untouched snow. 

They side-stepped up a small incline, poled past a row of Jeffrey pines, then dropped their packs. 

The pair of climate researchers from the Desert Research Institute (DRI) in Reno, Nevada, skied down to this research plot in the middle of the resort to test out a new way to take the temperature of the Sierra Nevada snowpack. They were equipped with an experimental infrared device that can take readings as it’s lowered down a hole in the snow to the ground.

The Sierra’s frozen reservoir provides about a third of California’s water and most of what comes out of the faucets, shower heads, and sprinklers in the towns and cities of northwestern Nevada. As it melts through the spring and summer, dam operators, water agencies, and communities have to manage the flow of billions of gallons of runoff, storing up enough to get through the inevitable dry summer months without allowing reservoirs and canals to flood.

The need for better snowpack temperature data has become increasingly critical for predicting when the water will flow down the mountains, as climate change fuels hotter weather, melts snow faster, and drives rapid swings between very wet and very dry periods. 

In the past, it has been arduous work to gather such snowpack observations. Now, a new generation of tools, techniques, and models promises to ease that process, improve water forecasts, and help California and other states safely manage one of their largest sources of water in the face of increasingly severe droughts and flooding.

Observers, however, fear that any such advances could be undercut by the Trump administration’s cutbacks across federal agencies, including the one that oversees federal snowpack monitoring and survey work. That could jeopardize ongoing efforts to produce the water data and forecasts on which Western communities rely.

“If we don’t have those measurements, it’s like driving your car around without a fuel gauge,” says Larry O’Neill, Oregon’s state climatologist. “We won’t know how much water is up in the mountains, and whether there’s enough to last through the summer.”

The birth of snow surveys

The snow survey program in the US was born near Lake Tahoe, the largest alpine lake in North America, around the turn of the 20th century. 

Without any reliable way of knowing how much water would flow down the mountain each spring, lakefront home and business owners, fearing floods, implored dam operators to release water early in the spring. Downstream communities and farmers pushed back, however, demanding that the dam was used to hold onto as much water as possible to avoid shortages later in the year. 

In 1908, James Church, a classics professor at the University of Nevada, Reno, whose passion for hiking around the mountains sparked an interest in the science of snow, invented a device that helped resolve the so-called Lake Tahoe Water Wars: the Mt. Rose snow sampler, named after the peak of a Sierra spur that juts into Nevada.

Professor James E. Church wearing goggles and snowshoes, standing on a snowy hillside
James Church, a professor of classics at the University of Nevada, Reno, became a pioneer in the field of snow surveys.
COURTESY OF UNIVERSITY OF NEVADA, RENO

It’s a simple enough device, with sections of tube that screw together, a sharpened end, and measurement ticks along the side. Snow surveyors measure the depth of the snow by plunging the sampler down to the ground. They then weigh the filled tube on a specialized scale to calculate the water content of the snow. 

Church used the device to take measurements at various points across the range, and calibrated his water forecasts by comparing his readings against the rising and falling levels of Lake Tahoe. 

It worked so well that the US began a federal snow survey program in the mid-1930s, which evolved into the one carried on today by the Department of Agriculture’s Natural Resources Conservation Service (NRCS). Throughout the winter, hundreds of snow surveyors across the American West head up to established locations on snowshoes, backcountry skis, or snowmobiles to deploy their Mt. Rose samplers, which have barely changed over more than a century. 

In the 1960s, the US government also began setting up a network of permanent monitoring sites across the mountains, now known as the SNOTEL network. There are more than 900 stations continuously transmitting readings from across Western states and Alaska. They’re equipped with sensors that measure air temperature, snow depth, and soil moisture, and include pressure-sensitive “snow pillows” that weigh the snow to determine the water content. 

The data from the snow surveys and SNOTEL sites all flows into snow depth and snow water content reports that the NRCS publishes, along with forecasts of the amount of water that will fill the streams and reservoirs through the spring and summer.

Taking the temperature

None of these survey and monitoring programs, however, provide the temperature throughout the snowpack. 

The Sierra Nevada snowpack can reach more than 6 meters (20 feet), and the temperature within it may vary widely, especially toward the top. Readings taken at increments throughout can determine what’s known as the cold content, or the amount of energy required to shift the snowpack to a uniform temperature of 32˚F. 

Knowing the cold content of the snowpack helps researchers understand the conditions under which it will begin to rapidly melt, particularly as it warms up in the spring or after rain falls on top of the snow.

If the temperature of the snow, for example, is close to 32˚F even at several feet deep, a few warm days could easily set it melting. If, on the other hand, the temperature measurements show a colder profile throughout the middle, the snowpack is more stable and will hold up longer as the weather warms.

a person with raising a snow shovel up at head height
Bjoern Bingham, a research scientist at the Desert Research Institute, digs at snowpit at a research plot within the Heavenly Ski Resort, near South Lake Tahoe, California.
JAMES TEMPLE

The problem is that taking the temperature of the entire snowpack has been, until now, tough and time-consuming work. When researchers do it at all, they mainly do so by digging snow pits down to the ground and then taking readings with probe thermometers along an inside wall.

There have been a variety of efforts to take continuous remote readings from sensors attached to fences, wires, or towers, which the snowpack eventually buries. But the movement and weight of the dense shifting snow tends to break the devices or snap the structures they’re assembled upon.

“They rarely last a season,” McAvoy says.

Anne Heggli, a professor of mountain hydrometeorology at DRI, happened upon the idea of using an infrared device to solve this problem during a tour of the institute’s campus in 2019, when she learned that researchers there were using an infrared meat thermometer to take contactless readings of the snow surface.

In 2021, Heggli began collaborating with RPM Systems, a gadget manufacturing company, to design an infrared device optimized for snowpack field conditions. The resulting snow temperature profiler is skinny enough to fit down a hole dug by snow surveyors and dangles on a cord marked off at 10-centimeter (4-inch) increments.

a researcher stands in a snowy trench taking notes, while a second researcher drops a yellow measure down from the surface level
Bingham and Daniel McEvoy, an associate research professor at the Desert Research Institute, work together to take temperature readings from inside the snowpit as well as from within the hole left behind by a snow sampler.
JAMES TEMPLE

At Heavenly on that April morning, Bingham, a staff scientist at DRI, slowly fed the device down a snow sampler hole, calling out temperature readings at each marking. McEvoy scribbled them down on a worksheet fastened to his clipboard as he used a probe thermometer to take readings of his own from within a snow pit the pair had dug down to the ground.

They were comparing the measurements to assess the reliability of the infrared device in the field, but the eventual aim is to eliminate the need to dig snow pits. The hope is that state and federal surveyors could simply carry along a snow temperature profiler and drop it into the snowpack survey holes they’re creating anyway, to gather regular snowpack temperature readings from across the mountains.

In 2023, the US Bureau of Reclamation, the federal agency that operates many of the nation’s dams, funded a three-year research project to explore the use of the infrared gadgets in determining snowpack temperatures. Through it, the DRI research team has now handed devices out to 20 snow survey teams across California, Colorado, Idaho, Montana, Nevada, and Utah to test their use in the field and supplement the snowpack data they’re collecting.

The Snow Lab

The DRI research project is one piece of a wider effort to obtain snowpack temperature data across the mountains of the West.

By early May, the snow depth had dropped from an April peak of 114 inches to 24 inches (2.9 meters to 0.6 meters) at the UC Berkeley Central Sierra Snow Lab, an aging wooden structure perched in the high mountains northwest of Lake Tahoe.

Megan Mason, a research scientist at the lab, used a backcountry ski shovel to dig out a trio of instruments from what was left of the pitted snowpack behind the building. Each one featured different types of temperature sensors, arrayed along a strong polymer beam meant to hold up under the weight and movement of the Sierra snowpack.  

She was pulling up the devices after running the last set of observations for the season, as part of an effort to develop a resilient system that can survive the winter and transmit hourly temperature readings.

The lab is working on the project, dubbed the California Cold Content Initiative, in collaboration with the state’s Department of Water Resources. California is the only western state that opted to maintain its own snow survey program and run its own permanent monitoring stations, all of which are managed by the water department. 

The plan is to determine which instruments held up and functioned best this winter. Then, they can begin testing the most promising approaches at several additional sites next season. Eventually, the goal is to attach the devices at more than 100 of California’s snow monitoring stations, says Andrew Schwartz, the director of the lab.

The NRCS is conducting a similar research effort at select SNOTEL sites equipped with a beaded temperature cable. One such cable is visible at the Heavenly SNOTEL station, next to where McEvoy and Bingham dug their snow pit, strung vertically between an arm extended from the main tower and the snow-covered ground. 

a gloved hand inserts a probe wire into a hole in the snow
DRI’s Bjoern Bingham feeds the snow temperature profiler, an infrared device, down a hole in the Sierra snowpack.
JAMES TEMPLE

Schwartz said that the different research groups are communicating and collaborating openly on the projects, all of which promise to provide complementary information, expanding the database of snowpack temperature readings across the West.

For decades, agencies and researchers generally produced water forecasts using relatively simple regression models that translated the amount of water in the snowpack into the amount of water that will flow down the mountain, based largely on the historic relationships between those variables. 

But these models are becoming less reliable as climate change alters temperatures, snow levels, melt rates, and evaporation, and otherwise drives alpine weather patterns outside of historic patterns.

“As we have years that scatter further and more frequently from the norm, our models aren’t prepared,” Heggli says.

Plugging direct temperature observations into more sophisticated models that have emerged in recent years, Schwartz says, promises to significantly improve the accuracy of water forecasts. That, in turn, should help communities manage through droughts and prevent dams from overtopping even as climate change fuels alternately wetter, drier, warmer, and weirder weather.

About a quarter of the world’s population relies on water stored in mountain snow and glaciers, and climate change is disrupting the hydrological cycles that sustain these natural frozen reservoirs in many parts of the world. So any advances in observations and modeling could deliver broader global benefits.

Ominous weather

There’s an obvious threat to this progress, though.

Even if these projects work as well as hoped, it’s not clear how widely these tools and techniques will be deployed at a time when the White House is gutting staff across federal agencies, terminating thousands of scientific grants, and striving to eliminate tens of billions of dollars in funding at research departments. 

The Trump administration has fired or put on administrative leave nearly 6,000 employees across the USDA, or 6% of the department’s workforce. Those cutbacks have reached regional NRCS offices, according to reporting by local and trade outlets.

That includes more than half of the roles at the Portland office, according to O’Neill, the state climatologist. Those reductions prompted a bipartisan group of legislators to call on the Secretary of Agriculture to restore the positions, warning the losses could impair water data and analyses that are crucial for the state’s “agriculture, wildland fire, hydropower, timber, and tourism sectors,” as the Statesman Journal reported.

There are more than 80 active SNOTEL stations in Oregon.

The fear is there won’t be enough people left to reach all the sites this summer to replace batteries, solar panels, and drifting or broken sensors, which could quickly undermine the reliability of the data or cut off the flow of information. 

“Staff and budget reductions at NRCS will make it impossible to maintain SNOTEL instruments and conduct routine manual observations, leading to inoperability of the network within a year,” the lawmakers warned.

The USDA and NRCS didn’t respond to inquiries from MIT Technology Review

looking down at a researcher standing in a snowy trench with a clipboard of notes
DRI’s Daniel McEvoy scribbles down temperature readings at the Heavenly site.
JAMES TEMPLE

If the federal cutbacks deplete the data coming back from SNOTEL stations or federal snow survey work, the DRI infrared method could at least “still offer a simplistic way of measuring the snowpack temperatures” in places where state and regional agencies continue to carry out surveys, McAvoy says.

But most researchers stress the field needs more surveys, stations, sensors, and readings to understand how the climate and water cycles are changing from month to month and season to season. Heggli stresses that there should be broad bipartisan support for programs that collect snowpack data and provide the water forecasts that farmers and communities rely on. 

“This is how we account for one of, if not the, most valuable resource we have,” she says. “In the West, we go into a seasonal drought every summer; our snowpack is what trickles down and gets us through that drought. We need to know how much we have.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Network data hygiene: The critical first step to effective AI agents

Many network teams manage some 15 to 30 different dashboards to track data across all the components in an environment, struggling to cobble together relevant information across domains and spending hours troubleshooting a single incident. In short, they are drowning in data. Artificial intelligence tools—and specifically AI agents—promise to ease

Read More »

Key takeaways from IBM Think partner event

The first week of May means flowers from April showers and that it’s time for IBM Think in Boston. The first day of the event has historically been the Partner Plus day, which is devoted to content for IBM partners, which include ISVs, technology partners and resellers. The 2025 keynote

Read More »

LandBridge Posts Higher Revenue

LandBridge Company LLC has reported $44 million in revenue for the first quarter of 2025, up from $36.5 million for the fourth quarter of 2024 and $19 million for the corresponding quarter a year prior. The company attributed the sequential increase to increases in surface use royalties of $6.8 million,

Read More »

USA DOE Announces 1st Step in its ‘Largest Deregulatory Effort in History’

In a statement posted on its website this week, the U.S. Department of Energy (DOE) announced “the first step in the Energy Department’s largest deregulatory effort in history, proposing the elimination or reduction of 47 regulations”. The statement highlighted that these actions are in accordance with U.S. President Donald Trump’s Executive Order “Zero-Based Regulation to Unleash American Energy”. Once finalized, these actions will cut more than 125,000 words from the Code of Federal Regulations, the statement noted. “While it would normally take years for the Department of Energy to remove just a handful of regulations, the Trump Administration assembled a team working around the clock to reduce costs and deliver results for the American people in just over 110 days,” U.S. Secretary of Energy Chris Wright said in the statement. The 47 deregulatory actions include “streamlining administrative procedures with respect to the import and export of natural gas”, “streamlining applications for authorization to transmit electric energy to a foreign country”, and “rescinding reporting requirements, certification, independent verification, and DOE review for voluntary greenhouse gas reporting”, the DOE statement showed. They also include “rescinding the renewable energy production incentive”, “delaying [the] compliance date for federal agencies to meet the clean energy federal building rule”, and “streamlining the procedures for acquisition of petroleum for the strategic petroleum reserve (SPR)”, the statement highlighted. In a statement posted on its site last month, the DOE announced that Strategic Storage Partners LLC had secured a $1.4 billion contract to manage and operate the SPR. In that statement, the DOE noted that the SPR’s mission is to safeguard the United States from significant petroleum supply disruptions through the acquisition, storage, distribution, and management of emergency petroleum stocks, fulfilling U.S. obligations under the International Energy Program. A “Zero-Based Regulatory Budgeting to Unleash American Energy” Executive Order was

Read More »

US, Saudi Arabia Ink Energy, Critical Mineral Deals

Saudi Arabia and the United States have signed agreements on cooperation on energy – including through refined oil products trading and artificial intelligence (AI) – and critical minerals. The memorandum of understanding (MOU) on energy collaboration and the memorandum of cooperation (MOC) on critical minerals were executed during Donald Trump’s visit to the kingdom, in which the U.S. president secured $600 billion in investment commitments from the Saudis. The MOU was signed between U.S. Energy Secretary Chris Wright and Saudi Energy Minister Prince Abdulaziz bin Salman Al Saud while the MOC was signed by Wright and Saudi Industry and Mineral Resources Minister Bandar Alkhorayef. The MOU “explores the potential for innovation, development, deployment of energy infrastructure in the two countries, and providing access to clean cooking solutions in developing countries”, the U.S. Department of Energy (DOE) said in an online statement Tuesday. “The MOU also highlights the intent to collaborate in various fields including petroleum refining and refined products trading, electricity generation technologies and energy storage systems, and artificial intelligence projects to accelerate deployment of energy-driven innovations. “The two sides also outlined areas for cooperation on civil nuclear energy, including safety, security, and nonproliferation programs; vocational training and workforce development; U.S. Generation III+ advanced large reactor technologies and small modular reactors; uranium exploration, mining, and milling; and safe and secure nuclear waste disposal”. Meanwhile the MOC will create “a framework for cooperation to strengthen and secure supply chains for critical minerals mining and processing”, the DOE said. The two intend to explore joint ventures and investment opportunities, including in refining and processing facilities, and in workforce and research institutions that will ensure continued innovation related to mineral exploration, extraction, and processing”. The White House announced separately on Tuesday that Trump has secured investment pledges totaling $600 billion from Saudi partners. The investments are part of two-way deals that

Read More »

Trump Favors $40-$50 Oil, Goldman Says After Sifting Posts

President Donald Trump appears to prefer US oil prices between $40 and $50 a barrel, according to Goldman Sachs Group Inc., citing an in-house analysis of his social-media posts on the topic. Trump “has always been focused on oil and on US energy dominance, having posted nearly 900 times,” analysts including Daan Struyven said in a report. His “inferred preference for WTI appears to be around $40 to $50 a barrel, where his propensity to post about oil prices bottoms,” they said. Oil prices — both global crude benchmark Brent, as well as US counterpart West Texas Intermediate , or WTI — are often buffeted by the president’s prolific social-media commentary, which can reference everything from OPEC policy and US gasoline prices to sanctions against nations including Iran. His administration has favored increased domestic production, as well as a broad push for cheap energy to help bring down inflation. The US leader “tends to call for lower prices (or celebrate falling prices) when WTI is greater than $50,” the analysts said. “In contrast, President Trump has called for higher prices when prices are very low (WTI less than $30) often in the context of supporting US production.” WTI — which last traded just above $63 a barrel — has shed 12% so far this year, hurt by the fallout from Trump’s trade tariffs, as well as a decision by OPEC+ to loosen supply curbs at a faster-than-expected pace. Still, prices have recovered some ground after the US and China scaled back some levies for 90 days, rising from a four-year closing low seen earlier this month. The president’s “inferred preference for relatively low oil prices directionally supports our view that oil prices are likely to edge lower in 2025-2026,” the analysts said, while also noting upside risks to expectations given

Read More »

Thousands of energy experts are leaving federal government. Here’s where they should go.

Charles Hua is the founder and executive director of PowerLines. Abe Silverman is a former general counsel of the New Jersey Board of Public Utilities and researcher at Johns Hopkins University’s Ralph O’Connor Sustainable Energy Institute. Ted Thomas is a former chairman of the Arkansas Public Service Commission and founder of Energize Strategies. Earlier this year, the federal government announced significant layoffs across government agencies. At the Department of Energy alone, 3,500 government workers are expected to leave. These talented and skilled professionals are now on the market, looking for new careers.  Meanwhile, immense challenges loom for the nation’s energy system — from skyrocketing utility bills and surging electricity demand to aging grid infrastructure and increased cybersecurity threats. There is a growing need for talented engineers, economists, lawyers, accountants and other professionals to help tackle these challenges. While much attention on energy issues focuses on federal action, most of the critical decisions over our‬ electricity system are taking place at the state level, at state public‬ utilities commissions, state energy offices and state consumer advocate offices. These agencies are juggling an increasingly complex and expansive set of responsibilities, from planning for‬ the buildout of energy resources and courting economic development projects to setting‬‭ electricity rates and securing critical infrastructure.‬  States bear disproportionate responsibility for maintaining affordable, reliable and safe‬ ‭electricity. And it is at the state level that departing federal workers can have the greatest impact on our energy future. The U.S. electricity system is at a crossroads. Nearly 80 million Americans, or one in three, are struggling to pay their utility bills, with electricity costs increasing 30% since 2021. According to a new national online poll conducted by Ipsos, four in five (80%) Americans say they feel powerless over their utility bills. This trend is expected to continue, with

Read More »

Tallgrass Secures Shipping Deals for Planned Permian Gas Pipeline

Tallgrass said Tuesday it has signed anchor shipper precedent agreements for a planned Permian Basin pipeline that would carry up to 2.4 billion cubic feet of natural gas. The firm transport commitments “financially justify construction of the pipeline project, subject to customary regulatory and corporate conditions and approvals, with a target in-service date in late 2028”, the Kansas City-based energy transport infrastructure company said in an online statement. Tallgrass said it would launch an open season for more shippers to subscribe and for the company to determine the pipeline’s ultimate capacity. The pipeline is planned to pick up gas from different points in the Permian for transport to the Rockies Express Pipeline, Tallgrass’ gas pipeline stretching about 1,700 miles from Colorado’s Blanco County to Ohio’s Monroe County. Tallgrass is also eyeing other delivery points. “The new pipeline project is unique in that it will enable affordable and plentiful natural gas to access markets across the U.S., including multiple major markets that are key hubs of activity for industrial, agricultural, and technological development and innovation, from reshoring, policies to promote U.S. agriculture, and AI-driven power demand”, Tallgrass said. “Upon in-service, natural gas from the project will also be able to reach markets across Tallgrass’ approximately 800-mile decarbonization pipeline network, so that consumers of new natural-gas-fired power generation and industrial consumers of natural gas will have an immediate and financially viable opportunity to supply their growing energy demand while decarbonizing through CO2 capture and sequestration and the use of clean hydrogen”. CO2 Pipeline Project Tallgrass is converting a natural gas pipeline that operated for over a decade to transport CO2 to its sequestration site being developed in Wyoming. The converted pipeline is planned to transport over 10 million tons a year of CO2 from industries in Colorado, Nebraska and Wyoming. Tallgrass expects the converted

Read More »

Sweden’s CorPower to build UK’s largest wave energy array in Orkney

Swedish firm CorPower Ocean is set to build the UK’s largest wave energy array at a test site in the Orkney Islands. The Scottish government made the announcement at the All-Energy conference in Glasgow. As part of the berth agreement, CorPower will deploy a 5 MW wave array at the European Marine Energy Centre (EMEC) by 2029. The project will be located at EMEC’s grid-connected Billia Croo wave energy test site and will consist of 14 wave energy converters (WECs) operating for up to 15 years. The wave array mark’s CorPower’s second project at EMEC after it tested its C3 WEC at the Scapa Flow test site in 2018 in partnership with Wave Energy Scotland. Since then, the Stockholm-headquartered firm has demonstrated its C4 WEC device off the coast of Portugal. Speaking at All-Energy in 2024, CorPower head of business development Anders Jansson said the company had wanted to continue its testing at EMEC. However, he said CorPower made the decision to move its testing from Scotland to Portugal after Brexit impacted access to EU research funding. © Supplied by CorPowerA CorPower wave energy converter in Portugal. Ahead of building the wave array at EMEC, CorPower will build three more WECs to its fleet for a small wave farm demonstration project before developing the 5 MW array. The Scottish government said CorPower will produce the hull and other subsystems for its Billia Croo array project in Orkney, alongside using local vessels during deployment. Scotland wave and tidal energy ambitions SNP deputy First Minister Kate Forbes said Scotland is in a “prime position” for the development and deployment wave and tidal stream energy. “This new project will create skilled jobs in Orkney, support a developing supply chain while reinforcing Scotland’s global leadership in marine renewables,” Forbes said. Jansson said the company

Read More »

HPE ‘morphs’ private cloud portfolio with improved virtualization, storage and data protection

What do you get when combining Morpheus with Aruba? As part of the extensible platform message that HPE is promoting with Morpheus, it’s also working in some capabilities from the broader HPE portfolio. One integration is with HPE Aruba for networking microsegmentation. Bhardwaj noted that a lot of HPE Morpheus users are looking for microsegmentation in order to make sure that the traffic between two virtual machines on a server is secure. “The traditional approach of doing that is on the hypervisor, but that costs cycles on the hypervisor,” Bhardwaj said. “Frankly, the way that’s being delivered today, customers have to pay extra cost on the server.” With the HPE Aruba plugin that now works with HPE Morpheus, the microsegmentation capability can be enabled at the switch level. Bhardwaj said that by doing the microsegmentation in the switch and not the hypervisor, costs can be lowered and performance can be increased. The integration brings additional capabilities, including the ability to support VPN and network address translation (NAT) in an integrated way between the switch and the hypervisor. VMware isn’t the only hypervisor supported by HPE  The HPE Morpheus VM Essentials Hypervisor is another new element in the HPE cloud portfolio. The hypervisor is now being integrated into HPE’s private cloud offerings for both data center as well as edge deployments.

Read More »

AMD targets hosting providers with affordable EPYC 4005 processors

According to Pinkesh Kotecha, chairman and MD of Ishan Technologies, AMD’s 4th Gen EPYC processors stood out because they offer the right combination of high performance, energy efficiency, and security. “Their high core density and ability to optimize performance per watt made them ideal for managing data-intensive operations like real-time analytics and high-frequency transactions. Additionally, AMD’s strong AI roadmap and growing portfolio of AI-optimised solutions position them as a forward-looking partner, ready to support our customers’ evolving AI and data needs. This alignment made AMD a clear choice over alternatives,” Kotecha said. By integrating AMD EPYC processors, Ishan Technologies’ Ishan Cloud plans to empower enterprises across BFSI, ITeS, and manufacturing industries, as well as global capability centers and government organizations, to meet India’s data localization requirements and drive AI-led digital transformation. “The AMD EPYC 4005 series’ price-to-performance ratio makes it an attractive option for cloud hosting and web services, where cost-efficient, always-on performance is essential,” said Manish Rawat, analyst, TechInsights. Prabhu Ram, VP for the industry research group at CMR, said EPYC 4005 processors deliver a compelling mix of performance-per-watt, higher core counts, and modern I/O support, positioning it as a strong alternative to Intel’s Xeon E-2400 and 6300P, particularly for edge deployments. Shah of Counterpoint added, “While ARM-based Ampere Altra promises higher power efficiencies and is ideally adopted in more cloud and hyperscale data centers, though performance is something where x86-based Zen 5 architecture excels and nicely balances the efficiencies with lower TDPs, better software compatibilities supported by a more mature ecosystem.”

Read More »

Shell’s immersive cooling liquids the first to receive official certification from Intel

Along with the certification, Intel is offering a Xeon processor single-phase immersion warranty rider. This indicates Intel’s confidence in the durability and effectiveness of Shell’s fluids. Yates explained that the rider augments Intel’s standard warranty terms and is available to data center operators deploying 4th and 5th generation Xeon processors in Shell immersion fluids. The rider is intended to provide data center operators confidence that their investment is guaranteed when deployed correctly. Shell’s fluids are available globally and can be employed in retrofitted existing infrastructure or used in new builds. Cuts resource use, increases performance Data centers consume anywhere from 10 to 50 times more energy per square foot than traditional office buildings, and they are projected to drive more than 20% of the growth in electricity demand between now and 2030. Largely due to the explosion of AI, data center energy consumption is expected to double from 415 terawatt-hours in 2024 to around 945 TWh by 2030. There are several other technologies used for data center cooling, including air cooling, cold plate (direct-to-chip), and precision cooling (targeted to specific areas), but the use of immersion cooling has been growing, and is expected to account for 36% of data center thermal management revenue by 2028. With this method, servers and networking equipment are placed in cooling fluids that absorb and dissipate heat generated by the electronic equipment. These specialized fluids are thermally conductive but not electrically conductive (dielectric) thus making them safe for submerging electrical equipment.

Read More »

Cisco joins AI infrastructure alliance

“The addition of Cisco reinforces AIP’s commitment to an open-architecture platform and fostering a broad ecosystem that supports a diverse range of partners on a non-exclusive basis, all working together to build a new kind of AI infrastructure,” the group said in a statement.  Separately, Cisco announced AI initiatives centered in the Middle East region. Last week, Cisco CEO Chuck Robbins visited Saudi Arabia, UAE, Qatar, and Bahrain. This week, Jeetu Patel, executive vice president and chief product officer, is in Saudi Arabia, where he is participating in President Trump’s state visit to the region, according to Cisco. Related new projects include:  An initiative with HUMAIN, Saudi Arabia’s new AI enterprise to help build an open, scalable, resilient and cost-efficient AI infrastructure: “This landmark collaboration will set a new standard for how AI infrastructure is designed, secured and delivered – combining Cisco’s global expertise with the Kingdom’s bold AI ambitions. The multi-year initiative aims to position the country as a global leader in digital innovation,” Cisco stated. A collaboration with the UAE-basedG42 to co-develop a secure AI portfolio and AI-native services: Cisco and G42 will work together to assess the potential to co-develop and jointly deploy AI-powered cybersecurity packages, as well as a reference architecture that integrates Cisco’s networking, security, and infrastructure solutions specifically designed for high-performance computing. This collaboration aims to help customers build and secure AI-ready data centers and develop AI workloads effectively, according to the companies. Interest in Qatar’s digital transformation: Qatar’s Ministry of Interior and Cisco signed a letter of intent to collaborate on Qatar’s digital transformation, AI, infrastructure development and cybersecurity.

Read More »

Cato Networks introduces AI-powered policy analysis engine

Cato Networks this week announced a new policy analysis engine for its cloud-based secure access service edge platform that the company says will optimize and improve SASE policies, reduce risk, simplify compliance, and reduce manual maintenance efforts. Cato Autonomous Policies is built into the Cato SASE Cloud Platform and can provide enterprises with AI-driven recommendations to eliminate security exposure, tighten access controls, and improve network performance. The first use case of the policy engine is designed for firewall as a service (FWaaS) environments in which “firewall rule bloat” is present, Cato explained in a statement. The bloat comes from organizations accumulating thousands of rules that were designed to protect the environment, but after becoming outdated or misconfigured, actually lead to increased risk. “Most enterprises rely on a mix of firewalls deployed in data centers, branch offices, and cloud environments. Over time, rule sets grow, become inconsistent, and are filled with redundant, outdated, or conflicting entries,” wrote Demetris Booth, product marketing director at Cato Networks, in a blog post on the product news. “As a result, security policies become hard to manage, even harder to audit, and often misaligned with zero-trust principles. AI-driven firewall policy management is necessary for modern enterprises to streamline and optimize security operations.”

Read More »

Riverbed bolsters network acceleration for AI’s performance bottlenecks

“Enterprises are worried about bad actors capturing encrypted traffic and saving copies for when quantum computing advances can break the encryption, providing the bad actors with free access to data. It’s a real concern,” Frey explains. “Post-quantum cryptography is a way to get ahead of that now.” Riverbed also introduced the SteelHead 90 series of network acceleration appliances, which the company says will provide resilient network performance to customers. The series includes: SteelHead 8090, which delivers up to 60 Gbps of data movement over a WAN. It supports multiple 100 Gigabyte network interfaces to pull data from the LAN. SteelHead 6090, which delivers up to 20 Gbps of data movement over a WAN, targeted for mid-scale data centers. SteelHead 4090 and 2090, which support mid-sized data center and edge use cases, with 500 Mbps and 200 Mbps of accelerated traffic, as well as up to 10 Gbps of total traffic processing for quality of service (QoS) and application classification use cases. Riverbed SteelHead Virtual, is a software-only version designed for virtualization environments and private cloud deployments, which is compatible with VMWare ESXI, KVM, and Microsoft Hyper-V. “For customers that are familiar with Riverbed, this is a big change in performance. We’ve gone from moving one appliance at 30 Gbps to 60 Gbps. We want to make sure that whether it’s new AI projects or existing data projects, we have ubiquitous availability across clouds,” says Chalan Aras, senior vice president and general manager of Acceleration at Riverbed. “We’re making it less expensive to move data—we are about half the price of traditional data movement methods.” With this announcement, Riverbed also unveiled its Flex licensing subscription offering. According to Riverbed, Flex makes it possible for enterprises to transfer licenses from hardware to virtual to cloud devices at no cost. Enterprises can reassign

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »