Stay Ahead, Stay ONMINE

Inside a new quest to save the “doomsday glacier”

The Thwaites glacier is a fortress larger than Florida, a wall of ice that reaches nearly 4,000 feet above the bedrock of West Antarctica, guarding the low-lying ice sheet behind it. But a strong, warm ocean current is weakening its foundations and accelerating its slide into the Amundsen Sea. Scientists fear the waters could topple the walls in the coming decades, kick-starting a runaway process that would crack up the West Antarctic Ice Sheet. That would mark the start of a global climate disaster. The glacier itself holds enough ice to raise ocean levels by more than two feet, which could flood coastlines and force tens of millions of people living in low-lying areas to abandon their homes. The loss of the entire ice sheet—which could still take centuries to unfold—would push up sea levels by 11 feet and redraw the contours of the continents. This is why Thwaites is known as the doomsday glacier—and why scientists are eager to understand just how likely such a collapse is, when it could happen, and if we have the power to stop it.  Scientists at MIT and Dartmouth College founded Arête Glacier Initiative last year in the hope of providing clearer answers to these questions. The nonprofit research organization will officially unveil itself, launch its website, and post requests for research proposals today, March 21, timed to coincide with the UN’s inaugural World Day for Glaciers, MIT Technology Review can report exclusively.  Arête will also announce it is issuing its first grants, each for around $200,000 over two years, to a pair of glacier researchers at the University of Wisconsin-Madison.  One of the organization’s main goals is to study the possibility of preventing the loss of giant glaciers, Thwaites in particular, by refreezing them to the bedrock. It would represent a radical intervention into the natural world, requiring a massive, expensive engineering project in a remote, treacherous environment.  But the hope is that such a mega-adaptation project could minimize the mass relocation of climate refugees, prevent much of the suffering and violence that would almost certainly accompany it, and help nations preserve trillions of dollars invested in high-rises, roads, homes, ports, and airports around the globe. “About a million people are displaced per centimeter of sea-level rise,” says Brent Minchew, an associate professor of geophysics at MIT, who cofounded Arête Glacier Initiative and will serve as its chief scientist. “If we’re able to bring that down, even by a few centimeters, then we would safeguard the homes of millions.” But some scientists believe the idea is an implausible, wildly expensive distraction, drawing money, expertise, time, and resources away from more essential polar research efforts.  “Sometimes we can get a little over-optimistic about what engineering can do,” says Twila Moon, deputy lead scientist at the National Snow and Ice Data Center at the University of Colorado Boulder. “Two possible futures” Minchew, who earned his PhD in geophysics at Caltech, says he was drawn to studying glaciers because they are rapidly transforming as the world warms, increasing the dangers of sea-level rise.  “But over the years, I became less content with simply telling a more dramatic story about how things were going and more open to asking the question of what can we do about it,” says Minchew, who will return to Caltech as a professor this summer. Last March, he cofounded Arête Glacier Initiative with Colin Meyer, an assistant professor of engineering at Dartmouth, in the hope of funding and directing research to improve scientific understanding of two big questions: How big a risk does sea-level rise pose in the coming decades, and can we minimize that risk? Brent Minchew, an MIT professor of geophysics, co-founded Arête Glacier Initiative and will serve as its chief scientist.COURTESY: BRENT MINCHEW “Philanthropic funding is needed to address both of these challenges, because there’s no private-sector funding for this kind of research and government funding is minuscule,” says Mike Schroepfer, the former Meta chief technology officer turned climate philanthropist, who provided funding to Arête through his new organization, Outlier Projects.  The nonprofit has now raised about $5 million from Outlier and other donors, including the Navigation Fund, the Kissick Family Foundation, the Sky Foundation, the Wedner Family Foundation, and the Grantham Foundation.  Minchew says they named the organization Arête, mainly because it’s the sharp mountain ridge between two valleys, generally left behind when a glacier carves out the cirques on either side. It directs the movement of the glacier and is shaped by it.  It’s meant to symbolize “two possible futures,” he says. “One where we do something; one where we do nothing.” Improving forecasts The somewhat reassuring news is that, even with rising global temperatures, it may still take thousands of years for the West Antarctic Ice Sheet to completely melt.  In addition, sea-level rise forecasts for this century generally range from as little as 0.28 meters (11 inches) to 1.10 meters (about three and a half feet), according to the latest UN climate panel report. The latter only occurs under a scenario with very high greenhouse gas emissions (SSP5-8.5), which significantly exceeds the pathway the world is now on. But there’s still a “low-likelihood” that ocean levels could surge nearly two meters (about six and a half feet) by 2100 that “cannot be excluded,” given “deep uncertainty linked to ice-sheet processes,” the report adds.  Two meters of sea-level rise could force nearly 190 million people to migrate away from the coasts, unless regions build dikes or other shoreline protections, according to some models. Many more people, mainly in the tropics, would face heightened flooding dangers. Much of the uncertainty over what will happen this century comes down to scientists’ limited understanding of how Antarctic ice sheets will respond to growing climate pressures. The initial goal of Arête Glacier Initiative is to help narrow the forecast ranges by improving our grasp of how Thwaites and other glaciers move, melt, and break apart. Gravity is the driving force nudging glaciers along the bedrock and reshaping them as they flow. But many of the variables that determine how fast they slide lie at the base. That includes the type of sediment the river of ice slides along; the size of the boulders and outcroppings it contorts around; and the warmth and strength of the ocean waters that lap at its face. In addition, heat rising from deep in the earth warms the ice closest to the ground, creating a lubricating layer of water that hastens the glacier’s slide. That acceleration, in turn, generates more frictional heat that melts still more of the ice, creating a self-reinforcing feedback effect. Minchew and Meyer are confident that the glaciology field is at a point where it could speed up progress in sea-level rise forecasting, thanks largely to improving observational tools that are producing more and better data. That includes a new generation of satellites orbiting the planet that can track the shifting shape of ice at the poles at far higher resolutions than in the recent past. Computer simulations of ice sheets, glaciers and sea ice are improving as well, thanks to growing computational resources and advancing machine learning techniques. On March 21, Arête will issue a request for proposals from research teams to contribute to an effort to collect, organize, and openly publish existing observational glacier data. Much of that expensively gathered information is currently inaccessible to researchers around the world, Minchew says. Colin Meyer, an assistant professor of engineering at Dartmouth, co-founded Arête Glacier Initiative. By funding teams working across these areas, Arête’s founders hope to help produce more refined ice-sheet models and narrower projections of sea-level rise. This improved understanding would help cities plan where to build new bridges, buildings, and homes, and to determine whether they’ll need to erect higher seawalls or raise their roads, Meyer says. It could also provide communities with more advance notice of the coming dangers, allowing them to relocate people and infrastructure to safer places through an organized process known as managed retreat. A radical intervention But the improved forecasts might also tell us that Thwaites is closer to tumbling into the ocean than we think, underscoring the importance of considering more drastic measures. One idea is to build berms or artificial islands to prop up fragile parts of glaciers, and to block the warm waters that rise from the deep ocean and melt them from below. Some researchers have also considered erecting giant, flexible curtains anchored to the seabed to achieve the latter effect. Others have looked at scattering highly reflective beads or other materials across ice sheets, or pumping ocean water onto them in the hopes it would freeze during the winter and reinforce the headwalls of the glaciers. But the concept of refreezing glaciers in place, know as a basal intervention, is gaining traction in scientific circles, in part because there’s a natural analogue for it. The glacier that stalled About 200 years ago, the Kamb Ice Stream, another glacier in West Antarctica that had been sliding about 350 meters (1,150 feet) per year, suddenly stalled. Glaciologists believe an adjacent ice stream intersected with the catchment area under the glacier, providing a path for the water running below it to flow out along the edge instead. That loss of fluid likely slowed down the Kamb Ice Stream, reduced the heat produced through friction, and allowed water at the surface to refreeze. The deceleration of the glacier sparked the idea that humans might be able to bring about that same phenomenon deliberately, perhaps by drilling a series of boreholes down to the bedrock and pumping up water from the bottom. Minchew himself has focused on a variation he believes could avoid much of the power use and heavy operating machinery hassles of that approach: slipping long tubular devices, known as thermosyphons, down nearly to the bottom of the boreholes.  These passive heat exchangers, which are powered only by the temperature differential between two areas, are commonly used to keep permafrost cold around homes, buildings and pipelines in Arctic regions. The hope is that we could deploy extremely long ones, stretching up to two kilometers and encased in steel pipe, to draw warm temperatures away from the bottom of the glacier, allowing the water below to freeze. Minchew says he’s in the process of producing refined calculations, but estimates that halting Thwaites could require drilling as many as 10,000 boreholes over a 100-square-kilometer area. He readily acknowledges that would be a huge undertaking, but provides two points of comparison to put such a project into context: Melting the necessary ice to create those holes would require roughly the amount of energy all US domestic flights consume from jet fuel in about two and a half hours. Or, it would produce about the same level of greenhouse gas emissions as constructing 10 kilometers of seawalls, a small fraction of the length the world would need to build if it can’t slow down the collapse of the ice sheets, he says. “Kick the system” One of Arête’s initial grantees is Marianne Haseloff, an assistant professor of geoscience at the University of Wisconsin-Madison. She studies the physical processes that govern the behavior of glaciers and is striving to more faithfully represent them in ice sheet models.  Haseloff says she will use those funds to develop mathematical methods that could more accurately determine what’s known as basal shear stress, or the resistance of the bed to sliding glaciers, based on satellite observations. That could help refine forecasts of how rapidly glaciers will slide into the ocean, in varying settings and climate conditions. Arête’s other initial grant will go to Lucas Zoet, an associate professor in the same department as Haseloff and the principal investigator with the Surface Processes group. He intends to use the funds to build the lab’s second “ring shear” device, the technical term for a simulated glacier. The existing device, which is the only one operating in the world, stands about eight feet tall and fills the better part of a walk-in freezer on campus. The core of the machine is a transparent drum filled with a ring of ice, sitting under pressure and atop a layer of sediment. It slowly spins for weeks at a time as sensors and cameras capture how the ice and earth move and deform. Lucas Zoet, an associate professor at the University of Wisconsin–Madison, stands in front of his lab’s “ring shear” device, a simulated glacier.ETHAN PARRISH The research team can select the sediment, topography, water pressure, temperature, and other conditions to match the environment of a real-world glacier of interest, be it Thwaites today—or Thwaites in 2100, under a high greenhouse gas emissions scenario.  Zoet says these experiments promise to improve our understanding of how glaciers move over different types of beds, and to refine an equation known as the slip law, which represents these glacier dynamics mathematically in computer models. The second machine will enable them to run more experiments and to conduct a specific kind that the current device can’t: a scaled-down, controlled version of the basal intervention. Zoet says the team will be able to drill tiny holes through the ice, then pump out water or transfer heat away from the bed. They can then observe whether the simulated glacier freezes to the base at those points and experiment with how many interventions, across how much space, are required to slow down its movement. It offers a way to test out different varieties of the basal intervention that is far easier and cheaper than using water drills to bore to the bottom of an actual glacier in Antarctica, Zoet says. The funding will allow the lab to explore a wide range of experiments, enabling them to “kick the system in a way we wouldn’t have before,” he adds. “Virtually impossible” The concept of glacier interventions is in its infancy. There are still considerable unknowns and uncertainties, including how much it would cost, how arduous the undertaking would be, and which approach would be most likely to work, or if any of them are feasible. “This is mostly a theoretical idea at this point,” says Katharine Ricke, an associate professor at the University of California, San Diego, who researches the international relations implications of geoengineering, among other topics. Conducting extensive field trials or moving forward with full-scale interventions may also require surmounting complex legal questions, she says. Antarctica isn’t owned by any nation, but it’s the subject of competing territorial claims among a number of countries and governed under a decades-old treaty to which dozens are a party. The basal intervention—refreezing the glacier to its bed—faces numerous technical hurdles that would make it “virtually impossible to execute,” Moon and dozens of other researchers argued in a recent preprint paper, “Safeguarding the polar regions from dangerous geoengineering.” Among other critiques, they stress that subglacial water systems are complex, dynamic, and interconnected, making it highly difficult to precisely identify and drill down to all the points that would be necessary to remove enough water or add enough heat to substantially slow down a massive glacier. Further, they argue that the interventions could harm polar ecosystems by adding contaminants, producing greenhouse gases, or altering the structure of the ice in ways that may even increase sea-level rise. “Overwhelmingly, glacial and polar geoengineering ideas do not make sense to pursue, in terms of the finances, the governance challenges, the impacts,” and the possibility of making matters worse, Moon says. “No easy path forward” But Douglas MacAyeal, professor emeritus of glaciology at the University of Chicago, says the basal intervention would have the lightest environmental impact among the competing ideas. He adds that nature has already provided an example of it working, and that much of the needed drilling and pumping technology is already in use in the oil industry. “I would say it’s the strongest approach at the starting gate,” he says, “but we don’t really know anything about it yet. The research still has to be done. It’s very cutting-edge.” Minchew readily acknowledges that there are big challenges and significant unknowns—and that some of these ideas may not work. But he says it’s well worth the effort to study the possibilities, in part because much of the research will also improve our understanding of glacier dynamics and the risks of sea-level rise—and in part because it’s only a question of when, not if, Thwaites will collapse. Even if the world somehow halted all greenhouse gas emissions tomorrow, the forces melting that fortress of ice will continue to do so.  So one way or another, the world will eventually need to make big, expensive, difficult interventions to protect people and infrastructure. The cost and effort of doing one project in Antarctica, he says, would be small compared to the global effort required to erect thousands of miles of seawalls, ratchet up homes, buildings, and roads, and relocate hundreds of millions of people. “One thing is challenging—and the other is even more challenging,” Minchew says. “There’s no easy path forward.”

The Thwaites glacier is a fortress larger than Florida, a wall of ice that reaches nearly 4,000 feet above the bedrock of West Antarctica, guarding the low-lying ice sheet behind it.

But a strong, warm ocean current is weakening its foundations and accelerating its slide into the Amundsen Sea. Scientists fear the waters could topple the walls in the coming decades, kick-starting a runaway process that would crack up the West Antarctic Ice Sheet.

That would mark the start of a global climate disaster. The glacier itself holds enough ice to raise ocean levels by more than two feet, which could flood coastlines and force tens of millions of people living in low-lying areas to abandon their homes.

The loss of the entire ice sheet—which could still take centuries to unfold—would push up sea levels by 11 feet and redraw the contours of the continents.

This is why Thwaites is known as the doomsday glacier—and why scientists are eager to understand just how likely such a collapse is, when it could happen, and if we have the power to stop it. 

Scientists at MIT and Dartmouth College founded Arête Glacier Initiative last year in the hope of providing clearer answers to these questions. The nonprofit research organization will officially unveil itself, launch its website, and post requests for research proposals today, March 21, timed to coincide with the UN’s inaugural World Day for Glaciers, MIT Technology Review can report exclusively. 

Arête will also announce it is issuing its first grants, each for around $200,000 over two years, to a pair of glacier researchers at the University of Wisconsin-Madison. 

One of the organization’s main goals is to study the possibility of preventing the loss of giant glaciers, Thwaites in particular, by refreezing them to the bedrock. It would represent a radical intervention into the natural world, requiring a massive, expensive engineering project in a remote, treacherous environment. 

But the hope is that such a mega-adaptation project could minimize the mass relocation of climate refugees, prevent much of the suffering and violence that would almost certainly accompany it, and help nations preserve trillions of dollars invested in high-rises, roads, homes, ports, and airports around the globe.

“About a million people are displaced per centimeter of sea-level rise,” says Brent Minchew, an associate professor of geophysics at MIT, who cofounded Arête Glacier Initiative and will serve as its chief scientist. “If we’re able to bring that down, even by a few centimeters, then we would safeguard the homes of millions.”

But some scientists believe the idea is an implausible, wildly expensive distraction, drawing money, expertise, time, and resources away from more essential polar research efforts. 

“Sometimes we can get a little over-optimistic about what engineering can do,” says Twila Moon, deputy lead scientist at the National Snow and Ice Data Center at the University of Colorado Boulder.

“Two possible futures”

Minchew, who earned his PhD in geophysics at Caltech, says he was drawn to studying glaciers because they are rapidly transforming as the world warms, increasing the dangers of sea-level rise. 

“But over the years, I became less content with simply telling a more dramatic story about how things were going and more open to asking the question of what can we do about it,” says Minchew, who will return to Caltech as a professor this summer.

Last March, he cofounded Arête Glacier Initiative with Colin Meyer, an assistant professor of engineering at Dartmouth, in the hope of funding and directing research to improve scientific understanding of two big questions: How big a risk does sea-level rise pose in the coming decades, and can we minimize that risk?

Brent Minchew, an MIT professor of geophysics, co-founded Arête Glacier Initiative and will serve as its chief scientist.
COURTESY: BRENT MINCHEW

“Philanthropic funding is needed to address both of these challenges, because there’s no private-sector funding for this kind of research and government funding is minuscule,” says Mike Schroepfer, the former Meta chief technology officer turned climate philanthropist, who provided funding to Arête through his new organization, Outlier Projects

The nonprofit has now raised about $5 million from Outlier and other donors, including the Navigation Fund, the Kissick Family Foundation, the Sky Foundation, the Wedner Family Foundation, and the Grantham Foundation. 

Minchew says they named the organization Arête, mainly because it’s the sharp mountain ridge between two valleys, generally left behind when a glacier carves out the cirques on either side. It directs the movement of the glacier and is shaped by it. 

It’s meant to symbolize “two possible futures,” he says. “One where we do something; one where we do nothing.”

Improving forecasts

The somewhat reassuring news is that, even with rising global temperatures, it may still take thousands of years for the West Antarctic Ice Sheet to completely melt. 

In addition, sea-level rise forecasts for this century generally range from as little as 0.28 meters (11 inches) to 1.10 meters (about three and a half feet), according to the latest UN climate panel report. The latter only occurs under a scenario with very high greenhouse gas emissions (SSP5-8.5), which significantly exceeds the pathway the world is now on.

But there’s still a “low-likelihood” that ocean levels could surge nearly two meters (about six and a half feet) by 2100 that “cannot be excluded,” given “deep uncertainty linked to ice-sheet processes,” the report adds. 

Two meters of sea-level rise could force nearly 190 million people to migrate away from the coasts, unless regions build dikes or other shoreline protections, according to some models. Many more people, mainly in the tropics, would face heightened flooding dangers.

Much of the uncertainty over what will happen this century comes down to scientists’ limited understanding of how Antarctic ice sheets will respond to growing climate pressures.

The initial goal of Arête Glacier Initiative is to help narrow the forecast ranges by improving our grasp of how Thwaites and other glaciers move, melt, and break apart.

Gravity is the driving force nudging glaciers along the bedrock and reshaping them as they flow. But many of the variables that determine how fast they slide lie at the base. That includes the type of sediment the river of ice slides along; the size of the boulders and outcroppings it contorts around; and the warmth and strength of the ocean waters that lap at its face.

In addition, heat rising from deep in the earth warms the ice closest to the ground, creating a lubricating layer of water that hastens the glacier’s slide. That acceleration, in turn, generates more frictional heat that melts still more of the ice, creating a self-reinforcing feedback effect.

Minchew and Meyer are confident that the glaciology field is at a point where it could speed up progress in sea-level rise forecasting, thanks largely to improving observational tools that are producing more and better data.

That includes a new generation of satellites orbiting the planet that can track the shifting shape of ice at the poles at far higher resolutions than in the recent past. Computer simulations of ice sheets, glaciers and sea ice are improving as well, thanks to growing computational resources and advancing machine learning techniques.

On March 21, Arête will issue a request for proposals from research teams to contribute to an effort to collect, organize, and openly publish existing observational glacier data. Much of that expensively gathered information is currently inaccessible to researchers around the world, Minchew says.

Colin Meyer, an assistant professor of engineering at Dartmouth, co-founded Arête Glacier Initiative.

By funding teams working across these areas, Arête’s founders hope to help produce more refined ice-sheet models and narrower projections of sea-level rise.

This improved understanding would help cities plan where to build new bridges, buildings, and homes, and to determine whether they’ll need to erect higher seawalls or raise their roads, Meyer says. It could also provide communities with more advance notice of the coming dangers, allowing them to relocate people and infrastructure to safer places through an organized process known as managed retreat.

A radical intervention

But the improved forecasts might also tell us that Thwaites is closer to tumbling into the ocean than we think, underscoring the importance of considering more drastic measures.

One idea is to build berms or artificial islands to prop up fragile parts of glaciers, and to block the warm waters that rise from the deep ocean and melt them from below. Some researchers have also considered erecting giant, flexible curtains anchored to the seabed to achieve the latter effect.

Others have looked at scattering highly reflective beads or other materials across ice sheets, or pumping ocean water onto them in the hopes it would freeze during the winter and reinforce the headwalls of the glaciers.

But the concept of refreezing glaciers in place, know as a basal intervention, is gaining traction in scientific circles, in part because there’s a natural analogue for it.

The glacier that stalled

About 200 years ago, the Kamb Ice Stream, another glacier in West Antarctica that had been sliding about 350 meters (1,150 feet) per year, suddenly stalled.

Glaciologists believe an adjacent ice stream intersected with the catchment area under the glacier, providing a path for the water running below it to flow out along the edge instead. That loss of fluid likely slowed down the Kamb Ice Stream, reduced the heat produced through friction, and allowed water at the surface to refreeze.

The deceleration of the glacier sparked the idea that humans might be able to bring about that same phenomenon deliberately, perhaps by drilling a series of boreholes down to the bedrock and pumping up water from the bottom.

Minchew himself has focused on a variation he believes could avoid much of the power use and heavy operating machinery hassles of that approach: slipping long tubular devices, known as thermosyphons, down nearly to the bottom of the boreholes. 

These passive heat exchangers, which are powered only by the temperature differential between two areas, are commonly used to keep permafrost cold around homes, buildings and pipelines in Arctic regions. The hope is that we could deploy extremely long ones, stretching up to two kilometers and encased in steel pipe, to draw warm temperatures away from the bottom of the glacier, allowing the water below to freeze.

Minchew says he’s in the process of producing refined calculations, but estimates that halting Thwaites could require drilling as many as 10,000 boreholes over a 100-square-kilometer area.

He readily acknowledges that would be a huge undertaking, but provides two points of comparison to put such a project into context: Melting the necessary ice to create those holes would require roughly the amount of energy all US domestic flights consume from jet fuel in about two and a half hours. Or, it would produce about the same level of greenhouse gas emissions as constructing 10 kilometers of seawalls, a small fraction of the length the world would need to build if it can’t slow down the collapse of the ice sheets, he says.

“Kick the system”

One of Arête’s initial grantees is Marianne Haseloff, an assistant professor of geoscience at the University of Wisconsin-Madison. She studies the physical processes that govern the behavior of glaciers and is striving to more faithfully represent them in ice sheet models. 

Haseloff says she will use those funds to develop mathematical methods that could more accurately determine what’s known as basal shear stress, or the resistance of the bed to sliding glaciers, based on satellite observations. That could help refine forecasts of how rapidly glaciers will slide into the ocean, in varying settings and climate conditions.

Arête’s other initial grant will go to Lucas Zoet, an associate professor in the same department as Haseloff and the principal investigator with the Surface Processes group.

He intends to use the funds to build the lab’s second “ring shear” device, the technical term for a simulated glacier.

The existing device, which is the only one operating in the world, stands about eight feet tall and fills the better part of a walk-in freezer on campus. The core of the machine is a transparent drum filled with a ring of ice, sitting under pressure and atop a layer of sediment. It slowly spins for weeks at a time as sensors and cameras capture how the ice and earth move and deform.

Lucas Zoet, an associate professor at the University of Wisconsin–Madison, stands in front of his lab’s “ring shear” device, a simulated glacier.
ETHAN PARRISH

The research team can select the sediment, topography, water pressure, temperature, and other conditions to match the environment of a real-world glacier of interest, be it Thwaites today—or Thwaites in 2100, under a high greenhouse gas emissions scenario. 

Zoet says these experiments promise to improve our understanding of how glaciers move over different types of beds, and to refine an equation known as the slip law, which represents these glacier dynamics mathematically in computer models.

The second machine will enable them to run more experiments and to conduct a specific kind that the current device can’t: a scaled-down, controlled version of the basal intervention.

Zoet says the team will be able to drill tiny holes through the ice, then pump out water or transfer heat away from the bed. They can then observe whether the simulated glacier freezes to the base at those points and experiment with how many interventions, across how much space, are required to slow down its movement.

It offers a way to test out different varieties of the basal intervention that is far easier and cheaper than using water drills to bore to the bottom of an actual glacier in Antarctica, Zoet says. The funding will allow the lab to explore a wide range of experiments, enabling them to “kick the system in a way we wouldn’t have before,” he adds.

“Virtually impossible”

The concept of glacier interventions is in its infancy. There are still considerable unknowns and uncertainties, including how much it would cost, how arduous the undertaking would be, and which approach would be most likely to work, or if any of them are feasible.

“This is mostly a theoretical idea at this point,” says Katharine Ricke, an associate professor at the University of California, San Diego, who researches the international relations implications of geoengineering, among other topics.

Conducting extensive field trials or moving forward with full-scale interventions may also require surmounting complex legal questions, she says. Antarctica isn’t owned by any nation, but it’s the subject of competing territorial claims among a number of countries and governed under a decades-old treaty to which dozens are a party.

The basal intervention—refreezing the glacier to its bed—faces numerous technical hurdles that would make it “virtually impossible to execute,” Moon and dozens of other researchers argued in a recent preprint paper, “Safeguarding the polar regions from dangerous geoengineering.”

Among other critiques, they stress that subglacial water systems are complex, dynamic, and interconnected, making it highly difficult to precisely identify and drill down to all the points that would be necessary to remove enough water or add enough heat to substantially slow down a massive glacier.

Further, they argue that the interventions could harm polar ecosystems by adding contaminants, producing greenhouse gases, or altering the structure of the ice in ways that may even increase sea-level rise.

“Overwhelmingly, glacial and polar geoengineering ideas do not make sense to pursue, in terms of the finances, the governance challenges, the impacts,” and the possibility of making matters worse, Moon says.

“No easy path forward”

But Douglas MacAyeal, professor emeritus of glaciology at the University of Chicago, says the basal intervention would have the lightest environmental impact among the competing ideas. He adds that nature has already provided an example of it working, and that much of the needed drilling and pumping technology is already in use in the oil industry.

“I would say it’s the strongest approach at the starting gate,” he says, “but we don’t really know anything about it yet. The research still has to be done. It’s very cutting-edge.”

Minchew readily acknowledges that there are big challenges and significant unknowns—and that some of these ideas may not work.

But he says it’s well worth the effort to study the possibilities, in part because much of the research will also improve our understanding of glacier dynamics and the risks of sea-level rise—and in part because it’s only a question of when, not if, Thwaites will collapse.

Even if the world somehow halted all greenhouse gas emissions tomorrow, the forces melting that fortress of ice will continue to do so. 

So one way or another, the world will eventually need to make big, expensive, difficult interventions to protect people and infrastructure. The cost and effort of doing one project in Antarctica, he says, would be small compared to the global effort required to erect thousands of miles of seawalls, ratchet up homes, buildings, and roads, and relocate hundreds of millions of people.

“One thing is challenging—and the other is even more challenging,” Minchew says. “There’s no easy path forward.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Will Google throw gasoline on the AI chip arms race?

The Nvidia processors, he explains, are for processing massive, large language models (LLMs), while the Google TPU is used for inferencing, the next step after processing the LLM. So the two chips don’t compete with each other, they complement each other, according to Gold. Selling and supporting processors may not

Read More »

Nvidia moves deeper into AI infrastructure with SchedMD acquisition

“Slurm excels at orchestrating multi-node distributed training, where jobs span hundreds or thousands of GPUs,” said Lian Jye Su, chief analyst at Omdia. “The software can optimize data movement within servers by deciding where jobs should be placed based on resource availability. With strong visibility into the network topology, Slurm

Read More »

ExxonMobil bumps up 2030 target for Permian production

ExxonMobil Corp., Houston, is looking to grow production in the Permian basin to about 2.5 MMboe/d by 2030, an increase of 200,000 boe/d from executives’ previous forecasts and a jump of more than 45% from this year’s output. Helping drive that higher target is an expected 2030 cost profile that

Read More »

Strategists Forecast Week on Week USA Crude Build

In an oil and gas report sent to Rigzone by the Macquarie team this week, Macquarie strategists, including Walt Chancellor, revealed that they are forecasting that U.S. crude inventories will be up by 2.5 million barrels for the week ending December 12. “This follows a 1.8 million barrel draw in the prior week, with the crude balance realizing quite loose relative to our expectations amidst an apparent surge in Canadian imports,” the strategists said in the report. “While our balances point to a much looser fundamental picture this week, we note some potential for a ‘catch-up’ to the tighter side in this week’s data,” they added. “For this week’s balance, from refineries, we look for a minimal reduction in crude runs. Among net imports, we model a small increase, with exports lower (-0.1 million barrels per day) and imports higher (+0.1 million barrels per day) on a nominal basis,” they continued. The strategists warned in the report that the timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod.+adj.+transfers), we look for an increase (+0.4 million barrels per day) on a nominal basis this week,” the strategists went on to note. “Rounding out the picture, we anticipate another small increase (+0.3 million barrels) in SPR [Strategic Petroleum Reserve] stocks this week,” they added. The analysts also stated in the report that, “among products”, they “again look for across the board builds (gasoline/ distillate/jet +5.2/+2.0/+1.5 million barrels)”. “We model implied demand for these three products at ~14.3 million barrels per day for the week ending December 12,” they said. In its latest weekly petroleum status report at the time of writing, which was released on December 10 and included data for the week ending December 5, the U.S. Energy Information Administration (EIA)

Read More »

SK On pivots to stationary energy storage after Ford joint venture ends

Dive Brief: Korean battery maker SK On says it remains committed to building out a Tennessee plant originally intended to supply electric vehicle batteries to Ford after a joint venture with the car maker was called off, the company said in a statement. The manufacturer will maintain its strategic partnership with Ford and continue to supply EV batteries for its future vehicles, SK Americas spokesperson Joe Guy Collier said in an email. However, going forward, SK On plans to focus more on “profitable and sustainable growth” in the U.S. by supplying batteries produced in the Tennessee plant to other customers, including for stationary energy storage systems, the company said. “This agreement allows SK On to strategically realign assets and production capacity to improve its operational efficiency,” the battery maker said in a statement. “It also enables the company to enhance productivity, operational flexibility, and respond more effectively to evolving market dynamics and diverse customer needs.” Dive Insight: Ford and SK On reached a mutual agreement to dissolve their electric vehicle battery joint venture, BlueOval SK, Collier confirmed in an email last week.  The joint venture was established in September 2021 as part of a planned $11.4 billion investment by the two companies to build three large-scale manufacturing plants — one in Tennessee and two in Kentucky —  to produce advanced batteries for Ford’s future EVs.  Under the terms of the dissolution agreement, each company will independently own and operate the joint venture’s former production facilities, Collier said. A Ford subsidiary will take full ownership of the two battery plants in Kentucky, and SK On will assume full ownership and operate the battery plant in Tennessee. “SK On is committed to the Tennessee plant long-term,” the company said. “We plan to make it a key part of our manufacturing base for advanced batteries

Read More »

Shell Adds New Gas Customer in Nigeria

Shell PLC, through Shell Nigeria Gas Ltd (SNG), has signed an agreement to supply natural gas to SG Industrial FZE. The new customer is “a leading steel company in the Guandong industrial zone in the state”, the British company said on its Nigerian website. “The agreement adds to a growing list of clients for SNG which has developed as a dependable supplier of gas through distribution pipelines of some 150 kilometers [93.21 miles], serving over 150 clients in Abia, Bayelsa, Ogun and Rivers states”, Shell said. Shell did not disclose the contract volume or value. SNG managing director Ralph Gbobo said, “Our commitment is clear – to build, operate and maintain a gas distribution system that is not only reliable but resilient, transparent and designed to fuel growth”. SG Industrial vice general manager Moya Shua said, “This collaboration marks a major step forward in securing reliable energy that will power our growth and long-term ambitions”. Shell said it had previously signed agreements to supply pipeline gas to Nigeria Distilleries Ltd III, Reliance Chemical Products Limited II, Rumbu Industries Nigeria Ltd and Ultimum Ltd. Expanding its gas operations in the West African country, Shell recently announced a final investment decision to develop the HI field to supply up to 350 million standard cubic feet of gas a day, equivalent to about 60,000 oil barrels per day, to Nigeria LNG. The project is part of a joint venture in which Shell owns 40 percent through Shell Nigeria Exploration and Production Co Ltd. Sunlink Energies and Resources Ltd holds 60 percent. At Nigeria LNG, which has a declared capacity of 22 million metric tons of liquefied natural gas a year, Shell owns 25.6 percent. “The increase in feedstock to NLNG, via the train VII project that aims to expand the Bonny Island terminal’s production capacity,

Read More »

Energy Secretary Ensures Washington Coal Plant Remains Open to Ensure Affordable, Reliable and Secure Power Heading into Winter

Emergency order addresses critical grid reliability issues, lowering risk of blackouts and ensuring affordable electricity access WASHINGTON—U.S. Secretary of Energy Chris Wright today issued an emergency order to ensure Americans in the Northwestern region of the United States have access to affordable, reliable and secure electricity heading into the cold winter months. The order directs TransAlta to keep Unit 2 of the Centralia Generating Station in Centralia, Washington available to operate. Unit 2 of the coal plant was scheduled to shut down at the end of 2025. The reliable supply of power from the Centralia coal plant is essential for grid stability in the Northwest. The order prioritizes minimizing the risk and costs of blackouts. “The last administration’s energy subtraction policies had the United States on track to experience significantly more blackouts in the coming years — thankfully, President Trump won’t let that happen,” said Energy Secretary Wright. “The Trump administration will continue taking action to keep America’s coal plants running so we can stop the price spikes and ensure we don’t lose critical generation sources. Americans deserve access to affordable, reliable, and secure energy to heat their homes all the time, regardless of whether the wind is blowing or the sun is shining.” According to DOE’s Resource Adequacy Report, blackouts were on track to potentially increase 100 times by 2030 if the U.S. continued to take reliable power offline as it did during the Biden administration. The North American Electric Reliability Corporation (NERC) determined in its 2025-2026 Winter Reliability Assessment that the WECC Northwest region is at elevated risk during periods of extreme weather, such as prolonged, far-reaching cold snaps.  This order is in effect beginning on December 16, 2025, and continuing until March 16, 2026.  Background:  The NERC Winter Reliability Assessment warns that “extreme winter conditions extending over

Read More »

Wood Says Mideast Contract Wins Exceeded $1B in 2025

John Wood Group PLC said Tuesday it has won more than $1 billion in contracts across the Middle East this year, exceeding last year’s company record. “Wood has seen a near 20 percent increase in awards compared to 2024, with wins across United Arab Emirates, Iraq, Kingdom of Saudi Arabia, Bahrain, Kuwait, Oman and Qatar”, the Aberdeen, Scotland-based engineering and consulting company said in an online statement. Ellis Renforth, president of operations for Europe, Middle East and Africa at Wood, said, “This year we’ve delivered critical solutions across the Middle East to improve asset reliability and cut emissions”. “In 2026, we’ll build on this success by expanding our operations and maintenance services in the region. Our focus is on proven approaches to asset management and modifications that improve efficiency and reduce downtime – practical steps that strengthen energy security and decarbonization”, Renforth added. Stuart Turl, Wood vice president for Middle East consulting, said, “Decarbonization and digitalization remain central to how we support clients in the Middle East. This year, we launched our specialist Middle East Energy Transition and Digital & AI Hubs to further support clients in accelerating emissions reduction while unlocking efficiencies through AI-driven solutions”. “This in-region advisory enables practical pathways to carbon reduction while supporting national visions for a sustainable energy future. Delivery has already spanned initiatives such as minerals procurement, hydrogen production facilities and carbon capture and storage infrastructure”, Turl said. On May 27 Wood said it had secured a contract from TA’ZIZ, a joint venture of Abu Dhabi National Oil Co (ADNOC) PJSC, TA’ZIZ to provide project management consultancy for the development of the UAE’s first methanol production facility, to rise in Al Ruwais Industrial City. “Construction will be completed by 2028 and the plant will be one of the largest methanol plants in the world, producing 1.8 million tonnes per year. It will be powered using the latest clean energy technology”, Wood noted. On June 10 Wood said it

Read More »

EU to Scrap Combustion Engine Ban

The European Union is set to propose softening emissions rules for new cars, scrapping an effective ban on combustion engines following months of pressure from the automotive industry. The proposal will allow carmakers to slow the rollout of electric vehicles in Europe and aligns the region more closely with the US, where President Donald Trump is tearing up efficiency standards for cars put in place by the previous administration. Globally, automakers are struggling to make the shift profitable, with Ford Motor Co. announcing it will take $19.5 billion in charges tied to a sweeping overhaul of its EV business. The European stepback – to be unveiled Tuesday – follows a global pullback from green policies as economic realities of major transformations set in. Mounting trade tensions with the US and China are pushing Europe to further prioritize shoring up its own industry. Although the bloc is legally bound to reach climate neutrality by 2050, governments and companies are intensifying calls for more flexibility, warning that rigid targets could jeopardize economic stability. Under the new proposal, the European Commission will lower the requirements that would have halted sales of new gasoline and diesel-fueled cars starting in 2035, instead allowing a number of plug-in hybrids and electric vehicles with fuel-powered range extenders, according to people with knowledge of the matter.  Tailpipe emissions will have to be reduced by 90 percent by the middle of the next decade compared with the current goal of a 100 percent reduction, said the people, who asked not to be identified because talks on the proposal are private. The commission will set a condition that carmakers need to compensate for the additional pollution by using low-carbon or renewable fuels or locally produced green steel. The European Commission declined to comment. The proposal is set to be adopted by EU commissioners on

Read More »

Uptime Institute’s Max Smolaks: Power, Racks, and the Economics of the AI Data Center Boom

The latest episode of the Data Center Frontier Show opens not with a sweeping thesis, but with a reminder of just how quickly the industry’s center of gravity has shifted. Editor in Chief Matt Vincent is joined by Max Smolaks, research analyst at Uptime Institute, whom DCF met in person earlier this year at the Open Compute Project (OCP) Global Summit 2025 in San Jose. Since then, Smolaks has been closely tracking several of the most consequential—and least obvious—threads shaping the AI infrastructure boom. What emerges over the course of the conversation is not a single narrative, but a set of tensions: between power and place, openness and vertical integration, hyperscale ambition and economic reality. From Crypto to Compute: An Unlikely On-Ramp One of the clearest structural patterns Smolaks sees in today’s AI buildout is the growing number of large-scale AI data center projects that trace their origins back to cryptocurrency mining. It is a transition few would have predicted even a handful of years ago. Generative AI was not an anticipated workload in traditional capacity planning cycles. Three years ago, ChatGPT did not exist, and the industry had not yet begun to grapple with the scale, power density, and energy intensity now associated with AI training and inference. When demand surged, developers were left with only a limited set of viable options. Many leaned heavily on on-site generation—most often natural gas—to bypass grid delays. Others ended up in geographies that had already been “discovered” by crypto miners. For years, cryptocurrency operators had been quietly mapping underutilized power capacity. Latency did not matter. Proximity to population centers did not matter. Cheap, abundant electricity did—often in remote or unconventional locations that would never have appeared on a traditional data center site-selection short list. As crypto markets softened, those same sites became

Read More »

Google’s TPU Roadmap: Challenging Nvidia’s Dominance in AI Infrastructure

Google’s roadmap for its Tensor Processing Units has quietly evolved into a meaningful counterweight to Nvidia’s GPU dominance in cloud AI infrastructure—particularly at hyperscale. While Nvidia sells physical GPUs and associated systems, Google sells accelerator services through Google Cloud Platform. That distinction matters: Google isn’t competing in the GPU hardware market, but it is increasingly competing in the AI compute services market, where accelerator mix and economics directly influence hyperscaler strategy. Over the past 18–24 months, Google has focused on identifying workloads that map efficiently onto TPUs and has introduced successive generations of the architecture, each delivering notable gains in performance, memory bandwidth, and energy efficiency. Currently, three major TPU generations are broadly available in GCP: v5e and v5p, the “5-series” workhorses tuned for cost-efficient training and scale-out learning. Trillium (v6), offering a 4–5× performance uplift over v5e with significant efficiency gains. Ironwood (v7 / TPU7x), a pod-scale architecture of 9,216 chips delivering more than 40 exaFLOPS FP8 compute, designed explicitly for the emerging “age of inference.” Google is also aggressively marketing TPU capabilities to external customers. The expanded Anthropic agreement (up to one million TPUs, representing ≥1 GW of capacity and tens of billions of dollars) marks the most visible sign of TPU traction. Reporting also suggests that Google and Meta are in advanced discussions for a multibillion-dollar arrangement in which Meta would lease TPUs beginning in 2026 and potentially purchase systems outright starting in 2027. At the same time, Google is broadening its silicon ambitions. The newly introduced Axion CPUs and the fully integrated AI Hypercomputer architecture frame TPUs not as a standalone option, but as part of a multi-accelerator environment that includes Nvidia H100/Blackwell GPUs, custom CPUs, optimized storage, and high-performance fabrics. What follows is a deeper look at how the TPU stack has evolved, and what

Read More »

DCF Trends Summit 2025: Beyond the Grid – Natural Gas, Speed, and the New Data Center Reality

By 2025, the data center industry’s power problem has become a site-selection problem, a finance problem, a permitting problem and, increasingly, a communications problem. That was the throughline of “Beyond the Grid: Natural Gas, Speed, and the New Data Center Reality,” a DCF Trends Summit panel moderated by Stu Dyer, First Vice President at CBRE, with Aad den Elzen, VP of Power Generation at Solar Turbines (a Caterpillar company); Creede Williams, CEO & President of Exigent Energy Partners; and Adam Michaelis, Vice President of Hyperscale Engineering at PointOne Data Centers. In an industry that once treated proximity to gas infrastructure as a red flag, Dyer opened with a blunt marker of the market shift: what used to be a “no-go” is now, for many projects, the shortest path to “yes.” Vacancy is tight, preleasing is high, and the center of gravity is moving both in scale and geography as developers chase power beyond the traditional core. From 48MW Campuses to Gigawatt Expectations Dyer framed the panel’s premise with a Northern Virginia memory: a “big” 48MW campus in Sterling that was expected to last five to seven years—until a hyperscale takedown effectively erased the runway. That was the early warning sign of what’s now a different era entirely. Today, Dyer said, the industry isn’t debating 72MW or even 150MW blocks. Increasingly, the conversation starts at 500MW critical and, for some customers, pushes past a gigawatt. Grid delivery timelines have not kept pace with that shift, and the mismatch is forcing alternative strategies into the mainstream. “If you’re interested in speed and scale… gas.” If there was a sharp edge to the panel, it came from Williams’ assertion that for near-term speed-to-power at meaningful scale, natural gas is the only broadly viable option. Williams spoke as an independent power producer (IPP) operator who

Read More »

Roundtable: The Economics of Acceleration

Ben Rapp, Rehlko: The pace of AI deployment is outpacing grid capacity in many regions, which means power strategy is now directly tied to deployment timelines. To move fast without sacrificing lifecycle cost or reliability, operators are adopting modular power systems that can be installed and commissioned quickly, then expanded or adapted as loads grow. From an energy perspective, this requires architectures that support multiple pathways: traditional generation, cleaner fuels like HVO, battery energy storage, and eventually hydrogen or renewable integrations where feasible. Backup power is no longer a static insurance policy, it’s a dynamic part of the operating model, supporting uptime, compliance, and long-term cost management. Rehlko’s global footprint and broad energy portfolio enable us to support operators through these transitions with scalable solutions that meet existing technical needs while providing a roadmap for future adaptation.

Read More »

DCF Trends Summit 2025: Bridging the Data Center Power Gap – Utilities, On-Site Power, and the AI Buildout

The second installment in our recap series from the 2025 Data Center Frontier Trends Summit highlights a panel that brought unusual candor—and welcome urgency—to one of the defining constraints of the AI era: power availability. Moderated by Buddy Rizer, Executive Director of Economic Development for Loudoun County, Bridging the Data Center Power Gap: Ways to Streamline the Energy Supply Chain convened a powerhouse group of energy and data center executives representing on-site generation, independent power markets, regulated utilities, and hyperscale operators: Jeff Barber, VP of Global Data Centers, Bloom Energy Bob Kinscherf, VP of National Accounts, Constellation Stan Blackwell, Director, Data Center Practice, Dominion Energy Joel Jansen, SVP Regulated Commercial Operations, American Electric Power David McCall, VP of Innovation, QTS Data Centers As presented on September 26, 2025 in Reston, Virginia, the discussion quickly revealed that while no single answer exists to the industry’s power crunch, a more collaborative, multi-path playbook is now emerging—and evolving faster than many realize. A Grid Designed for Yesterday Meets AI-Era Demand Curves Rizer opened with context familiar to anyone operating in Northern Virginia: this region sits at the epicenter of globally scaled digital infrastructure, but its once-ample headroom has evaporated under the weight of AI scaling cycles. Across the panel, the message was consistent: demand curves have shifted permanently, and the step-changes in load growth require new thinking across the entire energy supply chain. Joel Jansen (AEP) underscored the pace of change. A decade ago, utilities faced flat or declining load growth. Now, “our load curve is going straight up,” driven by hyperscale and AI training clusters that are large, high-density, and intolerant of slow development cycles. AEP’s 40,000 miles of transmission and 225,000 miles of distribution infrastructure give it perspective: generation is challenging, but transmission and interconnection timelines are becoming decisive gating factors.

Read More »

DCF Trends Summit 2025 – Scaling AI: Adaptive Reuse, Power-Rich Sites, and the New GPU Frontier

When Jones Lang LaSalle (JLL)’s Sean Farney walked back on stage after lunch at the Data Center Frontier Trends Summit 2025, he didn’t bother easing into the topic. “This is the best one of the day,” he joked, “and it’s got the most buzzwords in the title.” The session, “Scaling AI: The Role of Adaptive Reuse and Power-Rich Sites in GPU Deployment,” lived up to that billing. Over the course of the hour, Farney and his panel of experts dug into the hard constraints now shaping AI infrastructure—and the unconventional sites and power strategies needed to overcome them. Joining Farney on stage were: Lovisa Tedestedt, Strategic Account Executive – Cloud & Service Providers, Schneider Electric Phill Lawson-Shanks, Chief Innovation Officer, Aligned Data Centers Scott Johns, Chief Commercial Officer, Sapphire Gas Solutions Together, they painted a picture of an industry running flat-out, where adaptive reuse, modular buildouts, and behind-the-meter power are becoming the fastest path to AI revenue. The Perfect Storm: 2.3% Vacancy, Power-Constrained Revenue Farney opened with fresh JLL research that set the stakes in stark terms. U.S. colo vacancy is down to 2.3% – roughly 98% utilization. Just five years ago, vacancy was about 10%. The industry is tracking to over 5.4 GW of colocation absorption this year, with 63% of first-half absorption concentrated in just two markets: Northern Virginia and Dallas. There’s roughly 8 GW of build pipeline, but about 73% of that is already pre-leased, largely by hyperscalers and “Mag 7” cloud and AI giants. “We are the envy of every industry on the planet,” Farney said. “That’s fantastic if you’re in the data center business. It’s a really bad thing if you’re a customer.” The message to CIOs and CTOs was blunt: if you don’t have a capacity strategy dialed in, your growth may be constrained

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »