Stay Ahead, Stay ONMINE

AI could keep us dependent on natural gas for decades to come

The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company’s plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions “by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for $13 per million Btu (a measure of thermal energy); last year, it averaged just $2.21, the lowest annual price (adjusting for inflation) ever reported, according to the US Energy Information Administration (EIA). Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.  Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke’s Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but “as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn’t respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana’s residents will have to live with—and possibly pay for—the changes in the decades to come.

The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate.

So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution!

The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity.

To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center.

Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel.

The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company’s plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions “by funding carbon capture and a solar project are vague and offer little reassurance.”

The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon.

The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.”

The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth.

“It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.”

But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like.

For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments.

The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands.

The reign of natural gas

Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for $13 per million Btu (a measure of thermal energy); last year, it averaged just $2.21, the lowest annual price (adjusting for inflation) ever reported, according to the US Energy Information Administration (EIA).

Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit.

Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.  

But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify.

Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down.

Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions.

Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.  

But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up?

Times of stress

AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US

Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power.

There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave.

The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke’s Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity.

Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029.

“The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress.

The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.”

AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.”

Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says.

Footing the bill

Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants.

“The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation.

In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas. 

In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much.

Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline.

“Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”      

In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but “as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate.

The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn’t respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early.

Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems. 

Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years.

“Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director.

Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.”

The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.”

The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies.

Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana’s residents will have to live with—and possibly pay for—the changes in the decades to come.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Survey: AMD continues to take server share from Intel

Dean McCarron, president of Mercury, said it’s not AMD stealing Intel business but mostly a case of AMD growing faster than Intel. “AMD’s growth rate in the quarter was multiples of Intel’s, resulting in significant server share gains,” he said in a research note. “Server processor shipments were definitively the

Read More »

Charging Forward: Centrica flags Rough shutdown in blow to hydrogen storage hopes

In this week’s Charging Forward, Centrica has warned it could shut down its Rough offshore gas storage facility without UK government support. Meanwhile, Harmony Energy is in the midst of a bidding war for its battery storage portfolio with competing offers from Drax Group and Foresight Group. In addition, Field Energy, SAE, Balance Power and EOS are all progressing various battery energy storage system (BESS) projects across the UK. This week’s UK energy storage headlines: Centrica warns it may close Rough gas storage facility Harmony Energy Income Trust to auction battery storage portfolio SAE forms joint venture with Econergy for 250 MW Uskmouth BESS EOS gets approval for £62m Teesside BESS Field secures £42m loan to accelerate three UK BESS projects Balance Power secures approval for 29.9 MW Ayrshire BESS Devon council rejects Clearstone Enrgy 150 MW BESS plans International energy storage news: Noon Energy targets ultra long duration storage and China’s Sungrow installs BESS near Arctic Circle Centrica warns it may close Rough offshore gas storage facility British Gas owner Centrica has warned it may shut down and decommission its Rough offshore gas storage facility without additional UK government support. In an interview with the BBC, Centrica chief executive Chris O’Shea said operating the Rough facility will lead to a £100 million loss for the company this year. “If we were to simply spend £2 billion redeveloping this field and the summer-winter gas price stays the same, then we will lose that £2bn and we’ll lose the cost of operating the facility,” O’Shea said. “It’s just not sustainable.” Centrica initially shut down Rough in 2017, but moved to reopen the site following Russia’s invasion of Ukraine in 2022. © Supplied by CentricaCentrica has plans to revamp its Rough site into a storage base for hydrogen. The company is investigating converting

Read More »

Transmission charging adds £1bn to Scottish offshore wind farm costs

Transmission charging is making Scottish offshore wind farms £1 billion more expensive than their English counterpart. A report from Aurora Energy Research found that the potential changes, currently being considered by the UK government, would add the £1bn figure to a 1GW wind farm over the course of its operating life. The report added that reforming the transmission charging model could save billpayers £16bn and protect investments in offshore wind projects. Transmission charging is a levy placed on generators, having originally been designed to incentivise building asset near major cities in England. However, Scotland is looking to build over 45GW of offshore wind in the country’s waters, driven through the ScotWind and INTOG leasing rounds. Transmission charging is now effectively penalising generation in these areas. Scottish Renewables chief executive Claire Mack said: “Scotland’s abundant natural resources should make it the home of the UK’s biggest and most productive renewable energy projects but our outdated transmission charging rules, designed over 30 years ago, are unbalancing how the modern-day electricity network should be paid for which is negatively impacting the development of major sites. “These charges are both volatile and unpredictable, unfairly penalising Scottish projects by tens of millions of pounds every year.” © Supplied by Scottish RenewablesScottish Renewables chief executive Claire Mack. She added that the UK government cannot meet its 2030 clean power targets without Scottish offshore wind. Instead, she urged the UK government and Ofgem to implement a cap and floor model for transmission charging that alleviates these costs and keeps projects on track. “Delivering this meaningful reform will provide a stable, investment-friendly environment – one that protects the clean power projects vital to creating green jobs at scale and delivering a secure, sustainable energy system for the future,” Mack said. Transmission charging reforms According to the report, transmission

Read More »

Petrofac restructuring approved by courts as it looks to ‘healthy pipeline of future opportunities’

Aberdeen-based oil services firm Petrofac has had its restructuring plan approved by the High Court of England and Wales. The business has been pursuing a cost-cutting restructuring plan with lenders in order to bring its books back into the black and has now received the green light. In February, Petrofac announced plans to raise $355 million (£280m) in funding as part of the deal that has further reduced shareholder allocation. Existing shareholders were to be allocated 2.2% firm’s total share capital as part of the deal, a downgrade compared to the 2.5% outlined in plans announced just before Christmas. Petrofac appeared in court on 30 April and witnessed “overwhelming support of shareholders and the majority of creditor classes,” it claimed in a stock market update. The firm’s chief executive, Tareq Kawash said: “The sanctioning of the restructuring plan marks a significant milestone for our group. “Implementation of the plan will allow our talented team, clients, suppliers, shareholders and investors, to move forward with renewed confidence, bolstered by a sustainable financial platform, a robust backlog of existing contracts, and a healthy pipeline of future opportunities.” Petrofac CEO Tareq Kawash. The firm has faced financial hardship in recent years as it became entangled in a corruption scandal, as the UK’s Serious Fraud Office (SFO) charged two former Petrofac executives with paying bribes in the United Arab Emirates in 2024.  Additionally, Petrofac’s books have not been as healthy as they once were, and share prices slumped following the COVD pandemic. A decision to defer publishing its audited annual results for 2024 has also resulted the halt of shares trading for the company as well, adding to its list of concerns. This forms an ongoing saga of financial uncertainty for the services giant that has been ongoing since 2023.   However, now that a restructuring

Read More »

FERC rejects MISO plan to speed generation interconnection

Federal regulators on Friday rejected the Midcontinent Independent System Operator’s plan to fast-track new generation interconnections, arguing the proposal lacked limits on the number of projects that could be considered. MISO filed its Expedited Resource Addition Study process, or ERAS, in March, to provide a framework for the accelerated study of generation it said could address urgent resource adequacy and reliability needs. Under the proposal, projects entering the ERAS process would have been studied serially each quarter and granted an expedited interconnection agreement within 90 days. But renewable energy stakeholders argued the proposal could add chaos “to an already complex process.” In a 2-1 vote, FERC commissioners denied the proposal. The ERAS proposal “places no limit on the number of projects that could be entered in the ERAS process, which could result in an ERAS queue with processing times for interconnection requests that are too lengthy to meet MISO’s stated resource adequacy and reliability needs,” Commissioners David Rosner and Lindsay See wrote in their decision. MISO’s proposed tariff language, for example, did not limit the number of interconnection requests or total megawatts of interconnection requests that are eligible to enter the ERAS queue, they noted. “MISO proposes up to 14 opportunities to enter the ERAS process through 2028, which could further impede MISO’s ability to process ERAS requests on an expedited basis,” according to the decision. FERC Chairman Mark Christie dissented, though he said he did not disagree with the majority’s criticism. “I am willing, however, to extend to both the states and MISO a trust that they would implement the ERAS proposal in a manner that would promote the construction of badly needed generation capacity that serves resource adequacy and reliability,” Christie wrote. “One thing we know with no need for further proof: This country, including MISO, is heading for a

Read More »

Stop work order on Empire Wind 1 lifted, Equinor resumes construction

The Trump administration has lifted its stop work order on the 810-MW Empire Wind 1 project offshore New York, allowing construction to resume, developer Equinor said Monday. New York Governor Kathy Hochul, D, said in a Monday release that “countless conversations with Equinor and White House officials” had led to the lifting of the stop work order. “Now, Equinor will resume the construction of this fully-permitted project that had already received the necessary federal approvals,” she said. “I also reaffirmed that New York will work with the Administration and private entities on new energy projects that meet the legal requirements under New York law.” Interior Secretary Doug Burgum said in a Monday X post that he was “encouraged by Governor Hochul’s comments about her willingness to move forward on critical pipeline capacity” for natural gas.  The stop work order was issued April 16, and last week Equinor said the situation would force the company to terminate the project entirely if the situation wasn’t resolved within days, as the stoppage cost around $50 million a week. “I am grateful to Governor Hochul for her constructive collaboration with the Trump Administration, without which we would not have been able to advance this project and secure energy for 500,000 homes in New York,” Anders Opedal, president and CEO of Equinor said in a release. Opedal also thanked the president, New York City Mayor Eric Adams, and several New York lawmakers “as well as labour groups and other advocates that have maintained their steadfast support for the project.” When issuing the order, Burgum said in a letter to the Bureau of Ocean Energy Management that the project was “rushed through by the prior administration without sufficient analysis or consultation among the relevant agencies as relates to the potential effects from the project” and that

Read More »

GOP supporters of IRA clean energy credits outline reconciliation bill recommendations

Dive Brief: A group of Congressional Republicans are asking their party to further tweak proposed changes to clean energy tax credits that the House Ways and Means Committee included in its reconciliation bill text. The proposed budget reconciliation bill would decrease the credits for a number of clean energy technologies after 2028 and phase them out entirely after 2031, as well as additional requirements to qualify for the credits and an early sunset for their transferability. Rep. Jen Kiggans, a Virginia Republican, led 13 other House members in a joint statement Wednesday seeking to extend the transferability mechanism through the phase-out period and make other changes. The GOP tax bill, as currently constructed, passed the House Budget Committee late on Sunday, after the committee failed to clear the bill on Friday. However, as Kiggans and the other lawmakers seek changes to the bill to further protect the clean energy tax provisions, others in the party are looking to speed up the phaseout. Dive Insight: While the original Inflation Reduction Act had technology-neutral clean energy production and investment credits available until 2032, the House Ways and Means Committee’s bill text would move the sunset dates up and phase them out over a four-year period. Companies would be able to receive 80% of the credit values in 2029, 60% in 2030, 40% in 2031, with the credits phasing out entirely in 2032. Another nuclear power production credit would be phased out on the same schedule. Kiggans and her colleagues said they commended the committee “for including reasonable phase-out schedules” for the credits, but said in the May 14 joint statement that a few more changes are needed “to fully realize the intent of these phase-out schedules.” “While many of these provisions reflect a commitment to American energy dominance through an all-of-the-above energy

Read More »

Tariff uncertainty weighs on networking vendors

“Our guide assumes current tariffs and exemptions remain in place through the quarter. These include the following: China at 30%, partially offset by an exemption for semiconductors and certain electronic components; Mexico and Canada at 25% for the components and products that are not eligible for the current exemptions,” Cisco CFO Scott Herron told Wall Street analysts in the company’s quarterly earnings report on May 14. At this time, Cisco expects little impact from tariffs on steel and aluminum and retaliatory tariffs, Herron said. “We’ll continue to leverage our world-class supply chain team to help mitigate the impact,” he said, adding that “the flexibility and agility we have built into our operations over the last few years, the size and scale of our supply chain, provides us some unique advantages as we support our customers globally.” “Once the tariff scenario stabilizes, there [are] steps that we can take to mitigate it, as you’ve seen us do with China from the first Trump administration. And only after that would we consider price [increases],” Herron said. Similarly, Extreme Networks noted the changing tariff conditions during its earnings call on April 30. “The tariff situation is very dynamic, I think, as everybody knows and can appreciate, and it’s kind of hard to call. Yes, there was concern initially given the magnitude of tariffs,” said Extreme Networks CEO Ed Meyercord on the earnings call. “The larger question is, will all of the changes globally in trade and tariff policy have an impact on demand? And that’s hard to call at this point. And we’re going to hold as far as providing guidance or judgment on that until we have finality come July.” Financial news Meanwhile, AI is fueling high expectations and influencing investments in enterprise campus and data center environments.

Read More »

Liquid cooling becoming essential as AI servers proliferate

“Facility water loops sometimes have good water quality, sometimes bad,” says My Troung, CTO at ZutaCore, a liquid cooling company. “Sometimes you have organics you don’t want to have inside the technical loop.” So there’s one set of pipes that goes around the data center, collecting the heat from the server racks, and another set of smaller pipes that lives inside individual racks or servers. “That inner loop is some sort of technical fluid, and the two loops exchange heat across a heat exchanger,” says Troung. The most common approach today, he says, is to use a single-phase liquid — one that stays in liquid form and never evaporates into a gas — such as water or propylene glycol. But it’s not the most efficient option. Evaporation is a great way to dissipate heat. That’s what our bodies do when we sweat. When water goes from a liquid to a gas it’s called a phase change, and it uses up energy and makes everything around it slightly cooler. Of course, few servers run hot enough to boil water — but they can boil other liquids. “Two phase is the most efficient cooling technology,” says Xianming (Simon) Dai, a professor at University of Texas at Dallas. And it might be here sooner than you think. In a keynote address in March at Nvidia GTC, Nvidia CEO Jensen Huang unveiled the Rubin Ultra NVL576, due in the second half of 2027 — with 600 kilowatts per rack. “With the 600 kilowatt racks that Nvidia is announcing, the industry will have to shift very soon from single-phase approaches to two-phase,” says ZutaCore’s Troung. Another highly-efficient cooling approach is immersion cooling. According to a Castrol survey released in March, 90% of 600 data center industry leaders say that they are considering switching to immersion

Read More »

Cisco taps OpenAI’s Codex for AI-driven network coding

“If you want to ask Codex a question about your codebase, click “Ask”. Each task is processed independently in a separate, isolated environment preloaded with your codebase. Codex can read and edit files, as well as run commands including test harnesses, linters, and type checkers. Task completion typically takes between 1 and 30 minutes, depending on complexity, and you can monitor Codex’s progress in real time,” according to OpenAI. “Once Codex completes a task, it commits its changes in its environment. Codex provides verifiable evidence of its actions through citations of terminal logs and test outputs, allowing you to trace each step taken during task completion,” OpenAI wrote. “You can then review the results, request further revisions, open a GitHub pull request, or directly integrate the changes into your local environment. In the product, you can configure the Codex environment to match your real development environment as closely as possible.” OpenAI is releasing Codex as a research preview: “We prioritized security and transparency when designing Codex so users can verify its outputs – a safeguard that grows increasingly more important as AI models handle more complex coding tasks independently and safety considerations evolve. Users can check Codex’s work through citations, terminal logs and test results,” OpenAI wrote.  Internally, technical teams at OpenAI have started using Codex. “It is most often used by OpenAI engineers to offload repetitive, well-scoped tasks, like refactoring, renaming, and writing tests, that would otherwise break focus. It’s equally useful for scaffolding new features, wiring components, fixing bugs, and drafting documentation,” OpenAI stated. Cisco’s view of agentic AI Patel stated that Codex is part of the developing AI agent world, where Cisco envisions billions of AI agents will work together to transform and redefine the architectural assumptions the industry has relied on. Agents will communicate within and

Read More »

US companies are helping Saudi Arabia to build an AI powerhouse

AMD announced a five-year, $10 billion collaboration with Humain to deploy up to 500 megawatts of AI compute in Saudi Arabia and the US, aiming to deploy “multi-exaflop capacity by early 2026.” AWS, too, is expanding its data centers in Saudi Arabia to bolster Humain’s cloud infrastructure. Saudi Arabia has abundant oil and gas to power those data centers, and is growing its renewable energy resources with the goal of supplying 50% of the country’s power by 2030. “Commercial electricity rates, nearly 50% lower than in the US, offer potential cost savings for AI model training, though high local hosting costs due to land, talent, and infrastructure limit total savings,” said Eric Samuel, Associate Director at IDC. Located near Middle Eastern population centers and fiber optic cables to Asia, these data centers will offer enterprises low-latency cloud computing for real-time AI applications. Late is great There’s an advantage to being a relative latecomer to the technology industry, said Eric Samuel, associate director, research at IDC. “Saudi Arabia’s greenfield tech landscape offers a unique opportunity for rapid, ground-up AI integration, unburdened by legacy systems,” he said.

Read More »

AMD, Nvidia partner with Saudi startup to build multi-billion dollar AI service centers

Humain will deploy the Nvidia Omniverse platform as a multi-tenant system to drive acceleration of the new era of physical AI and robotics through simulation, optimization and operation of physical environments by new human-AI-led solutions. The AMD deal did not discuss the number of chips involved in the deal, but it is valued at $10 billion. AMD and Humain plan to develop a comprehensive AI infrastructure through a network of AMD-based AI data centers that will extend from Saudi Arabia to the US and support a wide range of AI workloads across corporate, start-up, and government markets. Think of it as AWS but only offering AI as a service. AMD will provide its AI compute portfolio – Epyc, Instinct, and FPGA networking — and the AMD ROCm open software ecosystem, while Humain will manage the delivery of the hyperscale data center, sustainable power systems, and global fiber interconnects. The partners expect to activate a multi-exaflop network by early 2026, supported by next-generation AI silicon, modular data center zones, and a software platform stack focused on developer enablement, open standards, and interoperability. Amazon Web Services also got a piece of the action, announcing a more than $5 billion investment to build an “AI zone” in the Kingdom. The zone is the first of its kind and will bring together multiple capabilities, including dedicated AWS AI infrastructure and servers, UltraCluster networks for faster AI training and inference, AWS services like SageMaker and Bedrock, and AI application services such as Amazon Q. Like the AMD project, the zone will be available in 2026. Humain only emerged this month, so little is known about it. But given that it is backed by Crown Prince Salman and has the full weight of the Kingdom’s Public Investment Fund (PIF), which ranks among the world’s largest and

Read More »

Check Point CISO: Network segregation can prevent blackouts, disruptions

Fischbein agrees 100% with his colleague’s analysis and adds that education and training can help prevent such incidents from occurring. “Simulating such a blackout is impossible, it has never been done,” he acknowledges, but he is committed to strengthening personal and team training and risk awareness. Increased defense and cybersecurity budgets In 2025, industry watchers expect there will be an increase in the public budget allocated to defense. In Spain, one-third of the budget will be allocated to increasing cybersecurity. But for Fischbein, training teams is much more important than the budget. “The challenge is to distribute the budget in a way that can be managed,” he notes, and to leverage intuitive and easy-to-use platforms, so that organizations don’t have to invest all the money in training. “When you have information, management, users, devices, mobiles, data centers, clouds, cameras, printers… the security challenge is very complex. You have to look for a security platform that makes things easier, faster, and simpler,” he says. ” Today there are excellent tools that can stop all kinds of attacks.” “Since 2010, there have been cybersecurity systems, also from Check Point, that help prevent this type of incident from happening, but I’m not sure that [Spain’s electricity blackout] was a cyberattack.” Leading the way in email security According to Gartner’s Magic Quadrant, Check Point is the leader in email security platforms. Today email is still responsible for 88% of all malicious file distributions. Attacks that, as Fischbein explains, enter through phishing, spam, SMS, or QR codes. “There are two challenges: to stop the threats and not to disturb, because if the security tool is a nuisance it causes more harm than good. It is very important that the solution does not annoy [users],” he stresses. “As almost all attacks enter via e-mail, it is

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »