Stay Ahead, Stay ONMINE

Exclusive: A record-breaking baby has been born from an embryo that’s over 30 years old

A baby boy born over the weekend holds the new record for the “oldest baby.” Thaddeus Daniel Pierce, who arrived on July 26, developed from an embryo that had been in storage for 30 and a half years. “We had a rough birth but we are both doing well now,” says Lindsey Pierce, his mother. “He is so chill. We are in awe that we have this precious baby!” Lindsey and her husband, Tim Pierce, who live in London, Ohio, “adopted” the embryo from a woman who had it created in 1994. She says her family and church family think “it’s like something from a sci-fi movie.”  “The baby has a 30-year-old sister,” she adds. Tim was a toddler when the embryos were first created. “It’s been pretty surreal,” says Linda Archerd, 62, who donated the embryo. “It’s hard to even believe.” Three little hopes The story starts back in the early 1990s. Archerd had been trying—and failing—to get pregnant for six years. She and her husband decided to try IVF, a fairly new technology at the time. “People were [unfamiliar] with it,” says Archerd. “A lot of people were like, what are you doing?” They did it anyway, and in May 1994, they managed to create four embryos. One of them was transferred to Linda’s uterus. It resulted in a healthy baby girl. “I was so blessed to have a baby,” Archerd says. The remaining three embryos were cryopreserved and kept in a storage tank. That was 31 years ago. The healthy baby girl is now a 30-year-old woman who has her own 10-year-old daughter. But the other three embryos remained frozen in time. Archerd originally planned to use the embryos herself. “I always wanted another baby desperately,” she says. “I called them my three little hopes.” Her then husband felt differently, she says. Archerd went on to divorce him, but she won custody of the embryos and kept them in storage, still hopeful she might use them one day, perhaps with another partner. That meant paying annual storage fees, which increased over time and ended up costing Archerd around a thousand dollars a year, she says. To her, it was worth it. “I always thought it was the right thing to do,” she says.  Things changed when she started going through menopause, she says. She considered her options. She didn’t want to discard the embryos or donate them for research. And she didn’t want to donate them to another family anonymously—she wanted to meet the parents and any resulting babies. “It’s my DNA; it came from me … and [it’s] my daughter’s sibling,” she says. Then she found out about embryo “adoption.” This is a type of embryo donation in which both donors and recipients have a say in whom they “place” their embryos with or “adopt” them from. It is overseen by agencies—usually explicitly religious ones—that believe an embryo is morally equivalent to a born human. Archerd is Christian. There are several agencies that offer these adoption services in the US, but not all of them accept embryos that have been stored for a very long time. That’s partly because those embryos will have been frozen and stored in unfamiliar, old-fashioned ways, and partly because old embryos are thought to be less likely to survive thawing and transfer to successfully develop into a baby. “So many places wouldn’t even take my information,” says Archerd. Then she came across the Snowflakes program run by the Nightlight Christian Adoptions agency. The agency was willing to accept her embryos, but it needed Archerd’s medical records from the time the embryos had been created, as well as the embryos’ lab records. So Archerd called the fertility doctor who had treated her decades before. “I still remembered his phone number by heart,” she says. That doctor, now in his 70s, is still practicing at a clinic in Oregon. He dug Archerd’s records out from his basement, she says. “Some of [them] were handwritten,” she adds. Her embryos entered Nightlight’s “matching pool” in 2022. Making a match “Our matching process is really driven by the preferences of the placing family,” says Beth Button, executive director of the Snowflakes program. Archerd’s preference was for a married Caucasian, Christian couple living in the US. “I didn’t want them to go out of the country,” says Archerd. “And being Christian is very important to me, because I am.” It took a while to find a match. Most of the “adopting parents” signed up for the Snowflakes program were already registered at fertility clinics that wouldn’t have accepted the embryos, says Button. “I would say that over 90% of clinics in the US would not have accepted these embryos,” she says. Lindsey and Tim Pierce at Rejoice Fertility.COURTESY LINDSEY PIERCE Archerd’s embryos were assigned to the agency’s Open Hearts program for embryos that are “hard to place,” along with others that have been in storage for a long time or are otherwise thought to be less likely to result in a healthy birth. Lindsey and Tim Pierce had also signed up for the Open Hearts program. The couple, aged 35 and 34, respectively, had been trying for a baby for seven years and had seen multiple doctors. Lindsey was researching child adoption when she came across the Snowflakes program.  When the couple were considering their criteria for embryos they might receive, they decided that they’d be open to any. “We checkmarked anything and everything,” says Tim. That’s how they ended up being matched with Archerd’s embryos. “We thought it was wild,” says Lindsey. “We didn’t know they froze embryos that long ago.” Lindsey and Tim had registered with Rejoice Fertility, an IVF clinic in Knoxville, Tennessee, run by John Gordon, a reproductive endocrinologist who prides himself on his efforts to reduce the number of embryos in storage. The huge numbers of embryos left in storage tanks was weighing on his conscience, he says, so around six years ago, he set up Rejoice Fertility with the aim of doing things differently.   “Now we’re here in the belt buckle of the Bible Belt,” says Gordon, who is Reformed Presbyterian. “I’ve changed my mode of practice.” IVF treatments performed at the clinic are designed to create as few excess embryos as possible. The clinic works with multiple embryo adoption agencies and will accept any embryo, no matter how long it has been in storage. COURTESY LINDA ARCHERD It was his clinic that treated the parents who previously held the record for the longest-stored embryo—in 2022, Rachel and Philip Ridgeway had twins from embryos created more than 30 years earlier. “They’re such a lovely couple,” says Gordon. When we spoke, he was making plans to meet the family for breakfast. The twins are “growing like weeds,” he says with a laugh. “We have certain guiding principles, and they’re coming from our faith,” says Gordon, although he adds that he sees patients who hold alternative views. One of those principles is that “every embryo deserves a chance at life and that the only embryo that cannot result in a healthy baby is the embryo not given the opportunity to be transferred into a patient.” That’s why his team will endeavor to transfer any embryo they receive, no matter the age or conditions. That can be challenging, especially when the embryos have been frozen or stored in unusual or outdated ways. “It’s scary for people who don’t know how to do it,” says Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility. “You don’t want to kill someone’s embryos if you don’t know what you’re doing.” Cumbersome and explosive In the early days of IVF, embryos earmarked for storage were slow-frozen. This technique involves gradually lowering the temperature of the embryos. But because slow freezing can cause harmful ice crystals to form, clinics switched in the 2000s to a technique called vitrification, in which the embryos are placed in thin plastic tubes called straws and lowered into tanks of liquid nitrogen. This rapidly freezes the embryos and converts them into a glass-like state.  The embryos can later be thawed by removing them from the tanks and rapidly—within two seconds—plunging them into warm “thaw media,” says Atkinson. Thawing slow-frozen embryos is more complicated. And the exact thawing method required varies, depending on how the embryos were preserved and what they were stored in. Some of the devices need to be opened while they are inside the storage tank, which can involve using forceps, diamond-bladed knives, and other tools in the liquid nitrogen, says Atkinson. Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility, directly injects sperm into two eggs to fertilize them.COURTESY OF SARAH ATKINSON AT REJOICE FERTILITY. Recently, she was tasked with retrieving embryos that had been stored inside a glass vial. The vial was made from blown glass and had been heat-sealed with the embryo inside. Atkinson had to use her diamond-bladed knife to snap open the seal inside the nitrogen tank. It was fiddly work, and when the device snapped, a small shard of glass flew out and hit Atkinson’s face. “Hit me on the cheek, cut my cheek, blood running down my face, and I’m like, Oh shit,” she says. Luckily, she had her safety goggles on. And the embryos survived, she adds. The two embryos that were transferred to Lindsey Pierce. Atkinson has a folder in her office with notes she’s collected on various devices over the years. She flicks through it over a video call and points to the notes she made about the glass vial. “Might explode; wear face shield and eye protection,” she reads. A few pages later, she points to another embryo-storage device. “You have to thaw this one in your fingers,” she tells me. “I don’t like it.” The record-breaking embryos had been slow-frozen and stored in a plastic vial, says Atkinson. Thawing them was a cumbersome process. But all three embryos survived it. The Pierces had to travel from their home in Ohio to the clinic in Tennessee five times over a two-week period. “It was like a five-hour drive,” says Lindsey. One of the three embryos stopped growing. The other two were transferred to Lindsey’s uterus on November 14, she says. And one developed into a fetus. Now that the baby has arrived, Archerd is keen to meet him. “The first thing that I noticed when Lindsey sent me his pictures is how much he looks like my daughter when she was a baby,” she says. “I pulled out my baby book and compared them side by side, and there is no doubt that they are siblings.” She doesn’t yet have plans to meet the baby, but doing so would be “a dream come true,” she says. “I wish that they didn’t live so far away from me … He is perfect!” “We didn’t go into it thinking we would break any records,” says Lindsey. “We just wanted to have a baby.”

A baby boy born over the weekend holds the new record for the “oldest baby.” Thaddeus Daniel Pierce, who arrived on July 26, developed from an embryo that had been in storage for 30 and a half years.

“We had a rough birth but we are both doing well now,” says Lindsey Pierce, his mother. “He is so chill. We are in awe that we have this precious baby!”

Lindsey and her husband, Tim Pierce, who live in London, Ohio, “adopted” the embryo from a woman who had it created in 1994. She says her family and church family think “it’s like something from a sci-fi movie.” 

“The baby has a 30-year-old sister,” she adds. Tim was a toddler when the embryos were first created.

“It’s been pretty surreal,” says Linda Archerd, 62, who donated the embryo. “It’s hard to even believe.”

Three little hopes

The story starts back in the early 1990s. Archerd had been trying—and failing—to get pregnant for six years. She and her husband decided to try IVF, a fairly new technology at the time. “People were [unfamiliar] with it,” says Archerd. “A lot of people were like, what are you doing?”

They did it anyway, and in May 1994, they managed to create four embryos. One of them was transferred to Linda’s uterus. It resulted in a healthy baby girl. “I was so blessed to have a baby,” Archerd says. The remaining three embryos were cryopreserved and kept in a storage tank.

That was 31 years ago. The healthy baby girl is now a 30-year-old woman who has her own 10-year-old daughter. But the other three embryos remained frozen in time.

Archerd originally planned to use the embryos herself. “I always wanted another baby desperately,” she says. “I called them my three little hopes.” Her then husband felt differently, she says. Archerd went on to divorce him, but she won custody of the embryos and kept them in storage, still hopeful she might use them one day, perhaps with another partner.

That meant paying annual storage fees, which increased over time and ended up costing Archerd around a thousand dollars a year, she says. To her, it was worth it. “I always thought it was the right thing to do,” she says. 

Things changed when she started going through menopause, she says. She considered her options. She didn’t want to discard the embryos or donate them for research. And she didn’t want to donate them to another family anonymously—she wanted to meet the parents and any resulting babies. “It’s my DNA; it came from me … and [it’s] my daughter’s sibling,” she says.

Then she found out about embryo “adoption.” This is a type of embryo donation in which both donors and recipients have a say in whom they “place” their embryos with or “adopt” them from. It is overseen by agencies—usually explicitly religious ones—that believe an embryo is morally equivalent to a born human. Archerd is Christian.

There are several agencies that offer these adoption services in the US, but not all of them accept embryos that have been stored for a very long time. That’s partly because those embryos will have been frozen and stored in unfamiliar, old-fashioned ways, and partly because old embryos are thought to be less likely to survive thawing and transfer to successfully develop into a baby.

“So many places wouldn’t even take my information,” says Archerd. Then she came across the Snowflakes program run by the Nightlight Christian Adoptions agency. The agency was willing to accept her embryos, but it needed Archerd’s medical records from the time the embryos had been created, as well as the embryos’ lab records.

So Archerd called the fertility doctor who had treated her decades before. “I still remembered his phone number by heart,” she says. That doctor, now in his 70s, is still practicing at a clinic in Oregon. He dug Archerd’s records out from his basement, she says. “Some of [them] were handwritten,” she adds. Her embryos entered Nightlight’s “matching pool” in 2022.

Making a match

“Our matching process is really driven by the preferences of the placing family,” says Beth Button, executive director of the Snowflakes program. Archerd’s preference was for a married Caucasian, Christian couple living in the US. “I didn’t want them to go out of the country,” says Archerd. “And being Christian is very important to me, because I am.”

It took a while to find a match. Most of the “adopting parents” signed up for the Snowflakes program were already registered at fertility clinics that wouldn’t have accepted the embryos, says Button. “I would say that over 90% of clinics in the US would not have accepted these embryos,” she says.

Expecting parents Tim and Lindsey Pierce.
Lindsey and Tim Pierce at Rejoice Fertility.
COURTESY LINDSEY PIERCE

Archerd’s embryos were assigned to the agency’s Open Hearts program for embryos that are “hard to place,” along with others that have been in storage for a long time or are otherwise thought to be less likely to result in a healthy birth.

Lindsey and Tim Pierce had also signed up for the Open Hearts program. The couple, aged 35 and 34, respectively, had been trying for a baby for seven years and had seen multiple doctors.

Lindsey was researching child adoption when she came across the Snowflakes program. 

When the couple were considering their criteria for embryos they might receive, they decided that they’d be open to any. “We checkmarked anything and everything,” says Tim. That’s how they ended up being matched with Archerd’s embryos. “We thought it was wild,” says Lindsey. “We didn’t know they froze embryos that long ago.”

Lindsey and Tim had registered with Rejoice Fertility, an IVF clinic in Knoxville, Tennessee, run by John Gordon, a reproductive endocrinologist who prides himself on his efforts to reduce the number of embryos in storage. The huge numbers of embryos left in storage tanks was weighing on his conscience, he says, so around six years ago, he set up Rejoice Fertility with the aim of doing things differently.  

“Now we’re here in the belt buckle of the Bible Belt,” says Gordon, who is Reformed Presbyterian. “I’ve changed my mode of practice.” IVF treatments performed at the clinic are designed to create as few excess embryos as possible. The clinic works with multiple embryo adoption agencies and will accept any embryo, no matter how long it has been in storage.

A portrait of Linda Archerd.

COURTESY LINDA ARCHERD

It was his clinic that treated the parents who previously held the record for the longest-stored embryo—in 2022, Rachel and Philip Ridgeway had twins from embryos created more than 30 years earlier. “They’re such a lovely couple,” says Gordon. When we spoke, he was making plans to meet the family for breakfast. The twins are “growing like weeds,” he says with a laugh.

“We have certain guiding principles, and they’re coming from our faith,” says Gordon, although he adds that he sees patients who hold alternative views. One of those principles is that “every embryo deserves a chance at life and that the only embryo that cannot result in a healthy baby is the embryo not given the opportunity to be transferred into a patient.”

That’s why his team will endeavor to transfer any embryo they receive, no matter the age or conditions. That can be challenging, especially when the embryos have been frozen or stored in unusual or outdated ways. “It’s scary for people who don’t know how to do it,” says Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility. “You don’t want to kill someone’s embryos if you don’t know what you’re doing.”

Cumbersome and explosive

In the early days of IVF, embryos earmarked for storage were slow-frozen. This technique involves gradually lowering the temperature of the embryos. But because slow freezing can cause harmful ice crystals to form, clinics switched in the 2000s to a technique called vitrification, in which the embryos are placed in thin plastic tubes called straws and lowered into tanks of liquid nitrogen. This rapidly freezes the embryos and converts them into a glass-like state. 

The embryos can later be thawed by removing them from the tanks and rapidly—within two seconds—plunging them into warm “thaw media,” says Atkinson. Thawing slow-frozen embryos is more complicated. And the exact thawing method required varies, depending on how the embryos were preserved and what they were stored in. Some of the devices need to be opened while they are inside the storage tank, which can involve using forceps, diamond-bladed knives, and other tools in the liquid nitrogen, says Atkinson.

Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility, directly injects sperm into two eggs to fertilize them.
COURTESY OF SARAH ATKINSON AT REJOICE FERTILITY.

Recently, she was tasked with retrieving embryos that had been stored inside a glass vial. The vial was made from blown glass and had been heat-sealed with the embryo inside. Atkinson had to use her diamond-bladed knife to snap open the seal inside the nitrogen tank. It was fiddly work, and when the device snapped, a small shard of glass flew out and hit Atkinson’s face. “Hit me on the cheek, cut my cheek, blood running down my face, and I’m like, Oh shit,” she says. Luckily, she had her safety goggles on. And the embryos survived, she adds.

The two embryos that were transferred to Lindsey Pierce.

Atkinson has a folder in her office with notes she’s collected on various devices over the years. She flicks through it over a video call and points to the notes she made about the glass vial. “Might explode; wear face shield and eye protection,” she reads. A few pages later, she points to another embryo-storage device. “You have to thaw this one in your fingers,” she tells me. “I don’t like it.”

The record-breaking embryos had been slow-frozen and stored in a plastic vial, says Atkinson. Thawing them was a cumbersome process. But all three embryos survived it.

The Pierces had to travel from their home in Ohio to the clinic in Tennessee five times over a two-week period. “It was like a five-hour drive,” says Lindsey. One of the three embryos stopped growing. The other two were transferred to Lindsey’s uterus on November 14, she says. And one developed into a fetus.

Now that the baby has arrived, Archerd is keen to meet him. “The first thing that I noticed when Lindsey sent me his pictures is how much he looks like my daughter when she was a baby,” she says. “I pulled out my baby book and compared them side by side, and there is no doubt that they are siblings.”

She doesn’t yet have plans to meet the baby, but doing so would be “a dream come true,” she says. “I wish that they didn’t live so far away from me … He is perfect!”

“We didn’t go into it thinking we would break any records,” says Lindsey. “We just wanted to have a baby.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

TechnipFMC Sees Surge in Q2 Profit

TechnipFMC PLC has reported $285.5 million in adjusted net income for the second quarter, up 99.8 percent from the prior three-month period and 51.1 percent against Q2 2024. The adjusted diluted earnings per share of 68 cents beat the Zacks Consensus Estimate of $0.57. TechnipFMC kept its dividend at $0.05

Read More »

Backblaze adds cloud storage security protection features

Application Keys have received a significant boost, starting with Multi-Bucket Application Keys, which make it possible to create a single key that can be used for more than one specific cloud storage bucket. This enhancement provides more granular control over bucket access, reducing the attack surface.  Secondly, Backblaze is now

Read More »

Macquarie Strategists Forecast USA Crude Inventory Build

In an oil and gas report sent to Rigzone by the Macquarie team late Monday, Macquarie strategists, including Walt Chancellor, revealed that they are forecasting that U.S. crude inventories will be up by 4.7 million barrels for the week ending July 25. “This follows a 3.2 million barrel draw in the prior week, with the crude balance realizing looser than our expectations,” the strategists said in the report. “For this week’s crude balance, from refineries, we model a small increase in crude runs (+0.1 million barrels per day) following a strong print last week,” they added. “Among net imports, we model a very large increase, with exports down (-0.6 million barrels per day) and imports up (+0.7 million barrels per day) on a nominal basis,” they continued. The strategists warned in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod.+adj.+transfers), we look for a small reduction (-0.1 million barrels per day) on a nominal basis this week,” the strategists went on to state in the report. “Rounding out the picture, we anticipate a small build in SPR [Strategic Petroleum Reserve] stocks (+0.2 million barrels) this week,” they added. The Macquarie strategists also highlighted in the report that, “among products”, they “look for small builds across the board (gasoline/ distillate/jet +0.3/+0.7/+0.1 million barrels)”. “We model implied demand for these three products at ~14.6 million barrels per day for the week ending July 25,” the strategists added in the report. In its latest weekly petroleum status report at the time of writing, which was released on July 23 and showed data for the week ending July 18, the U.S. Energy Information Administration (EIA) highlighted that U.S. commercial crude oil inventories, excluding those in the SPR, decreased by 3.2 million

Read More »

PTTEP Acquires Stake in Gulf of Thailand Asset from Chevron

Thailand’s PTT Exploration and Production Public Co. Ltd. (PTTEP) said it has acquired a 50 percent participating interest in Block A-18 of the Malaysia–Thailand Joint Development Area (MTJDA) for $450 million. The sellers, Hess (Bahamas) Limited and Hess Asia Holdings Inc., are subsidiaries of Chevron following the Chevron-Hess merger. The acquisition enhances PTTEP’s gas production volume, petroleum reserves, and increases its investment in the MTJDA from its existing 50 percent participating interest in Block B-17-01, the company said in a news release. Block A-18 currently produces 600 million standard cubic feet of natural gas per day (MMscfd) which is distributed equally to Thailand and Malaysia, the company said, adding that the 300 MMscfd supplied to Thailand accounts for six percent of the country’s domestic gas demand. PTTEP said it plans to develop additional production wells and wellhead platforms, as well as gas pipelines, to support a consistent and reliable gas supply. The MTJDA is located in the southern part of the Gulf of Thailand. Covering an area of approximately 2,800 square miles (7,250 square kilometers), it is a key source of natural gas and condensates for Thailand and Malaysia, according to the release. Block A-18 which includes Cakerawala, Bumi, Suriya, Bulan, and Bulan South fields, started production in 2005, while Block B-17-01 began production in 2010. The block includes Muda, Tapi, Tanjung, Amarit, Jengka, Melati, and Andalas fields, and currently produces approximately 300 MMscfd of natural gas for Thailand and Malaysia, the release said. “PTTEP is pleased to further expand our operations in the MTJDA, which is recognized for its petroleum potential and strategic significance to Thailand’s energy security. The acquisition also contributes to the company’s growth. Apart from the existing producing fields, Block A-18 includes several discovered gas fields awaiting development to unlock their full potential. Participation in Block

Read More »

ExxonMobil Transfers Operatorship of Bass Strait Assets to Woodside

Woodside Energy Group Ltd. will assume operatorship of the Gippsland Basin Joint Venture (GBJV) and Kipper Unit Joint Venture (KUJV), which account for 40 percent of natural gas supply in the Australian east coast market, from Exxon Mobil Corp., the Australian company said Tuesday. “Woodside and ExxonMobil’s equity interests in the assets and current decommissioning plans and provisions remain unchanged”, Woodside said in a statement online. ExxonMobil and Woodside equally own the GBJV. In the KUJV, ExxonMobil and Woodside each hold 32.5 percent while Japan’s Mitsui & Co. Ltd. owns 35 percent. The operatorship change covers the Bass Strait production assets, the Longford Gas Plant, the Long Island Point gas liquids processing facility and associated pipeline infrastructure. The GBJV and KUJV assets have a daily production capacity of 700 terajoules of gas, nearly 1,800 metric tons of liquefied petroleum gas, over 200 metric tons of ethane and about 2,200 metric tons of condensate, according to information on Woodside’s website. The parties anticipate completing the transaction next year, subject to regulatory approvals and other conditions. “As operator, Woodside will take on the responsibility for asset planning and execution activities, pursuing a value maximization strategy that targets further production and reliability improvements”, the statement said. In March 2022 Woodside announced further investment to deliver additional gas between 2023 and 2027. Several facilities have ceased production due to field depletion, according to Woodside. “This strategic move combines Woodside’s existing global operating capabilities with ExxonMobil’s highly experienced Bass Strait workforce who will transfer to Woodside, further strengthening Woodside’s overall operating expertise”, Woodside added. “Operatorship of a larger group of assets in Australia will create economies of scale which are expected to realize over $60 million in synergies for Woodside from the Bass Strait after deduction of transition and integration costs. “The agreement also creates flexibility

Read More »

Crude Futures Soar

Oil ended the session at the highest levels in over a month as President Donald Trump reiterated that the US may impose additional levies on Russia unless it reached a truce with Ukraine, stoking worries about tighter supplies. West Texas Intermediate oil settled at $69.21 a barrel while Brent settled above $72 a barrel, with both benchmarks at the highest since June. Speaking to reporters aboard Air Force One Tuesday, Trump warned of the potential for secondary sanctions if Moscow fails to reach a ceasefire within ten days. Asked if he was worried about the impact additional sanctions on Russia would have on the oil market, Trump said he was not concerned, suggesting that the US could ramp up its own energy production. “I don’t worry about it. We have so much oil in our country. We’ll just step it up, even further,” he said. This week, bullish options on the Brent crude benchmark flipped to a premium to bearish options for the first time in two weeks, signaling the optimistic sentiment extended beyond headline prices. “The new deadline caught many analysts by surprise and, if enforced, could tighten Russian crude and fuel supplies to the global market,” said Dennis Kissler, senior vice president for trading at BOK Financial Securities. Earlier Tuesday morning, Kremlin made it clear that President Vladimir Putin is unlikely to change course, after taking note of the US President’s threat. Trump’s warning follows the latest round of sanctions by the European Union targeting Russia, including penalties on India’s Nayara Energy, which has trimmed processing rates at a refinery as a result of the measures. Global markets are also focused on the US deadline to nail down trade deals by Aug. 1, and the upcoming OPEC+ meeting that will decide supply policy for September. Oil was already

Read More »

OPEC+ Oil Gambit Looking a Little Less Costly, For Now

The financial sting of OPEC+’s shock move to open the oil taps appears to be fading — for the time being.  When Saudi Arabia and its partners agreed four months ago to rapidly revive crude production, the fallout seemed catastrophic: prices crashed to a four-year low, leaving producers with widening budget deficits as state revenues dwindled.  But the ensuing months, which saw the Organization of the Petroleum Exporting Countries and its partners announce further supply increases, have brought some solace.  As benchmark Brent recovers to $70 a barrel and the countries’ production targets rise, the nominal value of output from four of OPEC’s key Middle East members has climbed to the highest since February. This month it jumped to almost $1.4 billion per day, according to calculations using data from Rystad Energy A/S.  The main uncertainty, however, is whether this tentative rebound will endure.  Oil forecasters at Goldman Sachs Group Inc. and JPMorgan Chase & Co. widely expect a price slump later this year as extra barrels from OPEC+ swell the surplus created by faltering Chinese demand and brimming US supply. That could slash OPEC+ revenues again, and even pressure the coalition to roll back the latest output hikes. Eight key OPEC+ nations will decide this weekend on another bumper production hike for September, which would complete the restart of a 2.2 million-barrel supply tranche a year ahead of schedule. The alliance is still raking in less cash than before it loosened the spigot, and could face a deeper slump in the months ahead. But for now Riyadh can take consolation that the hit from pushing through such a bold strategy wasn’t much worse.   WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate

Read More »

Department of Energy Issues Report Evaluating Impact of Greenhouse Gasses on U.S. Climate, Invites Public Comment

WASHINGTON—The U.S. Department of Energy (DOE) today released a new report, A Critical Review of Impacts of Greenhouse Gas Emissions on the U.S. Climate, evaluating existing peer-reviewed literature and government data on climate impacts of Greenhouse Gas (GHG) Emissions and providing a critical assessment of the conventional narrative on climate change. The report was developed by the 2025 Climate Working Group, a group of five independent scientists assembled by Energy Secretary Chris Wright with diverse expertise in physical science, economics, climate science and academic research. Among the key findings, the report concludes that CO2-induced warming appears to be less damaging economically than commonly believed, and that aggressive mitigation strategies may be misdirected. Additionally, the report finds that U.S. policy actions are expected to have undetectably small direct impacts on the global climate and any effects will emerge only with long delays. “The rise of human flourishing over the past two centuries is a story worth celebrating. Yet we are told—relentlessly—that the very energy systems that enabled this progress now pose an existential threat,” said U.S. Secretary of Energy Chris Wright.“Climate change is real, and it deserves attention. But it is not the greatest threat facing humanity. As someone who values data, I know that improving the human condition depends on expanding access to reliable, affordable energy.” The report was published today as part of the U.S. Environmental Protection Agency’s (EPA) proposed rule repealing the 2009 Endangerment Finding. EPA’s proposed rule, if finalized, will mark a critical step forward in achieving President Trump’s commitment to unleash American energy dominance and return to commonsense policymaking. “We stand at the threshold of a new era of energy leadership,” Secretary Wright concluded. “If we empower innovation rather than restrain it, America can lead the world in providing more abundant energy—lifting billions out of poverty,

Read More »

AI Deployments are Reshaping Intra-Data Center Fiber and Communications

Artificial Intelligence is fundamentally changing the way data centers are architected, with a particular focus on the demands placed on internal fiber and communications infrastructure. While much attention is paid to the fiber connections between data centers or to end-users, the real transformation is happening inside the data center itself, where AI workloads are driving unprecedented requirements for bandwidth, low latency, and scalable networking. Network Segmentation and Specialization Inside the modern AI data center, the once-uniform network is giving way to a carefully divided architecture that reflects the growing divergence between conventional cloud services and the voracious needs of AI. Where a single, all-purpose network once sufficed, operators now deploy two distinct fabrics, each engineered for its own unique mission. The front-end network remains the familiar backbone for external user interactions and traditional cloud applications. Here, Ethernet still reigns, with server-to-leaf links running at 25 to 50 gigabits per second and spine connections scaling to 100 Gbps. Traffic is primarily north-south, moving data between users and the servers that power web services, storage, and enterprise applications. This is the network most people still imagine when they think of a data center: robust, versatile, and built for the demands of the internet age. But behind this familiar façade, a new, far more specialized network has emerged, dedicated entirely to the demands of GPU-driven AI workloads. In this backend, the rules are rewritten. Port speeds soar to 400 or even 800 gigabits per second per GPU, and latency is measured in sub-microseconds. The traffic pattern shifts decisively east-west, as servers and GPUs communicate in parallel, exchanging vast datasets at blistering speeds to train and run sophisticated AI models. The design of this network is anything but conventional: fat-tree or hypercube topologies ensure that no single link becomes a bottleneck, allowing thousands of

Read More »

ABB and Applied Digital Build a Template for AI-Ready Data Centers

Toward the Future of AI Factories The ABB–Applied Digital partnership signals a shift in the fundamentals of data center development, where electrification strategy, hyperscale design and readiness, and long-term financial structuring are no longer separate tracks but part of a unified build philosophy. As Applied Digital pushes toward REIT status, the Ellendale campus becomes not just a development milestone but a cornerstone asset: a long-term, revenue-generating, AI-optimized property underpinned by industrial-grade power architecture. The 250 MW CoreWeave lease, with the option to expand to 400 MW, establishes a robust revenue base and validates the site’s design as AI-first, not cloud-retrofitted. At the same time, ABB is positioning itself as a leader in AI data center power architecture, setting a new benchmark for scalable, high-density infrastructure. Its HiPerGuard Medium Voltage UPS, backed by deep global manufacturing and engineering capabilities, reimagines power delivery for the AI era, bypassing the limitations of legacy low-voltage systems. More than a component provider, ABB is now architecting full-stack electrification strategies at the campus level, aiming to make this medium-voltage model the global standard for AI factories. What’s unfolding in North Dakota is a preview of what’s coming elsewhere: AI-ready campuses that marry investment-grade real estate with next-generation power infrastructure, built for a future measured in megawatts per rack, not just racks per row. As AI continues to reshape what data centers are and how they’re built, Ellendale may prove to be one of the key locations where the new standard was set.

Read More »

Amazon’s Project Rainier Sets New Standard for AI Supercomputing at Scale

Supersized Infrastructure for the AI Era As AWS deploys Project Rainier, it is scaling AI compute to unprecedented heights, while also laying down a decisive marker in the escalating arms race for hyperscale dominance. With custom Trainium2 silicon, proprietary interconnects, and vertically integrated data center architecture, Amazon joins a trio of tech giants, alongside Microsoft’s Project Stargate and Google’s TPUv5 clusters, who are rapidly redefining the future of AI infrastructure. But Rainier represents more than just another high-performance cluster. It arrives in a moment where the size, speed, and ambition of AI infrastructure projects have entered uncharted territory. Consider the past several weeks alone: On June 24, AWS detailed Project Rainier, calling it “a massive, one-of-its-kind machine” and noting that “the sheer size of the project is unlike anything AWS has ever attempted.” The New York Times reports that the primary Rainier campus in Indiana could include up to 30 data center buildings. Just two days later, Fermi America unveiled plans for the HyperGrid AI campus in Amarillo, Texas on a sprawling 5,769-acre site with potential for 11 gigawatts of power and 18 million square feet of AI data center capacity. And on July 1, Oracle projected $30 billion in annual revenue from a single OpenAI cloud deal, tied to the Project Stargate campus in Abilene, Texas. As Data Center Frontier founder Rich Miller has observed, the dial on data center development has officially been turned to 11. Once an aspirational concept, the gigawatt-scale campus is now materializing—15 months after Miller forecasted its arrival. “It’s hard to imagine data center projects getting any bigger,” he notes. “But there’s probably someone out there wondering if they can adjust the dial so it goes to 12.” Against this backdrop, Project Rainier represents not just financial investment but architectural intent. Like Microsoft’s Stargate buildout in

Read More »

Google and CTC Global Partner to Fast-Track U.S. Power Grid Upgrades

On June 17, 2025, Google and CTC Global announced a joint initiative to accelerate the deployment of high-capacity power transmission lines using CTC’s U.S.-manufactured ACCC® advanced conductors. The collaboration seeks to relieve grid congestion by rapidly upgrading existing infrastructure, enabling greater integration of clean energy, improving system resilience, and unlocking capacity for hyperscale data centers. The effort represents a rare convergence of corporate climate commitments, utility innovation, and infrastructure modernization aligned with the public interest. As part of the initiative, Google and CTC issued a Request for Information (RFI) with responses due by July 14. The RFI invites utilities, state energy authorities, and developers to nominate transmission line segments for potential fast-tracked upgrades. Selected projects will receive support in the form of technical assessments, financial assistance, and workforce development resources. While advanced conductor technologies like ACCC® can significantly improve the efficiency and capacity of existing transmission corridors, technological innovation alone cannot resolve the grid’s structural challenges. Building new or upgraded transmission lines in the U.S. often requires complex permitting from multiple federal, state, and local agencies, and frequently faces legal opposition, especially from communities invoking Not-In-My-Backyard (NIMBY) objections. Today, the average timeline to construct new interstate transmission infrastructure stretches between 10 and 12 years, an untenable lag in an era when grid reliability is under increasing stress. In 2024, the Federal Energy Regulatory Commission (FERC) reported that more than 2,600 gigawatts (GW) of clean energy and storage projects were stalled in the interconnection queue, waiting for sufficient transmission capacity. The consequences affect not only industrial sectors like data centers but also residential areas vulnerable to brownouts and peak load disruptions. What is the New Technology? At the center of the initiative is CTC Global’s ACCC® (Aluminum Conductor Composite Core) advanced conductor, a next-generation overhead transmission technology engineered to boost grid

Read More »

CoreSite’s Denver Power Play: Acquisition of Historic Carrier Hotel Supercharges Interconnection Capabilities

In this episode of the Data Center Frontier Show podcast, we unpack one of the most strategic data center real estate moves of 2025: CoreSite’s acquisition of the historic Denver Gas and Electric Building. With this transaction, CoreSite, an American Tower company, cements its leadership in the Rocky Mountain region’s interconnection landscape, expands its DE1 facility, and streamlines access to Google Cloud and the Any2Denver peering exchange. Podcast guests Yvonne Ng, CoreSite’s General Manager and Vice President for the Central Region, and Adam Post, SVP of Finance and Corporate Development, offer in-depth insights into the motivations behind the deal, the implications for regional cloud and network ecosystems, and what it means for Denver’s future as a cloud interconnection hub. Carrier Hotel to Cloud Hub Located at 910 15th Street in downtown Denver, the Denver Gas and Electric Building is widely known as the most network-dense facility in the region. Long the primary interconnection hub for the Rocky Mountains, the building has now been fully acquired by CoreSite, bringing ownership and operations of the DE1 data center under a single umbrella. “This is a strategic move to consolidate control and expand our capabilities,” said Ng. “By owning the building, we can modernize infrastructure more efficiently, double the space and power footprint of DE1, and deliver an unparalleled interconnection ecosystem.” The acquisition includes the facility’s operating businesses and over 100 customers. CoreSite will add approximately 3 critical megawatts (CMW) of data center capacity, nearly doubling DE1’s footprint. Interconnection in the AI Era As AI, multicloud strategies, and real-time workloads reshape enterprise architecture, interconnection has never been more vital. CoreSite’s move elevates Denver’s role in this transformation. With the deal, CoreSite becomes the only data center provider in the region offering direct connections to major cloud platforms, including the dedicated Google Cloud Platform

Read More »

Texas Senate Bill 6: A Bellwether On How States May Approach Data Center Energy Use

Texas isn’t the first state to begin attempting to regulate energy use statewide. The impact of this legislation could shape how other states, of which there are at least a dozen in process, could shape their own programs. What are Other States Doing? There’s a clear shift toward targeted utility regulation for mega-load data centers. States are increasingly requiring cost alignment, with large consumers bearing infrastructure costs rather than residential cross-subsidization and implementing specialized contract/tariff terms, taking advantage of these huge contracts to uniquely tailor each contract. These agreements are also being used to enforce environmental responsibility through reporting mandates and permitting. And for those estates still focusing on incentivization to draw data center business, coupling incentives with guardrails, balancing investment attraction with equitable distribution. What follows is a brief  overview of U.S. states that have enacted or proposed special utility regulations and requirements for data centers. The focus is  on tariffs, cost-allocation mechanisms, green mandates, billing structures, and transparency rules. California SB 57 (2025): Introduces a special electricity tariff for large users—including data centers—with embedded zero-carbon procurement targets, aiming to integrate grid reliability with emissions goals. AB 222 (2025): Targets consumption transparency, requiring data centers to report energy usage with a specific focus on AI-driven load. Broader California Public Utilities  actions: Proposals for efficiency mandates like airflow containment via Title 24; opening utility rate cases to analyze infrastructure cost recovery from large consumers. Georgia Public Service Commission  rule changes (January 2025): Georgia Power can impose minimum billing, longer contract durations, and special terms for customers with loads >100 MW—chiefly data centers. SB 34: Mandates that data centers either assume full infrastructure costs or pay equitably—not distributing these costs to residential users. Ohio AEP Ohio proposed in 2024: For loads >25 MW (data centers, crypto), demand minimum charges, 10-year contracts, and exit penalties before new infrastructure

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »