Stay Ahead, Stay ONMINE

Exclusive: A record-breaking baby has been born from an embryo that’s over 30 years old

A baby boy born over the weekend holds the new record for the “oldest baby.” Thaddeus Daniel Pierce, who arrived on July 26, developed from an embryo that had been in storage for 30 and a half years. “We had a rough birth but we are both doing well now,” says Lindsey Pierce, his mother. “He is so chill. We are in awe that we have this precious baby!” Lindsey and her husband, Tim Pierce, who live in London, Ohio, “adopted” the embryo from a woman who had it created in 1994. She says her family and church family think “it’s like something from a sci-fi movie.”  “The baby has a 30-year-old sister,” she adds. Tim was a toddler when the embryos were first created. “It’s been pretty surreal,” says Linda Archerd, 62, who donated the embryo. “It’s hard to even believe.” Three little hopes The story starts back in the early 1990s. Archerd had been trying—and failing—to get pregnant for six years. She and her husband decided to try IVF, a fairly new technology at the time. “People were [unfamiliar] with it,” says Archerd. “A lot of people were like, what are you doing?” They did it anyway, and in May 1994, they managed to create four embryos. One of them was transferred to Linda’s uterus. It resulted in a healthy baby girl. “I was so blessed to have a baby,” Archerd says. The remaining three embryos were cryopreserved and kept in a storage tank. That was 31 years ago. The healthy baby girl is now a 30-year-old woman who has her own 10-year-old daughter. But the other three embryos remained frozen in time. Archerd originally planned to use the embryos herself. “I always wanted another baby desperately,” she says. “I called them my three little hopes.” Her then husband felt differently, she says. Archerd went on to divorce him, but she won custody of the embryos and kept them in storage, still hopeful she might use them one day, perhaps with another partner. That meant paying annual storage fees, which increased over time and ended up costing Archerd around a thousand dollars a year, she says. To her, it was worth it. “I always thought it was the right thing to do,” she says.  Things changed when she started going through menopause, she says. She considered her options. She didn’t want to discard the embryos or donate them for research. And she didn’t want to donate them to another family anonymously—she wanted to meet the parents and any resulting babies. “It’s my DNA; it came from me … and [it’s] my daughter’s sibling,” she says. Then she found out about embryo “adoption.” This is a type of embryo donation in which both donors and recipients have a say in whom they “place” their embryos with or “adopt” them from. It is overseen by agencies—usually explicitly religious ones—that believe an embryo is morally equivalent to a born human. Archerd is Christian. There are several agencies that offer these adoption services in the US, but not all of them accept embryos that have been stored for a very long time. That’s partly because those embryos will have been frozen and stored in unfamiliar, old-fashioned ways, and partly because old embryos are thought to be less likely to survive thawing and transfer to successfully develop into a baby. “So many places wouldn’t even take my information,” says Archerd. Then she came across the Snowflakes program run by the Nightlight Christian Adoptions agency. The agency was willing to accept her embryos, but it needed Archerd’s medical records from the time the embryos had been created, as well as the embryos’ lab records. So Archerd called the fertility doctor who had treated her decades before. “I still remembered his phone number by heart,” she says. That doctor, now in his 70s, is still practicing at a clinic in Oregon. He dug Archerd’s records out from his basement, she says. “Some of [them] were handwritten,” she adds. Her embryos entered Nightlight’s “matching pool” in 2022. Making a match “Our matching process is really driven by the preferences of the placing family,” says Beth Button, executive director of the Snowflakes program. Archerd’s preference was for a married Caucasian, Christian couple living in the US. “I didn’t want them to go out of the country,” says Archerd. “And being Christian is very important to me, because I am.” It took a while to find a match. Most of the “adopting parents” signed up for the Snowflakes program were already registered at fertility clinics that wouldn’t have accepted the embryos, says Button. “I would say that over 90% of clinics in the US would not have accepted these embryos,” she says. Lindsey and Tim Pierce at Rejoice Fertility.COURTESY LINDSEY PIERCE Archerd’s embryos were assigned to the agency’s Open Hearts program for embryos that are “hard to place,” along with others that have been in storage for a long time or are otherwise thought to be less likely to result in a healthy birth. Lindsey and Tim Pierce had also signed up for the Open Hearts program. The couple, aged 35 and 34, respectively, had been trying for a baby for seven years and had seen multiple doctors. Lindsey was researching child adoption when she came across the Snowflakes program.  When the couple were considering their criteria for embryos they might receive, they decided that they’d be open to any. “We checkmarked anything and everything,” says Tim. That’s how they ended up being matched with Archerd’s embryos. “We thought it was wild,” says Lindsey. “We didn’t know they froze embryos that long ago.” Lindsey and Tim had registered with Rejoice Fertility, an IVF clinic in Knoxville, Tennessee, run by John Gordon, a reproductive endocrinologist who prides himself on his efforts to reduce the number of embryos in storage. The huge numbers of embryos left in storage tanks was weighing on his conscience, he says, so around six years ago, he set up Rejoice Fertility with the aim of doing things differently.   “Now we’re here in the belt buckle of the Bible Belt,” says Gordon, who is Reformed Presbyterian. “I’ve changed my mode of practice.” IVF treatments performed at the clinic are designed to create as few excess embryos as possible. The clinic works with multiple embryo adoption agencies and will accept any embryo, no matter how long it has been in storage. COURTESY LINDA ARCHERD It was his clinic that treated the parents who previously held the record for the longest-stored embryo—in 2022, Rachel and Philip Ridgeway had twins from embryos created more than 30 years earlier. “They’re such a lovely couple,” says Gordon. When we spoke, he was making plans to meet the family for breakfast. The twins are “growing like weeds,” he says with a laugh. “We have certain guiding principles, and they’re coming from our faith,” says Gordon, although he adds that he sees patients who hold alternative views. One of those principles is that “every embryo deserves a chance at life and that the only embryo that cannot result in a healthy baby is the embryo not given the opportunity to be transferred into a patient.” That’s why his team will endeavor to transfer any embryo they receive, no matter the age or conditions. That can be challenging, especially when the embryos have been frozen or stored in unusual or outdated ways. “It’s scary for people who don’t know how to do it,” says Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility. “You don’t want to kill someone’s embryos if you don’t know what you’re doing.” Cumbersome and explosive In the early days of IVF, embryos earmarked for storage were slow-frozen. This technique involves gradually lowering the temperature of the embryos. But because slow freezing can cause harmful ice crystals to form, clinics switched in the 2000s to a technique called vitrification, in which the embryos are placed in thin plastic tubes called straws and lowered into tanks of liquid nitrogen. This rapidly freezes the embryos and converts them into a glass-like state.  The embryos can later be thawed by removing them from the tanks and rapidly—within two seconds—plunging them into warm “thaw media,” says Atkinson. Thawing slow-frozen embryos is more complicated. And the exact thawing method required varies, depending on how the embryos were preserved and what they were stored in. Some of the devices need to be opened while they are inside the storage tank, which can involve using forceps, diamond-bladed knives, and other tools in the liquid nitrogen, says Atkinson. Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility, directly injects sperm into two eggs to fertilize them.COURTESY OF SARAH ATKINSON AT REJOICE FERTILITY. Recently, she was tasked with retrieving embryos that had been stored inside a glass vial. The vial was made from blown glass and had been heat-sealed with the embryo inside. Atkinson had to use her diamond-bladed knife to snap open the seal inside the nitrogen tank. It was fiddly work, and when the device snapped, a small shard of glass flew out and hit Atkinson’s face. “Hit me on the cheek, cut my cheek, blood running down my face, and I’m like, Oh shit,” she says. Luckily, she had her safety goggles on. And the embryos survived, she adds. The two embryos that were transferred to Lindsey Pierce. Atkinson has a folder in her office with notes she’s collected on various devices over the years. She flicks through it over a video call and points to the notes she made about the glass vial. “Might explode; wear face shield and eye protection,” she reads. A few pages later, she points to another embryo-storage device. “You have to thaw this one in your fingers,” she tells me. “I don’t like it.” The record-breaking embryos had been slow-frozen and stored in a plastic vial, says Atkinson. Thawing them was a cumbersome process. But all three embryos survived it. The Pierces had to travel from their home in Ohio to the clinic in Tennessee five times over a two-week period. “It was like a five-hour drive,” says Lindsey. One of the three embryos stopped growing. The other two were transferred to Lindsey’s uterus on November 14, she says. And one developed into a fetus. Now that the baby has arrived, Archerd is keen to meet him. “The first thing that I noticed when Lindsey sent me his pictures is how much he looks like my daughter when she was a baby,” she says. “I pulled out my baby book and compared them side by side, and there is no doubt that they are siblings.” She doesn’t yet have plans to meet the baby, but doing so would be “a dream come true,” she says. “I wish that they didn’t live so far away from me … He is perfect!” “We didn’t go into it thinking we would break any records,” says Lindsey. “We just wanted to have a baby.”

A baby boy born over the weekend holds the new record for the “oldest baby.” Thaddeus Daniel Pierce, who arrived on July 26, developed from an embryo that had been in storage for 30 and a half years.

“We had a rough birth but we are both doing well now,” says Lindsey Pierce, his mother. “He is so chill. We are in awe that we have this precious baby!”

Lindsey and her husband, Tim Pierce, who live in London, Ohio, “adopted” the embryo from a woman who had it created in 1994. She says her family and church family think “it’s like something from a sci-fi movie.” 

“The baby has a 30-year-old sister,” she adds. Tim was a toddler when the embryos were first created.

“It’s been pretty surreal,” says Linda Archerd, 62, who donated the embryo. “It’s hard to even believe.”

Three little hopes

The story starts back in the early 1990s. Archerd had been trying—and failing—to get pregnant for six years. She and her husband decided to try IVF, a fairly new technology at the time. “People were [unfamiliar] with it,” says Archerd. “A lot of people were like, what are you doing?”

They did it anyway, and in May 1994, they managed to create four embryos. One of them was transferred to Linda’s uterus. It resulted in a healthy baby girl. “I was so blessed to have a baby,” Archerd says. The remaining three embryos were cryopreserved and kept in a storage tank.

That was 31 years ago. The healthy baby girl is now a 30-year-old woman who has her own 10-year-old daughter. But the other three embryos remained frozen in time.

Archerd originally planned to use the embryos herself. “I always wanted another baby desperately,” she says. “I called them my three little hopes.” Her then husband felt differently, she says. Archerd went on to divorce him, but she won custody of the embryos and kept them in storage, still hopeful she might use them one day, perhaps with another partner.

That meant paying annual storage fees, which increased over time and ended up costing Archerd around a thousand dollars a year, she says. To her, it was worth it. “I always thought it was the right thing to do,” she says. 

Things changed when she started going through menopause, she says. She considered her options. She didn’t want to discard the embryos or donate them for research. And she didn’t want to donate them to another family anonymously—she wanted to meet the parents and any resulting babies. “It’s my DNA; it came from me … and [it’s] my daughter’s sibling,” she says.

Then she found out about embryo “adoption.” This is a type of embryo donation in which both donors and recipients have a say in whom they “place” their embryos with or “adopt” them from. It is overseen by agencies—usually explicitly religious ones—that believe an embryo is morally equivalent to a born human. Archerd is Christian.

There are several agencies that offer these adoption services in the US, but not all of them accept embryos that have been stored for a very long time. That’s partly because those embryos will have been frozen and stored in unfamiliar, old-fashioned ways, and partly because old embryos are thought to be less likely to survive thawing and transfer to successfully develop into a baby.

“So many places wouldn’t even take my information,” says Archerd. Then she came across the Snowflakes program run by the Nightlight Christian Adoptions agency. The agency was willing to accept her embryos, but it needed Archerd’s medical records from the time the embryos had been created, as well as the embryos’ lab records.

So Archerd called the fertility doctor who had treated her decades before. “I still remembered his phone number by heart,” she says. That doctor, now in his 70s, is still practicing at a clinic in Oregon. He dug Archerd’s records out from his basement, she says. “Some of [them] were handwritten,” she adds. Her embryos entered Nightlight’s “matching pool” in 2022.

Making a match

“Our matching process is really driven by the preferences of the placing family,” says Beth Button, executive director of the Snowflakes program. Archerd’s preference was for a married Caucasian, Christian couple living in the US. “I didn’t want them to go out of the country,” says Archerd. “And being Christian is very important to me, because I am.”

It took a while to find a match. Most of the “adopting parents” signed up for the Snowflakes program were already registered at fertility clinics that wouldn’t have accepted the embryos, says Button. “I would say that over 90% of clinics in the US would not have accepted these embryos,” she says.

Expecting parents Tim and Lindsey Pierce.
Lindsey and Tim Pierce at Rejoice Fertility.
COURTESY LINDSEY PIERCE

Archerd’s embryos were assigned to the agency’s Open Hearts program for embryos that are “hard to place,” along with others that have been in storage for a long time or are otherwise thought to be less likely to result in a healthy birth.

Lindsey and Tim Pierce had also signed up for the Open Hearts program. The couple, aged 35 and 34, respectively, had been trying for a baby for seven years and had seen multiple doctors.

Lindsey was researching child adoption when she came across the Snowflakes program. 

When the couple were considering their criteria for embryos they might receive, they decided that they’d be open to any. “We checkmarked anything and everything,” says Tim. That’s how they ended up being matched with Archerd’s embryos. “We thought it was wild,” says Lindsey. “We didn’t know they froze embryos that long ago.”

Lindsey and Tim had registered with Rejoice Fertility, an IVF clinic in Knoxville, Tennessee, run by John Gordon, a reproductive endocrinologist who prides himself on his efforts to reduce the number of embryos in storage. The huge numbers of embryos left in storage tanks was weighing on his conscience, he says, so around six years ago, he set up Rejoice Fertility with the aim of doing things differently.  

“Now we’re here in the belt buckle of the Bible Belt,” says Gordon, who is Reformed Presbyterian. “I’ve changed my mode of practice.” IVF treatments performed at the clinic are designed to create as few excess embryos as possible. The clinic works with multiple embryo adoption agencies and will accept any embryo, no matter how long it has been in storage.

A portrait of Linda Archerd.

COURTESY LINDA ARCHERD

It was his clinic that treated the parents who previously held the record for the longest-stored embryo—in 2022, Rachel and Philip Ridgeway had twins from embryos created more than 30 years earlier. “They’re such a lovely couple,” says Gordon. When we spoke, he was making plans to meet the family for breakfast. The twins are “growing like weeds,” he says with a laugh.

“We have certain guiding principles, and they’re coming from our faith,” says Gordon, although he adds that he sees patients who hold alternative views. One of those principles is that “every embryo deserves a chance at life and that the only embryo that cannot result in a healthy baby is the embryo not given the opportunity to be transferred into a patient.”

That’s why his team will endeavor to transfer any embryo they receive, no matter the age or conditions. That can be challenging, especially when the embryos have been frozen or stored in unusual or outdated ways. “It’s scary for people who don’t know how to do it,” says Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility. “You don’t want to kill someone’s embryos if you don’t know what you’re doing.”

Cumbersome and explosive

In the early days of IVF, embryos earmarked for storage were slow-frozen. This technique involves gradually lowering the temperature of the embryos. But because slow freezing can cause harmful ice crystals to form, clinics switched in the 2000s to a technique called vitrification, in which the embryos are placed in thin plastic tubes called straws and lowered into tanks of liquid nitrogen. This rapidly freezes the embryos and converts them into a glass-like state. 

The embryos can later be thawed by removing them from the tanks and rapidly—within two seconds—plunging them into warm “thaw media,” says Atkinson. Thawing slow-frozen embryos is more complicated. And the exact thawing method required varies, depending on how the embryos were preserved and what they were stored in. Some of the devices need to be opened while they are inside the storage tank, which can involve using forceps, diamond-bladed knives, and other tools in the liquid nitrogen, says Atkinson.

Sarah Atkinson, lab supervisor and head embryologist at Rejoice Fertility, directly injects sperm into two eggs to fertilize them.
COURTESY OF SARAH ATKINSON AT REJOICE FERTILITY.

Recently, she was tasked with retrieving embryos that had been stored inside a glass vial. The vial was made from blown glass and had been heat-sealed with the embryo inside. Atkinson had to use her diamond-bladed knife to snap open the seal inside the nitrogen tank. It was fiddly work, and when the device snapped, a small shard of glass flew out and hit Atkinson’s face. “Hit me on the cheek, cut my cheek, blood running down my face, and I’m like, Oh shit,” she says. Luckily, she had her safety goggles on. And the embryos survived, she adds.

The two embryos that were transferred to Lindsey Pierce.

Atkinson has a folder in her office with notes she’s collected on various devices over the years. She flicks through it over a video call and points to the notes she made about the glass vial. “Might explode; wear face shield and eye protection,” she reads. A few pages later, she points to another embryo-storage device. “You have to thaw this one in your fingers,” she tells me. “I don’t like it.”

The record-breaking embryos had been slow-frozen and stored in a plastic vial, says Atkinson. Thawing them was a cumbersome process. But all three embryos survived it.

The Pierces had to travel from their home in Ohio to the clinic in Tennessee five times over a two-week period. “It was like a five-hour drive,” says Lindsey. One of the three embryos stopped growing. The other two were transferred to Lindsey’s uterus on November 14, she says. And one developed into a fetus.

Now that the baby has arrived, Archerd is keen to meet him. “The first thing that I noticed when Lindsey sent me his pictures is how much he looks like my daughter when she was a baby,” she says. “I pulled out my baby book and compared them side by side, and there is no doubt that they are siblings.”

She doesn’t yet have plans to meet the baby, but doing so would be “a dream come true,” she says. “I wish that they didn’t live so far away from me … He is perfect!”

“We didn’t go into it thinking we would break any records,” says Lindsey. “We just wanted to have a baby.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

TechnipFMC Sees Surge in Q2 Profit

TechnipFMC PLC has reported $285.5 million in adjusted net income for the second quarter, up 99.8 percent from the prior three-month period and 51.1 percent against Q2 2024. The adjusted diluted earnings per share of 68 cents beat the Zacks Consensus Estimate of $0.57. TechnipFMC kept its dividend at $0.05

Read More »

Spire to Buy Duke Tennessee Gas Business for $2.5B

Spire Inc. agreed to pay $2.5 billion to acquire Duke Energy Corp.’s Tennessee Piedmont Natural Gas unit to expand in the growing Nashville region. The deal will give Spire Tennessee’s largest investor-owned gas utility, with almost 3,800 miles (6,100 kilometers) of distribution and transmission pipelines and a liquefied natural gas facility, serving about 200,000 Nashville area customers. The price represents a multiple of 1.5 times Piedmont’s estimated 2026 rate base, according to a statement Tuesday. Spire is expanding in the middle Tennessee region, where Nashville is one of the fastest-growing US cities. The deal also reflects a long-term trend of utilities shedding non-core assets, especially gas companies, to focus on more stable, regulated operations. Duke said it would use about $800 million of the proceeds to offset debt at Piedmont to maintain its capital structure, with the balance going to its five-year capital plan. Spire, based in St. Louis, is one of the largest publicly traded natural gas companies in the country, serving Alabama, Mississippi and Missouri. “This acquisition is a natural fit for Spire, allowing us to expand our core utility business and increase our utility customer base to nearly two million homes and businesses,” Scott Doyle, Spire’s chief executive officer, said in the statement. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

WTI Breaks Out as Trump Targets Russian Oil Buyers

Oil rose to the highest since June after US President Donald Trump’s threats to penalize India for buying Russian crude fueled investors’ worries that global supplies may tighten. West Texas Intermediate climbed 1.1% to settle at $70 a barrel, the highest since June 20, extending a rally that jolted prices out of a narrow range this month. Trump said on his Truth Social platform that India would face a 25% tariff “plus a penalty” while criticizing the country for being one of largest buyers of Russian energy. India’s Nayara Energy is reducing run rates at its refinery as a result of the measures, stoking fears of exacerbated tightness in refined product markets that could bleed into oil prices. Earlier in the session, oil prices were boosted by data that showed resilience in the US economy in the second quarter. Later in the day though, gains were tempered by Federal Reserve officials downgrading their view of the US economy, raising concerns of domestic demand deterioration in coming months. “The threat of secondary sanctions on Russian crude supported WTI and Brent crude prices over the last session,” said Daniel Ghali, a commodity strategist at TD Securities. “Algos have now achieved their ‘max long’ position size in WTI, limiting the scope for further inflows,” suggesting the rally may be running out of steam. Also limiting the price gains, a US government report showed that crude inventories rose by 7.7 million barrels, the biggest increase since January, while stockpiles at the key storage hub in Cushing, Oklahoma also ticked up. At the same time, diesel reserves — which were previously at a multi-decade low for this time of year — increased, shaking trader confidence in a sector that has been underpinning oil-market resilience. The super-sized stockpile report came after Trump said he wasn’t concerned

Read More »

Nvidia addresses AI peak power demand, spikes in new rack-scale systems

Dive Brief: Nvidia recently announced some of its rack-scale systems will now include a new power supply unit with energy storage and other features the company claims can smooth power spikes and reduce peak grid demand by up to 30%. The artificial intelligence powerhouse said in a blog post Monday that the PSU with energy storage, plus the hardware and software needed to use it, will be available with its new GB300 NVL72 platform, as well as GB200 NVL72 systems. A spokesperson for Nvidia declined to say when or provide additional details.  Santiago Grijalva, a professor of electrical and computer engineering at Georgia Tech, called the new technology “a moderate big deal,” given Nvidia’s role as a dominant player in the space. “But this solution is limited to NVIDIA’s high-end systems,” he wrote in an email. “The solution competes with Tesla’s cooling and Meta’s hardware optimizations, offering a significant but not transformative refinement of existing power management techniques.” Dive Insight: Unlike traditional data centers, data centers that run AI workloads have rapid ramps that the director of reliability services for the Texas Reliability Entity recently compared to the load pattern of a steel mill. This creates problems for grid operators, utilities and the hyperscalers that need the power.  Nvidia said its new system can help address this challenge with a smoothing solution that consists of several mechanisms across three main operational phases: ramp up, steady state, and ramp down. A power cap feature limits power draw at the start of a workload, with the cap gradually increasing. Once steady-state operation is achieved, storage is deployed to control short-term power fluctuations. For ramping down, the GPU can operate in a special power burner mode to ensure a smooth transition rather than a sharp drop, the company said.  The energy storage mechanism is only

Read More »

Ex-Millennium Trader Launches Energy Focused Hedge Fund

A former Millennium Management LLC senior portfolio manager has launched an energy-focused hedge fund with the backing of New Holland Capital, people familiar with the matter said. New Holland typically invests $100 million to $300 million in new funds as a day-one investor and receives revenue shares in exchange, one of the people said, asking not to be identified because the information is private.  The new fund, Saber Capital, was founded by Steve Barclay and is based in Geneva. Saber will initially focus on global liquids trading — primarily crude oil and refined products such as gasoline and diesel. Representatives for Barclay and New Holland Capital declined to comment.   Volatility in oil markets has surged in response to US President Donald Trump’s trade moves and heightened geopolitical tensions in the Middle East. Variables in supply and demand — including how the rollout of fresh sanctions on Russia will reroute oil flows — and whether OPEC and its allies will swell global oil supplies have further clouded the outlook.  Commodities have been a major profit generator for hedge funds in recent years, with giants like Citadel raking in billions. Others such as Qube Research & Technologies are expanding in the space.  Barclay has more than two decades of experience in energy markets and was most recently a senior portfolio manager at Millennium Management. He was previously an energy specialist at Goldman Sachs Group Inc. and an adviser to multi-billion-dollar hedge funds. Barclay joins a growing cohort of money managers departing marquee hedge funds to start their own firms. Saber Capital was launched earlier this year and trading began in late May, the people said, adding that Barclay is currently the only risk taker.  The firm, which also has an office in Zurich, is raising capital in separately managed accounts and working with another investor to launch a

Read More »

Data center flexibility can save money but may come with higher emissions: MIT

Dive Brief: Data centers with the capability to shift workloads to different times can avoid stressing the electric grid and save consumers money — but the shifts may increase power plant emissions in some markets, according to new research from the MIT Future Energy Systems Center.  Data center load shifting can facilitate both renewable energy integration and improvements in the utilization of existing baseload capacity, researchers concluded. “As a result, the emissions impact depends on which effect dominates,” they wrote. “Our findings highlight the importance of aligning data center flexibility with renewable deployment and regional conditions.” Projections of data center electricity demand growth have utilities and grid operators looking for resources and demand management strategies to accommodate the new loads. AI data centers could consume 34 GW, or about 3% of the United States’ generating capacity by 2030, Schneider Electric said earlier this year. Other predictions for data center growth are much higher. Dive Insight: Experts say not all proposed data centers will be built, meaning estimates for future power demand are uncertain. But they also agree that significant demand growth is in the wings, and flexible load profiles can ease potential grid stress.  A 2024 RAND Corporation forecast sees 130 GW of data center demand in the United States by 2030. The U.S. Department of Energy this month estimated an additional 100 GW of new peak capacity is needed by 2030 for data centers. “Data centers are among the fastest-growing electricity consumers,” Christopher Knittel, MIT’s associate dean for climate and sustainability, said in a statement. “Our report underscores the urgency of rethinking grid management and [of] operating data centers more flexibly – especially when it comes to AI training.” Data centers do not operate consistently at full capacity and typically maintain utilization rates around 80%, according to the paper, “Flexible Data Centers

Read More »

Want abundant energy? Ask who benefits from scarcity.

Arjun Krishnaswami is a senior fellow at the Federation of American Scientists. A new obsession with abundance is spreading through policy conversations and governors’ mansions across the country. Abundance advocates, boosted by a recent book from Ezra Klein and Derek Thompson, envision a future in which we defeat the climate crisis, reduce cost of living and improve quality of life by speeding up construction of housing and energy infrastructure. Making clean energy abundant is certainly critical to addressing the climate crisis. We need plentiful, cheap, clean energy to replace polluting fossil fuels in buildings, vehicles and factories. As a senior policy advisor in the Biden White House, I worked on many policies aimed at clean energy abundance, directly or indirectly, and I also saw firsthand how those policies were insufficient. That’s why it is now clear to me that the abundance movement’s playbook — to streamline permitting, simplify government processes and make public investments more focused — falls short of what’s needed. We won’t achieve energy abundance unless we contend with the powerful interests that benefit from scarcity. Doing so requires reforming electricity markets, refreshing regulation of electric companies and rethinking the way we pay for grid infrastructure. Let’s start with the problem: we are not building nearly enough clean energy to curb climate change and keep electricity affordable. Analysis from three leading research projects found that for us to get within striking distance of the Paris climate goals and plan for the lowest electricity costs, we must build 70 GW to 125 GW of clean energy per year, much higher than the record 50 GW built in 2024. As a result of our failure to build new energy projects fast, families and businesses will pay more for power and the planet will warm faster. This is no longer an

Read More »

Micron unveils PCIe Gen6 SSD to power AI data center workloads

Competitive positioning With the launch of the 9650 SSD PCIe Gen 6, Micron competes with Samsung and SK Hynix enterprise SSD offerings, which are the dominant players in the SSD market. In December last year, SK Hynix announced the development of PS1012 U.2 Gen5 PCIe SSD, for massive high-capacity storage for AI data centers.  The PM1743 is Samsung’s PCIe Gen5 offering in the market, with 14,000 MBps sequential read, designed for high-performance enterprise workloads. According to Faruqui, PCIe Gen6 data center SSDs are best suited for AI inference performance enhancement. However, we’re still months away from large-scale adoption as no current CPU platforms are available with PCIe 6.0 support. Only Nvidia’s Blackwell-based GPUs have native PCIe 6.0 x16 support with interoperability tests in progress. He added that PCIe Gen 6 SSDs will see very delayed adoption in the PC segment and imminent 2025 2H adoption in AI, data centers, high-performance computing (HPC), and enterprise storage solutions. Micron has also introduced two additional SSDs alongside the 9650. The 6600 ION SSD delivers 122TB in an E3.S form factor and is targeted at hyperscale and enterprise data centers looking to consolidate server infrastructure and build large AI data lakes. A 245TB variant is on the roadmap. The 7600 PCIe Gen5 SSD, meanwhile, is aimed at mixed workloads that require lower latency.

Read More »

AI Deployments are Reshaping Intra-Data Center Fiber and Communications

Artificial Intelligence is fundamentally changing the way data centers are architected, with a particular focus on the demands placed on internal fiber and communications infrastructure. While much attention is paid to the fiber connections between data centers or to end-users, the real transformation is happening inside the data center itself, where AI workloads are driving unprecedented requirements for bandwidth, low latency, and scalable networking. Network Segmentation and Specialization Inside the modern AI data center, the once-uniform network is giving way to a carefully divided architecture that reflects the growing divergence between conventional cloud services and the voracious needs of AI. Where a single, all-purpose network once sufficed, operators now deploy two distinct fabrics, each engineered for its own unique mission. The front-end network remains the familiar backbone for external user interactions and traditional cloud applications. Here, Ethernet still reigns, with server-to-leaf links running at 25 to 50 gigabits per second and spine connections scaling to 100 Gbps. Traffic is primarily north-south, moving data between users and the servers that power web services, storage, and enterprise applications. This is the network most people still imagine when they think of a data center: robust, versatile, and built for the demands of the internet age. But behind this familiar façade, a new, far more specialized network has emerged, dedicated entirely to the demands of GPU-driven AI workloads. In this backend, the rules are rewritten. Port speeds soar to 400 or even 800 gigabits per second per GPU, and latency is measured in sub-microseconds. The traffic pattern shifts decisively east-west, as servers and GPUs communicate in parallel, exchanging vast datasets at blistering speeds to train and run sophisticated AI models. The design of this network is anything but conventional: fat-tree or hypercube topologies ensure that no single link becomes a bottleneck, allowing thousands of

Read More »

ABB and Applied Digital Build a Template for AI-Ready Data Centers

Toward the Future of AI Factories The ABB–Applied Digital partnership signals a shift in the fundamentals of data center development, where electrification strategy, hyperscale design and readiness, and long-term financial structuring are no longer separate tracks but part of a unified build philosophy. As Applied Digital pushes toward REIT status, the Ellendale campus becomes not just a development milestone but a cornerstone asset: a long-term, revenue-generating, AI-optimized property underpinned by industrial-grade power architecture. The 250 MW CoreWeave lease, with the option to expand to 400 MW, establishes a robust revenue base and validates the site’s design as AI-first, not cloud-retrofitted. At the same time, ABB is positioning itself as a leader in AI data center power architecture, setting a new benchmark for scalable, high-density infrastructure. Its HiPerGuard Medium Voltage UPS, backed by deep global manufacturing and engineering capabilities, reimagines power delivery for the AI era, bypassing the limitations of legacy low-voltage systems. More than a component provider, ABB is now architecting full-stack electrification strategies at the campus level, aiming to make this medium-voltage model the global standard for AI factories. What’s unfolding in North Dakota is a preview of what’s coming elsewhere: AI-ready campuses that marry investment-grade real estate with next-generation power infrastructure, built for a future measured in megawatts per rack, not just racks per row. As AI continues to reshape what data centers are and how they’re built, Ellendale may prove to be one of the key locations where the new standard was set.

Read More »

Amazon’s Project Rainier Sets New Standard for AI Supercomputing at Scale

Supersized Infrastructure for the AI Era As AWS deploys Project Rainier, it is scaling AI compute to unprecedented heights, while also laying down a decisive marker in the escalating arms race for hyperscale dominance. With custom Trainium2 silicon, proprietary interconnects, and vertically integrated data center architecture, Amazon joins a trio of tech giants, alongside Microsoft’s Project Stargate and Google’s TPUv5 clusters, who are rapidly redefining the future of AI infrastructure. But Rainier represents more than just another high-performance cluster. It arrives in a moment where the size, speed, and ambition of AI infrastructure projects have entered uncharted territory. Consider the past several weeks alone: On June 24, AWS detailed Project Rainier, calling it “a massive, one-of-its-kind machine” and noting that “the sheer size of the project is unlike anything AWS has ever attempted.” The New York Times reports that the primary Rainier campus in Indiana could include up to 30 data center buildings. Just two days later, Fermi America unveiled plans for the HyperGrid AI campus in Amarillo, Texas on a sprawling 5,769-acre site with potential for 11 gigawatts of power and 18 million square feet of AI data center capacity. And on July 1, Oracle projected $30 billion in annual revenue from a single OpenAI cloud deal, tied to the Project Stargate campus in Abilene, Texas. As Data Center Frontier founder Rich Miller has observed, the dial on data center development has officially been turned to 11. Once an aspirational concept, the gigawatt-scale campus is now materializing—15 months after Miller forecasted its arrival. “It’s hard to imagine data center projects getting any bigger,” he notes. “But there’s probably someone out there wondering if they can adjust the dial so it goes to 12.” Against this backdrop, Project Rainier represents not just financial investment but architectural intent. Like Microsoft’s Stargate buildout in

Read More »

Google and CTC Global Partner to Fast-Track U.S. Power Grid Upgrades

On June 17, 2025, Google and CTC Global announced a joint initiative to accelerate the deployment of high-capacity power transmission lines using CTC’s U.S.-manufactured ACCC® advanced conductors. The collaboration seeks to relieve grid congestion by rapidly upgrading existing infrastructure, enabling greater integration of clean energy, improving system resilience, and unlocking capacity for hyperscale data centers. The effort represents a rare convergence of corporate climate commitments, utility innovation, and infrastructure modernization aligned with the public interest. As part of the initiative, Google and CTC issued a Request for Information (RFI) with responses due by July 14. The RFI invites utilities, state energy authorities, and developers to nominate transmission line segments for potential fast-tracked upgrades. Selected projects will receive support in the form of technical assessments, financial assistance, and workforce development resources. While advanced conductor technologies like ACCC® can significantly improve the efficiency and capacity of existing transmission corridors, technological innovation alone cannot resolve the grid’s structural challenges. Building new or upgraded transmission lines in the U.S. often requires complex permitting from multiple federal, state, and local agencies, and frequently faces legal opposition, especially from communities invoking Not-In-My-Backyard (NIMBY) objections. Today, the average timeline to construct new interstate transmission infrastructure stretches between 10 and 12 years, an untenable lag in an era when grid reliability is under increasing stress. In 2024, the Federal Energy Regulatory Commission (FERC) reported that more than 2,600 gigawatts (GW) of clean energy and storage projects were stalled in the interconnection queue, waiting for sufficient transmission capacity. The consequences affect not only industrial sectors like data centers but also residential areas vulnerable to brownouts and peak load disruptions. What is the New Technology? At the center of the initiative is CTC Global’s ACCC® (Aluminum Conductor Composite Core) advanced conductor, a next-generation overhead transmission technology engineered to boost grid

Read More »

CoreSite’s Denver Power Play: Acquisition of Historic Carrier Hotel Supercharges Interconnection Capabilities

In this episode of the Data Center Frontier Show podcast, we unpack one of the most strategic data center real estate moves of 2025: CoreSite’s acquisition of the historic Denver Gas and Electric Building. With this transaction, CoreSite, an American Tower company, cements its leadership in the Rocky Mountain region’s interconnection landscape, expands its DE1 facility, and streamlines access to Google Cloud and the Any2Denver peering exchange. Podcast guests Yvonne Ng, CoreSite’s General Manager and Vice President for the Central Region, and Adam Post, SVP of Finance and Corporate Development, offer in-depth insights into the motivations behind the deal, the implications for regional cloud and network ecosystems, and what it means for Denver’s future as a cloud interconnection hub. Carrier Hotel to Cloud Hub Located at 910 15th Street in downtown Denver, the Denver Gas and Electric Building is widely known as the most network-dense facility in the region. Long the primary interconnection hub for the Rocky Mountains, the building has now been fully acquired by CoreSite, bringing ownership and operations of the DE1 data center under a single umbrella. “This is a strategic move to consolidate control and expand our capabilities,” said Ng. “By owning the building, we can modernize infrastructure more efficiently, double the space and power footprint of DE1, and deliver an unparalleled interconnection ecosystem.” The acquisition includes the facility’s operating businesses and over 100 customers. CoreSite will add approximately 3 critical megawatts (CMW) of data center capacity, nearly doubling DE1’s footprint. Interconnection in the AI Era As AI, multicloud strategies, and real-time workloads reshape enterprise architecture, interconnection has never been more vital. CoreSite’s move elevates Denver’s role in this transformation. With the deal, CoreSite becomes the only data center provider in the region offering direct connections to major cloud platforms, including the dedicated Google Cloud Platform

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »