Stay Ahead, Stay ONMINE

Supersonic planes are inching toward takeoff. That could be a problem.

Boom Supersonic broke the sound barrier in a test flight of its XB-1 jet last week, marking an early step in a potential return for supersonic commercial flight. The small aircraft reached a top speed of Mach 1.122 (roughly 750 miles per hour) in a flight over southern California and exceeded the speed of sound for a few minutes.  “XB-1’s supersonic flight demonstrates that the technology for passenger supersonic flight has arrived,” said Boom founder and CEO Blake Scholl in a statement after the test flight. Boom plans to start commercial operation with a scaled-up version of the XB-1, a 65-passenger jet called Overture, before the end of the decade, and it has already sold dozens of planes to customers including United Airlines and American Airlines. But as the company inches toward that goal, experts warn that such efforts will come with a hefty climate price tag.  Supersonic planes will burn significantly more fuel than current aircraft, resulting in higher emissions of carbon dioxide, which fuels climate change. Supersonic jets also fly higher than current commercial planes do, introducing atmospheric effects that may warm the planet further. In response to questions from MIT Technology Review, Boom pointed to alternative fuels as a solution, but those remain in limited supply—and they could have limited use in cutting emissions in supersonic aircraft. Aviation is a significant and growing contributor to human-caused climate change, and supersonic technologies could grow the sector’s pollution, rather than make progress toward shrinking it. XB-1 follows a long history of global supersonic flight. Humans first broke the sound barrier in 1947, when Chuck Yeager hit 700 miles per hour in a research aircraft (the speed of sound at that flight’s altitude is 660 miles per hour). Just over two decades later, in 1969, the first supersonic commercial airliner, the Concorde, took its first flight. That aircraft regularly traveled at supersonic speeds until the last one was decommissioned in 2003. Among other issues (like the nuisance of sonic booms), one of the major downfalls of the Concorde was its high operating cost, due in part to the huge amounts of fuel it required to reach top speeds. Experts say today’s supersonic jets will face similar challenges.  Flying close to the speed of sound changes the aerodynamics required of an aircraft, says Raymond Speth, associate director of the MIT Laboratory for Aviation and the Environment. “All the things you have to do to fly at supersonic speed,” he says, “they reduce your efficiency … There’s a reason we have this sweet spot where airplanes fly today, around Mach 0.8 or so.” Boom estimates that one of its full-sized Overture jets will burn two to three times as much fuel per passenger as a subsonic plane’s first-class cabin. The company chose this comparison because its aircraft is “designed to deliver an enhanced, productive cabin experience,” similar to what’s available in first- and business-class cabins on today’s aircraft.  That baseline, however, isn’t representative of the average traveler today. Compared to standard economy-class travel, first-class cabins tend to have larger seats with more space between them. Because there are fewer seats, more fuel is required per passenger, and therefore more emissions are produced for each person.  When passengers crammed into coach are considered in addition to those in first class, each passenger on a Boom Supersonic flight will burn somewhere between five and seven times more fuel per passenger than the average subsonic plane passenger today, according to research from the International Council on Clean Transportation.  It’s not just carbon dioxide from burning fuel that could add to supersonic planes’ climate impact. All jet engines release other pollutants as well, including nitrogen oxides, black carbon, and sulfur. The difference is that while commercial planes today top out in the troposphere, supersonic aircraft tend to fly higher in the atmosphere, in the stratosphere. The air is less dense at higher altitudes, creating less drag on the plane and making it easier to reach supersonic speeds. Flying in the stratosphere, and releasing pollutants there, could increase the climate impacts of supersonic flight, Speth says. For one, nitrogen oxides released in the stratosphere damage the ozone layer through chemical reactions at that altitude. It’s not all bad news, to be fair. The drier air in the stratosphere means supersonic jets likely won’t produce significant contrails. That could be a benefit for climate, since contrails contribute to aviation’s warming. Boom has also touted plans to make up for its expected climate impacts by making its aircraft compatible with 100% sustainable aviation fuel (SAF), a category of alternative fuels made from biological sources, waste products, or even captured carbon from the air. “Going faster requires more energy, but it doesn’t need to emit more carbon. Overture is designed to fly on net-zero carbon sustainable aviation fuel (SAF), eliminating up to 100% of carbon emissions,” a Boom spokesperson said via email in response to written questions from MIT Technology Review.  However, alternative fuels may not be a saving grace for supersonic flight. Most commercially available SAF today is made with a process that cuts emissions between 50% and 70% compared to fossil fuels. So a supersonic jet running on SAFs may emit less carbon dioxide than one running on fossil fuels, but alternative fuels will likely still come with some level of carbon pollution attached, says Dan Rutherford, senior director of research at the International Council on Clean Transportation.  “People are pinning a lot of hope on SAFs,” says Rutherford. “But the reality is, today they remain scarce [and] expensive, and they have sustainability concerns of their own.” Of the 100 billion gallons of jet fuel used last year, only about 0.5% of it was SAF. Companies are building new factories to produce larger volumes of the fuels and expand the available options, but the fuel is likely going to continue to make up a small fraction of the existing fuel supply, Rutherford says. That means supersonic jets will be competing with other, existing planes for the same supply, and aiming to use more of it.  Boom Supersonic has secured 10 million gallons of SAF annually from Dimensional Energy and Air Company for the duration of the Overture test flight program, according to the company spokesperson’s email. Ultimately, though, if and when Overture reaches commercial operation, it will be the airlines that purchase its planes hunting for a fuel supply—and paying for it.  There’s also a chance that using SAFs in supersonic jets could come with unintended consequences, as the fuels have a slightly different chemical makeup than fossil fuels. For example, fossil fuels generally contain sulfur, which has a cooling effect, as sulfur aerosols formed from jet engine exhaust help reflect sunlight. (Intentional release of sulfur is one strategy being touted by groups aiming to start geoengineering the atmosphere.) That effect is stronger in the stratosphere, where supersonic jets are likely to fly. SAFs, however, typically have very low sulfur levels, so using the alternative fuels in supersonic jets could potentially result in even more warming overall. There are other barriers that Boom and others will need to surmount to get a new supersonic jet industry off the ground. Supersonic travel over land is largely banned, because of the noise and potential damage that comes from the shock wave caused by breaking the sound barrier. While some projects, including one at NASA, are working on changes to aircraft that would result in a less disruptive shock wave, these so-called low-boom technologies are far from proven. NASA’s prototype was revealed last year, and the agency is currently conducting tests of the aircraft, with first flight anticipated sometime this year.   Boom is planning a second supersonic test flight for XB-1, as early as February 10, according to the spokesperson. Once testing in that small aircraft is done, the data will be used to help build Overture, the full-scale plane. The company says it plans to begin production on Overture in its factory in roughly 18 months.  In the meantime, the world continues to heat up. As MIT’s Speth says, “I feel like it’s not the time for aviation to be coming up with new ways of using even more energy, with where we are in the climate crisis.”

Boom Supersonic broke the sound barrier in a test flight of its XB-1 jet last week, marking an early step in a potential return for supersonic commercial flight. The small aircraft reached a top speed of Mach 1.122 (roughly 750 miles per hour) in a flight over southern California and exceeded the speed of sound for a few minutes. 

“XB-1’s supersonic flight demonstrates that the technology for passenger supersonic flight has arrived,” said Boom founder and CEO Blake Scholl in a statement after the test flight.

Boom plans to start commercial operation with a scaled-up version of the XB-1, a 65-passenger jet called Overture, before the end of the decade, and it has already sold dozens of planes to customers including United Airlines and American Airlines. But as the company inches toward that goal, experts warn that such efforts will come with a hefty climate price tag. 

Supersonic planes will burn significantly more fuel than current aircraft, resulting in higher emissions of carbon dioxide, which fuels climate change. Supersonic jets also fly higher than current commercial planes do, introducing atmospheric effects that may warm the planet further.

In response to questions from MIT Technology Review, Boom pointed to alternative fuels as a solution, but those remain in limited supply—and they could have limited use in cutting emissions in supersonic aircraft. Aviation is a significant and growing contributor to human-caused climate change, and supersonic technologies could grow the sector’s pollution, rather than make progress toward shrinking it.

XB-1 follows a long history of global supersonic flight. Humans first broke the sound barrier in 1947, when Chuck Yeager hit 700 miles per hour in a research aircraft (the speed of sound at that flight’s altitude is 660 miles per hour). Just over two decades later, in 1969, the first supersonic commercial airliner, the Concorde, took its first flight. That aircraft regularly traveled at supersonic speeds until the last one was decommissioned in 2003.

Among other issues (like the nuisance of sonic booms), one of the major downfalls of the Concorde was its high operating cost, due in part to the huge amounts of fuel it required to reach top speeds. Experts say today’s supersonic jets will face similar challenges. 

Flying close to the speed of sound changes the aerodynamics required of an aircraft, says Raymond Speth, associate director of the MIT Laboratory for Aviation and the Environment. “All the things you have to do to fly at supersonic speed,” he says, “they reduce your efficiency … There’s a reason we have this sweet spot where airplanes fly today, around Mach 0.8 or so.”

Boom estimates that one of its full-sized Overture jets will burn two to three times as much fuel per passenger as a subsonic plane’s first-class cabin. The company chose this comparison because its aircraft is “designed to deliver an enhanced, productive cabin experience,” similar to what’s available in first- and business-class cabins on today’s aircraft. 

That baseline, however, isn’t representative of the average traveler today. Compared to standard economy-class travel, first-class cabins tend to have larger seats with more space between them. Because there are fewer seats, more fuel is required per passenger, and therefore more emissions are produced for each person. 

When passengers crammed into coach are considered in addition to those in first class, each passenger on a Boom Supersonic flight will burn somewhere between five and seven times more fuel per passenger than the average subsonic plane passenger today, according to research from the International Council on Clean Transportation. 

It’s not just carbon dioxide from burning fuel that could add to supersonic planes’ climate impact. All jet engines release other pollutants as well, including nitrogen oxides, black carbon, and sulfur.

The difference is that while commercial planes today top out in the troposphere, supersonic aircraft tend to fly higher in the atmosphere, in the stratosphere. The air is less dense at higher altitudes, creating less drag on the plane and making it easier to reach supersonic speeds.

Flying in the stratosphere, and releasing pollutants there, could increase the climate impacts of supersonic flight, Speth says. For one, nitrogen oxides released in the stratosphere damage the ozone layer through chemical reactions at that altitude.

It’s not all bad news, to be fair. The drier air in the stratosphere means supersonic jets likely won’t produce significant contrails. That could be a benefit for climate, since contrails contribute to aviation’s warming.

Boom has also touted plans to make up for its expected climate impacts by making its aircraft compatible with 100% sustainable aviation fuel (SAF), a category of alternative fuels made from biological sources, waste products, or even captured carbon from the air. “Going faster requires more energy, but it doesn’t need to emit more carbon. Overture is designed to fly on net-zero carbon sustainable aviation fuel (SAF), eliminating up to 100% of carbon emissions,” a Boom spokesperson said via email in response to written questions from MIT Technology Review

However, alternative fuels may not be a saving grace for supersonic flight. Most commercially available SAF today is made with a process that cuts emissions between 50% and 70% compared to fossil fuels. So a supersonic jet running on SAFs may emit less carbon dioxide than one running on fossil fuels, but alternative fuels will likely still come with some level of carbon pollution attached, says Dan Rutherford, senior director of research at the International Council on Clean Transportation. 

“People are pinning a lot of hope on SAFs,” says Rutherford. “But the reality is, today they remain scarce [and] expensive, and they have sustainability concerns of their own.”

Of the 100 billion gallons of jet fuel used last year, only about 0.5% of it was SAF. Companies are building new factories to produce larger volumes of the fuels and expand the available options, but the fuel is likely going to continue to make up a small fraction of the existing fuel supply, Rutherford says. That means supersonic jets will be competing with other, existing planes for the same supply, and aiming to use more of it. 

Boom Supersonic has secured 10 million gallons of SAF annually from Dimensional Energy and Air Company for the duration of the Overture test flight program, according to the company spokesperson’s email. Ultimately, though, if and when Overture reaches commercial operation, it will be the airlines that purchase its planes hunting for a fuel supply—and paying for it. 

There’s also a chance that using SAFs in supersonic jets could come with unintended consequences, as the fuels have a slightly different chemical makeup than fossil fuels. For example, fossil fuels generally contain sulfur, which has a cooling effect, as sulfur aerosols formed from jet engine exhaust help reflect sunlight. (Intentional release of sulfur is one strategy being touted by groups aiming to start geoengineering the atmosphere.) That effect is stronger in the stratosphere, where supersonic jets are likely to fly. SAFs, however, typically have very low sulfur levels, so using the alternative fuels in supersonic jets could potentially result in even more warming overall.

There are other barriers that Boom and others will need to surmount to get a new supersonic jet industry off the ground. Supersonic travel over land is largely banned, because of the noise and potential damage that comes from the shock wave caused by breaking the sound barrier. While some projects, including one at NASA, are working on changes to aircraft that would result in a less disruptive shock wave, these so-called low-boom technologies are far from proven. NASA’s prototype was revealed last year, and the agency is currently conducting tests of the aircraft, with first flight anticipated sometime this year.  

Boom is planning a second supersonic test flight for XB-1, as early as February 10, according to the spokesperson. Once testing in that small aircraft is done, the data will be used to help build Overture, the full-scale plane. The company says it plans to begin production on Overture in its factory in roughly 18 months. 

In the meantime, the world continues to heat up. As MIT’s Speth says, “I feel like it’s not the time for aviation to be coming up with new ways of using even more energy, with where we are in the climate crisis.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Kyndryl expands Palo Alto deal to offer managed SASE service

Kyndryl has expanded its alliance with Palo Alto Networks to add secure access service edge (SASE) services to its managed services offerings. In 2023, when Kyndryl first said it would integrate Palo Alto’s security products and services into its own managed security services, the vendors said they would ultimately support

Read More »

AI-Powered Policing: The Future of Traffic Safety in Kazakhstan

Traffic management is a growing challenge for cities worldwide, requiring a balance between enforcement, efficiency, and public trust. In Kazakhstan, the Qorgau system is redefining road safety through an innovative fusion of artificial intelligence (AI), computer vision, and mobile technology. Designed to assist traffic police in real-time violation detection and

Read More »

Quantum networking advances on Earth and in space

“Currently, the U.S. government is not investing in such testbeds or demonstrations, ensuring it will be a follower and not a leader in the development of technical advances in the field,” said a report released last year by the Quantum Economic Development Consortium, a global association of more than 250

Read More »

Equinor slashes renewables, doubles down on hydrocarbons in focus on returns

Equinor’s dramatic renewable cuts and increased focus on oil and gas mirrors moves across the energy sector. The Norwegian major is prioritising shareholder returns over energy transition spending. CEO Anders Opedal cited growing energy demand, market uncertainty and an “uneven pace” in the energy transition as drivers for the strategic shift. “We are well positioned to create value in the context” of growing energy demand, Opedal said at the company’s capital markets day. Opedal said the energy transition was “moving fast in some markets but slow in most. Even the massive renewable growth is currently energy addition, not energy transition.” He attributed the challenges of the energy transition to inflation, interest rates, supply chain issues and regulatory uncertainty. “Segments like offshore wind and hydrogen are impacted. We adapt to these realities … by maximising returns.” Equinor had planned to reach 12-16 GW of renewable energy capacity by 2030. It has now cut this to 10-12 GW. “We are lowering our renewables ambition for 2030, we introduce a range for net carbon intensity and we retire our gross capex ambition. But the strategic direction is the same,” Opedal said. While the company will continue investing in projects such as the US’ Empire Wind, hydrocarbons are clearly more profitable. CFO Torgrim Reitan said the “lifecycle return” on the project was “close to 10% nominal equity return”. He went on to describe it as a “challenging project” that was “not great but it is ok”. For upstream projects, Equinor is targeting an internal rate of return of around 30% over the next 10 years. Finding focus Equinor aims to return $9 billion to shareholders this year. Of this, $4bn will come in dividends and $5bn in buybacks. “The largest driver of this is an $8bn capex reduction” over the next three years. “We

Read More »

Energy Voices: why carbon capture matters at Morecambe

Spirit Energy has concrete plans for enabling a new zero energy system. Following the sale of oil producing fields in the Norwegian North Sea in 2022, the firm, which is majority owned by Centrica (LON: CNA) has become almost entirely focused on gas production. And now it is looking to how to use its experience in this area to the next stage of the energy transition to store carbon. Susan Mackenzie is leading Spirit’s ambitious plans to repurpose depleted gas fields to create the UK’s largest permanent carbon store, capable of storing one billion tons of CO2. Mackenzie moved to Aberdeen in 1999 to complete her MsC in environmental sciences. Her career started in the oil and gas industry which she admits she felt “conflicted” following her training. “But you can work that conflict out because you need power, then you can try and get new developments to mitigate impact on the environment.” Then the opportunity came with Spirit Energy to lead the Morecambe Net Zero (MNZ) project, which she did not hesitate to take. “It was really easy,” she recalls when asked if she wanted to join the team. Spirit Energy’s Morecambe hub has been producing gas for 40 years. The firm estimates it could meet up to a quarter of the UK’s carbon storage capacity needs from 2030. MNZ is not currently part of the UK’s carbon capture and storage (CCS) track system. Last year the UK government allocated £22 billion in support for two CCS projects, one of which is nearby to MNZ on Merseyside. Mackenzie envisages going beyond current government allocations. “We need to do it. And we need to find a route to it for not just two, not just four, but many projects,” she said. “MNZ is probably the best project because of the

Read More »

Banks improving clean energy finance ratios, still well short of net-zero pace: report

Dive Brief: Global banks spent 89 cents financing low-carbon energy supply technologies to every dollar spent on fossil fuels in 2023, according to a new report by BloombergNEF. That is an improvement from the 0.74-to-1 financing ratio from the prior year, but still far from the pace needed to reach net-zero. In terms of capital investments, low-carbon energy investments surpassed fossil fuels for the first time in 2023, with $1.2 trillion invested in low-carbon projects, compared to $1.1 trillion invested in fossil fuels, a 1.11-to-1 investment ratio, according to the report released Wednesday. While the rate of fossil fuel investment and bank-facilitated fossil fuel financing declined in 2023, the financing ratios fall well below what is needed to meet global net-zero emissions targets. “The ratio isn’t evolving at the pace needed to hit the 4:1 level required this decade under commonly referenced scenarios to limit climate change to 1.5 [degrees Celsius],” the report said. Dive Insight: The third annual report on energy supply investment and banking ratios from BloombergNEF — a Bloomberg subsidiary which provides research on the low-carbon transition — found that overall, bank-facilitated financing for global energy supply fell for the second year in a row to $1.6 trillion in 2023 from $1.8 trillion in 2022. Capital investments in energy supply increased, reaching $2.3 trillion, up from $2.1 trillion the year prior and $1.8 trillion in 2021.  The decline in energy supply financing was more dramatic for fossil fuels, however, than for low-carbon energy sources. Bank-facilitated financing for fossil fuels fell 18% to $870 billion in 2023, while low-carbon energy sources saw just a 1.4% decrease in financing to $776 billion globally. BloombergNEF said part of the increase in the energy supply banking ratio is due to how it measures finance, as well as data gaps in China,

Read More »

OPEC+ Damocles Sword Hanging Over Shale Oil Producers

In a report sent to Rigzone by the Skandinaviska Enskilda Banken AB (SEB) team on Tuesday morning, Bjarne Schieldrop, chief commodities analyst at the company, said “the Damocles Sword of OPEC+” is “hanging over U.S. shale oil producers”. “OPEC+ decided yesterday [Monday] to stick with its plan – to lift production by 120,000 barrels per day every month for 18 months starting April,” Schieldrop noted in the report. “Again and again, it has pushed the start of the production increase further into the future. It could do it yet again. That will depend on circumstances of one – global oil demand growth and two – non-OPEC+ supply growth,” he added. In the report, Schieldrop said “all oil producers in the world know that OPEC+ has … five to six million barrels per day of reserve capacity at hand” and noted that the group “wants to return two to three million barrels per day of this reserve to the market to get back to a more normal reserve level”. “The now increasingly standing threat of OPEC+ to increase production in ‘just a couple of months’ is hanging over the world’s oil producers like a Damocles Sword. OPEC+ is essentially saying: ‘produce much more and we will do too, and you will get a much lower price’,” Schieldrop noted.    The chief commodities analyst at SEB went on to state in the report that, “if U.S. shale oil producers embarked on a strong supply growth path, heeding calls from Donald Trump for more production and a lower oil price, then OPEC+ would have no other choice than to lift production and let the oil price fall”. “Trump would get a lower oil price as he wishes for, but he would not get higher U.S. oil production. U.S. shale oil producers would get a

Read More »

Community solar agreement aims to cut electricity costs, emissions across 31 Illinois fast food outlets

Dive Brief: A portfolio of 31 Wendy’s and McDonald’s locations across northern Illinois has locked in discounts on its energy bills potentially worth over $1 million across 20 years in an agreement with community solar provider Perch Energy, the company said Jan. 23.  Participating All Star Management restaurants are receiving a 10% discount on their electricity bills over the projects’ 20-year lifespans, Perch Energy President and CEO Bruce Stewart said in an email. The restaurants expect to save more than $20,000 in electricity costs per year based on the franchise’s 2024 energy use.  “It was an easy decision. A lot of our restaurants are open early and close late. And some are open 24 hours a day, so we use a lot of electricity,” All Star Management Owner Mike Allegro said in a statement. Dive Insight: Community solar purchases allow businesses to reduce the greenhouse gas emissions of their electricity use without installing on-site solar. In June, the Wendy’s Co. — which operates corporate-owned Wendy’s restaurants and licenses franchise locations to owner-operators like All Star Management — announced that nearly 100 of its corporate-owned restaurants and 40 franchise locations in New York, Massachusetts and Illinois had enrolled in community solar provider Ampion Renewable Energy’s solar-powered clean energy offering. All Star Management’s first round of subscriptions, announced Jan. 23, will purchase energy from two community solar projects with nearly 8 megawatts of combined capacity to supply its locations in Chicago’s western and southern suburbs. The deal could produce about “half a million dollars of savings” for All Star Management, Perch Energy said. Expected future capacity additions could allow All Star to double its savings to an estimated $1 million over 20 years, Perch and All Star Management said. Perch Energy has more than 36,000 active subscriptions in the United States drawing

Read More »

Where Trump officials stand on RNG; plus news from WM, Chevron and more

This is the latest installment in Waste Dive’s Biogas Monthly series. A new administration brings new officials in charge of federal policies affecting the biofuels space, and early signs point to a mostly positive environment for the industry. U.S. EPA Administrator Lee Zeldin was asked during his confirmation hearing if he would set timely renewable volume obligations under the Renewable Fuel Standard, the policy controlling federal incentives for renewable natural gas. Zeldin said he would, and also said he would be open to conversations about favorable policies for developing sustainable aviation fuels as well, Biomass Magazine first reported. On Monday, the Senate also voted to confirm Chris Wright, former chairman and CEO of Liberty Energy, to lead the U.S. Department of Energy. Liberty Energy is an oil and gas company based in Colorado, though it did acquire a company in 2023 that delivers compressed natural gas, RNG or hydrogen to remote customers. Wright will be tasked with implementing President Donald Trump’s energy policy, which is focused on boosting oil and gas production. In his “Unleashing American Energy” memo, issued on Jan. 20, Trump included biofuels among the kinds of energy production he’d like to see increase. There could also be cuts to programs that have previously supported the biofuels industry with the new administration. Experts said Wright could pause funding from the Loan Programs Office, which ramped up its investments considerably during the Biden administration. And the Rural Energy for America Program office, which has financed projects like dairy manure digesters, is on a list of programs whose funding was affected by Trump’s attempt to freeze large swathes of federal funding. Judges in Rhode Island and Washington, D.C., have issued temporary restraining orders against the administration to allow funds to continue flowing for now. Despite the changes in Washington, projects continue

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »