Stay Ahead, Stay ONMINE

What a return to supersonic flight could mean for climate change

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. As I’ve admitted in this newsletter before, I love few things more than getting on an airplane. I know, it’s a bold statement from a climate reporter because of all the associated emissions, but it’s true. So I’m as intrigued as the next person by efforts to revive supersonic flight.   Last week, Boom Supersonic completed its first supersonic test flight of the XB-1 test aircraft. I watched the broadcast live, and the vibe was infectious, watching the hosts’ anticipation during takeoff and acceleration, and then their celebration once it was clear the aircraft had broken the sound barrier. And yet, knowing what I know about the climate, the promise of a return to supersonic flight is a little tarnished. We’re in a spot with climate change where we need to drastically cut emissions, and supersonic flight would likely take us in the wrong direction. The whole thing has me wondering how fast is fast enough.  The aviation industry is responsible for about 4% of global warming to date. And right now only about 10% of the global population flies on an airplane in any given year. As incomes rise and flight becomes more accessible to more people, we can expect air travel to pick up, and the associated greenhouse gas emissions to rise with it.  If business continues as usual, emissions from aviation could double by 2050, according to a 2019 report from the International Civil Aviation Organization.  Supersonic flight could very well contribute to this trend, because flying faster requires a whole lot more energy—and consequently, fuel. Depending on the estimate, on a per-passenger basis, a supersonic plane will use somewhere between two and nine times as much fuel as a commercial jet today. (The most optimistic of those numbers comes from Boom, and it compares the company’s own planes to first-class cabins.) In addition to the greenhouse gas emissions from increased fuel use, additional potential climate effects may be caused by pollutants like nitrogen oxides, sulfur, and black carbon being released at the higher altitudes common in supersonic flight. For more details, check out my latest story. Boom points to sustainable aviation fuels (SAFs) as the solution to this problem. After all, these alternative fuels could potentially cut out all the greenhouse gases associated with burning jet fuel. The problem is, the market for SAFs is practically embryonic. They made up less than 1% of the jet fuel supply in 2024, and they’re still several times more expensive than fossil fuels. And currently available SAFs tend to cut emissions between 50% and 70%—still a long way from net-zero. Things will (hopefully) progress in the time it takes Boom to make progress on reviving supersonic flight—the company plans to begin building its full-scale plane, Overture, sometime next year. But experts are skeptical that SAF will be as available, or as cheap, as it’ll need to be to decarbonize our current aviation industry, not to mention to supply an entirely new class of airplanes that burn even more fuel to go the same distance. The Concorde supersonic jet, which flew from 1969 to 2003, could get from New York to London in a little over three hours. I’d love to experience that flight—moving faster than the speed of sound is a wild novelty, and a quicker flight across the pond could open new options for travel.  One expert I spoke to for my story, after we talked about supersonic flight and how it’ll affect the climate, mentioned that he’s actually trying to convince the industry that planes should actually be slowing down a little bit. By flying just 10% slower, planes could see outsized reductions in emissions.  Technology can make our lives better. But sometimes, there’s a clear tradeoff between how technology can improve comfort and convenience for a select group of people and how it will contribute to the global crisis that is climate change.  I’m not a Luddite, and I certainly fly more than the average person. But I do feel like, maybe we should all figure out how to slow down, or at least not tear toward the worst impacts of climate change faster.  Related reading We named sustainable aviation fuel as one of our 10 Breakthrough Technologies this year.  The world of alternative fuels can be complicated. Here’s everything you need to know about the wide range of SAFs.  Rerouting planes could help reduce contrails—and aviation’s climate impacts. Read more in this story from James Temple.   SARAH ROGERS / MITTR | PHOTO GETTY Another thing DeepSeek has crashed onto the scene, upending established ideas about the AI industry. One common claim is that the company’s model could drastically reduce the energy needed for AI. But the story is more complicated than that, as my colleague James O’Donnell covered in this sharp analysis.  Keeping up with climate Donald Trump announced a 10% tariff on goods from China. Plans for tariffs on Mexico and Canada were announced, then quickly paused, this week as well. Here’s more on what it could mean for folks in the US. (NPR)→ China quickly hit back with mineral export curbs on materials including tellurium, a key ingredient in some alternative solar panels. (Mining.com)→ If the tariffs on Mexico and Canada go into effect, they’d hit supply chains for the auto industry, hard. (Heatmap News) Researchers are scrambling to archive publicly available data from agencies like the National Oceanic and Atmospheric Administration. The Trump administration has directed federal agencies to remove references to climate change. (Inside Climate News)→ As of Wednesday morning, it appears that live data that tracks carbon dioxide in the atmosphere is no longer accessible on NOAA’s website. (Try for yourself here) Staffers with Elon Musk’s “department of government efficiency” entered the NOAA offices on Wednesday morning, inciting concerns about plans for the agency. (The Guardian) The National Science Foundation, one of the US’s leading funders of science and engineering research, is reportedly planning to lay off between 25% and 50% of its staff. (Politico) Our roads aren’t built for the conditions being driven by climate change. Warming temperatures and changing weather patterns are hammering roads, driving up maintenance costs. (Bloomberg) Researchers created a new strain of rice that produces much less methane when grown in flooded fields. The variant was made with traditional crossbreeding. (New Scientist) Oat milk maker Oatly is trying to ditch fossil fuels in its production process with industrial heat pumps and other electrified technology. But getting away from gas in food and beverage production isn’t easy. (Canary Media) A new 3D study of the Greenland Ice Sheet reveals that crevasses are expanding faster than previously thought. (Inside Climate News) In other ice news, an Arctic geoengineering project shut down over concerns for wildlife. The nonprofit project was experimenting with using glass beads to slow melting, but results showed it was a threat to food chains. (New Scientist)

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

As I’ve admitted in this newsletter before, I love few things more than getting on an airplane. I know, it’s a bold statement from a climate reporter because of all the associated emissions, but it’s true. So I’m as intrigued as the next person by efforts to revive supersonic flight.  

Last week, Boom Supersonic completed its first supersonic test flight of the XB-1 test aircraft. I watched the broadcast live, and the vibe was infectious, watching the hosts’ anticipation during takeoff and acceleration, and then their celebration once it was clear the aircraft had broken the sound barrier.

And yet, knowing what I know about the climate, the promise of a return to supersonic flight is a little tarnished. We’re in a spot with climate change where we need to drastically cut emissions, and supersonic flight would likely take us in the wrong direction. The whole thing has me wondering how fast is fast enough. 

The aviation industry is responsible for about 4% of global warming to date. And right now only about 10% of the global population flies on an airplane in any given year. As incomes rise and flight becomes more accessible to more people, we can expect air travel to pick up, and the associated greenhouse gas emissions to rise with it. 

If business continues as usual, emissions from aviation could double by 2050, according to a 2019 report from the International Civil Aviation Organization. 

Supersonic flight could very well contribute to this trend, because flying faster requires a whole lot more energy—and consequently, fuel. Depending on the estimate, on a per-passenger basis, a supersonic plane will use somewhere between two and nine times as much fuel as a commercial jet today. (The most optimistic of those numbers comes from Boom, and it compares the company’s own planes to first-class cabins.)

In addition to the greenhouse gas emissions from increased fuel use, additional potential climate effects may be caused by pollutants like nitrogen oxides, sulfur, and black carbon being released at the higher altitudes common in supersonic flight. For more details, check out my latest story.

Boom points to sustainable aviation fuels (SAFs) as the solution to this problem. After all, these alternative fuels could potentially cut out all the greenhouse gases associated with burning jet fuel.

The problem is, the market for SAFs is practically embryonic. They made up less than 1% of the jet fuel supply in 2024, and they’re still several times more expensive than fossil fuels. And currently available SAFs tend to cut emissions between 50% and 70%—still a long way from net-zero.

Things will (hopefully) progress in the time it takes Boom to make progress on reviving supersonic flight—the company plans to begin building its full-scale plane, Overture, sometime next year. But experts are skeptical that SAF will be as available, or as cheap, as it’ll need to be to decarbonize our current aviation industry, not to mention to supply an entirely new class of airplanes that burn even more fuel to go the same distance.

The Concorde supersonic jet, which flew from 1969 to 2003, could get from New York to London in a little over three hours. I’d love to experience that flight—moving faster than the speed of sound is a wild novelty, and a quicker flight across the pond could open new options for travel. 

One expert I spoke to for my story, after we talked about supersonic flight and how it’ll affect the climate, mentioned that he’s actually trying to convince the industry that planes should actually be slowing down a little bit. By flying just 10% slower, planes could see outsized reductions in emissions. 

Technology can make our lives better. But sometimes, there’s a clear tradeoff between how technology can improve comfort and convenience for a select group of people and how it will contribute to the global crisis that is climate change. 

I’m not a Luddite, and I certainly fly more than the average person. But I do feel like, maybe we should all figure out how to slow down, or at least not tear toward the worst impacts of climate change faster. 


Related reading

We named sustainable aviation fuel as one of our 10 Breakthrough Technologies this year. 

The world of alternative fuels can be complicated. Here’s everything you need to know about the wide range of SAFs

Rerouting planes could help reduce contrails—and aviation’s climate impacts. Read more in this story from James Temple.  

A glowing deepseek logo

SARAH ROGERS / MITTR | PHOTO GETTY

Another thing

DeepSeek has crashed onto the scene, upending established ideas about the AI industry. One common claim is that the company’s model could drastically reduce the energy needed for AI. But the story is more complicated than that, as my colleague James O’Donnell covered in this sharp analysis

Keeping up with climate

Donald Trump announced a 10% tariff on goods from China. Plans for tariffs on Mexico and Canada were announced, then quickly paused, this week as well. Here’s more on what it could mean for folks in the US. (NPR)
→ China quickly hit back with mineral export curbs on materials including tellurium, a key ingredient in some alternative solar panels. (Mining.com)
→ If the tariffs on Mexico and Canada go into effect, they’d hit supply chains for the auto industry, hard. (Heatmap News)

Researchers are scrambling to archive publicly available data from agencies like the National Oceanic and Atmospheric Administration. The Trump administration has directed federal agencies to remove references to climate change. (Inside Climate News)
→ As of Wednesday morning, it appears that live data that tracks carbon dioxide in the atmosphere is no longer accessible on NOAA’s website. (Try for yourself here)

Staffers with Elon Musk’s “department of government efficiency” entered the NOAA offices on Wednesday morning, inciting concerns about plans for the agency. (The Guardian)

The National Science Foundation, one of the US’s leading funders of science and engineering research, is reportedly planning to lay off between 25% and 50% of its staff. (Politico)

Our roads aren’t built for the conditions being driven by climate change. Warming temperatures and changing weather patterns are hammering roads, driving up maintenance costs. (Bloomberg)

Researchers created a new strain of rice that produces much less methane when grown in flooded fields. The variant was made with traditional crossbreeding. (New Scientist)

Oat milk maker Oatly is trying to ditch fossil fuels in its production process with industrial heat pumps and other electrified technology. But getting away from gas in food and beverage production isn’t easy. (Canary Media)

A new 3D study of the Greenland Ice Sheet reveals that crevasses are expanding faster than previously thought. (Inside Climate News)

In other ice news, an Arctic geoengineering project shut down over concerns for wildlife. The nonprofit project was experimenting with using glass beads to slow melting, but results showed it was a threat to food chains. (New Scientist)

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Spy vs spy: Security agencies help secure the network edge

“By following the minimum levels of observability and digital forensics baselines outlined in this guidance, device manufacturers and their customers will be better equipped to detect and identify malicious activity against their solutions,” it said. “Device manufacturers should also use it to establish a baseline of standard features to include

Read More »

Nvidia claims near 50% boost in AI storage speed

Storage is an overlooked element of AI that has been overshadowed by all the emphasis on processors, namely GPUs. Large language models (LLMs) measure in the terabytes of size and all that needs to be moved around to be processed. So the faster you can move data, the better, so

Read More »

Kyndryl expands Palo Alto deal to offer managed SASE service

Kyndryl has expanded its alliance with Palo Alto Networks to add secure access service edge (SASE) services to its managed services offerings. In 2023, when Kyndryl first said it would integrate Palo Alto’s security products and services into its own managed security services, the vendors said they would ultimately support

Read More »

AI-Powered Policing: The Future of Traffic Safety in Kazakhstan

Traffic management is a growing challenge for cities worldwide, requiring a balance between enforcement, efficiency, and public trust. In Kazakhstan, the Qorgau system is redefining road safety through an innovative fusion of artificial intelligence (AI), computer vision, and mobile technology. Designed to assist traffic police in real-time violation detection and

Read More »

Denmark Toughens Scrutiny of Ships Carrying Russian Oil

Denmark will ramp up inspections of ships transporting Russian oil through its narrow straits, in an effort to protect the environment and maritime safety from older vessels in Moscow’s shadow fleet.  The Danish Maritime Authority will start conducting so-called Port State Control checks on ships anchored outside Skagen that “cannot be considered in innocent passage,” according to a statement. The controls will ensure ships comply with regulations on maritime safety, environmental protection and seafarers’ welfare. The authority said that the initiative comes in response to increased activity of older vessels transporting oil through Danish waters. Previously Denmark would only carry out such inspections if a ship called at a port, which shadow fleet tankers have rarely done. It marks a significant change in policy for Denmark, which has long maintained that it wouldn’t interfere with Russian ships passing through its waters due to a treaty dating back to 1857 that allows for free passage of ships transiting the Danish straits. The UK began challenging vessels sailing through the English Channel late last year over their insurance accreditation, while concerns over shadow fleet vessels in the Baltic Sea have been mounting after a spate of incidents where ships have damaged subsea cables. Russia assembled a fleet of about 700 ships to help keep its oil moving in the face of western sanctions, but many operate with opaque ownership structures, calling safety standards into question. Tankers transporting Russian oil are also increasingly refusing to use specialist pilots to guide them through the Danish straits, raising the risk of an oil spill off the country’s coast. Denmark’s narrow straits are a critical trade artery through which Russian oil has flowed largely unrestrained since the start of the attack on Ukraine, helping fund President Vladimir Putin’s war economy. En route, many of the tankers anchor outside Skagen in northern

Read More »

Alliance Resource Partners Q4 Profit Down on Weaker Coal Sales

Coal producer Alliance Resource Partners LP (ARLP) has reported $16.3 million in net income for the fourth quarter of 2024, down 86 percent compared to the same three-month period a year ago as volumes and prices fell. Net profit also dropped 81 percent sequentially. The Tulsa, Oklahoma-based company, which also owns coal, oil and gas royalty stakes, sold 8.42 million tons of coal at $59.97 per ton in the October-December 2024 period, according to results it published online. In the same quarter in 2023, ARLP sold 8.61 million tons of coal at $60.6 per ton. In the third quarter of 2024, ARLP sold 8.38 million tons at $63.57 per ton. Fourth-quarter revenue dropped 5.6 percent year-on-year and 3.8 percent sequentially to $590.1 million. ARLP attributed the year-on-year decrease to lower coal sales volumes, which declined 2.3 percent. It blamed the quarter-on-quarter decrease on weaker coal sales prices, which slid 5.7 percent partly due to lower export price realizations. Besides lower revenue, ARLP said the year-on-year decline in quarterly net earnings was due to “higher per-ton operating expenses, which include $13.1 million of non-cash accruals for certain long-term liabilities, and $31.1 million of non-cash impairment charges in the 2024 quarter due to market uncertainty at our MC Mining operation, partially offset by a $14.0 million increase in the fair value of our digital assets”. Meanwhile the sequential fall in quarterly net profit was due to “lower revenues and higher non-cash accruals relating to certain long-term liabilities and impairment charges in the 2024 Quarter, partially offset by an increase in the fair value of our digital assets”. Quarterly adjusted earnings before interests, taxes, depreciation and amortization (EBITDA) came at $124 million, compared to $185.4 million for the fourth quarter of 2023. “Adjusted EBITDA for the 2024 quarter decreased 27.2 percent compared to

Read More »

Intensity Gets EIV Capital Backing

Tulsa-headquartered energy infrastructure company Intensity Infrastructure Partners LLC has received equity commitment from its management and a private equity firm, EIV Capital. A joint media release said the investment will allow it to acquire, construct, and operate midstream assets in the United States. Intensity is led by Joseph Griffin and Derek Gipson, along with an executive team that collectively has 145 years of experience. This team brings expertise in management, commercial operations, engineering, marketing, construction, and origination across a wide range of midstream activities. Their focus primarily includes the Permian Basin, the Bakken Shale, the Powder River Basin, and various plays in the Mid-Continent region, the press release said. Intensity has invested more than $2 billion in the construction and acquisition of midstream assets across the entire value chain. These include gas gathering, compression, processing, and treating, as well as gathering and transmission of crude oil and natural gas liquids (NGL), according to the press release. “We are pleased to partner with Intensity as the team plans to leverage its exceptional track record to acquire, develop, operate, and commercialize midstream infrastructure assets. The Intensity team brings a robust background that spans numerous basins and commodities, positioning the team well to partner with producers seeking midstream solutions”, David Finan, Managing Partner at EIV, said. “Intensity is a proven team that is well suited to capitalize on what we see as an attractive opportunity set in the midstream sector, where EIV continues to be highly active”, Greg Davis, Partner at EIV, said. “We are excited to partner with EIV as we develop midstream solutions throughout the United States. Our executive team has devoted our careers to the midstream space and strongly believes there will be increasing opportunities to partner with producers and continue to build and optimize midstream assets for our customers”,

Read More »

Hartshead submits revised Anning and Somerville gas offtake route to UK regulator

North Sea operator Hartshead Resources (ASX:HHR) has submitted a revised gas offtake route for its Anning and Somerville development to the UK regulator. Hartshead said the selected export route now ties into the CalEnergy Resources owned and operated Saturn Banks pipeline system. The Saturn Banks system was previously owned and operated by independent IOG, which went into administration in 2023. The project will then transport gas to the Perenco owned and operated Bacton Terminal on the Norfolk coast for processing and entry into the UK grid. The Perth-based firm said the project economics are “significantly enhanced” by the new route through reduced anticipated capital expenditure. © Supplied by Hartshead ResourcesA graphic showing a revised gas offtake route for the Hartshead Resources Anning and Somerville development in the Southern North Sea. Image: Hartshead Resources Previously, Hartshead had proposed a gas offtake route via the Shell-operated Corvette and Leman Alpha installations in the Southern North Sea (SNS). The Australian firm holds a 60% interest in the P2607 licence comprising the Anning and Somerville gas fields. Hartshead estimates the fields have combined 2p reserves of just over 300 billion cubic feet (Bcf) of gas. The operator farmed out a 40% stake in Anning and Somerville to Viaro Energy subsidiary RockRose Energy in 2023 in a £105 million deal. Since then, Viaro has provided a “financing backstop” worth A$800 million (£415m) to progress the first phase of the project after Hartshead encountered financial difficulties. In July last year, Viaro also agreed a deal to acquire the SNS assets of Shell and ExxonMobil, including the Corvette and Leman Alpha assets. Anning and Somerville offtake route Hartshead said the “far superior” offtake route via the Saturn Banks pipeline system holds significant advantages over the previous route. This includes increased production volumes capacity, accelerated production, a simpler

Read More »

TotalEnergies has “little future” in the UK, boss warns

TotalEnergies (PAR: TTE) will continue to move away from the UK North Sea as it looks to curtail exploration in the basin. Speaking as part of an investor call during the presentation of the French oil major’s fourth-quarter results, CEO Patrick Pouyanné said he sees “little future, to be honest” in the UK. The company’s fourth-quarter results continue a trend from the third quarter in being largely silent on its plans for the UK. Speaking at the event, Pouyanné said that the company has no plans to make further exploration in the UK. “We have explored many discoveries and today we have a debate about who has the right to develop. It’s not possible to put some exploration money when you can’t get the developer licence,” he said. “I prefer to explore in countries where I’m convinced that we’ll get the development.” He added that TotalEnergies will continue its strategy of divesting its assets, though he did not specify if this would include any in the UK. TotalEnergies had previously agreed a deal to sell its oil and gas fields West of Shetland as well as the Shetland Gas Plant to Prax Group. The sale includes the Greater Laggan area, Laggan, Tormore, Glenlivet, Edradour and Glendronach. However, the company’s fourth-quarter results made no mention of when it expects it to close. The company’s upstream portfolio in the North Sea still includes the Elgin-Franklin, Culzean and Alwyn fields and offshore wind plays including the Seagreen project off the Angus coast. The group has also shutdown its Gryphon floating production storage and offloading (FPSO) vessel as ahead of a planned decommissioning this year. TotalEnergies has gone from one of the North Sea’s biggest tax payers into significant retreat from the region. Figures for 2023 showed that TotalEnergies paid the largest windfall tax,

Read More »

Enterprise Products Posts Record Midstream Volumes in 2024

Enterprise Products Partners L.P. posted record midstream volumes in 2024, due to natural gas and natural gas liquids (NGL) volume growth in the Permian Basin. The partnership reported net income attributable to common unitholders of $5.9 billion for 2024, a 7 percent increase year over year compared to $5.5 billion in the previous year. Distributable cash flow for 2024 was a record $7.8 billion, compared to $7.6 billion in the previous year. Distributions declared with respect to 2024 increased 5 percent to $2.10 per common unit annualized, compared to distributions declared for 2023, Enterprise said in its most recent earnings release. For the quarter ended December 2024, Enterprise reported revenue of $14.2 billion, down 2.9 percent over the same period last year, while earnings per share (EPS) was marked at $0.74, compared to $0.72 in the year-ago quarter. The reported fourth-quarter revenue was lower than the Zacks Consensus Estimate of $14.31 billion, while EPS was slightly higher with the consensus estimate being $0.69. Enterprise reported net income attributable to common unitholders of $1.6 billion, or $0.74 per common unit on a fully diluted basis, for the fourth quarter, a 3 percent increase compared to $1.6 billion, or $0.72 per common unit on a fully diluted basis, for the same quarter in 2023. For 2024, the partnership’s total capital investments were $5.5 billion, which included $3.9 billion for growth capital projects, $949 million for the acquisition of Pinon Midstream, LLC, and $667 million of sustaining capital expenditures, according to the release. Sustaining capital expenditures were elevated in 2024 due to plant turnarounds in the partnership’s petrochemicals business. Organic growth capital investments are expected to be in the range of $4.0 billion to $4.5 billion in 2025, while sustaining capital expenditures are expected to be approximately $525 million in 2025, Enterprise stated.

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »