Stay Ahead, Stay ONMINE

AI, Data Centers, and the Next Big Correction: Will Growth Outpace Market Reality?

AI is being readily embraced by organizations, government, and individual enthusiasts for data aggregation, pattern recognition, data visualization, and co-creation of content. Given the headlines lately, AI is set to take over the world. And as an emerging, revolutionary technology with large potential impact and newfound user-friendliness, both large tech companies and small startups alike […]

AI is being readily embraced by organizations, government, and individual enthusiasts for data aggregation, pattern recognition, data visualization, and co-creation of content. Given the headlines lately, AI is set to take over the world. And as an emerging, revolutionary technology with large potential impact and newfound user-friendliness, both large tech companies and small startups alike have raced to capitalize on potential growth. Hands down, this transformative technology has caused a wave of adoption, investment, and innovation around the world and across industries.

Naturally, when a technology or application accelerates quickly, the more risk-averse will be cautious and when it accelerates this quickly, a bubble might be forming. Even more bullish investors have ridden through too much tumult in the past few decades for their bank accounts to withstand another cataclysmic loss. More investment is pouring in (including at a federal level), stock valuations are all over the charts and not necessarily true to a ticker’s earnings, and the recent market fluctuations leave the entire ecosystem a little hesitant about buying into the hype too much.

The Nature of Bubbles and Some Potential Signals to Watch For

Economic bubbles occur when asset prices significantly exceed their intrinsic value, often fueled by speculative demand and irrational investment, leading to unsustainable market conditions. A bigger concern than just to digital infrastructure, bubbles can have far-reaching impacts on the entire market, as the initial distorted financial metrics encourage excessive lending and create systemic risk. The collapse of a bubble can trigger a chain reaction of financial distress, causing widespread economic instability and potentially leading to recessions, as seen in historical examples like the dot-com and housing bubbles.

Reasonable bubble indicators that have the market concerned include:

  • Overvaluation and Lack of Profit Generation: Tech giants are heavily invested in AI despite limited returns from the associated products. Likewise, many AI startups have achieved valuations far exceeding their earnings. This discrepancy between valuation and profitability is a classic sign of a bubble.
  • Hype vs. Reality: The AI hype cycle throughout the news has led to significant investments, with society torn about the potential and ethical claims regarding AI capabilities. Overstatements in the media often must be tempered with corrections in later expectation, but when hundreds of billions of dollars are at stake, it’s no small adjustment.
  • Diminishing Returns: Some experts suggest that large language models (LLMs) may not be as scalable as previously thought, leading to diminishing returns on investment in these technologies.

The Dot-Com Burst Saw Precisely This Happen

The dot-com bubble emerged in the late 1990s, fueled by the rapid growth of the internet and the establishment of numerous tech startups. This period saw a surge in demand for internet-based stocks, leading to high valuations that often exceeded the companies’ intrinsic value. The NASDAQ Composite index rose dramatically, increasing by 582% from January 1995 to March 2000, only to fall by 75% from March 2000 to October 2002.

The frenzy of buying internet-based stocks was overwhelming, with many companies lacking viable business models and focusing instead on metrics like website traffic. Venture capitalists and other investors poured money into these startups, often ignoring traditional financial metrics in favor of speculative growth potential. The media played a significant role in fueling this hype, encouraging investors to overlook caution and invest in risky tech stocks.

The bubble burst when capital began to dry up, leading to a market crash. By 2002, investor losses were estimated at around $5 trillion. Many tech companies that conducted IPOs during this era declared bankruptcy or were acquired by other companies. The collapse of the dotcom bubble resulted in massive layoffs in the technology sector and served as a cautionary tale about the dangers of speculative investing and overvaluation.

The aftermath of the dotcom bubble led to a more cautious approach to investing, with a renewed focus on fundamental analysis rather than speculative hype. Despite the devastating impact, it laid the groundwork for the modern tech industry, with companies like Amazon and Google surviving and thriving to become leaders in their fields.

Growth and Profitability

While AI as a technology has been around for decades, the advent of generative AI built on neural networks resulted in the release of ChatGPT. This launched a user-friendly chatbot that could interpret and then generate responses in milli-seconds that were more than just coherent, but informative, insightful, and intuitive. The potential of AI was on display for all the world to see and users of OpenAI’s system grew to 1 million users in five days and 100 million users in 2 months, the fastest adoption of a platform the world has ever seen. They recently have reached 400 million weekly active users.

The societal adoption makes sense, but what about the business application, where there is real money to be made? Other than for the reputed college kids writing term papers, AI’s value to an organization is its ability to analyze vast amounts of disorganized data, aggregate it all, and make complex decisions from it. Key industries like healthcare, computer science, cybersecurity, logistics, manufacturing, and content creation are all leading the shift and embracing the benefits of AI technology and there is no end in sight to the innovation available.

The efficiency gains and reduced operational costs to an organization are limited only by a user’s imagination for what queries to put to the test. But speaking openly, as someone who grew up in the power distribution world, peddling equipment that made utilities and industries more efficient and reduced OpEx as our core product benefits, I can tell you this isn’t an easy value proposition to market your products on, even when it is so tangibly evident as it is with AI, and the enterprise and B2B adoption is rolling out slower than the headlines might have us believing.

Simply stated, this technology is only profitable if there are paying customers and revenue growth that follow. Serious startup capital is being spent on applications of this technology that the market may not be ready to support. This does have the markings of a crash, but whether that crash will be a true bubble will depend on the speed, reach, and broader impact of that decline.

Economic Considerations

Herd mentality plays a significant role in the adoption of AI technologies. This phenomenon involves individuals following the crowd and making decisions based on the actions of others, rather than their own beliefs or analysis. In the context of AI, herd behavior is amplified by the widespread adoption of AI tools and the fear of missing out (FOMO) on potential benefits.

AI algorithms, trained on extensive datasets, can perpetuate this mentality by replicating existing trends and strategies, making them more appealing to a broader audience. As a result, the rapid adoption of AI technologies can lead to inflated expectations and valuations, similar to what was observed during the dotcom bubble, where speculative demand drove prices far beyond their intrinsic value.

The prices of hardware necessary for AI development and deployment are being driven up by several factors, including scarcity and increased demand. The rapid growth of AI applications has led to a surge in demand for GPUs and TPUs necessary for training models. This increased demand, coupled with supply chain constraints and geopolitical tensions affecting semiconductor production, has resulted in higher prices for these critical components.

Additionally, the concentration of manufacturing in a few regions exacerbates these supply chain issues, further contributing to price increases. As AI continues to expand across industries, the strain on hardware resources is likely to persist, maintaining upward pressure on prices.

Right now, investors and data center operators, alike, are attempting to chart the viability of the many parties and the likely winners of the AI arms race, and charting those sort of outcomes always brings different economic tools such as game theory to mind, where we have many players all vying for the same opportunities. The considerations of approaching this like a game are that we can complement our decisions by modeling interdependencies, ensuring strategies that achieve the most desirable outcomes.

This mathematical framework is frequently used for understanding interactions within an ecosystem, but is much more complicated than the well-known Nash equilibrium, whereby each participant strives to maximize their outcome, and equilibrium is achieved only when all players have reached this maximum, which is interdependent on the behaviors and actions of the other players. The Prisoner’s Dilemma is the well-known classic, but as applied in this sense, other studied “games” to consider are more applicable, especially those that result in a “winner takes all” outcome.

One of the challenges, however, is that new neocloud players are joining amidst an ongoing game, making this extremely difficult to mathematically chart. Nevertheless, it can be useful framework for isolated scenario modeling of strategies, predictive analytics, and decision mapping to anticipate outcomes.

For example, many AI startup companies may be bidding for the same hyperscale AI projects. As with a Prisoner’s Dilemma, there may be a first-mover advantage, but this is actually more like a game of Chicken. The first to pull out of the competition loses the crown title, but keeps their life; the one to stay in the match (if the other pulls out) earns both; or they defeat each other through psychological tactics whereby 1) neither succeed or 2) the result is mutually assured destruction when neither gives in.

The resulting sentiment is that in this arms race, one year from now only a handful of companies will have survived.

Therefore, investment is slowing down as investors are digging deeper into the cost of the technology, the feasibility of finding customers, and the timeline to revenue. “Show me the money,” is being heard across digital infrastructure, or rather, show me the path to monetization, the business case for your unique application of the technology and prospective customer. With limited winners and an excess of losers, it is hard to see investors placing financial bets across the board; they will be much more strategically selected than we saw in the dot-com days.

Ripples in the Ecosystem

Countering the bubble fear-mongers, it must be argued that the long-term outlook of AI and the underlying technology that fosters this innovation will have a lasting-impact. From the 40,000-foot view, I can’t imagine a fundamentally revolutionary technology causing a complete market burst, while businesses and individuals have already come to rely on various AI applications as essential tools for business.

Rather than a crash, natural economic adjustment may be more likely, though it must be said that market fluctuations have had greater swings of late and may be established as a norm that day tradesmen have to account for in their strategies, while longer-term investors are willing to ride these waves out. That is, if they ever lock in on a winner they choose to financially back. Readjustments are just part of the game.

As an asset category, we need to consider the full ecosystem and consider the market corrections we’ve begun to see play out:

  • Competitive Market Growth:  An example of this is easily seen when we consider the DeepSeek launch recently, a Chinese product competitive to ChatGPT that supposedly boasted lower costs and energy usage. The U.S. tech index lost $1 trillion in value that day. Much of that was quickly recovered. Additionally, individual stocks may contribute to some fluctuations, but there was some concern about a burst looming, because a single announcement should never have seen the swing that resulted from this announcement. In general, we need to stop letting short-term sentiment and fear impact us to this extent and trust what we know to be true about the technology adoption. The wake-up call was heard nonetheless across the market, and we should expect to see much more reticence to large investments that present a high risk profile.
  • Lease Terms: The data center market has been a bit of a seller’s market for a few years now; those with land and power need simply say the word and they could lock in 15 year lease terms. That’s changing a bit of late and as we’ve seen, some hyperscalers are even pulling back lease terms to under 10 years, some around 7-8 years. AI leases are even less secure with many neocloud startups aiming for 5-7 year lease terms. This doesn’t offer the same confidence to an investor or to a data center provider compared to a longer-term commit and let’s not forget, these cash constrained startups cannot afford to give this perception. As we learned from the real estate bubble, inability to pay the rent quite literally could become a trigger for another burst.
  • Equipment Obsolescence:  Another factor to consider is the high cost of investment in hardware. Ultimately with growth, price per unit will come down. Then as new models are released by the various manufacturers, the previous renditions will become obsolete, and suddenly entire generations of hardware may lose value. As long as the neocloud provider has established a decent customer base to generate revenue, or a hyperscaler has deep enough pockets to fund an equipment refresh, this is no concern. But it’s a bitter pill to swallow when it happens and is not always a blow that can be recovered from, since it hinges on the model already demonstrating success. Some question has arisen whether there will be a second-hand market for GPUs. With the investment that goes into the purchase up front, it would be a struggle to imagine that there won’t be, but a viable use case has yet to emerge; it’s simply too new to discern. It would likely be pennies on the dollar, but better than nothing. Perhaps by being repurposed for smaller outfits that lease  to single-use enterprises will provide a niche market where equipment finds new utility, even if not as lucrative as the initial use.
  • Equipment Failure:  Beginning to be discussed openly, GPUs have a high failure rate due to component failures, memory issues, and driver problems. This unreliability can lead to costly downtime and data loss, impacting the efficiency and reliability of AI operations. As AI applications become more complex and widespread, the need for robust and reliable GPU infrastructure grows. The consequences of these failures ripple through the market, affecting not only the deployment timelines and operational costs but will also make companies more hesitant to adopt and scale their use of the technology. Moreover, the scarcity of GPUs, exacerbated by supply chain disruptions and export restrictions, further complicates the situation, pushing companies to explore alternative solutions like GPU-as-a-Service (GPUaaS) to mitigate these risks.
  • Stock Valuations:  Nvidia, the leading supplier of GPUs essential for training AI models, has become one of the most valuable publicly listed companies, with a valuation exceeding $3 trillion. As the gold standard for GPUs, Nvidia’s stock performance significantly influences the broader market, particularly tech-heavy indices like the S&P 500. Given its substantial market capitalization, Nvidia’s stock makes up a considerable portion of major indexes, meaning that any large market adjustment could have far-reaching effects on the entire tech sector. This concentration of market influence in a few key stocks, including Nvidia, leaves investors vulnerable unless they are well diversified. The valuation of AI-related stocks, such as OpenAI potentially reaching a $300 billion valuation despite never being profitable, raises questions about sustainability. The recent stock market surge has been largely driven by the “Magnificent Seven” companies—Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla—which are heavily invested in AI and have collectively seen significant growth. These companies account for over half of the S&P 500’s total return in 2024, with annualized appreciation rates exceeding 20% over the past five years, and Nvidia leading with over 90% growth. The sustainability of such high valuations and growth rates is uncertain, and any correction could have profound implications for the entire market.
  • Colocation Markets:  The Magnificent Seven mentioned include the hyperscale market, which naturally leads the majority of AI investment, but we must consider impacts to other operators. Over the past two years, many hyperscalers paused to reevaluate their facility designs, then turned to colocation providers for extended support. We have now seen this infrastructure begin to crumble, with Microsoft cancelling leases based on concerns of oversupply and reduced capacity needs for AI. Those contracted deployments will have caused a financial loss for the colocation providers who planned to construct them. This may have been our biggest market test yet, as it eerily echoes the dot-com triggers that began the burst. The market did react and it’s unclear whether we’re out of the woods just yet. Aside from hyperscale AI deployments inside a colocation data center, neocloud companies present another viable AI tenant opportunity, but even they are all bidding for the same hyperscale contracts. When the hyperscalers get nervous, this puts the entire industry at great concern about long-term viability.
Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Agentic AI: What now, what next?

Agentic AI burst onto the scene with its promises of streamliningoperations and accelerating productivity. But what’s real and what’s hype when it comes to deploying agentic AI? This Special Report examines the state of agentic AI, the challenges organizations are facing in deploying it, and the lessons learned from success

Read More »

AMD to build two more supercomputers at Oak Ridge National Labs

Lux is engineered to train, refine, and deploy AI foundation models that accelerate scientific and engineering progress. Its advanced architecture supports data-intensive and model-centric workloads, thereby enhancing AI-driven research capabilities. Discovery differs from Lux in that it uses Instinct MI430X GPUs instead of the 300 series. The MI400 Series is

Read More »

Xcel Energy rolls out $60 billion capital spending plan

By the Numbers: Xcel Energy Q3 2025 $524M Quarterly earnings, down 23% from 2024 on higher depreciation, interest charges and O&M expenses, partially offset by improved recovery from infrastructure investments. 3 GW Capacity of contracted or ‘high probability’ data center load. The utility says it is tracking additional deals that could exceed 20 GW in new load. $60B Five-year capital spending plan Accelerated Growth The Minneapolis-based utility serves about 3.9 million electric customers in parts of Colorado, Michigan, Minnesota, New Mexico, North Dakota, South Dakota, Texas and Wisconsin. The company expects retail sales to grow 5% through 2030. A 3 GW pipeline of contracted and “high probability” data center projects will drive the majority of that growth, according to the company. Leaders believe Xcel’s data center queue could exceed 20 GW if earlier-stage prospects materialize. Xcel Energy announced a $15 billion addition to its five-year capital plan on Thursday, which CEO Bob Frenzel said will now cover 7.5 GW of new renewable generation, 3 GW of new gas generation, 1.9 GW of energy storage, 1,500 miles of high-voltage transmission and $5 billion for wildfire mitigation. Xcel and two telecom companies agreed to a $640 million settlement with plaintiffs in a lawsuit over the 2021-2022 Marshall Fire in Colorado in September. The company excluded a $290 million charge from its share of the Marshall Wildfire settlement in Colorado from quarterly earnings metrics, it noted. Xcel’s long-term vision includes the addition of 4.5 GW of new natural gas capacity as well as 5 GW of energy storage, Frenzel said. “Making sure that we can deliver a cleaner energy product as well as a highly reliable and highly affordable product is very strategic as we approach economic development with data centers,” Frenzel said. New data center load represents about 60% of Xcel’s anticipated retail sales growth through

Read More »

Lukoil to Sell Assets to Gunvor Amid Sanctions

Russian oil producer Lukoil PJSC has agreed to sell its international assets to energy trader Gunvor Group, a week after being hit by US sanctions. The country’s No. 2 oil producer said it had accepted an offer from Gunvor and made a commitment not to negotiate with other potential buyers. If successful, the deal would involve the transfer of a sprawling global network of oil fields, refineries and gas stations to one of the world’s top independent commodity traders.  The US last week blacklisted oil giants Rosneft PJSC and Lukoil as part of a fresh bid to end the war in Ukraine by depriving Moscow of revenues. It was the first major package of sanctions on Russia’s petroleum industry since US President Donald Trump took office, and has left governments and business partners clambering to understand the impact. The offer — for which no value was disclosed — includes Lukoil International’s trading arm Litasco, but not the business units in Dubai which have recently become subject to sanctions, said a person familiar with the matter. Gunvor itself has had a long history with Russia. Its co-founder Gennady Timchenko was placed under US sanctions in the wake of the Kremlin’s annexation of Crimea in 2014, with the US government claiming at the time that Russian President Vladimir Putin had “investments in Gunvor,” which the company has consistently denied.  Since Timchenko sold his shares, it’s now majority-owned by co-founder and chief executive officer Torbjorn Tornqvist.  After making record profits from recent volatility in energy markets, cash-rich commodity traders are spending big on assets to help lock in better margins for the future. A potential deal could provide Gunvor with a system of upstream and downstream businesses akin to the trading units of majors like BP Plc and Shell Plc. The deal is subject to

Read More »

Energy Department Announces $100 Million to Restore America’s Coal Plants

WASHINGTON— The U.S. Department of Energy (DOE) today issued a Notice of Funding Opportunity (NOFO) for up to $100 million in federal funding to refurbish and modernize the nation’s existing coal power plants. It follows the Department’s September announcement of its intent to invest $625 million to expand and reinvigorate America’s coal industry. The effort will support practical, high-impact projects that improve efficiency, plant lifetimes, and performance of coal and natural gas use. “For years, the Biden and Obama administrations relentlessly targeted America’s coal industry and workers, resulting in the closure of reliable power plants and higher electricity costs,” said U.S. Secretary of Energy Chris Wright. “Thankfully, President Trump has ended the war on American coal and is restoring common sense energy policies that put Americans first. These projects will help keep America’s coal plants operating and ensure the United States has the reliable and affordable power it needs to keep the lights on and power our future.” This effort supports President Trump’s Executive Orders, Reinvigorating America’s Beautiful Clean Coal Industry and Strengthening the Reliability and Security of the United States Electric Grid, and advances his commitment to restore U.S. energy dominance. This NOFO seeks applications for projects to design, implement, test, and validate three strategic opportunities for refurbishment and retrofit of existing American coal power plants to make them operate more efficiently, reliably, and affordably: Development, engineering, and implementation of advanced wastewater management systems capable of cost-effective water recovery and other value-added byproducts from wastewater streams. Engineering, design, and implementation of retrofit systems that enable fuel switching between coal and natural gas without compromising critical operational parameters. Deployment, engineering, and implementation of advanced coal-natural gas co-firing systems and system components, including highly fuel-flexible burner designs and advanced control systems, to maximize gas co-firing capacity to provide a low cost retrofit option for coal plants while minimizing efficiency penalties. DOE’s National Energy

Read More »

USA Energy Sec Says Goal Is for Canada Trade Talks to Resume

(Update) October 31, 2025, 4:18 PM GMT: Adds comments from President Trump, starting in the first paragraph. US President Donald Trump said he received an apology from Canadian Prime Minister Mark Carney over a television ad that opposed tariffs, but suggested that trade talks between the two countries won’t restart.  Asked by reporters aboard Air Force One whether negotiations between the White House and Carney’s government would resume, Trump said: “No, but I have a very good relationship. I like him a lot, but you know, what they did was wrong. He was very nice. He apologized for what they did with the commercial.”  Earlier Friday, US Energy Secretary Chris Wright said the goal is for the US and Canada to return to the table after talks broke off last week, and for the countries to cooperate more closely on oil, gas and critical minerals. There has been friction in the talks between Canada and the US “for some good reasons,” Wright told reporters at the Group of Seven energy and environment ministers’ meeting in Toronto on Friday. Trump called off the negotiations last week after the province of Ontario aired an anti-tariff advertisement in the US that drew from a 1987 radio address by former President Ronald Reagan. Trump also threatened an additional 10% tariff on Canada. Before the breakdown, Carney said the two countries had been progressing on a deal on steel and aluminum sectoral tariffs, as well as energy. Carney had pitched Trump on reviving the Keystone XL pipeline project. “Unfortunately we’ve had some bumps on the road,” Wright said. “I would say the goal is to bring those back together and I think to see cooperation between the United States and Canada across critical minerals, across oil and gas.” Trump has also said recently that he’s satisfied with

Read More »

Exxon and Chevron Top Estimates With Oil Output Increases

Exxon Mobil Corp. and Chevron Corp. outperformed Wall Street expectations after new oilfield projects and acquisitions boosted crude output. Exxon’s adjusted third-quarter profit per-share was 7 cents higher than analysts forecast, while Chevron posted an almost 20-cent surprise on Friday. For Exxon, it was the sixth consecutive beat, buoyed by the startup of the explorer’s latest Guyana development. Chevron rose as much as 3.1% in New York. Exxon, meanwhile, dipped as much as 1.5% after a spate of acquisitions during the period pressured free cash flow. North America’s largest oil companies are pursuing divergent paths as global oil markets slip into what is widely expected to be a hefty supply glut. As Exxon presses head with a raft of expansion projects despite slumping crude prices, Chevron is positioning itself to wring cash from operations to weather the market downturn. This is all happening against the backdrop of efforts by the OPEC+ alliance to recapture market share by unleashing more crude onto global markets. Brent crude, the international benchmark, already is on pace for its worst annual decline in half a decade. The US supermajors followed European rival Shell Plc in posting stronger-than-expected results. TotalEnergies SE reported profit that was in-line with expectations. BP Plc is scheduled to disclose results next week. For Exxon, eight of the 10 new developments slated for this year have already started up and the remaining two are “on track,” Chief Executive Officer Darren Woods said in a statement.  Woods is betting Exxon’s low debt level means he has ample capacity to fund growth projects that span from crude in Brazil to chemicals in China while maintaining a $20 billion annual buyback program despite weak oil prices. His goal is to be ready to capitalize on an upturn in commodity prices, which analysts say could come

Read More »

Trump Says Canada Trade Talks Won’t Resume, Contradicting Energy Sec

(Update) October 31, 2025, 4:18 PM GMT: Adds comments from President Trump, starting in the first paragraph. US President Donald Trump said he received an apology from Canadian Prime Minister Mark Carney over a television ad that opposed tariffs, but suggested that trade talks between the two countries won’t restart.  Asked by reporters aboard Air Force One whether negotiations between the White House and Carney’s government would resume, Trump said: “No, but I have a very good relationship. I like him a lot, but you know, what they did was wrong. He was very nice. He apologized for what they did with the commercial.”  Earlier Friday, US Energy Secretary Chris Wright said the goal is for the US and Canada to return to the table after talks broke off last week, and for the countries to cooperate more closely on oil, gas and critical minerals. There has been friction in the talks between Canada and the US “for some good reasons,” Wright told reporters at the Group of Seven energy and environment ministers’ meeting in Toronto on Friday. Trump called off the negotiations last week after the province of Ontario aired an anti-tariff advertisement in the US that drew from a 1987 radio address by former President Ronald Reagan. Trump also threatened an additional 10% tariff on Canada. Before the breakdown, Carney said the two countries had been progressing on a deal on steel and aluminum sectoral tariffs, as well as energy. Carney had pitched Trump on reviving the Keystone XL pipeline project. “Unfortunately we’ve had some bumps on the road,” Wright said. “I would say the goal is to bring those back together and I think to see cooperation between the United States and Canada across critical minerals, across oil and gas.” Trump has also said recently that he’s satisfied with

Read More »

Supermicro Unveils Data Center Building Blocks to Accelerate AI Factory Deployment

Supermicro has introduced a new business line, Data Center Building Block Solutions (DCBBS), expanding its modular approach to data center development. The offering packages servers, storage, liquid-cooling infrastructure, networking, power shelves and battery backup units (BBUs), DCIM and automation software, and on-site services into pre-validated, factory-tested bundles designed to accelerate time-to-online (TTO) and improve long-term serviceability. This move represents a significant step beyond traditional rack integration; a shift toward a one-stop, data-center-scale platform aimed squarely at the hyperscale and AI factory market. By providing a single point of accountability across IT, power, and thermal domains, Supermicro’s model enables faster deployments and reduces integration risk—the modern equivalent of a “single throat to choke” for data center operators racing to bring GB200/NVL72-class racks online. What’s New in DCBBS DCBBS extends Supermicro’s modular design philosophy to an integrated catalog of facility-adjacent building blocks, not just IT nodes. By including critical supporting infrastructure—cooling, power, networking, and lifecycle software—the platform helps operators bring new capacity online more quickly and predictably. According to Supermicro, DCBBS encompasses: Multi-vendor AI system support: Compatibility with NVIDIA, AMD, and Intel architectures, featuring Supermicro-designed cold plates that dissipate up to 98% of component-level heat. In-rack liquid-cooling designs: Coolant distribution manifolds (CDMs) and CDUs rated up to 250 kW, supporting 45 °C liquids, alongside rear-door heat exchangers, 800 GbE switches (51.2 Tb/s), 33 kW power shelves, and 48 V battery backup units. Liquid-to-Air (L2A) sidecars: Each row can reject up to 200 kW of heat without modifying existing building hydronics—an especially practical design for air-to-liquid retrofits. Automation and management software: SuperCloud Composer for rack-scale and liquid-cooling lifecycle management SuperCloud Automation Center for firmware, OS, Kubernetes, and AI pipeline enablement Developer Experience Console for self-service workflows and orchestration End-to-end services: Design, validation, and on-site deployment options—including four-hour response service levels—for both greenfield builds

Read More »

Investments Anchor Vertiv’s Growth Strategy as AI-Driven Data Center Orders Surge 60% YoY

New Acquisitions and Partner Awards Vertiv’s third-quarter financial performance was underscored by a series of strategic acquisitions and ecosystem recognitions that expand the company’s technological capabilities and market reach amid AI-driven demand. Acquisition of Waylay NV: AI and Hyperautomation for Infrastructure Intelligence On August 26, Vertiv announced its acquisition of Waylay NV, a Belgium-based developer of generative AI and hyperautomation software. The move bolsters Vertiv’s portfolio with AI-driven monitoring, predictive services, and performance optimization for digital infrastructure. Waylay’s automation platform integrates real-time analytics, orchestration, and workflow automation across diverse connected assets and cloud services—enabling predictive maintenance, uptime optimization, and energy management across power and cooling systems. “With the addition of Waylay’s technology and software-focused team, Vertiv will accelerate its vision of intelligent infrastructure—data-driven, proactive, and optimized for the world’s most demanding environments,” said CEO Giordano Albertazzi. Completion of Great Lakes Acquisition: Expanding White Space Integration Just days earlier, as alluded to above, Vertiv finalized its $200 million acquisition of Great Lakes Data Racks & Cabinets, a U.S.-based manufacturer of enclosures and integrated rack systems. The addition expands Vertiv’s capabilities in high-density, factory-integrated white space solutions; bridging power, cooling, and IT enclosures for hyperscale and edge data centers alike. Great Lakes’ U.S. and European manufacturing footprint complements Vertiv’s global reach, supporting faster deployment cycles and expanded configuration flexibility.  Albertazzi noted that the acquisition “enhances our ability to deliver comprehensive infrastructure solutions, furthering Vertiv’s capabilities to customize at scale and configure at speed for AI and high-density computing environments.” 2024 Partner Awards: Recognizing the Ecosystem Behind Growth Vertiv also spotlighted its partner ecosystem in August with its 2024 North America Partner Awards. The company recognized 11 partners for 2024 performance, growth, and AI execution across segments: Partner of the Year – SHI for launching a customer-facing high-density AI & Cyber Labs featuring

Read More »

QuEra’s Quantum Leap: From Neutral-Atom Breakthroughs to Hybrid HPC Integration

The race to make quantum computing practical – and commercially consequential – took a major step forward this fall, as Boston-based QuEra Computing announced new research milestones, expanded strategic funding, and an accelerating roadmap for hybrid quantum-classical supercomputing. QuEra’s Chief Commercial Officer Yuval Boger joined the Data Center Frontier Show to discuss how neutral-atom quantum systems are moving from research labs into high-performance computing centers and cloud environments worldwide. NVIDIA Joins Google in Backing QuEra’s $230 Million Round In early September, QuEra disclosed that NVentures, NVIDIA’s venture arm, has joined Google and others in expanding its $230 million Series B round. The investment deepens what has already been one of the most active collaborations between quantum and accelerated-computing companies. “We already work with NVIDIA, pairing our scalable neutral-atom architecture with its accelerated-computing stack to speed the arrival of useful, fault-tolerant quantum machines,” said QuEra CEO Andy Ory. “The decision to invest in us underscores our shared belief that hybrid quantum-classical systems will unlock meaningful value for customers sooner than many expect.” The partnership spans hardware, software, and go-to-market initiatives. QuEra’s neutral-atom machines are being integrated into NVIDIA’s CUDA-Q software platform for hybrid workloads, while the two companies collaborate at the NVIDIA Accelerated Quantum Center (NVAQC) in Boston, linking QuEra hardware with NVIDIA’s GB200 NVL72 GPU clusters for simulation and quantum-error-decoder research. Meanwhile, at Japan’s AIST ABCI-Q supercomputing center, QuEra’s Gemini-class quantum computer now operates beside more than 2,000 H100 GPUs, serving as a national testbed for hybrid workflows. A jointly developed transformer-based decoder running on NVIDIA’s GPUs has already outperformed classical maximum-likelihood error-correction models, marking a concrete step toward practical fault-tolerant quantum computing. For NVIDIA, the move signals conviction that quantum processing units (QPUs) will one day complement GPUs inside large-scale data centers. For QuEra, it widens access to the

Read More »

How CoreWeave and Poolside Are Teaming Up in West Texas to Build the Next Generation of AI Data Centers

In the evolving landscape of artificial-intelligence infrastructure, a singular truth is emerging: access to cutting-edge silicon and massive GPU clusters is no longer enough by itself. For companies chasing the frontier of multi-trillion-parameter model training and agentic AI deployment, the bottleneck increasingly lies not just in compute, but in the seamless integration of compute + power + data center scale. The latest chapter in this story is the collaboration between CoreWeave and Poolside, culminating in the launch of Project Horizon, a 2-gigawatt AI-campus build in West Texas. Setting the Stage: Who’s Involved, and Why It Matters CoreWeave (NASDAQ: CRWV) has positioned itself as “The Essential Cloud for AI™” — a company founded in 2017, publicly listed in March 2025, and aggressively building out its footprint of ultra-high-performance infrastructure.  One of its strategic moves: in July 2025 CoreWeave struck a definitive agreement to acquire Core Scientific (NASDAQ: CORZ) in an all-stock transaction. Through that deal, CoreWeave gains grip over approximately 1.3 GW of gross power across Core Scientific’s nationwide data center footprint, plus more than 1 GW of expansion potential.  That acquisition underlines a broader trend: AI-specialist clouds are no longer renting space and power; they’re working to own or tightly control it. Poolside, founded in 2023, is a foundation-model company with an ambitious mission: building artificial general intelligence (AGI) and deploying enterprise-scale agents.  According to Poolside’s blog: “When people ask what it takes to build frontier AI … the focus is usually on the model … but that’s only half the story. The other half is infrastructure. If you don’t control your infrastructure, you don’t control your destiny—and you don’t have a shot at the frontier.”  Simply put: if you’re chasing multi-trillion-parameter models, you need both the compute horsepower and the power infrastructure; and ideally, tight vertical integration. Together, the

Read More »

Vantage Data Centers Pours $15B Into Wisconsin AI Campus as It Builds Global Giga-Scale Footprint

Expanding in Ohio: Financing Growth Through Green Capital In June 2025, Vantage secured $5 billion in green loan capacity, including $2.25 billion to fully fund its New Albany, Ohio (OH1) campus and expand its existing borrowing base. The 192 MW development will comprise three 64 MW buildings, with first delivery expected in December 2025 and phased completion through 2028. The OH1 campus is designed to come online as Vantage’s larger megasites ramp up, providing early capacity and regional proximity to major cloud and AI customers in the Columbus–New Albany corridor. The site also offers logistical and workforce advantages within one of the fastest-growing data center regions in the U.S. Beyond the U.S. – Vantage Expands Its Global Footprint Moving North: Reinforcing Canada’s Renewable Advantage In February 2025, Vantage announced a C$500 million investment to complete QC24, the fourth and final building at its Québec City campus, adding 32 MW of capacity by 2027. The project strengthens Vantage’s Montreal–Québec platform and reinforces its renewable-heavy power profile, leveraging abundant hydropower to serve sustainability-driven customers. APAC Expansion: Strategic Scale in Southeast Asia In September 2025, Vantage unveiled a $1.6 billion APAC expansion, led by existing investors GIC (Singapore’s sovereign wealth fund) and ADIA (Abu Dhabi Investment Authority). The investment includes the acquisition of Yondr’s Johor, Malaysia campus at Sedenak Tech Park. Currently delivering 72.5 MW, the Johor campus is planned to scale to 300 MW at full build-out, positioning it within one of Southeast Asia’s most active AI and cloud growth corridors. Analysts note that the location’s connectivity to Singapore’s hyperscale market and favorable development economics give Vantage a strong competitive foothold across the region. Italy: Expanding European Presence Under National Priority Status Vantage is also adding a second Italian campus alongside its existing Milan site, totaling 32 MW across two facilities. Phase

Read More »

Nvidia GTC show news you need to know round-up

In the case of Flex, it will use digital twins to unify inventory, labor, and freight operations, streamlining logistics across Flex’s worldwide network. Flex’s new 400,000 sq. ft. facility in Dallas is purpose-built for data center infrastructure, aiming to significantly shorten lead times for U.S. customers. The Flex/Nvidia partnership aims to address the country’s labor shortages and drive innovation in manufacturing, pharmaceuticals, and technology. The companies believe the partnership sets the stage for a new era of giga-scale AI factories. Nvidia and Oracle to Build DOE’s Largest AI Supercomputer Oracle continues its aggressive push into supercomputing with a deal to build the largest AI supercomputer for scientific discovery — Using Nvidia GPUs, obviously — at a Department of Energy facility. The system, dubbed Solstice, will feature an incredible 100,000 Nvidia Blackwell GPUs. A second system, dubbed Equinox, will include 10,000 Blackwell GPUs and is expected to be available in the first half of 2026. Both systems will be interconnected by Nvidia networking and deliver a combined 2,200 exaflops of AI performance. The Solstice and Equinox supercomputers will be located at Argonne National Laboratory, the home to the Aurora supercomputer, built using all Intel parts. They will enable scientists and researchers to develop and train new frontier models and AI reasoning models for open science using the Nvidia Megatron-Core library and scale them using the Nvidia TensorRT inference software stack.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »