Stay Ahead, Stay ONMINE

Chan Zuckerberg Initiative’s rBio uses virtual cells to train AI, bypassing lab work

The Chan Zuckerberg Initiative announced Thursday the launch of rBio, the first artificial intelligence model trained to reason about cellular biology using virtual simulations rather than requiring expensive laboratory experiments — a breakthrough that could dramatically accelerate biomedical research and drug discovery.The reasoning model, detailed in a research paper published on bioRxiv, demonstrates a novel approach called “soft verification” that uses predictions from virtual cell models as training signals instead of relying solely on experimental data. This paradigm shift could help researchers test biological hypotheses computationally before committing time and resources to costly laboratory work.“The idea is that you have these super powerful models of cells, and you can use them to simulate outcomes rather than testing them experimentally in the lab,” said Ana-Maria Istrate, senior research scientist at CZI and lead author of the research, in an interview. “The paradigm so far has been that 90% of the work in biology is tested experimentally in a lab, while 10% is computational. With virtual cell models, we want to flip that paradigm.”The announcement represents a significant milestone for CZI’s ambitious goal to “cure, prevent, and manage all disease by the end of this century.” Under the leadership of pediatrician Priscilla Chan and Meta CEO Mark Zuckerberg, the $6 billion philanthropic initiative has increasingly focused its resources on the intersection of artificial intelligence and biology.

The Chan Zuckerberg Initiative announced Thursday the launch of rBio, the first artificial intelligence model trained to reason about cellular biology using virtual simulations rather than requiring expensive laboratory experiments — a breakthrough that could dramatically accelerate biomedical research and drug discovery.

The reasoning model, detailed in a research paper published on bioRxiv, demonstrates a novel approach called “soft verification” that uses predictions from virtual cell models as training signals instead of relying solely on experimental data. This paradigm shift could help researchers test biological hypotheses computationally before committing time and resources to costly laboratory work.

“The idea is that you have these super powerful models of cells, and you can use them to simulate outcomes rather than testing them experimentally in the lab,” said Ana-Maria Istrate, senior research scientist at CZI and lead author of the research, in an interview. “The paradigm so far has been that 90% of the work in biology is tested experimentally in a lab, while 10% is computational. With virtual cell models, we want to flip that paradigm.”

The announcement represents a significant milestone for CZI’s ambitious goal to “cure, prevent, and manage all disease by the end of this century.” Under the leadership of pediatrician Priscilla Chan and Meta CEO Mark Zuckerberg, the $6 billion philanthropic initiative has increasingly focused its resources on the intersection of artificial intelligence and biology.


AI Scaling Hits Its Limits

Power caps, rising token costs, and inference delays are reshaping enterprise AI. Join our exclusive salon to discover how top teams are:

  • Turning energy into a strategic advantage
  • Architecting efficient inference for real throughput gains
  • Unlocking competitive ROI with sustainable AI systems

Secure your spot to stay ahead: https://bit.ly/4mwGngO


rBio addresses a fundamental challenge in applying AI to biological research. While large language models like ChatGPT excel at processing text, biological foundation models typically work with complex molecular data that cannot be easily queried in natural language. Scientists have struggled to bridge this gap between powerful biological models and user-friendly interfaces.

“Foundation models of biology — models like GREmLN and TranscriptFormer — are built on biological data modalities, which means you cannot interact with them in natural language,” Istrate explained. “You have to find complicated ways to prompt them.”

The new model solves this problem by distilling knowledge from CZI’s TranscriptFormer — a virtual cell model trained on 112 million cells from 12 species spanning 1.5 billion years of evolution — into a conversational AI system that researchers can query in plain English.

The ‘soft verification’ revolution: Teaching AI to think in probabilities, not absolutes

The core innovation lies in rBio’s training methodology. Traditional reasoning models learn from questions with unambiguous answers, like mathematical equations. But biological questions involve uncertainty and probabilistic outcomes that don’t fit neatly into binary categories.

CZI’s research team, led by Senior Director of AI Theofanis Karaletsos and Istrate, overcame this challenge by using reinforcement learning with proportional rewards. Instead of simple yes-or-no verification, the model receives rewards proportional to the likelihood that its biological predictions align with reality, as determined by virtual cell simulations.

“We applied new methods to how LLMs are trained,” the research paper explains. “Using an off-the-shelf language model as a scaffold, the team trained rBio with reinforcement learning, a common technique in which the model is rewarded for correct answers. But instead of asking a series of yes/no questions, the researchers tuned the rewards in proportion to the likelihood that the model’s answers were correct.”

This approach allows scientists to ask complex questions like “Would suppressing the actions of gene A result in an increase in activity of gene B?” and receive scientifically grounded responses about cellular changes, including shifts from healthy to diseased states.

Beating the benchmarks: How rBio outperformed models trained on real lab data

In testing against the PerturbQA benchmark — a standard dataset for evaluating gene perturbation prediction — rBio demonstrated competitive performance with models trained on experimental data. The system outperformed baseline large language models and matched performance of specialized biological models in key metrics.

Particularly noteworthy, rBio showed strong “transfer learning” capabilities, successfully applying knowledge about gene co-expression patterns learned from TranscriptFormer to make accurate predictions about gene perturbation effects—a completely different biological task.

“We show that on the PerturbQA dataset, models trained using soft verifiers learn to generalize on out-of-distribution cell lines, potentially bypassing the need to train on cell-line specific experimental data,” the researchers wrote.

When enhanced with chain-of-thought prompting techniques that encourage step-by-step reasoning, rBio achieved state-of-the-art performance, surpassing the previous leading model SUMMER.

From social justice to science: Inside CZI’s controversial pivot to pure research

The rBio announcement comes as CZI has undergone significant organizational changes, refocusing its efforts from a broad philanthropic mission that included social justice and education reform to a more targeted emphasis on scientific research. The shift has drawn criticism from some former employees and grantees who saw the organization abandon progressive causes.

However, for Istrate, who has worked at CZI for six years, the focus on biological AI represents a natural evolution of long-standing priorities. “My experience and work has not changed much. I have been part of the science initiative for as long as I have been at CZI,” she said.

The concentration on virtual cell models builds on nearly a decade of foundational work. CZI has invested heavily in building cell atlases — comprehensive databases showing which genes are active in different cell types across species — and developing the computational infrastructure needed to train large biological models.

“I’m really excited about the work that’s been happening at CZI for years now, because we’ve been building up to this moment,” Istrate noted, referring to the organization’s earlier investments in data platforms and single-cell transcriptomics.

Building bias-free biology: How CZI curated diverse data to train fairer AI models

One critical advantage of CZI’s approach stems from its years of careful data curation. The organization operates CZ CELLxGENE, one of the largest repositories of single-cell biological data, where information undergoes rigorous quality control processes.

“We’ve generated some of the flagship initial data atlases for transcriptomics, and those were generated with diversity in mind to minimize bias in terms of cell types, ancestry, tissues, and donors,” Istrate explained.

This attention to data quality becomes crucial when training AI models that could influence medical decisions. Unlike some commercial AI efforts that rely on publicly available but potentially biased datasets, CZI’s models benefit from carefully curated biological data designed to represent diverse populations and cell types.

Open source vs. big tech: Why CZI is giving away billion-dollar AI technology for free

CZI’s commitment to open-source development distinguishes it from commercial competitors like Google DeepMind and pharmaceutical companies developing proprietary AI tools. All CZI models, including rBio, are freely available through the organization’s Virtual Cell Platform, complete with tutorials that can run on free Google Colab notebooks.

“I do think the open source piece is very important, because that’s a core value that we’ve had since we’ve started CZI,” Istrate said. “One of the main goals for our work is to accelerate science. So everything we do is we want to make it open source for that purpose only.”

This strategy aims to democratize access to sophisticated biological AI tools, potentially benefiting smaller research institutions and startups that lack the resources to develop such models independently. The approach reflects CZI’s philanthropic mission while creating network effects that could accelerate scientific progress.

The end of trial and error: How AI could slash drug discovery from decades to years

The potential applications extend far beyond academic research. By enabling scientists to quickly test hypotheses about gene interactions and cellular responses, rBio could significantly accelerate the early stages of drug discovery — a process that typically takes decades and costs billions of dollars.

The model’s ability to predict how gene perturbations affect cellular behavior could prove particularly valuable for understanding neurodegenerative diseases like Alzheimer’s, where researchers need to identify how specific genetic changes contribute to disease progression.

“Answers to these questions can shape our understanding of the gene interactions contributing to neurodegenerative diseases like Alzheimer’s,” the research paper notes. “Such knowledge could lead to earlier intervention, perhaps halting these diseases altogether someday.”

The universal cell model dream: Integrating every type of biological data into one AI brain

rBio represents the first step in CZI’s broader vision to create “universal virtual cell models” that integrate knowledge from multiple biological domains. Currently, researchers must work with separate models for different types of biological data—transcriptomics, proteomics, imaging—without easy ways to combine insights.

“One of the grand challenges in building these virtual cell models and understanding cells, as I mentioned over the past couple over the next couple of years, is how to integrate knowledge from all of these super powerful models of biology,” Istrate said. “The main challenge is, how do you integrate all of this knowledge into one space?”

The researchers demonstrated this integration capability by training rBio models that combine multiple verification sources — TranscriptFormer for gene expression data, specialized neural networks for perturbation prediction, and knowledge databases like Gene Ontology. These combined models significantly outperformed single-source approaches.

The roadblocks ahead: What could stop AI from revolutionizing biology

Despite its promising performance, rBio faces several technical challenges. The model’s current expertise focuses primarily on gene perturbation prediction, though the researchers indicate that any biological domain covered by TranscriptFormer could theoretically be incorporated.

The team continues working on improving the user experience and implementing appropriate guardrails to prevent the model from providing answers outside its area of expertise—a common challenge in deploying large language models for specialized domains.

“While rBio is ready for research, the model’s engineering team is continuing to improve the user experience, because the flexible problem-solving that makes reasoning models conversational also poses a number of challenges,” the research paper explains.

The trillion-dollar question: How open source biology AI could reshape the pharmaceutical industry

The development of rBio occurs against the backdrop of intensifying competition in AI-driven drug discovery. Major pharmaceutical companies and technology firms are investing billions in biological AI capabilities, recognizing the potential to transform how medicines are discovered and developed.

CZI’s open-source approach could accelerate this transformation by making sophisticated tools available to the broader research community. Academic researchers, biotech startups, and even established pharmaceutical companies can now access capabilities that would otherwise require substantial internal AI development efforts.

The timing proves significant as the Trump administration has proposed substantial cuts to the National Institutes of Health budget, potentially threatening public funding for biomedical research. CZI’s continued investment in biological AI infrastructure could help maintain research momentum during periods of reduced government support.

A new chapter in the race against disease

rBio’s launch marks more than just another AI breakthrough—it represents a fundamental shift in how biological research could be conducted. By demonstrating that virtual simulations can train models as effectively as expensive laboratory experiments, CZI has opened a path for researchers worldwide to accelerate their work without the traditional constraints of time, money, and physical resources.

As CZI prepares to make rBio freely available through its Virtual Cell Platform, the organization continues expanding its biological AI capabilities with models like GREmLN for cancer detection and ongoing work on imaging technologies. The success of the soft verification approach could influence how other organizations train AI for scientific applications, potentially reducing dependence on experimental data while maintaining scientific rigor.

For an organization that began with the audacious goal of curing all diseases by the century’s end, rBio offers something that has long eluded medical researchers: a way to ask biology’s hardest questions and get scientifically grounded answers in the time it takes to type a sentence. In a field where progress has traditionally been measured in decades, that kind of speed could make all the difference between diseases that define generations—and diseases that become distant memories.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

CompTIA unveils AI prompting certification

IT training and certification provider CompTIA launched a new certification designed to help professionals develop and enhance artificial intelligence skills in prompt writing, AI collaboration, and the responsible use of AI. The AI Prompting Essentials certification program can help professionals learn how to identify tasks best suited to AI and

Read More »

VMware Explore preview: Customers are looking for VCF value

“This year’s VMware Explore theme, ‘Simplify Your Cloud. Architect Your Future,’ speaks to where we see customers going,” said Broadcom in a statement to Network World. “Customers are struggling with cloud complexity, AI adoption, and security demands – all at once. VMware Explore 2025 aims to provide them with the

Read More »

Halliburton to Provide Well Stimulation Services for ConocoPhillips

Halliburton said it was awarded a contract to deliver comprehensive well stimulation services to ConocoPhillips Skandinavia AS to improve well performance and reservoir productivity. The contract spans five years and includes three optional extension periods, Halliburton said in a news release. Financial terms of the contract were not disclosed. Under

Read More »

Oil Gains as Tariff Fears Loom

Oil advanced in a choppy session after a Trump administration trade official said he expected additional tariffs on India as a result of the country’s Russian crude purchases. West Texas Intermediate for October delivery climbed more than 1% to settle above $63 a barrel after trade adviser Peter Navarro said he expected US tariffs on India to double on Aug. 27 as a penalty for Russian oil purchases. Brent gained to settle near $68. For the past 10 sessions, US oil futures have been locked in a tight range between about $62 and $65 a barrel. Investors are monitoring progress toward a Russia-Ukraine ceasefire following a series of high-level talks brokered by President Donald Trump. The US has worked to set up a meeting between the warring sides, though the Kremlin so far has proved noncommittal. Any peace deal may lead to fewer restrictions on Russia’s crude exports, although Moscow has largely kept its oil flowing despite an array of sanctions. Investors also continued to parse a mixed US crude stockpile report from Wednesday that included the biggest overall decline since mid-June but a seventh straight weekly buildup at the storage hub of Cushing, Oklahoma. The delivery point for West Texas Intermediate futures has seen a recent surge in supplies from the Permian Basin. Oil has dropped more than 10% this year on concerns that US tariffs will hurt economic growth just as OPEC+ nations are returning idled production, raising expectations for a glut once peak summer demand ends. US gasoline stockpiles also declined for a fifth straight week, offering a reminder that, while many traders expect a surplus later this year, global inventories are still abnormally low. Jet fuel demand remains strong. “The market continues to weigh a mix of bullish and bearish drivers that, which together with thin

Read More »

EU, US Chart Next Steps on Trade Pact

The US and European Union took the next steps to formalize their trade pact, detailing plans that could reduce tariffs on European automobiles within weeks while opening the door to new potential discounts for steel and aluminum. The joint statement issued Thursday advances the preliminary deal announced a month ago, by including specific benchmarks for the EU to secure its promised sectoral tariff discounts on cars, pharmaceuticals and semiconductors, as well as new commitments to cooperate on economic security matters, food standards and digital trade. President Donald Trump has repeatedly praised the sweeping US-EU trade framework, extolling it as “a big deal” in a Monday White House meeting with foreign leaders including European Commission President Ursula von der Leyen.  The deal provides predictability and “delivers for our citizens & companies, and strengthens transatlantic relations,” von der Leyen said in a post on the social-media platform X. The development underscores the nature of trade talks under Trump, with some initial, broad pronouncements of deals giving way to weeks – or more – of work to hammer out detailed agreements. Many of them are also tied to sweeping policy changes that could take time to materialize. For example, Trump already imposed a flat 15 percent rate on most European goods – half the 30 percent he’d previously threatened. But the US promise to extend that lower levy to autos and auto parts now hinges on the EU formally introducing a legislative proposal to eliminate a host of its own tariffs on US industrial goods and provide “preferential market access” for some US seafood and agricultural products. Car Relief The statement outlines choreographed action on both sides of the Atlantic, with the US codifying reduced auto tariffs once the EU “formally introduces the necessary legislative proposal to enact” its own promised tariff reductions. The discounted 15 percent tariffs on

Read More »

FBI, Cisco warn of Russia-linked hackers targeting critical infrastructure organizations

Hackers linked to the Russian government have been exploiting a vulnerability in Cisco networking devices to target critical infrastructure organizations, the FBI said on Wednesday. “In the past year, the FBI detected the actors collecting configuration files for thousands of networking devices associated with US entities across critical infrastructure sectors,” the bureau said in an alert. The Cybersecurity and Infrastructure Security Agency lists energy among 16 critical infrastructure sectors. The hackers, whom the FBI linked to the Russian Federal Security Service’s Center 16, have been taking advantage of a bug in Cisco’s IOS software, tracked as CVE-2018-0171, to execute arbitrary code on unpatched and end-of-life network switches made by Cisco and Rockwell Automation. In some cases, the attackers modified configuration files to gain access to the devices and conduct reconnaissance focused on “protocols and applications commonly associated with industrial control systems,” according to the FBI. FSB, aka Berserk Bear, Dragonfly The FSB’s Center 16, which has conducted operations that researchers track using the names “Berserk Bear” and “Dragonfly,” has spent more than a decade penetrating computer systems by exploiting networking devices’ use of unencrypted protocols, the FBI said. Cisco researchers on Wednesday published their own account of the Russia-linked group, which they call Static Tundra. The group focuses on telecommunications, education and manufacturing organizations around the world, Cisco said, “with victims selected based on their strategic interest to the Russian government,” mostly in Ukraine and its allies. Both the FBI and Cisco linked the FSB’s Center 16 to the SYNful Knock malware, which Google detected in four countries, including Ukraine, in 2015. Cisco said the group’s operations against Ukrainian targets have increased significantly since Moscow expanded its invasion of Ukraine in 2022. “Static Tundra was observed compromising Ukrainian organizations in multiple verticals,” Cisco said, “as opposed to previously more limited,

Read More »

PJM surplus interconnection can support 153 GW of solar, wind, storage: UC Berkeley researchers

Thermal and renewable energy generating facilities in the PJM Interconnection have surplus interconnection capacity that could support about 153 GW of solar, wind and storage, according to a working paper released by University of California, Berkeley, researchers earlier this month. However, the loss of Inflation Reduction Act clean energy tax credits reduces the total renewable energy and storage potential that could economically be built using the surplus capacity to about 106 GW, according to the study. PJM had a peak load of about 160 GW on June 23, during a heat wave. The analysis comes as PJM has been warning it may lack sufficient power supplies to meet growing electric demand, mainly driven by data center development. Taking advantage of existing, but little-used, interconnection capacity could allow PJM to meet its near-term needs with solar and storage, which can be built within two years compared with up to about seven years for a gas-fired power plant, according to Umed Paliwal, one of the report’s authors and a senior scientist at Berkeley’s Goldman School of Public Policy. “We need firm capacity in the PJM grid in the next three to four years,” Paliwal said Tuesday. PJM has about 52 GW of thermal power plants with 15% capacity factors — a measure of how often a power plant runs — as well as 25 GW of solar and wind projects, which operate at 19% and 16% capacity factors, respectively, leaving significant amounts of unused interconnection capacity at those sites, according to the study. The underused points of interconnection at the thermal plants could support about 74 GW of solar and 5 GW of wind by 2030, according to the study. Also, adding about 23 GW of 6-hour battery storage at existing renewable energy sites could support an additional 28 GW of solar

Read More »

Managed EV charging could generate $30B in annual savings by 2035: report

Dive Brief: Managed charging programs can turn flexible electric vehicle loads into a grid resource capable of generating $30 billion in annual utility savings, according to research from ev.energy, a managed charging provider in the United States, and The Brattle Group. Each actively managed vehicle can create up to $575 in avoided costs for utilities, leading to a 10% reduction in all customer electric bills by 2035 — whether they own an EV or not, according to the report. Programs that include a vehicle-to-grid bidirectional charging component could more than double the benefits. While bidirectional programs are still grappling with technical and policy challenges, managed charging is ready for full deployment, ev.energy CEO Nick Woolley said in an interview. “From a regulatory perspective, we don’t need to do more pilots. We need to scale these programs,” he said. Dive Insight: Demand from growing EV adoption is beginning to show up on the electric grid, creating simultaneous opportunities and hurdles for utilities, Woolley said. Absent a proactive approach to managing the demand, power companies will face premature transformer failures, require emergency system upgrades and see customer bills rise higher, according to the report. With managed charging, they can create a more decentralized, optimized and affordable grid. “The biggest source of demand growth is going to be electric vehicles on the energy system in the 2030s, and that really means, from a utility perspective, that if they don’t control and manage that resource, they could end up building lots of expensive additional generation to meet the demand,” Woolley said. Last year, an NREL report looking out to 2050 noted that “EVs are expected to be the largest source of electricity demand growth, and will require investments in generation, transmission, and distribution systems.” It also said that smart integration of EVs “can strengthen the grid by providing

Read More »

Turkey Set for Deal to Explore Gas in Libyan Waters

Libya’s eastern parliament is poised to let Turkey explore for energy in the North African nation’s waters, according to people familiar with the matter, in what would be a dramatic about-face that’s the latest sign of warming ties between Ankara and the region. The House of Representatives in Benghazi is set to vote in the coming weeks on the 2019 pact that set out the terms of an exploration agreement, said people in Turkey and Libya familiar with the deliberations. Most obstacles to the accord have been removed, they said, declining to be identified because of the sensitivity of the issue. The war-torn OPEC member is split between rival administrations in Benghazi and the western capital, Tripoli. The latter has strong relations with Turkey and already supports the deal, but staunch opposition from the east – where military commander Khalifa Haftar’s Libyan National Army holds sway – has hitherto prevented its enactment. If approved, the agreement could kickstart oil and gas exploration by Turkish ships in a swathe of territory between Crete and Turkey, reinforcing Ankara’s claims in the eastern Mediterranean. That risks inflaming a dispute with Greece and Cyprus, which accuse Turkey of a bid for dominance in contested waters.  Eastern Libya’s change of tack speaks to a new detente between Turkey and Haftar, who fought on opposing sides of a months-long battle for the Libyan capital in 2019 and 2020. Turkey militarily backed a previous Islamist-leaning government in Tripoli that Haftar – who counted the United Arab Emirates, Egypt and Russia among his allies – unsuccessfully sought to overthrow. A United Nations-supported peace drive installed a new premier, Abdul Hamid Dbeibah, but has failed to mend the split. The Tripoli-based government and the east have periodic spats over revenue from Africa’s largest oil reserves and control of vital state institutions, although there’s

Read More »

Fluke Networks expands testing to help ease data center networking challenges

High-density fiber connections amplify contamination risks The shift toward higher-density fiber connections has significantly complicated contamination control. Modern array connectors can house up to 24 individual fibers within a single connection point. In contrast, traditional duplex connections contained just two fibers. “The slightest little bit of dust on one of those nine micron wide fibers, which, by the way, is much smaller than a human hair, the slightest little bit of dust on any one of the 24 in that connector, and it’s not going to work,” Mullins explained.  The inspection and cleaning requirements extend beyond traditional fiber testing. Each kit includes cleaning and inspection capabilities. Mullins noted that many technicians take shortcuts on fiber preparation.  “Cleaning and inspecting a fiber, every time you unplug it and plug it back in, adds, like another minute worth of work. But you know what? If you don’t do it, you’re gonna pay for it down the road,” he said. Cable identification a persistent challenge In addition to the new kits, Fluke Networks is also continuing to help solve other persistent networking issues. Physical cable identification continues to plague data center operations despite advances in network management and monitoring. Fluke’s solutions address this through multiple approaches. These include tone and probe technology, remote identification systems, and active network port discovery.

Read More »

Cisco ties storage networking gear to IBM z17 mainframe

“IBM Z systems are mainframes known for their ability to handle massive transaction volumes, support large-scale databases, and provide unmatched levels of security and uptime,” wrote Fausto Vaninetti, a senior solutions engineer for data center technologies at Cisco, in a blog post about the news. “The newest in the IBM Z system family, IBM z17 is the first mainframe fully engineered for the AI age, unlocking expanded capabilities for enterprise-scale AI, such as large language models, generative AI, and accelerated inferencing. However, the performance of mainframe applications depends on the underlying storage infrastructure.” SANs play a critical role in ensuring fast, reliable, and secure access to data, Vaninetti wrote: “For mainframe environments, which leverage high-speed [Fibre Connection] FICON protocol, having a robust SAN fabric that supports these requirements is non-negotiable. A solution that combines high throughput, low latency, and enterprise-class resilience is vital to ensure seamless operations and meet stringent service-level agreement requirements.” According to Vaninetti, some standout features of the MDS 9000 Series for mainframe environments include:

Read More »

Scaling Up: Tract’s Master-Planned Land and Infrastructure Approach to Data Center Development

With the rapid growth of physical data center infrastructure, it’s no surprise that a niche market has emerged for companies specializing in land acquisition. Reports of massive property purchases by firms planning new facilities appear almost daily—and so do accounts of the challenges developers face before the first shovel hits the ground. As parcel sizes grow and power and water demands intensify, the complexities of acquiring and preparing these sites have only increased. Tract is a leader in this space. The Denver-based company develops master-planned data center parks, with more than 25,000 acres of potential sites under its control and plans to support over 25 GW of workload capacity. To put that into perspective, 25,000 acres is roughly 40 square miles—about two-thirds the land area of Washington, D.C., or, for European readers, two-thirds the size of Liechtenstein. Building Shovel-Ready Megasites Rather than waiting for developers to come knocking, Tract takes a proactive approach, built on the core belief that the future of data center growth lies in pre-entitled, zoned, and infrastructure-ready megasites. The company works years in advance to deliver shovel-ready campuses with reliable energy, fiber connectivity, and municipal cooperation already in place. Its model emphasizes strategic land aggregation in high-growth regions, the cultivation of long-term relationships with utilities and governments, and master planning for power, cooling, transportation, and sustainability. This integrated approach positions Tract to deliver both speed and certainty to hyperscale project developers—at scale. Tract’s leadership team brings deep industry experience. Founder and Executive Chairman Grant van Rooyen previously led acquisitions and expansions at Cologix and Teraco. President Matt Spencer brings more than 35 years of telecom and infrastructure leadership, while Chief Energy Officer Nat Sahlstrom, former head of Amazon’s global energy, water, and sustainability teams, helped make Amazon the world’s largest buyer of renewable energy. Backed by

Read More »

When Communities Push Back: Navigating Data Center Opposition

2025 has been a landmark year for data center development. The rise of the AI Factory and AI-driven data center designs has made announcements of massive new complexes routine, with claims and certainties that these facilities will require hundreds of megawatts of power scarcely raising an industry eyebrow. At the same time, opposition is becoming more organized, often forming unexpected alliances. Even in an era of clear political alignment around certain causes, blocking data center projects has emerged as a bipartisan concern among voters. In the past several months, as data center projects in the gigawatt range have been announced, significant behind-the-scenes opposition has been building, from local grassroots organizations to state legislatures crafting new guidelines for data center development. Rising Community Opposition In 2025, multiple communities across the U.S., from Northern Virginia to Indiana, Texas, Arizona, Georgia, and Alabama, have effectively organized to challenge proposed data center developments. Some campaigns have already succeeded in delaying or derailing projects, while others are still building momentum. A report from Data Center Watch, covering the period from May 2024 to March 2025, estimates that billions in data center investment have already been affected by local resistance: $18 billion in projects were blocked, and another $46 billion faced delays. Whether these trends will represent a lasting constraint on the AI-driven data center boom remains unclear, but one point is certain: organized community action has become a central front in the debate over digital infrastructure in America. The Data Center Watch report also identified 142 activist groups across 24 states actively opposing data center projects. While opposition is largely local in focus, the nature of the concerns has remained relatively consistent, with activism often coalescing quickly into organized groups (such as the Coalition to Protect Prince William County, No Desert Data Center, and Protect

Read More »

Study finds data center colocation capacity near zero

Vacancy in the North American market has declined to a new record low of 2.3 percent, and JLL projects that figure will remain the same or go even further down from now through 2027. For comparison, the vacancy rate stood at 9.8 percent in 2020. As bad as the wait for data center capacity has become, the wait for power is even worse. The average wait time for a grid connection across North America is now four years, according to the report, with power delays representing a significant hurdle in efforts to alleviate the shortage of new colocation capacity. Most of the top markets have doubled or even tripled in size since 2020, with Columbus, Ohio leading the way with 1800% growth, followed by Austin/San Antonio at 500% growth. However, they started from a small base in 2020. In absolute terms, Northern Virginia (+3,975 MW), Dallas (+1,008 MW) and Atlanta (+828 MW) have seen the largest increase in capacity.

Read More »

HPEJ, JHPE or what?

The problem is that less than one-eighth of enterprises said that they believed HPE could justify the Juniper deal simply by having HPE salespeople sell Juniper gear. We can take JHPE off the table. But not the Mist influence. Nearly all the enterprises who offered comments on the deal want to see HPE expand Mist to include HPE management, becoming an element in a “full-stack observability” strategy, creating our “Mist4All” name. Three dozen told me that their sales team had suggested this would happen, but interestingly, none who also talked with HPE or Juniper executives said they repeated that promise. HPE’s problem is that only 11 of over 250 enterprises said a move to offer integrated AI observability would even make them consider switching from Cisco to the new company. The problem with vendor-specific observability like Mist4All is that you can’t introduce it incrementally; you need to fork-lift. Only one-fifth of enterprises thought HPE could justify the deal by integrating their own operations tools with Mist. Forget Mist4All, too. Which brings us to HPEJ. I don’t think there’s any question that the biggest driver in how the new venture positions itself is AI. AI is also the most likely near-term driver of incremental data center deployment, both among enterprises and among service providers, something Juniper’s AI-Native positioning aims to exploit and that competitors like Cisco and Extreme are countering. And, of course AI data centers obviously need servers, so HPE has its own interests here. That’s what creates the HPEJ/JHPE tension, but can shareholder value and customer interest be created by simply doubling down on what the companies were doing separately? Three enterprises who use both HPE and Juniper gear enough to get significant account attention from both say that Juniper’s people push AI hosting more than HPE does. Nobody

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »