Stay Ahead, Stay ONMINE

Colossal raises $200M to “de-extinct” the woolly mammoth, thylacine and dodo

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Colossal BioSciences has raised $200 million in a new round of funding to bring back extinct species like the woolly mammoth. Dallas- and Boston-based Colossal is making strides in the scientific breakthroughs toward “de-extinction,” or bringing […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Colossal BioSciences has raised $200 million in a new round of funding to bring back extinct species like the woolly mammoth.

Dallas- and Boston-based Colossal is making strides in the scientific breakthroughs toward “de-extinction,” or bringing back extinct species like the woolly mammoth, thylacine and the dodo.

I would be remiss if I did not mention this is the plot of Michael Crichton’s novel Jurassic Park, where scientists used the DNA found in mosquitoes preserved in amber to bring back the Tyrannosaurus Rex and other dinosaurs. I mean, what could go wrong when science fiction becomes reality? Kidding aside, this is pretty amazing work and I’m not surprised to see game dev Richard Garriott among the investors.

TWG Global, a diversified holding company with operating businesses and investments in technology/AI, financial services, private lending and sports and media, jointly led by Mark Walter and Thomas Tull.

Since launching in September 2021, Colossal has raised $435 million in total funding. This latest round of capital places the company at a $10.2 billion valuation. Colossal will leverage this latest infusion of capital to continue to advance its genetic engineering technologies while pioneering new revolutionary software, wetware and hardware solutions, which have applications beyond de-extinction including species preservation and human healthcare.

“Our recent successes in creating the technologies necessary for our end-to-end de-extinction toolkit have been met with enthusiasm by the investor community. TWG Global and our other partners have been bullish in their desire to help us scale as quickly and efficiently as possible,” said CEO Colossal Ben Lamm, in a statement. “This funding will grow our team, support new technology development, expand our de-extinction species list, while continuing to allow us to carry forth our mission to make extinction a thing of the past.”

Colossal employs over 170 scientists and partners with labs in Boston, Dallas, and Melbourne, Australia. In addition, Colossal sponsors over 40 full time postdoctoral scholars and research programs in 16 partner labs at some of the most prestigious universities around the globe.

Colossal’s scientific advisory board has grown to include over 95 of the top scientists working in genomics, ancient DNA, ecology, conservation, developmental biology, and paleontology. Together, these teams are tackling some of the hardest problems in biology, including mapping genotypes to traits and behaviors, understanding developmental pathways to phenotypes like craniofacial shape, tusk formation, and coat color patterning, and developing new tools for multiplex and large-insert genome engineering.

“Colossal is the leading company working at the intersection of AI, computational biology and genetic engineering for both de-extinction and species preservation,” said Mark Walter, CEO of TWG Global, in a statement. “Colossal has assembled a world-class team that has already driven, in a short period of time, significant technology innovations and impact in advancing conservation, which is a core value of TWG Global. We are thrilled to support Colossal as it accelerates and scales its mission to combat the animal extinction crisis.”

“Colossal is a revolutionary genetics company making science fiction into science fact. We are creating the technology to build de-extinction science and scale conservation biology particularly for endangered and at-risk species. I could not be more appreciative of the investor support for this important mission,” said George Church, Colossal cofounder and a professor of genetics at Harvard Medical School and professor of Health Sciences and Technology at Harvard and the Massachusetts Institute of Technology (MIT).

In October 2024, the Colossal Foundation was launched, a sister 501(c)(3) focused on overseeing the deployment and application of Colossal-developed science and technology innovations. The organization currently supports 48 conservation partners and their global initiatives around the world.

This includes partners like Re:wild, Save The Elephants, Biorescue, Birdlife International, Conservation Nation, Sezarc, Mauritian Wildlife Foundation, Aussie Ark, International Elephant Foundation, Saving Animals From Extinction. Currently the Colossal Foundation is focused on supporting conservation partners who are working on new innovative technologies that can be applied to conservation and those who benefit from the development and deployment of new genetic rescue and de-extinction technologies to help combat the biodiversity extinction crisis.

Tracking Progress on Colossal’s De-Extinction Projects

Ben Lamm is CEO of Colossal Biosciences

The first step in every de-extinction project is to recover and analyze preserved genetic material and use that data to identify each species’ core genomic components. In addition to recruiting Beth Shapiro, a global leader in ancient DNA research, as Colossal’s chief science officer, Colossal has built a team of Ph.D experts in ancient DNA among its scientific advisors, including Love Dalen, Andrew Pask, Tom Gilbert, Michael Hofreiter, Hendrik Poinar, Erez Lieberman Aiden, and Matthew Wooler.

With this team, Colossal continues to push advances in ancient DNA through support to academic labs and internal scientific research. All three core species – mammoth, thylacine, and dodo – have already benefited from this coalescence of expertise. As an example, Colossal now has the most contiguous and complete ancient genomes to date for each of these three species; these genomes are the blueprints from which these species’ core traits will be engineered.

The path from ancient genome to living species requires a systems model approach to innovation across computational biology, cellular engineering, genetic engineering, embryology, and animal husbandry, with refinement and tuning in each step along the de-extinction pipeline occurring simultaneously. To date, Colossal’s scientists have achieved monumental breakthroughs at each step for each of the three flagship species.

In the last three years, Colossal’s first major project to be announced, the woolly mammoth project, generated new genomic resources, made breakthroughs in cell biology and genome engineering, and explored the ecological impact of de-extinction, with implications for mammoths, elephants, and species across the vertebrate tree of life.

Woolly Mammoth De-extinction Project Progress

The mammoth team has generated chromosome-scale reference genomes for the African elephant, Asian elephant, and rock hyrax, all of which have been released on the National Center for Biotechnology Information database; it has generated the first de novo assembled mammoth genome – that is, a genome generated using only the ancient DNA reads rather than mapped to a reference genome. This genome identified several genetic loci that are missing in reference-guided assemblies.

And it has acquired and aligned 60+ ancient genomes for woolly mammoth and Columbian mammoth in collaboration with key scientific advisor, Love Dalen and Tom van der Valk. This data, in combination with 30+ genomes for extant elephant species including Asian, African, and Bornean elephants, have dramatically increased the accuracy of mammoth-specific variant calling.

The team has derived, characterized, and biobanked 10+ primary cell lines from acquired tissue for Asian elephants, rock hyrax, and aardvark for use in company conservation and de-extinction pipelines; and it became the first to derive pluripotent stem cells for Asian elephants. These cells are essential for in-vitro embryogenesis and gametogenesis. There are numerous other steps forward.

“These mammoth milestones mark a pivotal step forward for de-extinction technologies,” said Love Dalen, professor at the Centre for Paleogenetics, University of Stockholm, and a key advisor to the mammoth project, in a statement. “The dedication of the team at Colossal to precision and scientific rigor is truly inspiring, and I have no doubt they will be successful in resurrecting core mammoth traits.”

Thylacine De-extinction Project Progress

A thylacine image generated by Microsoft Copilot.

The Colossal thylacine team recently made announcements demonstrating progress on the various work streams critical for the de-extinction of the thylacine.

Since that team’s inception two years ago the Australia and Texas-based teams have generated the highest-quality ancient genome to date for a Thylacine, at 99.9% complete, using ancient long reads and ancient RNA – a world’s first and once thought to be an impossible goal – creating the genomic blueprint for Thylacine de-extinction.

They have generated ancient genomes for 11 individuals thylacines, thereby gaining understanding of fixed variants versus population-level variation in thylacines pre-extinction and enabling more accurate prediction of de-extinction targets.

And they have assembled telomere-to-telomere genome sequences for all dasyurid species – the evolutionary cousins of thylacines– providing resources both to improve Colossal’s understanding thylacine evolution and underpin thylacine engineering efforts, and to aid in the conservation of threatened marsupial species. They made numerous other advances as well.

“These milestones put us ahead of schedule on many of the critical technologies needed to underpin de-extinction efforts. At the same time, it creates major advances in genomics, stem cell generation and engineering, and marsupial reproductive technologies that are paving the way for the de-extinction of the thylacine and is revolutionizing conservation science for marsupials. Colossal’s work demonstrates that with innovation and perseverance, we can offer groundbreaking solutions to safeguard biodiversity— and the team is already doing this in many visionary ways,” said Andrew Pask, Ph.D., in a statement.

Dodo De-extinction Project Progress

Dodo image generated by Microsoft Copilot.

The Colossal Avian Genomics Group is currently focused on the Dodo project as well as building a distinct suite of tools for avian genome engineering which differs from some of the company’s mammalian projects. The dodo specific team’s progress includes generating a complete, high coverage genome for the dodo, its sister extinct species the solitaire, and the critically endangered manumea (also known as the “tooth-billed pigeon” and “little dodo”).

They also generated and published a chromosome-scale assembly of the Nicobar pigeon (the dodo’s closest relative) as well as developed a population-scale data set of Nicobar pigeon genomes for computational identification of dodo-specific traits.

And the team developed a machine learning approach to identify genes associated with craniofacial shape in birds for gene-editing targets toward resurrect the dodo’s unique bill morphology; and they processed more than 10,000 eggs and optimized culture conditions for growing primordial germ cells (PGCs) for four bird species. The team also made a number of other strides.

“As we advance our understanding of avian genomics and developmental biology, we’re seeing remarkable progress in the tools and techniques needed to restore lost bird species,” said Colossal’s chief science officer Beth Shapiro, in a statement. “The unique challenges of avian reproduction require bespoke approaches to genetic engineering, for example, and our dodo team has had impressive success translating tools developed for chickens to tools that have even greater success in pigeons. While work remains, the pace of discovery within our dodo team has exceeded expectations.”

Colossal’s Support of Global Conservation and De-Extinction Efforts

By 2050, it is projected that over 50% of the world’s animal species may be extinct. Now around 27,000 species per year go extinct, compared to the natural rate of 10 to 100 species per year. Over the past 50 years (1970–2020), the average size of monitored wildlife populations has shrunk by 73%.

That extinction crisis will have cascading, negative impacts on human health and wellbeing including reductions in drinkable water, increases in land desertification and increases in food insecurity. While current conservation efforts are imperative to protecting species, more and newer technologies and techniques are required that can scale in response to the speed humanity is changing the planet and destroying ecosystems.

Colossal was created to respond to this crisis. And, Colossal’s growing de-extinction and species preservation toolkit of software, wetware and hardware solutions provides new, scalable approaches to this systems-level existential threat and biodiversity crisis.

“The technological advances we’re seeing in genetic engineering and synthetic biology are rapidly transforming our understanding of what’s possible in species restoration,” said Shapiro. “While the path to de-extinction is complex, each step forward brings us closer to understanding how we might responsibly reintroduce traits from lost species. The real promise lies not just in the technology, but also in how we might apply these tools to protect and restore endangered species and ecosystems.”

The breakthroughs in Colossal’s core projects create a ripple effect across species conservation. Each Colossal core species is tied to conservation efforts that support other endangered and at-risk species in the respective animal’s family group.

The company’s work toward mammoth restoration has simultaneously advanced reproductive and genetic technologies that can help preserve endangered elephant species, while the dodo program is pioneering avian genetic tools that will benefit threatened bird species worldwide. Through the Colossal Foundation and its partnerships with leading conservation organizations, Colossal is transforming these scientific advances into practical solutions that can help protect and restore vulnerable species across multiple taxonomic families.

It has key initiatives such as Colossal’s $7.5M in new donations to fund ancient DNA research across a diverse selection of species; the development of a gene-engineering solution to create cane toad toxin resistance for Australia’s endangered Northern Quoll; a partnership with the international conservation organization Re:wild on a suite of initiatives to preserve the world’s most threatened species.

It has a joint 10-year conservation strategy to save some of the world’s most threatened species by leveraging the power of Colossal’s genetic technologies and Re:wild’s experience and partnerships for species conservation across the world. There are a number of other efforts under way too.

“Colossal is advancing the development of genetic technologies for conservation at a rapid pace. Their cutting-edge technologies are changing what is possible in species conservation and are permitting us to envision a world where many more Critically Endangered species not only survive but thrive,” said Barney Long, PhD, senior director of conservation strategies for Re:wild, in a statement.

Colossal’s additional strategic investors include funds such as USIT, Animal Capital, Breyer Capital, At One Ventures, In-Q-Tel, BOLD Capital, Peak 6, and Draper Associates among others and private investors including Robert Nelsen, Peter Jackson, Fran Walsh, Ric Edelman, Brandon Fugal, Paul Tudor Jones, Richard Garriott, Giammaria Giuliani, Sven-Olof Lindblad, Victor Vescovo, and Jeff Wilke.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

U.S. Department of Energy Announces $14 Million for Enhanced Geothermal Systems Demonstration Project in Pennsylvania

WASHINGTON—The U.S. Department of Energy’s Hydrocarbons and Geothermal Energy Office (HGEO) today announced a $14 million project to support field tests for enhanced geothermal systems (EGS). EGS demonstration projects explore the greater potential for geothermal technology to provide reliable, cost-effective electricity using the earth’s abundant heat resources, supporting the Trump Administration’s commitments to advance energy addition and reduce energy costs for American families and businesses. Led by the Pennsylvania Department of Environmental Protection, the project will leverage the significant thermal resources in the Appalachian Utica Shale to assess the efficacy and scalability of EGS in the eastern United States.  “The Department of Energy’s investments in enhanced geothermal systems represent a key advancement in our national energy strategy as we explore innovative ways to reach and use geothermal resources beyond what is currently possible,” said Kyle Haustveit, Assistant Secretary of the Hydrocarbons and Geothermal Energy Office. “As the first enhanced geothermal systems demonstration site located in the eastern United States, this project offers an important opportunity to assess the ability of such systems to deliver reliable, affordable geothermal electricity to Americans nationwide.” Using geothermal resources for electricity production requires fluid to flow among hot rocks in the subsurface and then be drawn to the surface in the form of steam or hot water. While underground heat exists everywhere, many locations lack adequate water or conditions that facilitate fluid flow necessary to recover that heat energy. In those cases, EGS can be used to create a human-made underground reservoir to tap that heat for energy. Demonstration projects are vital to help expand knowledge and data about EGS reservoirs and how they function, and to understand EGS in a variety of geographic locations, geologic formations, and subsurface conditions. Successful demonstrations will help spur further growth of geothermal energy. The Pennsylvania EGS project activities will include converting a horizontal shale gas

Read More »

Wright, Zeldin, and Burgum Break Ground on NESE Pipeline in New York City to Deliver Reliable, Affordable Natural Gas to the Northeast

NEW YORK—U.S. Secretary of Energy Chris Wright, U.S. Environmental Protection Agency Administrator Lee Zeldin, and U.S. Secretary of the Interior Doug Burgum today participated in a groundbreaking ceremony for the Northeast Supply Enhancement (NESE) Pipeline. This pipeline, of Williams Companies, will transport natural gas from Pennsylvania into New York City and Long Island, providing affordable and reliable energy for millions of Americans while meeting the growing energy demands of the region. President Trump and his National Energy Dominance Council worked across party lines to secure the necessary permits for this project from the states of New York and New Jersey last fall. NESE is an expansion of Williams’ Transco pipeline system across Pennsylvania, New Jersey, and New York that will add 400,000 dekatherms per day of capacity. This is enough energy to serve the equivalent of 2.3 million homes. NESE remains on track to be in service by the fourth quarter of 2027. “For decades, poor political choices obstructed the building of energy infrastructure, leading to higher energy costs for millions of Americans. President Trump promised to lower energy costs and to get America building again—that is exactly what the groundbreaking of the NESE pipeline will accomplish,” said Energy Secretary Chris Wright. “This project is a win-win: natural gas is a reliable, low-cost, clean burning option for New Yorkers to heat and power their homes and businesses. President Trump, Secretary Burgum, Administrator Zeldin and I will continue fighting to build more energy infrastructure so that all Americans have access to affordable, reliable, and secure American energy.” “Breaking ground on the NESE pipeline marks a massive milestone for millions of New Yorkers seeking access to reliable, affordable natural gas. Delivering natural gas from Pennsylvania to New York City and Long Island will lower costs while helping to meet the growing energy

Read More »

BW Energy granted 25-year extension of license offshore Gabon

BW Energy Gabon has received approval from the Ministry of Oil and Gas of the Gabonese Republic to extend the Dussafu Marin production license offshore Gabon, West Africa. The license period has been extended to 2053 from 2028, inclusive of three 5-year option periods from 2038 onwards. The prior contract was until 2038 inclusive of two 5-year option periods from 2028 onwards. The extra time “provides long-term visibility for production, investments, and reserve development” of the operator’s “core producing asset,” the company said in a release Apr. 7. Ongoing license projects include MaBoMo Phase 2, with planned first oil in second-half 2026, and the Bourdon development following its discovery last year. The timeline also “strengthens the foundation for future infrastructure‑led growth opportunities across the adjacent Niosi and Guduma licenses, both operated by BW Energy,” the company continued. The Dussafu Marin permit is a development and exploitation license with multiple discoveries and prospects lying within a proven oil and gas play fairway within Southern Gabon basin. To the northwest of the block is the Etame-Ebouri Trend, a collection of fields producing from the pre-salt Gamba and Dentale sandstones, and to the north are Lucina and M’Bya fields which produce from the syn-rift Lucina sandstones beneath the Gamba. Oil fields within the Dussafu Permit include Moubenga, Walt Whitman, Ruche, Ruche North East, Tortue, Hibiscus, and Hibiscus North. BW Energy Gabon is operator at Dussafu (73.50%) with partners Panoro Energy ASA (17.5%) and Gabon Oil Co. (9%). Dussafu.

Read More »

Santos plans development of North Slope’s Quokka Unit

Santos Ltd. has started development planning in the Quokka Unit on Alaska’s North Slope after further delineating the Nanushuk reservoir. The Quokka-1 appraisal well spudded on Jan. 1, 2026, about 6 six miles from the Mitquq-1 discovery well drilled in 2020. It was drilled to 4,787 ft TD and encountered a high-quality reservoir with about 143 ft of net oil pay in the Nanushuk formation, demonstrating an average porosity of 19%. Following a single stage fracture stimulation, the well achieved a flow rate of 2,190 bo/d. Reservoir sands correlated between the two discoveries, coupled with fluid analyses, confirm the presence of high‑quality, light‑gravity oil, supporting strong well performance and improved pricing relative to Pikka oil. Together with additional geological data, these results underpin the potential for a two‑drill‑site development with production capacity comparable to Pikka phase 1, the company said.  Rate and resource potential for the two-drill-site development is being evaluated. Resource estimation is ongoing and appraisal results will be evaluated as part of the FY26 contingent resource assessment. In FY25, Santos reported 2C contingent resources of 177 MMboe for the Quokka Unit. Based on these results, Santos has started development planning, including the initiation of key permitting activities. Santos is operator of the Quokka Unit (51%) with partner Repsol (49%).

Read More »

Fluor, Axens secure contracts for US grassroots refinery project

Fluor Corp. and Axens Group have been awarded key contracts for America First Refining’s (AFR) proposed grassroots refinery at the Port of Brownsville, Tex., advancing development of what would be the first new US refinery to be built in more than 50 years. Fluor will execute front-end engineering and design (FEED) for the project, while Axens will serve as technology licensor of core refining process technologies to be used at the site, the service providers said in separate Apr. 7 releases. The AFR refinery is designed to process more than 60 million bbl/year—or about 164,400 b/d—of US light shale crude into transportation fuels, including gasoline, diesel, and jet fuel. Contract details Without disclosing a specific value of its contract, Fluor said the scope of its FEED study will cover early-stage engineering and design required to define project execution, cost, and schedule based on a complex that will incorporate commercially proven technologies to improve efficiency and emissions performance while processing domestic shale crude. As technology licensor, Axens said it will deliver process technologies for key refining units at the site, including those for: Naphtha, diesel hydrotreating. Continuous catalytic reforming. Isomerization. Alongside supporting improved fuel-quality specifications, the unspecified technologies to be supplied for the refinery will also help to reduce overall energy consumption at the site. Axens—which confirmed its involvement since 2017 in working with AFR on early-stage development of the project—said this latest licensing agreement will also cover engineering support, equipment, catalysts, and services across the refinery’s process configuration. Project background, commercial framework Upon first announcing the project in March 2026, AFR said the proposed development came alongside an already signed 20-year offtake agreement with a global integrated oil company covering 1.2 billion bbl of US light shale crude, as well as capital investment to support construction. As part of the

Read More »

EIA: US crude inventories up 3.1 million bbl

US crude oil inventories for the week ended Apr. 3, excluding the Strategic Petroleum Reserve, increased by 3.1 million bbl from the previous week, according to data from the US Energy Information Administration (EIA). At 464.7 million bbl, US crude oil inventories are about 2% above the 5-year average for this time of year, the EIA report indicated. EIA said total motor gasoline inventories decreased by 1.6 million bbl from last week and are about 3% above the 5-year average for this time of year. Finished gasoline inventories increased while blending components inventories decreased last week. Distillate fuel inventories decreased by 3.1 million bbl last week and are about 5% below the 5-year average for this time of year. Propane-propylene inventories increased by 600,000 bbl from last week and are 71% above the 5-year average for this time of year, EIA said. US crude oil refinery inputs averaged 16.3 million b/d for the week ended Apr. 3, which was 129,000 b/d less than the previous week’s average. Refineries operated at 92% of capacity. Gasoline production decreased, averaging 9.4 million b/d. Distillate fuel production increased, averaging 5.0 million b/d. US crude oil imports averaged 6.3 million b/d, down 130,000 b/d from the previous week. Over the last 4 weeks, crude oil imports averaged about 6.6 million b/d, 9.1% more than the same 4-week period last year. Total motor gasoline imports averaged 571,000 b/d. Distillate fuel imports averaged 152,000 b/d.

Read More »

Maine to put brakes on big data centers as AI expansion collides with power limits

Mills has pushed for an exemption protecting a proposed $550 million project at the former Androscoggin paper mill in Jay, arguing it would reuse existing infrastructure without straining the grid. Lawmakers rejected that exemption. Mills’ office did not immediately respond to a request for comment. A national wave, an unanswered federal question Maine is one of at least 12 states now weighing moratorium or restraint legislation, alongside more than 300 data center bills filed across 30-plus states in the current session, according to legislative tracking firm MultiState. The shared concern is energy cost. Data centers could consume up to 12% of total US electricity by 2028, according to the US Department of Energy. On March 25, Senator Bernie Sanders and Alexandria Ocasio-Cortez introduced the AI Data Center Moratorium Act in Congress, which would impose a nationwide freeze on all new data center construction until Congress passes AI safety legislation. The Trump administration has pursued a different path from the legislative approach being taken in states. On March 4, Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI signed the White House’s Ratepayer Protection Pledge, a voluntary commitment by hyperscalers to fund their own power generation rather than pass grid costs to ratepayers. The pledge, published in the Federal Register on March 9, carries no penalties for noncompliance or auditing requirements.

Read More »

Cisco just made two moves to own the AI infrastructure stack

In a world of autonomous agents, identity and access become the de facto safety rails. Astrix is designed to inventory these non-human identities, map their permissions, detect toxic combinations, and remediate overprivileged access before it becomes an exploit or a data leak. That capability integrates directly with Cisco’s broader zero-trust and identity-centric security strategy, in which the network enforces policy based on who or what the entity is, not on which subnet it resides in. How this strengthens Cisco’s secure networking story Cisco has positioned itself as the vendor that can deliver “AI-ready, secure networks” spanning campus, data center, cloud, and edge. Galileo and Astrix extend that narrative from infrastructure into AI behavior and identity governance: The network becomes the high‑performance, policy‑enforcing substrate for AI traffic and data. Splunk plus Galileo becomes the observability plane for AI agents, linking AI incidents to network and application signals. Security plus Astrix becomes the identity and permission-control layer that constrains what AI agents can actually do within the environment. This is the core of Cisco’s emerging “Secure AI” posture: not just using AI to improve security but securing AI itself as it is embedded across every workflow, API, and device. For customers, that means AI initiatives can be brought under the same operational and compliance disciplines already used for networks and apps, rather than existing as unmanaged risk islands. Why this matters to Cisco customers Most large Cisco accounts are exactly the enterprises now experimenting with AI agents in contact centers, IT operations, and business workflows. They face three practical problems: They cannot see what agents are doing end‑to‑end, or measure quality beyond offline benchmarks. They lack a coherent model for managing the identities, secrets, and permissions those agents depend on. Their security and networking teams are often disconnected from AI projects happening in lines of business.

Read More »

From Buildings to Token Factories: Compu Dynamics CEO Steve Altizer On Why AI Is Rewriting the Data Center Design Playbook

Not Falling Short—Just Not Optimized Altizer drew a clear distinction. Traditional data centers can run AI workloads, but they weren’t built for them. “We’re not falling short much, we’re just not optimizing.” The gap shows up most clearly in density. Legacy facilities were designed for roughly 300 to 400 watts per square foot. AI pushes that to 2,000 to 4,000 watts per square foot—changing not just rack design, but the logic of the entire facility. For Altizer, AI-ready infrastructure starts with fundamentals: access to water for heat rejection, significantly higher power density, and in some cases specific redundancy topologies favored by chip makers. It also requires liquid cooling loops extended to the rack and, critically, flexibility in the white space. That last point is the hardest to reconcile with traditional design. “The GPUs change… your power requirements change… your liquid cooling requirements change. The data center needs to change with it.” Buildings are static. AI is not. Rethinking Modular: From Containers to Systems “Modular” has been part of the data center vocabulary for years, but Altizer argues most of the industry is still thinking about it the wrong way. The old model centered on ISO containers. The emerging model focuses on modularizing the white space itself. “We’re not building buildings—we’re building assemblies of equipment.” Compu Dynamics is pushing toward factory-built IT modules that can be delivered and assembled on-site. A standard 5 MW block consists of 10 modules, stacked into a two-story configuration and designed for transport by trailer across the U.S. From there, scale becomes repeatable. Blocks can be placed adjacent or connected to create larger deployments, moving from 5 MW to 10 MW and beyond. The point is not just scalability; it’s repeatability and speed. Altizer ties this directly to a broader shift in how data centers are

Read More »

Data centers are moving inland, away from some traditional locations

The future is even less clear the further you go out. The vast majority of data centers planned for launch between 2028 and 2032 have yet to break ground and only a sliver are under construction. Those delays, it seems, appear to be twofold: first, the well-documented component shortage. Not just memory and storage, but batteries, electrical transformers, and circuit breakers. They all make up less than 10% of the cost to construct one data center, but as Andrew Likens, energy and infrastructure lead at AI data center provider Crusoe’s told Bloomberg, it’s impossible to build new data centers without them. “If one piece of your supply chain is delayed, then your whole project can’t deliver,” Likens said. “It is a pretty wild puzzle at the moment.” Second problem is the growing rebellion against data centers, both by citizens and governments alike. The latest pushback comes from the Seminole nation of Native Americans, who have banned data centers on their tribal lands. Of the data centers that are coming online in the next few months, the top states reflect what Synergy has been saying about data center migration to the interior of the country. Texas is leading the way, with 22.5 GW coming online, followed by New Mexico at 8.3 GW and Pennsylvania, which is making a major push for data centers to come to the state, at 7.1 GW.

Read More »

Hillwood, PowerHouse Advance $20B Joliet Data Campus as Midwest AI Buildout Accelerates

The approval of the Joliet Technology Center signals that the Chicago region is being pulled into the Midwest’s next phase of AI infrastructure development, one that has so far been led by Ohio and defined by scale, power demand, and rising public scrutiny. It also underscores a growing reality: local governments are beginning to understand exactly what that shift entails. On March 19, 2026, the Joliet City Council voted 8–1 to approve the conditional annexation of roughly 795 acres for the proposed Joliet Technology Center, a $20 billion data center campus backed by Hillwood and PowerHouse Data Centers. The site, near Rowell and Bernhard Roads on Joliet’s east side, is planned as a 24-building, multi-phase development that would rank among the most consequential digital infrastructure projects ever approved in Illinois. Joliet is now a clear case study in how the Midwest’s data center market is evolving: massive land assemblies, utility-scale power requirements, front-loaded community concessions, increasingly organized local opposition, and regulators working to ensure that the costs of AI infrastructure are not shifted onto ratepayers. A Project Too Large to Call Routine The Joliet Technology Center is a campus-scale industrial platform built for the AI era. Plans call for 24 two-story buildings of roughly 144,500 square feet each, with total development estimated at approximately 6.9 million square feet and up to 1.8 GW of eventual capacity. That places the project firmly in the emerging “AI factory” category, e.g. far-removed from the incremental, metro-edge data center expansions that defined earlier growth cycles. The distinction is critical. AI-scale campuses operate on a different economic and technical model. Fiber access and metro proximity are no longer enough. These developments require large, contiguous power blocks, land to support phased substation and utility infrastructure, and a political framework capable of absorbing what is effectively heavy

Read More »

AI is a Positive Catalyst for Grid Growth

Data centers, particularly those optimized for artificial intelligence workloads, are frequently characterized in public discourse as a disruptive threat to grid stability and ratepayer affordability. But behind-the-narrative as we are, the AI‑driven data center growth is simply illuminating pre‑existing systemic weaknesses in electric infrastructure that have accumulated over more than a decade of underinvestment in transmission, substations, and interconnection capacity. Over the same period, many utilities operated under planning assumptions shaped by slow demand growth and regulatory frameworks that incentivized incremental upgrades rather than large, anticipatory capital programs. As a result, the emergence of gigawatt‑scale computing campuses appears to be a sudden shock to a system that, in reality, was already misaligned with long‑term decarbonization, electrification, and digitalization objectives. Utilities have been asked to do more with aging grids, slow permitting, and chronically constrained capital, and now AI and cloud are finally putting real urgency — and real investment — behind modernizing that backbone. In that sense, large‑scale compute is not the problem; it is the catalyst that makes it impossible to ignore the problem any longer. We are at a moment when data centers, and especially AI data centers, are being blamed for exposing weaknesses that were already there, when in reality they are giving society a chance to fix a power system that has been underbuilt for more than a decade. Utilities have been asked to do more with aging grids, slow permitting, and limited investment, and now AI and cloud are finally putting real urgency — and real capital — behind modernizing that backbone. In that sense, data centers aren’t the problem; they are the catalyst that makes it impossible to ignore the problem any longer. AI Demand Provided a Long‑Overdue Stress Test The nature of AI workloads intensified this dynamic. High‑performance computing clusters concentrate substantial power

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »