
When DartPoints CEO Scott Willis took the stage on “the Distributed Edge” panel at the 2025 Data Center Frontier Trends Summit, his message resonated across a room full of developers, operators, and hyperscale strategists: the future of AI infrastructure will be built far beyond the nation’s tier-one metros.
On the latest episode of the Data Center Frontier Show, Willis expands on that thesis, mapping out how DartPoints has positioned itself for a moment when digital infrastructure inevitably becomes more distributed, and why that moment has now arrived.
DartPoints’ strategy centers on what Willis calls the “regional edge”—markets in the Midwest, Southeast, and South Central regions that sit outside traditional cloud hubs but are increasingly essential to the evolving AI economy.
These are not tower-edge micro-nodes, nor hyperscale mega-campuses. Instead, they are regional data centers designed to serve enterprises with colocation, cloud, hybrid cloud, multi-tenant cloud, DRaaS, and backup workloads, while increasingly accommodating the AI-driven use cases shaping the next phase of digital infrastructure.
As inference expands and latency-sensitive applications proliferate, Willis sees the industry’s momentum bending toward the very markets DartPoints has spent years cultivating.
Interconnection as Foundation for Regional AI Growth
A key part of the company’s differentiation is its interconnection strategy. Every DartPoints facility is built to operate as a deeply interconnected environment, drawing in all available carriers within a market and stitching sites together through a regional fiber fabric.
Willis describes fiber as the “nervous system” of the modern data center, and for DartPoints that means creating an interconnection model robust enough to support a mix of enterprise cloud, multi-site disaster recovery, and emerging AI inference workloads. The company is already hosting latency-sensitive deployments in select facilities—particularly inference AI and specialized healthcare applications—and Willis expects such deployments to expand significantly as regional AI architectures become more widely adopted.
With capital partners NOVA Infrastructure and OIC Orion Infrastructure Capital now backing DartPoints following a recent recapitalization, the company is accelerating its expansion plans. Four new facilities have been launched within the current footprint, each designed from the ground up to support higher densities and AI-oriented workloads.
These next-generation data centers are engineered for racks ranging from 30 kW at the low end to 120 kW at the high end, with a targeted operating band between 40 and 80 kW. Combined with strategic positioning in the 10- to 50-megawatt range, DartPoints aims to occupy a crucial tier of the market—large enough to support meaningful enterprise and HPC deployments, yet small and nimble enough to sidestep the multiyear delays that plague gigawatt-scale hyperscale builds.
Speed to Market Through Power, Cooling, and Brownfield Advantage
Speed to market is a central pillar of DartPoints’ power strategy. Willis notes that power availability within the company’s target range is generally not a gating factor, especially when leveraging brownfield sites where utility infrastructure is already in place.
While DartPoints continues to explore behind-the-meter options for future greenfield developments, the company’s primary focus remains on accelerating delivery rather than pursuing custom generation solutions. Brownfield opportunities, Willis says, allow the company to reduce typical timelines dramatically compared to greenfields, which can stretch to two-and-a-half years or more.
Transformer lead times remain a reality, but the moderate scale of DartPoints’ deployments makes these challenges manageable relative to larger hyperscale projects. Cooling strategy is just as central to the company’s roadmap. Willis stresses the need for a “wide catcher’s mitt” design capable of supporting a diverse mix of enterprise IT, HPC clusters, AI inference workloads, and neo-cloud deployments.
To do this, DartPoints is adopting a hybrid approach: advanced air cooling for densities in the 30–40 kW range, and liquid cooling—direct-to-chip and rear-door heat exchange—for workloads surpassing 40–45 kW. This flexible design helps contain capital deployment, Willis says, balancing the cost pressures of advanced cooling (which can elevate build costs to $12–13 million per megawatt) while ensuring readiness for workloads that continue to grow in power intensity and thermal demands.
Competing, Partnering, and Building for Community Impact
The competitive landscape in regional markets is becoming increasingly active, with several operators now pursuing similar strategies. Willis welcomes the company, arguing that it validates the direction of the market and supports the broader ecosystem.
What matters most, he says, is deploying durable, interconnected digital infrastructure in locations where power, utilities, and fiber align to support long-term demand. Site selection increasingly favors markets with brownfield potential, where existing utility service entrances and fiber routes bring deployment timelines in line with customer urgency.
Community partnership is another theme Willis returns to frequently. DartPoints’ facilities do not have the physical sprawl or gigawatt power draws of hyperscale campuses, and Willis sees this as a positive differentiator when entering new markets. The company focuses on enhancing building aesthetics, mitigating generator noise, and ensuring that facilities serve as catalysts for local economic growth.
DartPoints views itself as a steward in communities, delivering state-of-the-art digital infrastructure that allows local enterprises to scale without relocating workloads to distant metros.
The Road Ahead: AI as the Regional Growth Engine
Looking toward 2026, Willis expects the demand curve for digital infrastructure—particularly AI infrastructure—to steepen. The pace of innovation is accelerating, he says, and organizations must learn to adapt to change as a core competency rather than a periodic challenge. For DartPoints, that means staying disciplined on its regional strategy while remaining flexible in how it designs, builds, and operates sites to match evolving workloads.
In the near term, DartPoints will continue expanding within its non-tier-one footprint through both organic growth and targeted M&A. The company will emphasize high-density, AI-ready facilities with hybrid air and liquid cooling and a strong regional interconnection fabric. Above all, Willis reiterates the company’s priority of meeting market needs quickly while remaining an engaged and responsible community partner.
For an operator once ahead of the curve, the market is catching up. As AI inference expands and compute becomes more distributed, DartPoints’ regional edge strategy now sits squarely in the path of where digital infrastructure is moving next.





















