
Enterprise AI adoption is no longer emerging. It is already embedded and beginning to scale in ways that will reshape data center demand.
The latest TD Cowen GenAI Adoption Survey makes that clear. Across 689 U.S. enterprises, 92% are now using at least one major AI platform, with Microsoft Copilot, Google Gemini, and ChatGPT forming the core triad of daily enterprise tooling.
That’s the baseline. The more important story is what comes next. AI is moving quickly from assistive software to autonomous systems, and that shift carries direct implications for compute demand, power consumption, and infrastructure design.
From Copilots to Autonomous Systems
Today’s enterprise AI footprint is already broad, but it is still largely human-in-the-loop. That is beginning to change.
Roughly a third of respondents say they already have semi-autonomous AI agents running in production, while another large cohort is piloting or planning deployments over the next 12 to 18 months. By 2027, more than three-quarters expect to be running AI agents capable of executing multi-step workflows without human intervention.
This is not incremental adoption. It is a step-function shift. Autonomous agents don’t just respond to prompts; they execute tasks, interact with enterprise systems, and continuously access data. For data centers, that translates into more persistent, baseline load: exactly the kind of demand profile that stresses power delivery, increases utilization, and accelerates capacity planning timelines.
To wit: AI is moving from a bursty workload to a continuous one.
ROI Is No Longer the Question
At the same time, the debate around AI return on investment is effectively over. Three-quarters of respondents report positive ROI, while only a small minority report negative outcomes. A meaningful share is already seeing multiples of return on their investments.
The implication seems straightforward: AI budgets are becoming durable. This is no longer experimental spend that can be dialed up or down; it is being baked into operating models.
Importantly, those budgets are not primarily being funded by workforce reductions. Productivity gains and dedicated innovation budgets rank ahead of headcount cuts as sources of funding. That signals enterprises are treating AI as a growth and efficiency lever, not simply a cost-cutting exercise.
For infrastructure providers, such signaling reinforces that demand is likely to persist and expand.
The Real Opportunity: Enterprise SaaS as the AI Control Layer
One of the more revealing findings in the survey is where the highest value is emerging. Horizontal AI – embedded within enterprise software platforms like CRM systems and workflow engines – shows the lowest current adoption but the highest perceived return. It is also where upgrade intent is strongest.
These platforms sit closest to systems of record and operational workflows, making them the natural control layer for autonomous AI execution. That is where AI moves from generating content to running the business.
This dynamic signals a shift in workload intensity. Early adoption has centered on front-end productivity tools. The next phase moves deeper into the enterprise stack, where AI orchestrates processes and interacts directly with core systems. That layer is inherently more compute-intensive, more persistent, and more tightly coupled to enterprise data.
Data Gravity Is Consolidating—Fast
Data architecture is emerging as the gating factor. The survey shows that nearly all companies running autonomous agents rely on centrally governed data platforms such as data lakes and warehouses. Integration with systems of record is viewed as essential, particularly among larger enterprises.
This points to a consolidation of data gravity. AI workloads are clustering around centralized, high-performance data environments, increasing the importance of throughput, latency, and proximity between compute and data.
For data center design, that reinforces the need for tightly integrated infrastructure: dense compute, high-speed interconnects, and architectures that can support sustained data movement at scale.
Multi-Agent Architectures Drive Compute Intensity
The rise of multi-agent environments adds another layer of demand. Most organizations are not standardizing on a single AI system. Instead, they are deploying multiple agents, with an average of more than two AI tools in use per enterprise.
That creates a more complex execution environment. Agents interact, orchestrate, and run tasks in parallel, increasing the number of inference cycles per workflow. What looks like incremental adoption at the application layer becomes multiplicative demand at the infrastructure layer.
Enterprise vs. SMB: A Two-Speed Market
Adoption is unfolding unevenly across the market. Large enterprises are moving fastest, with a substantial majority already deploying or piloting autonomous AI capabilities. Smaller organizations are earlier in the curve, often relying more heavily on standalone AI tools.
For data center developers and operators, this reinforces where the near-term load will concentrate. Enterprise AI deployments with their integrated, persistent, and data-intensive qualities will anchor demand, while SMB usage remains more fragmented and less predictable.
Barriers Are Organizational—Not Technical
The primary barriers to further adoption are no longer rooted in model capability. Security, compliance, and governance concerns top the list of challenges cited by respondents.
Organizations with formal AI governance frameworks are significantly more likely to move from pilot to production and to deploy AI across multiple departments. As those frameworks mature, adoption is likely to accelerate quickly. The constraint is shifting from technology readiness to organizational readiness.
The Bottom Line for Data Centers
Taken together, the survey paints a clear picture. AI is already embedded across enterprise workflows. Autonomous systems are on the near horizon. Data is consolidating around centralized platforms. And the economic case for deployment is established.
For the data center industry, the implication is direct. The infrastructure buildout is not leading demand. It is catching up to it.
As enterprises move from copilots to autonomous systems, the nature of the workload changes. It becomes more continuous, more integrated, and more power-intensive.
The question is no longer whether AI will drive the next wave of data center growth. It is whether the industry can build capacity and deliver power fast enough to keep up.



















