Stay Ahead, Stay ONMINE

From NIMBY to YIMBY: A Playbook for Data Center Community Acceptance

Across many conversations at the start of this year, at PTC and other conferences alike, the word on everyone’s lips seems to be “community.” For the data center industry, that single word now captures a turning point from just a few short years ago: we are no longer a niche, back‑of‑house utility, but a front‑page […]

Across many conversations at the start of this year, at PTC and other conferences alike, the word on everyone’s lips seems to be “community.” For the data center industry, that single word now captures a turning point from just a few short years ago: we are no longer a niche, back‑of‑house utility, but a front‑page presence in local politics, school board budgets, and town hall debates. That visibility is forcing a choice in how we tell our story—either accept a permanent NIMBY-reactive framework, or actively build a YIMBY narrative that portrays the real value digital infrastructure brings to the markets and surrounding communities that host it.

Speaking regularly with Ilissa Miller, CEO of iMiller Public Relations about this topic, there is work to be done across the ecosystem to build communications. Miller recently reflected: “What we’re seeing in communities isn’t a rejection of digital infrastructure, it’s a rejection of uncertainty driven by anxiety and fear. Most local leaders have never been given a framework to evaluate digital infrastructure developments the way they evaluate roads, water systems, or industrial parks. When there’s no shared planning language, ‘no’ becomes the safest answer.”

A Brief History of “No”

Community pushback against data centers is no longer episodic; it has become organized, media‑savvy, and politically influential in key markets. In Northern Virginia, resident groups and environmental organizations have mobilized against large‑scale campuses, pressing counties like Loudoun and Prince William to tighten zoning, question incentives, and delay or reshape projects.1

Loudoun County’s move in 2025 to end by‑right approvals for new facilities, requiring public hearings and board votes, marked a watershed moment as the world’s densest data center market signaled that communities now expect more say over where and how these campuses are built. Prince William County’s decision to sharply increase its tax rate on data center equipment and Fairfax County’s ban on data centers within a mile of rail stations, reinforced the message that growth would be more conditional and contested.

Opposition is also spreading beyond Virginia, with towns like Peculiar, Missouri, blocking billion‑dollar proposals over noise and siting concerns, and local coalitions using tools from lawsuits to yard‑sign campaigns to challenge projects. Between May, 2024 and March, 2025, Data Center Watch reports that $64 billion of data center projects have been blocked or delayed amid local opposition.2 Heatmap News cites $98 billion of data center projects blocked or delayed across 25 project cancellations, 21 of which occurred in the second half of last year.3 The resistance is gaining ground and the costs are rising. 

In this environment, simply reacting to “No” is no longer a strategy; it is a slow surrender of narrative ground to groups that define data centers solely by what they might take away, rather than what we actually create.

What Data Centers Really Bring

For host communities, the promise of data centers is not as abstract as some treat it. The value readily shows up in tax receipts, infrastructure upgrades, and long‑term employment, even if those benefits are often loosely explained at the outset of a project, depending on construction timelines and operational decisions that will be made much later than the initial site acquisition. In Loudoun County, data centers now provide nearly half of all property tax revenue, supporting public services at a scale few other industries can match and helping the county anticipate between $800 million and $1 billion in data center tax revenue this year alone.4

That kind of fiscal capacity changes local politics: officials have cited the industry’s impact on keeping residential tax rates lower and funding school construction and renovation without imposing new burdens on homeowners. Prince William County reports data center‑related revenue in the hundreds of millions of dollars, much of it flowing into the general fund that pays for education and core services. Across the United States, the broader data center industry contributed an estimated $162.7 billion in tax revenues to local, state, and federal governments in 2023, reflecting a 146 percent increase since 2017.4

Jobs, while sometimes debated locally, are both direct and indirect: from high‑skill roles paying into six figures to thousands of construction jobs and a steady demand for electricians, HVAC technicians, security, and local vendors. In emerging regions like Southside Virginia, proposed campuses are pitched as catalysts for economic diversification, offering “high‑skill high‑paying” positions and investments in onsite power that expand the tax base and modernize energy infrastructure simultaneously.5 The real story, if told clearly, is not just about support of the ever-increasing digital world, but about durable, capital‑intensive assets that anchor local revenue and enable broader technology advancement and application for businesses, schools, and local mission critical needs.

Taxes, Incentives, and Shared Upside

Incentive packages often become the flashpoint where NIMBY arguments crystallize, with critics charging that data centers receive “giveaways” at public expense. Yet in practice, many of these tools (property tax abatements, sales tax exemptions on equipment, and expedited permitting) are structured to trade near‑term relief for long‑term, outsized revenue streams and infrastructure upgrades. As states have discovered over the past several years, incentives have evolved from one‑off bargaining chips into core elements of a broader economic strategy, with dozens of jurisdictions racing to refine programs that can secure high‑value digital infrastructure while still delivering tangible returns for local residents.6

Reports from organizations like the Data Center Coalition document how, in markets such as Texas, data centers generate substantial recurring tax revenue at the state and local level, even after accounting for incentive programs.7 Industry analyses highlight that for every dollar spent in local services for data centers, counties like Loudoun can receive well over twenty dollars back in revenue, a ratio difficult to replicate with other commercial uses.5 These returns fund not only schools and emergency services, but also upgrades to roads, substations, and broadband that benefit the wider community, not just the project site.

As more states and localities compete for AI‑driven campuses, incentives are also aligning with policy goals: tying tax benefits to energy‑efficiency standards, limiting pass‑through of infrastructure costs to consumers, and encouraging co‑investment in resilient power and digital equity initiatives. The political debate is shifting from whether incentives are justified at all to how to ensure they produce visible, measurable gains for residents over the lifecycle of the facility. That is fertile ground for a YIMBY frame that positions communities as partners, not passive recipients, in structuring these deals.

From NIMBY to YIMBY: Reframing the Narrative

“Not in my back yard” has become shorthand for fears about noise, visual impact, energy use, and environmental risk—concerns that are real but often amplified by a lack of clear, accessible information. A YIMBY approach for data centers does not dismiss those objections; it gets ahead of them by acknowledging that these concerns persist, and then lays out what saying “yes” unlocks when projects are designed and negotiated well. 

One emerging theme from industry and civic conversations is that many community planners do not yet have a mental model for digital infrastructure; they think in terms of roads and water lines, not fiber routes or compute density. This “translation gap” can push risk‑averse local officials to default to “no” simply because they are unsure what questions to ask or what leverage they actually have.

Miller provided her insights into what these conversations might entail. “Many county planners aren’t opposed to data centers, they’re simply being asked to make long-term land-use decisions without a clear understanding of what questions to ask or what leverage they actually have. YIMBY doesn’t mean saying yes blindly; it means empowering communities with information to effectively assess what they have, what they need and ultimately what they want. The outcome, we hope, is clarity. They can say yes or no based on the specific conditions that reflect their community’s priorities.” 

YIMBY‑oriented engagement flips the script. Instead of approaching a locality with a fully baked plan and a narrow permit request, developers and operators can frame data centers as tools to achieve shared priorities: improved tax stability, digital inclusion, better connectivity for local businesses, and modernized energy systems. The language shifts from “let us build here” to “here is how this facility can help your community do what it already wants to do—if we discuss those needs together.”

Communities That Say “Yes”

Even as headline‑grabbing battles play out, there are quieter stories of communities that have not only accepted data centers, but actively pursued them as part of long‑term development strategies. Counties such as Danville‑Pittsylvania in Virginia5 have unanimously approved data center proposals, seeing them as a path to new investment, infrastructure, and jobs in regions transitioning from traditional industries.

Local officials in these areas talk about data centers as economic engines, citing projected millions in annual tax revenues and hundreds of high‑paying jobs as justification for support. In some cases, early projects have already lowered tax pressures on residents or financed school improvements, reinforcing the perception that our facilities can coexist with rural character and quality of life when sited and managed responsibly.

On a broader scale, states that have built predictable, transparent policy frameworks combining clear zoning, targeted incentives, and expectations around sustainability, are positioning themselves as YIMBY jurisdictions for digital infrastructure. Their message to both industry and citizens is that data centers are welcome, but on terms that reflect local values and long‑term planning priorities. These examples are the raw material for a new narrative: not just “no more data center sprawl,” but “yes, to the right projects, in the right places, with the right commitments.”

The Architecture of “Yes”

“Communities don’t need to be convinced that digital infrastructure matters. Technology is everywhere; we know it matters,” noted Miller. “Communities need to be empowered with information and tools to evaluate their community needs on their own terms. When local leaders understand what they can negotiate for, from siting and design to infrastructure investment and community benefits, the conversation shifts from opposition to partnership.”

One of the most promising developments in this shift from NIMBY to YIMBY is the work underway to give communities practical frameworks, language, and tools to engage with digital infrastructure on their own terms. Within the OIX Association, the Digital Infrastructure Framework Committee, led by board member Ilissa Miller, is building exactly this kind of toolkit.

The Digital Infrastructure Framework is designed to help local governments assess what digital assets they already have, where the gaps are, and how new infrastructure, from fiber routes to data centers, can close those gaps in ways that support everything from healthcare and education to small‑business growth. Miller emphasizes that the goal is not to convince every community to host a data center, but to ensure they understand their options and the negotiating leverage they bring to the table. As she has argued, planners are often saying “no” before they understand that they can ask for better design, stronger community benefits, or different siting approaches instead.

OIX is pairing this framework with active outreach to organizations like the National Association of Counties and the U.S. Conference of Mayors, embedding digital infrastructure considerations into mainstream municipal conversations. By meeting civic leaders where they already gather and speak, the initiative helps normalize data centers as part of the planning lexicon, not as exotic or opaque projects. That normalization is a quiet but powerful form of YIMBYism: it makes “yes, with conditions” a more comfortable, informed choice than “no, because we don’t understand this yet.”

Industry associations such as the Data Center Coalition are also contributing to the “Architecture of Yes” by publishing accessible data on economic and tax impacts at the state and local level. Their reports give mayors, supervisors, and community advocates concrete numbers to weigh against fears, showing how tax revenue, job creation, and infrastructure investment play out over time. When used alongside frameworks like OIX’s, these resources equip communities to negotiate from a position of knowledge rather than anxiety.

Shifting the Dialogue

Data centers have gone mainstream; they are now part of how cities and counties think about economic development, climate goals, and digital opportunity. That new visibility means our industry can no longer afford to define itself reactively against NIMBY critiques, repeating opponents’ vocabulary and responding on their terms.

A YIMBY mindset asks different questions: What does a “good” data center look like for this community? How do we measure success not only in megawatts and megabits, but in tax stability, educational outcomes, and resilience? Who are the local voices that need to be at the table early—planners, school boards, small businesses, neighborhood associations—and what information, tools, and negotiable levers do we put in their hands from the start?

Data center operators with “community” at the core of their approach, groups like OIX’s Digital Infrastructure Framework Committee, advocates such as Ilissa Miller, and industry bodies including the Data Center Coalition are sketching the outlines of an answer: standard language, shared data, and repeatable playbooks that help communities say “yes, and here is how we want this to work for us.” The next phase is for developers, operators, and investors to internalize that approach, integrating it into site selection, design, and community engagement rather than treating it as crisis management after opposition emerges.

Miller concludes, “The next phase for this industry isn’t about winning individual projects, it’s about earning long-term trust that the digital infrastructure projects are indeed for the greater good and not just corporate greed. That happens when digital infrastructure is integrated into master planning early, discussed transparently, and measured not just in megawatts, but in community outcomes over decades.”

If “community” is the word on everyone’s lips, then “Getting to Yes” needs to be the story our sector learns to tell—clearly, confidently, and collaboratively. Not because every project deserves approval, but because every community deserves the chance to understand the benefits of what saying yes could truly mean in their own back yard.

References:

  1. https://www.hawthorngroup.com/energy-news/another-data-center-stopped-by-local-opposition/

  2. https://www.datacenterwatch.org/report

  3. https://www.yahoo.com/news/articles/report-finds-data-center-cancellations-005000945.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAEBTvJ3wTKCbBf_qLZLG2jqeljgjD6Aqtl1xfcsVx6LHOi6vSH2_V_pKh98L4T5X9cPI_oZzomPwNQ86pTxK4UsMcoDc9pV-nJ4NQC1rA5mRNeoJWzMnP0mBCLKby4fUlnS3NWo9vUK-BkC3yxq6AF2ZshRZc3N04ELuGFgsiXWu

  4. https://datacentercoalition.org/cpages/faq

  5. https://cardinalnews.org/2025/04/10/data-centers-can-bring-high-paying-jobs-and-millions-in-tax-revenue-is-that-what-southside-will-get/

  6. https://www.datacenterfrontier.com/site-selection/article/55307797/incentivizing-the-digital-future-inside-americas-race-to-attract-data-centers

  7. https://datacentercoalition.org/reports-and-publications

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Quantum Elements cuts quantum error rates using AI-powered digital twin

“That’s pretty clever, actually,” Sutor says. “It’s a little microwave pulse. That fixes some of the errors.” The Quantum Elements paper specifically addressed quantum error correction in IBM’s 127-qubit superconducting processor. But these techniques might also be able to be generalized to other types of quantum computers, Sutor says. And

Read More »

How AWS is reinventing the telco revenue model

Consider what that means for the mobile operator and its relationship with its customers. Instead of selling a generic 5G pipe with a static SLA, a telco can now sell a dynamic, guaranteed slice for a specific use case—say, a remote robotic surgery setup or a high-density, low-latency industrial IoT

Read More »

What’s the biggest barrier to AI success?

AI’s challenge starts with definition. We hear all the time about how AI raises productivity, and many have experienced that themselves. But what, exactly, does “productivity” mean? To the average person, it means they can do things with less effort, which they like, so it generates a lot of favorable

Read More »

Brent retreats from highs after Trump signals Iran war nearing end

@import url(‘https://fonts.googleapis.com/css2?family=Inter:[email protected]&display=swap’); a { color: var(–color-primary-main); } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: Inter; } body { line-height: 150%; letter-spacing: 0.025em; font-family: Inter; } button, .ebm-button-wrapper { font-family: Inter; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #c19a06 !important; border-color: #c19a06 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #c19a06 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #c19a06 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #c19a06 !important; border-color: #c19a06 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #c19a06 !important; border-color: #c19a06 !important; } Oil futures eased from recent highs Tuesday as markets reacted to comments from US President Donald Trump suggesting the war with Iran may be nearing its conclusion, easing concerns about prolonged disruptions to Middle East crude supplies. Brent crude had climbed above $100/bbl amid escalating tensions in the region and fears that the war could prolong disruptions to shipments through the Strait of Hormuz—one of the world’s most critical energy chokepoints and a transit route for roughly one-fifth of global oil supply. Prices pulled back after Pres. Trump said the war was “almost done,” prompting traders to reassess the risk premium that had built into crude markets during the latest escalation. The earlier gains were driven by the fact that the war had disrupted tanker traffic in the Strait of Hormuz, raising concerns about wider supply disruptions from major Gulf oil producers. While the latest remarks helped calm markets, analysts note that geopolitical risks remain elevated and price volatility is likely to persist as traders monitor developments in the region. Any renewed escalation could quickly send crude prices higher again.

Read More »

Southwest Arkansas lithium project moves toward FID with 10-year offtake deal

Smackover Lithium, a joint venture between Standard Lithium Ltd. and Equinor, through subsidiaries of Equinor ASA, signed the first commercial offtake agreement for the South West Arkansas Project (SWA Project) with commodities group Trafigura Trading LLC. Under the terms of a binding take-or-pay offtake agreement, the JV will supply Trafigura with 8,000 metric tonnes/year (tpy) of battery-quality lithium carbonate (Li2CO3) over a 10-year period, beginning at the start of commercial production. Smackover Lithium is expected to achieve final investment decision (FID) for the project, which aims to use direct lithium extraction technology to produce lithium from brine resources in the Smackover formation in southern Arkansas, in 2026, with first production anticipated in 2028. The project encompasses about 30,000 acres of brine leases in the region, with the initial phase of project development focused on production from the 20,854-acre Reynolds Brine Unit.   Front-end engineering design was completed in support of a definitive feasibility study with a principal recommendation that the project is ready to progress to FID.  While pricing terms of the Trafigura deal were kept confidential, Standard Lithium said they are “structured to support the anticipated financing for the project.” The JV is seeking to finalize customer offtake agreements for roughly 80% of the 22,500 tonnes of annual nameplate lithium carbonate capacity for the initial phase of the project. This agreement represents over 40% of the targeted offtake commitments. Formed in 2024, Smackover Lithium is developing multiple DLE projects in Southwest Arkansas and East Texas. Standard Lithium is operator of the projecs with 55% interest. Equinor holds the remaining 45% interest.

Read More »

Equinor makes oil and gas discoveries in the North Sea

Equinor Energy AS discovered oil in the Troll area and gas and condensate in the Sleipner area of the North Sea. Byrding C discovery well 35/11-32 S in production license (PL) 090 HS was made 5 km northwest of Fram field in Troll. The well was drilled by the COSL Innovator rig in 373 m of water to 3,517 m TVD subsea. It was terminated in the Heather formation from the Middle Jurassic. The primary exploration target was to prove petroleum in reservoir rocks from the Late Jurassic deep marine equivalent to the Sognefjord formation. The secondary target was to prove petroleum and investigate the presence of potential reservoir rocks in two prospective intervals from the Middle Jurassic in deep marine equivalents to the Fensfjord formation. The well encountered a 22-m oil column in sandstone layers in the Sognefjord formation with a total thickness of 82 m, of which 70 m was sandstone with moderate to good reservoir properties. The oil-water contact was encountered. The secondary exploration target in the Fensfjord formation did not prove reservoir rocks or hydrocarbons. The well was not formation-tested, but data and samples were collected. The well has been permanently plugged. Preliminary estimates indicate the size of the discovery is 4.4–8.2 MMboe. Oil discovered in Byrding C will be produced using existing or future infrastructure in the area. The Frida Kahlo discovery was drilled from the Sleipner B platform in production license PL 046 northwest of Sleipner Vest and is estimated to contain 5–9 MMboe of gas and condensate. The well will be brought on stream as early as April. The four most recent exploration wells in the Sleipner area, drilled over a 3-month period, include Lofn, Langemann, Sissel, and Frida Kahlo. All have all proven gas and condensate in the Hugin formation, with combined estimated

Read More »

IEA launches record strategic oil release as Middle East war disrupts supply

The International Energy Agency (IEA) on Mar. 11 approved the largest emergency oil stock release in its history, making 400 million bbl available from member-country reserves in response to market disruptions tied to the war in the Middle East. The coordinated action, agreed unanimously by the IEA’s 32 member countries, is intended to ease supply pressure and temper price volatility as crude markets react to disrupted flows through the Strait of Hormuz. “The conflict in the Middle East is having significant impacts on global oil and gas markets, with major implications for energy security, energy affordability and the global economy for oil,” IEA executive director Fatih Birol said. The release more than doubles the previous IEA record set in 2022, when member countries collectively made 182.7 million bbl available following Russia’s invasion of Ukraine. Under the IEA system, member countries are required to maintain emergency oil stocks equal to at least 90 days of net imports, giving the agency a mechanism to respond when severe disruptions threaten global supply. The move comes after crude prices surged amid concerns that the US-Iran war could lead to prolonged disruption of exports from the Gulf. Despite the planned stock release, traders remain uncertain about whether reserve barrels alone will be enough to offset losses if the disruption persists. IEA said the emergency barrels will be supplied to the market from government-controlled and obligated industry stocks held across member countries. The action marks the sixth coordinated stock release in the agency’s history and underscores the seriousness of the current supply shock. Earlier the day, Japanese Prime Minister Sanae Takaichi said that Japan might start using its strategic oil reserves as early as next week, citing Japan’s unusually high dependence on Middle Eastern crude oil.

Read More »

Infographic: Strait of Hormuz energy trade 2025

@import url(‘https://fonts.googleapis.com/css2?family=Inter:[email protected]&display=swap’); a { color: var(–color-primary-main); } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: Inter; } body { line-height: 150%; letter-spacing: 0.025em; font-family: Inter; } button, .ebm-button-wrapper { font-family: Inter; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #c19a06 !important; border-color: #c19a06 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #c19a06 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #c19a06 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #c19a06 !important; border-color: #c19a06 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #c19a06 !important; border-color: #c19a06 !important; } Coordinated attacks Feb. 28 by the US and Israel on Iran and the since-escalated conflict have nearly halted shipping traffic through the Strait of Hormuz, which typically carries about 20% of the world’s crude oil and natural gas. OGJ Statistics Editor Laura Bell-Hammer compiled data to showcase 2025 energy trade through the critical transit chokepoint.   <!–> –> <!–> ]–> <!–> ]–>

Read More »

BOEM: US OCS holds 65.8 billion bbl of technically recoverable reserves

The US Outer Continental Shelf (OCS) holds mean undiscovered technically recoverable resources (UTRR) of 65.8 billion bbl of oil and 218.43 tcf of natural gas, the US Bureau of Ocean Energy Management (BOEM) said Mar. 9. Based on current production trends, these undiscovered resources represent the potential for 100 or more years of energy production from the US Outer Continental Shelf (OCS), BOEM said. A large portion of undiscovered OSC resources is located offshore the Gulf of Mexico and Alaska, according to the report. The offshore Gulf holds 26.9 million bbl of oil and 45.59 tcf of gas, while offshore Alaska holds an estimated mean 24.1 million bbl of oil and 122.29 tcf of gas. Offshore Pacific holds a mean UTRR of 10.3 million barrels of oil and 16.2 trillion cubic feet of gas, the report said. Offshore Atlantic holds a mean UTRR of 10.3 billion barrels of oil and 16.2 trillion cubic feet of gas. The assessment also evaluates the impact of prices on hydrocarbon recovery. Alaska is particularly price-sensitive, with mean undiscovered economically recoverable resources (UERR) negligible until prices average $100/bbl and $17.79/Mcf. At those levels, the mean UERR stands at 6.25 billion bbl and 13.25 tcf. At $160/bbl and $28.47/Mcf, recoverable resources jump to 14.67 billion bbl and 58.78 tcf. In the Gulf of Mexico, the mean UERR is 17.51 billion bbl of oil and 13.71 tcf at average prices of $60/bbl and $3.20/Mcf, increasing to 20.51 billion bbl and 17.49 tcf at average prices of $100/bbl and $5.34/Mcf, respectively. BOEM conducts a national resource assessment every 4 years to understand the “distribution of undiscovered oil and gas resources on the OCS” and identify opportunities for additional oil and gas exploration and development. “The Outer Continental Shelf holds tremendous resource potential,” said BOEM Acting Director Matt Giacona. “This

Read More »

Cisco extends its Secure AI Factory with Nvidia

“Customers can now control and manage this environment and operate it like it was a traditional data center fabric,” Wollenweber said. “The ability to bring it under the same Nexus umbrella is actually a huge selling point for AI customers, because their IT infrastructure folks, their operational people that are running the network, already understand how to use these Nexus tools, and so they can now add AI workloads and kind of accelerated computing technologies like GPUs, but in that same Nexus umbrella,” Wollenweber said.  “As Al becomes operational and distributed, complexity becomes the enemy of scale. Fragmented architectures force customers to manage integration, policy enforcement, observability, and security across silos, increasing cost and slowing innovation,” said Wollenweber. “Architecting silicon, networking, compute, security, and Al software into a cohesive system gives organizations a unified operating model, stronger performance guarantees, and embedded trust.” Those are the driving ideas around Cisco Secure AI Factory with Nvidia, Wollenweber said. Introduced a year ago, Secure AI Factory with Nvidia integrates Cisco’s Hypershield and AI Defense packages to help protect the development, deployment, and use of AI models and applications. Hypershield uses AI to dynamically refine security policies based on application identity and behavior. It automates policy creation, optimization, and enforcement across workloads. AI Defense discovers the various models being used in a customer’s AI development and uses four features to help customers enforce AI protection: AI access, AI cloud visibility, AI model and application validation, and AI runtime protection. Cisco integrates Hybrid Mesh Firewall technology On the security side, Cisco said it will embed its Hybrid Mesh Firewall technology to allow for security policy enforcement on Nvidia BlueField data processing units (DPU) that are embedded in Nvidia GPU servers connected to Cisco Nexus One fabrics. Cisco Hybrid Mesh Firewall offers a distributed security fabric

Read More »

Middle East war fosters concerns about physical data center security

The most common issue that Guidepost talks about with its clients is insider threats, which can be anyone that is rightfully permitted into your data center. Data centers have very strict rules regarding movement of visitors, but employees pretty much have free rule of the place. “Insider threat could be someone simply putting a USB stick in a server or having access to a data device that they’re not supposed to,” he said. “A threat actor could potentially cause harm within the facility, whether that’s mechanical, electrical, plumbing spaces or the data halls themselves is our number one preventative item that we’re trying to thwart.” When it comes to external threats, Guidepost looks after vehicle-borne IEDs and vehicle ramming, even if it’s accidental. That’s why data centers have high, anti-climb perimeter fences, multi-layered gates. and vehicle barriers that are put in place help to prevent any unwanted vehicles outside of the facility. “It’s a lot of what we call Crime Prevention Through Environmental Design,” said Bekisz. “It’s a theory that we utilize in our industry for ensuring that we are detecting and thwarting individuals before they are willing to commit some type of offensive action or some type of unwanted behavior.” That includes simple things like lighting right or reducing the visibility of the data center through shrubs and trees and berms and using that in consortium with physical preventative devices. Drones are a growing problem, even if they are not being used in kamikaze attacks. Bekisz said the only thing you can do is put in drone detection, so you have some type of device in the air in the area of your facility, and then you call for support from local emergency services.

Read More »

Palantir partners with Nvidia to streamline AI data center deployment

This collaboration grants enterprises full control over their data, AI models, and applications while supporting the use of open-source AI models and related data acceleration tools. The Palantir AI OS reference architecture gives enterprises total control over their data, AI models and applications. It is particularly critical for customers with existing GPU infrastructure, latency-sensitive workflows, data sovereignty requirements, and high geographic distribution. “From our first deployment with the United States government and in every deployment since, our software has had to meet the moment in the most complex and sensitive environments where customers must maintain control,” says Akshay Krishnaswamy, Palantir’s chief architect in a statement. “Together with Nvidia — and building on many customers’ existing investments — we are proud to deliver a fully integrated AI operating system that is optimized for Nvidia accelerated compute infrastructure and enables customers to realize the promise of on-premises, edge, and sovereign cloud deployments,” he added. Sovereign AI is an emerging market that represents a country’s efforts to develop and maintain control of its own AI, using its own data, and keeping the data within its borders.

Read More »

Who’s in the data-center space race?

But not everyone is that optimistic. According to Gartner, space-based data centers won’t be useful for decades, so companies should focus on expanding capacity down here on Earth. “I honestly think the idea with the current landscape of putting data centers in space is ridiculous,” OpenAI CEO Sam Altman told The Indian Express in February. Current satellite computing can’t easily scale to data centers, agrees Holger Mueller, an analyst at Constellation Research. “Weight is still the restriction,” he says. “It’s the equivalent of you buying a tablet or small laptop to travel across Latin America versus putting in a data center in the Amazon. Different power requirements, investment, totally different setup.” Then there are issues like damaged solar panels from meteorite storms and satellite debris, he adds. “You would have to pay for operational redundancy, which is further investment.” “Data centers will be built where they are affordable,” he says. “I don’t see space happening soon. Remember the Microsoft submerged one? Crickets…” But he agrees that solar power is nice, though the sun is only visible from one side of the planet at any given time. And space is cold, he says. Cooling down in outer space In fact, space is very cold. Close to absolute zero cold. But vacuum is also a great insulator, and there’s no air to move the heat around. “You can’t convect heat away,” says Richard Bonner, CTO at Accelsius, a liquid cooling company. Bonner has worked on NASA research projects about the challenge of cooling in space and is very familiar with the problem. A small proportion of the heat might be turned back into useful electricity, but that’s not really a solution, he says, because computer chips don’t get quite that hot. Instead, heat is radiated. When an object warms up, it generates

Read More »

Community Opposition Emerges as New Gatekeeper for AI Data Center Expansion

The rapid global buildout of AI infrastructure is colliding with a new constraint that hyperscalers cannot solve with capital or GPUs: local opposition. In the first months of 2026, community resistance has already begun reshaping the development pipeline. A February analysis by Sightline Climate estimates that 30–50 percent of the data center capacity expected to come online in 2026 may not be delivered on schedule, reflecting a growing set of constraints that now include power availability, permitting challenges, and increasingly organized local opposition. The financial stakes are already substantial. Recent reporting indicates that tens of billions of dollars in planned data center development have been delayed or halted amid community pushback, including an estimated $98 billion worth of projects delayed or blocked in a single quarter of 2025, according to research cited by Data Center Watch. What had been framed throughout 2024 and 2025 as an inevitable expansion of hyperscale campuses, gigawatt-scale power agreements, and AI “factory” clusters is now encountering a different kind of gatekeeper: the communities expected to host the infrastructure. The shift is already visible in project outcomes. Across the United States, multiple projects were canceled, blocked, or fundamentally reshaped in the opening months of 2026 due to organized local opposition. Reporting from The Guardian found that 26 data center projects were canceled in December and January, compared with just one cancellation in October, suggesting that community resistance campaigns are increasingly capable of stopping projects before construction begins. At the same time, local governments are responding to community pressure with moratoriums, zoning restrictions, and permitting delays that can stall projects long enough to jeopardize financing or push developers to seek more favorable jurisdictions. While opposition to data center development is not new, the scale, coordination, and success rate of these efforts suggest a structural shift in how

Read More »

From Real Estate to AI Factories: 7×24 Exchange’s Michael Siteman on Power, Politics, and the New Logic of Data Center Development

The data center industry’s explosive growth in the AI era is transforming how projects are conceived, financed, and built. What was once a real estate-driven business has become something far more complex: an engineering and infrastructure challenge defined by power availability, network topology, and local politics. That was one of the key themes in this recent episode of the Data Center Frontier Show podcast, where Editor-in-Chief Matt Vincent spoke with Michael Siteman, President of Prodigious Proclivities and a longtime leader and board member within 7×24 Exchange International. Drawing on decades of experience spanning brokerage, development, connectivity strategy, and infrastructure advisory, Siteman offered a field-level view of how the industry is adapting to the demands of AI-driven infrastructure. “The business used to be a pure real estate play,” Siteman said. “Now it’s a systems engineering problem. It’s power, network topology, the real estate itself, and political risk—all of these factors that have to work together.” Site Selection Becomes Systems Engineering For much of the early data center era, location decisions revolved around traditional real estate considerations: available buildings, proximity to customers, and nearby fiber connectivity. That logic has fundamentally changed. “Years ago, the question was: Is there a building? Are there carriers nearby?” Siteman recalled. “Now it’s completely different. Power availability, network topology, community acceptance—these are the variables that define whether a site works.” Utilities themselves have become gatekeepers in the process. “You go to a utility and ask if there’s power,” he explained. “They might say, ‘We might have power, but you have to pay us to study whether we actually have power.’” In many regions experiencing rapid digital infrastructure expansion, the answer increasingly comes back the same: there simply isn’t enough grid capacity available. Power Becomes the Project In the gigawatt-scale era of AI infrastructure, power strategy has moved

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »