Stay Ahead, Stay ONMINE

Utah’s 4 GW AI Campus Tests the Limits of Speed-to-Power

Back in September 2025, we examined an ambitious proposal from infrastructure developer Joule Capital Partners – often branding the effort as “Joule Power” – in partnership with Caterpillar. The concept is straightforward but consequential: acquire a vast rural tract in Millard County, Utah, and pair an AI-focused data center campus with large-scale, on-site “behind-the-meter” generation […]

Back in September 2025, we examined an ambitious proposal from infrastructure developer Joule Capital Partners – often branding the effort as “Joule Power” – in partnership with Caterpillar. The concept is straightforward but consequential: acquire a vast rural tract in Millard County, Utah, and pair an AI-focused data center campus with large-scale, on-site “behind-the-meter” generation to bypass the interconnection queues, transmission constraints, and substation bottlenecks slowing projects nationwide.

The appeal is clear: speed-to-power and greater control over delivery timelines. But that speed shifts the project’s risk profile. Instead of navigating traditional utility procurement, the development begins to resemble a distributed power plant subject to industrial permitting, fuel supply logistics, air emissions scrutiny, noise controls, and groundwater governance. These are issues communities typically associate with generation facilities, not hyperscale data centers.

Our earlier coverage focused on the technical and strategic logic of pairing compute with on-site generation. Now the story has evolved. Community opposition is emerging as a material variable that could influence schedule and scope. Although groundbreaking was held in November 2025, final site plans and key conditional use permits remain pending at the time of publication.

What Is Actually Being Proposed?

Public records from Millard County show Joule pursuing a zone change for approximately 4,000 acres (about 6.25 square miles), converting agricultural land near 11000 N McCornick Road to Heavy Industrial use. At a July 2025 public meeting, residents raised familiar concerns that surface when a rural landscape is targeted for hyperscale development: labor influx and housing strain, water use, traffic, dust and wildfire risk, wildlife disruption, and the broader loss of farmland and local character.

What has proven less clear is the precise scale and sequencing of the buildout.

Local reporting describes an initial phase of six data center buildings, each supported by a substantial fleet of Caterpillar natural-gas generators, with construction beginning “this spring.” Other accounts reference a significantly larger first phase, with entitlement discussions including as many as 32 buildings of roughly one million square feet each, even if only a portion would be constructed in the near term.

These descriptions are not necessarily contradictory. Developers often seek entitlements for a maximum buildout while actual construction proceeds in phased increments based on financing and customer commitments. But the distinction matters. Community impact (particularly around noise, emissions, traffic, and water) will be evaluated based on what is permitted and installed in the near term, not on long-range conceptual buildout plans.

What is clear is that the project’s critical path is not simply data center construction. It is prime power generation.

The Salt Lake Tribune reported that early plans pair each of six buildings with 69 Caterpillar natural-gas generators. At that scale, the site would require hundreds of engines, with community members describing projected sound levels as comparable to “more than 400 semi-trucks idling.” Joule and Caterpillar’s own materials position the campus as a 4-gigawatt development featuring combined cooling, heat and power (CCHP), liquid cooling by design, and a fleet of Caterpillar G3520K generator sets.

This is not a conventional hyperscale construction program. It is the development of a distributed generation campus: foundations for large engine arrays, exhaust and aftertreatment systems, high-voltage switchyards, synchronization controls, fire and life-safety systems, fuel interconnections, and a commissioning process that resembles a utility-scale plant more than a colocation facility.

That shift introduces additional long-lead and integration risks, including:

  • Generator manufacturing capacity and delivery sequencing.

  • Emissions-control configuration and regulatory compliance.

  • Synchronization and islanding controls.

  • Black-start capability and ride-through design.

  • Commissioning under load, particularly before full tenant occupancy.

In short, the engineering challenge extends well beyond compute density. It centers on whether a modular, engine-based generation strategy can be permitted, delivered, and synchronized at multi-gigawatt scale without triggering schedule friction from the very industrial systems that make the project possible.

Construction Logistics: When Rural Scale Meets Industrial Volume

County proceedings reflect immediate concern about the practical realities of building at this scale. Residents raised questions about labor influx, temporary housing, traffic, and the strain placed on a rural road network not designed for sustained heavy industrial movement.

On a 4,000-acre site, construction logistics become a program of their own. Hundreds of large generator units, transformers, switchgear lineups, cooling systems, and potentially battery containers must be delivered, staged, installed, and commissioned. That translates into prolonged heavy trucking, haul-route coordination, road upgrades, laydown yards, pre-assembly zones, and ongoing dust management across miles of internal access roads.

Rural land availability is often cited as an advantage in hyperscale siting. But the same locations frequently lack depth in supporting infrastructure, from workforce housing and emergency response capacity to medical services and road maintenance budgets. That imbalance surfaced directly in public meetings, where residents asked whether the community is equipped to absorb the scale and duration of construction activity being proposed.

Dust and fire risk were raised explicitly in the record. In arid regions, dust affects more than local quality of life; it can degrade construction productivity and equipment reliability, increasing filtration and maintenance requirements for cooling and electrical systems. Fire risk, meanwhile, introduces questions about defensible space, fire-water supply, response times, and whether local emergency services would require expansion to support an industrial campus of this magnitude.

For a traditional data center, these concerns are manageable extensions of site work. For a multi-gigawatt campus anchored by engine-based generation, they become material schedule and community-relations variables.

Emissions Strategy Shapes the Site Plan

County meeting minutes note that the buildings are intentionally spread across the property “due to the emissions and not dispersing them all in one area.” That single comment reveals a great deal about the permitting strategy behind the layout.

At multi-gigawatt scale, particularly with engine-based generation, site geometry becomes an emissions-management tool. Distributing buildings and associated generator blocks across thousands of acres may help manage:

  • Localized pollutant concentration modeling results.

  • Noise contours and setback compliance.

  • Stack and exhaust dispersion dynamics.

  • Regulatory thresholds that can shift when large numbers of emission sources are co-located.

In other words, the physical layout is not driven solely by operational efficiency or campus aesthetics. It may also be structured to navigate air-quality modeling and permitting categories.

There is a trade-off. Spreading structures across a large footprint increases civil and electrical complexity: longer internal roads, extended medium-voltage distribution runs, more trenching, additional switchgear segmentation, and greater redundancy requirements. That raises both capital cost and coordination demands.

But concentration carries its own risk. Clustering large generator arrays can intensify modeled emission “hot spots,” tighten setback constraints, and elevate the regulatory classification of the project.

In this case, the master plan appears to reflect a calculated balance between construction efficiency and emissions dispersion; a reminder that, at this scale, environmental modeling is influencing not just equipment selection, but the geography of the campus itself.

Water Rights vs. Water Reality

Joule’s public case for the project emphasizes water independence. The company has reportedly secured rights to more than 10,000 acre-feet of groundwater annually (over 3 billion gallons) and has stressed that the campus will not rely on a municipal system.

Project materials also point to a closed-loop, direct-to-chip cooling architecture designed to minimize evaporative losses. According to Trellis, engineers estimate the data center would use significantly less water than the alfalfa farming currently supported on the land — potentially as much as 75% less on an annual basis.

In arid regions, however, possessing legal rights does not automatically resolve public concern.

Opposition tends to focus less on annual totals and more on long-term basin health: aquifer drawdown over time, impacts on neighboring wells, drought-cycle variability, and the transparency of monitoring and reporting regimes. There is also a precedent question. If one multi-gigawatt industrial campus can rely on privately controlled groundwater at this scale, others may attempt to follow.

At the state level, scrutiny is increasing. Utah lawmakers have signaled interest in expanding water-use reporting requirements for data centers, reflecting broader concern about transparency and sustainability in water-constrained regions.

The core issue is not simply consumption. It is governance, verification, and public trust in how withdrawals will be measured and managed over decades of operation.

Air Permitting: The Project’s Central Flashpoint

The defining feature of the Utah campus — large-scale, on-site gas-fired generation — is also its most direct environmental vulnerability.

According to Trellis, air permit applications filed with Utah regulators indicate that the initial six-building phase could emit approximately 4,380 tons per year of regulated pollutants (excluding CO₂), including roughly 1,380 tons annually of nitrogen oxides (NOx). Trellis further reports that the projected NOx rate is materially higher than that of Utah’s gas-fueled Lake Side Power Plant, based on EPA data comparisons.

Those figures shift the conversation. This is no longer simply a data center debate; it is an air-quality and industrial-generation discussion.

The Salt Lake Tribune has highlighted a broader concern emerging statewide: when utility interconnection timelines stretch too long, some data center developers are choosing to build generation on-site, effectively relocating power plant emissions closer to new industrial campuses. For environmental advocates, that raises questions about cumulative air-quality impacts and regulatory precedent.

From a construction and delivery standpoint, the risk becomes schedule-driven. Air permitting — including dispersion modeling and determinations around Best Available Control Technology (BACT) or, if triggered, Lowest Achievable Emissions Rate (LAER) requirements — can materially influence equipment configuration. If regulators tighten emissions controls or modeling assumptions late in the process, engine specifications, aftertreatment systems, or operating limits may need to be revised after procurement decisions have been made.

At multi-gigawatt scale, those revisions are not minor adjustments. They can mean redesign, additional capital expenditure, extended lead times, and commissioning delays.

In effect, the project’s speed-to-power advantage hinges on successfully navigating a regulatory pathway more commonly associated with utility-scale generation than with hyperscale data halls.

Noise: Operational Reality, Not Rhetoric

Noise concerns surfaced directly in county proceedings, including discussion of projected decibel levels. The Salt Lake Tribune characterized the anticipated generator output as comparable to “hundreds of idling semi-trucks,” a description vivid enough to resonate well beyond technical modeling.

At the scale proposed, acoustic mitigation becomes a core design requirement, not a secondary engineering detail. Engine enclosures, exhaust mufflers, sound walls, berming, building orientation, and setback distances all influence both compliance and community acceptance.

Noise also carries enforcement implications. Even where permitted limits are met on paper, persistent complaints can lead to additional monitoring, operational restrictions, or pressure to retrofit mitigation measures. For a campus built around continuous engine-based generation, acoustic performance becomes an operational variable with both regulatory and reputational consequences.

From Farmland to Industrial Power Campus

Beyond engineering metrics lies a broader land-use shift. Residents have voiced concern about the conversion of agricultural land, described by Trellis as a family alfalfa operation, into a multi-gigawatt industrial complex.

Even if the data center ultimately consumes less water than the existing agricultural activity, the transformation is not simply volumetric. It represents a permanent change in landscape function: new road networks, fencing, lighting, substations, and generation yards replacing open farmland.

That transition introduces habitat fragmentation, construction disturbance, and long-term industrialization of a rural corridor. For some in the community, the question is not only environmental impact but identity, i.e. whether the region is prepared to redefine itself from agricultural base to energy-intensive digital infrastructure hub.

Power at Utility Scale

Caterpillar’s announcement frames the project as a 4-gigawatt campus incorporating 1.1 GWh of grid-forming battery energy storage, combined cooling, heat and power (CCHP), and what it describes as “diverse fuel sources.” Our earlier coverage reflected those figures. Subsequent reporting from Trellis suggests the broader site could ultimately scale toward 12 gigawatts, depending on entitlement and demand.

Even at the lower bound, 4 GW is not simply a large substation. It is utility-scale generation.

If a meaningful portion of that capacity is installed in early phases, fuel logistics, emissions permitting, and operational oversight become regional planning issues rather than purely site-level considerations.

The design direction outlined publicly includes:

  • Large fleets of Caterpillar natural-gas generator sets.

  • A planned pipeline interconnection (with county minutes referencing the Kern River system and Trellis citing nearby gas infrastructure).

  • Battery storage positioned for load smoothing, firming, and grid-forming capability, with the system described as “permitted to accept cleaner electricity” in the future, including potential fuel cells, geothermal, or small modular reactors.

This approach reflects a broader pattern emerging in AI-oriented campus development: deploy modular gas generation and storage to secure immediate power availability, then pursue incremental decarbonization as alternative supply chains mature.

Trellis cites Utah Clean Energy describing the engine-based approach as less efficient than combined-cycle turbine plants, and therefore potentially more emissions-intensive, while acknowledging that reciprocating engines are deployable today.

That is the central trade-off.

Modular engines and battery systems offer speed, sequencing flexibility, and independence from grid interconnection timelines.

But they also anchor the project in an industrial permitting regime defined by air quality, fuel supply, noise, and long-term emissions intensity; considerations that diverge from the renewables-backed narrative often associated with hyperscale data center expansion.

What Must Be Resolved

For community support to solidify, several questions need clear, enforceable answers.

First, what precisely constitutes Phase I? Whether the near-term build is limited to six buildings or represents the leading edge of a much larger sequence materially changes projected impacts on traffic, emissions, water, and noise. Entitlement scale and construction scale must align transparently.

Second, what are the binding air-permit limits and monitoring protocols? The reported tonnage and NOx comparisons suggest the air program will sit at the center of regulatory scrutiny. Modeling assumptions, control technologies, and compliance verification will define both schedule and public confidence.

Third, what noise mitigation commitments are contractually embedded in the design, and how will compliance be validated over time? Acoustic performance is not theoretical at this scale; it is measurable and enforceable.

Fourth, how will groundwater withdrawals be metered, reported, and made publicly auditable? Legal rights alone do not guarantee community trust in a drought-sensitive basin.

Finally, who absorbs the cost of ancillary impacts, i.e. road upgrades, emergency-response expansion, workforce housing strain? County proceedings suggest these concerns are already embedded in local discourse.

A Test Case for the AI Power Model

In many respects, Joule’s 4,000-acre Utah campus represents more than a single development proposal. It is a case study in the next phase of AI-era infrastructure strategy.

When grid interconnection timelines stretch beyond acceptable delivery windows, developers are increasingly bringing the power plant to the servers.

That shift changes the development equation. The core question is no longer simply whether a data center can be constructed on time and on budget. It becomes whether a utility-scale distributed generation system can be entitled, financed, built, and operated without sustained opposition over air emissions, noise, water use, and land conversion.

Scale does not eliminate local scrutiny. Even in rural settings, multi-gigawatt projects introduce industrial impacts that communities recognize and evaluate accordingly. In that sense, the Utah proposal may foreshadow a broader industry reality: as AI campuses grow to industrial dimensions, they inherit industrial politics.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

CERT-EU blames Trivy supply chain attack for Europa.eu data breach

Back door credentials The Trivy compromise dates to February, when TeamPCP exploited a misconfiguration in Trivy’s GitHub Actions environment, now identified as CVE-2026-33634, to establish a foothold via a privileged access token, according to Aqua Security. Discovering this, Aqua Security rotated credentials but, because some credentials remain valid during this

Read More »

French government take Bull by horns for €404 million

It’s the second time that Bull has been nationalized: The first time, in 1982 was to save it from bankruptcy. Atos, has had financial troubles of its own. In August 2024, it tried — and failed — to sell its legacy infrastructure management business. The company had already staved off

Read More »

Cisco fixes critical IMC auth bypass present in many products

Cisco has released patches for a critical vulnerability in its out-of-band management solution, present in many of its servers and appliances. The flaw allows unauthenticated remote attackers to gain admin access to the Cisco Integrated Management Controller (IMC), which gives administrators remote control over servers even when the main OS

Read More »

Kyndryl service targets AI agent automation, security

Understand agents, serving as a single source of truth to help mitigate the risks associated with shadow AI. Validate each agent before launch by testing for security, resilience, and policy compliance to ensure they meet your standards before going live. Maintain control with real-time guardrails that keep agents operating within

Read More »

Energy Department Authorizes Additional Exports of LNG from Elba Island Terminal, Strengthening Global Energy Supply with U.S. LNG

WASHINGTON—U.S. Secretary of Energy Chris Wright today authorized an immediate 22% increase in exports of liquefied natural gas (LNG) from the Elba Island Terminal in Chatham County, Georgia. With today’s order, Kinder Morgan subsidiary Southern LNG Company L.L.C., operator of the Elba Island LNG Terminal, is now authorized to export up to an additional 28.25 (Bcf/yr) to non-free trade agreement countries, strengthening global natural gas supplies with reliable U.S. LNG. Elba Island was previously authorized to export up to 130 billion cubic feet per year (Bcf/yr) of natural gas as LNG to non-free trade agreement countries and has been exporting U.S. LNG since 2019. The project is positioned to export the additional approved volumes immediately.  “At a time when global energy supply routes face disruption, the United States remains a reliable energy partner to our allies and trading partners,” said DOE Assistant Secretary of the Hydrocarbons and Geothermal Energy Office, Kyle Haustveit. “DOE is using all available authorities to ensure American energy can reach global markets when it is needed most, supporting energy security and helping stabilize global energy supplies.”  The action comes as global oil and LNG supply routes face disruption from tensions in the Middle East and attacks carried out by Iran and its proxies, threatening the reliable flow of energy through critical maritime corridors. The Department will continue to act, using its full set of authorities, to ensure U.S. LNG remains a dependable energy source in global energy markets and a stabilizing presence in times of disruption.  Thanks to President Trump’s leadership and American innovation, the United States is the world’s largest natural gas producer and exporter, with exports reaching all-time highs in March 2026. Since President Trump ended the previous administration’s LNG export approval ban, the Department has approved more than 19 Bcf/d of LNG export authorizations. With recent final investment decisions for additional export capacity, U.S. LNG exports are set

Read More »

Energy Department Initiates Additional Strategic Petroleum Reserve Emergency Exchange to Stabilize Global Oil Supply

WASHINGTON—The U.S. Department of Energy (DOE) issued a Request for Proposal (RFP) today for an emergency exchange of 10-million-barrels from the Strategic Petroleum Reserve (SPR). This action is part of the coordinated release of 400-million-barrels from IEA member nations’ strategic reserves President Trump previously announced. The United States continues to deliver on its 172-million-barrel release commitment.  The crude oil will originate from the Strategic Petroleum Reserve’s (SPR) Bryan Mound site. Today’s action builds on the initial phase of the Emergency Exchange, which moved quickly to award 45.2 million barrels from the Bayou Choctaw, Bryan Mound, and West Hackberry SPR sites. The 10-million-barrel exchange leverages the full capabilities of the SPR, alongside the President’s limited Jones Act waiver, to accelerate critical near-term oil flows into the market.  “Today’s action furthers the United States’ efforts to move oil quickly to the market and mitigate short-term supply disruptions,” said DOE Assistant Secretary of the Hydrocarbons and Geothermal Energy Office Kyle Haustveit. “Thanks to President Trump, America is managing our national security assets responsibly again. Through this exchange, we will continue to refill the Strategic Petroleum Reserve by bringing additional barrels back at a later date through this pragmatic exchange structure, strengthening its long-term readiness and all at no cost to the American taxpayer.”  Under DOE’s exchange authority, participating companies will return the borrowed 10 million barrels with additional premium barrels by next year. This exchange delivers immediate crude to refiners and the market while generating additional barrels for the American people at no cost to taxpayers.   Bids for the solicitation are due no later than 11:00 A.M. CT on Monday, April 6, 2026.    For more information on the SPR, please visit DOE’s website.   

Read More »

Trump Administration Keeps Colorado Coal Plant Open to Ensure Affordable, Reliable and Secure Power in Colorado

WASHINGTON—U.S. Secretary of Energy Chris Wright today issued an emergency order to keep a Colorado coal plant operational to ensure Americans maintain access to affordable, reliable and secure electricity. The order directs Tri-State Generation and Transmission Association (Tri-State), Platte River Power Authority, Salt River Project, PacifiCorp, and Public Service Company of Colorado (Xcel Energy), in coordination with the Western Area Power Administration (WAPA) Rocky Mountain Region and Southwest Power Pool (SPP), to take all measures necessary to ensure that Unit 1 at the Craig Station in Craig, Colorado is available to operate. Unit One of the coal plant was scheduled to shut down at the end of 2025 but on December 30, 2025, Secretary Wright issued an emergency order directing Tri-State and the co-owners to ensure that Unit 1 at the Craig Station remains available to operate. “The last administration’s energy subtraction policies threatened America’s energy security and positioned our nation to likely experience significantly more blackouts in the coming years—thankfully, President Trump won’t let that happen,” said Energy Secretary Wright. “The Trump Administration will continue taking action to ensure we don’t lose critical generation sources. Americans deserve access to affordable, reliable, and secure energy to power their homes all the time, regardless of whether the wind is blowing or the sun is shining.” Thanks to President Trump’s leadership, coal plants across the country are reversing plans to shut down. In 2025, more than 17 gigawatts (GW) of coal-power electricity generation were saved. On April 1, once Tri-State and the WAPA Rocky Mountain Region join the SPP RTO West expansion, SPP is directed to take every step to employ economic dispatch to minimize costs to ratepayers. According to DOE’s Resource Adequacy Report, blackouts were on track to potentially increase 100 times by 2030 if the U.S. continued to take reliable

Read More »

NextDecade contractor Bechtel awards ABB more Rio Grande LNG automation work

NextDecade Corp. contractor Bechtel Corp. has awarded ABB Ltd. additional integrated automation and electrical solution orders, extending its scope to Trains 4 and 5 of NextDecade’s 30-million tonne/year (tpy)  Rio Grande LNG (RGLNG) plant in Brownsville, Tex. The orders were booked in third- and fourth-quarters 2025 and build on ABB’s Phase 1 work with Trains 1-3, totaling 17 million tpy.  The scope for RGLNG Trains 4 and 5 includes deployment of an integrated control and safety system consisting of a distributed control system, emergency shutdown, and fire and gas systems. An electrical controls and monitoring system will provide unified visibility of the plant’s electrical infrastructure. These two overarching solutions will provide a common automation platform. ABB will also supply medium-voltage drives, synchronous motors, transformers, motor controllers and switchgear.  The orders also include local equipment buildings—two for Train 4 and one for Train 5— housing critical control and electrical systems in prefabricated modules to streamline installation and commissioning on site. The solutions being delivered to Bechtel use ABB adaptive execution, a methodology for capital projects designed to optimize engineering work and reduce delivery timelines. Phase 1 of RGLNG is under construction and expected to begin operations in 2027. Operations at Train 4 are expected in 2030 and Train 5 in 2031. ABB’s senior vice-president for the Americas, Scott McCay, confirmed to Oil & Gas Journal at CERAWeek by S&P Global in Houston that the company is doing similar work through Tecnimont for Argent LNG’s planned 25-million tpy plant in Port Fourchon, La.; 10-million tpy Phase 1 and 15-million tpy Phase 2. Argent is targeting 2030 completion for its plant.

Read More »

Persistent oil flow imbalances drive Enverus to increase crude price forecast

Citing impacts from the Iran war, near-zero flows through the Strait of Hormuz, accelerating global stock draws, and expectations for a muted US production response despite higher prices, Enverus Intelligence Research (EIR) raised its Brent crude oil price forecast. EIR now expects Brent to average $95/bbl for the remainder of 2026 and $100/bbl in 2027, reflecting what it described as a persistent global oil flow imbalance that continues to draw down inventories. “The world has an oil flow problem that is draining stocks,” said Al Salazar, director of research at EIR. “Whenever that oil flow problem is resolved, the world is left with low stocks. That’s what drives our oil price outlook higher for longer.” The outlook assumes the Strait of Hormuz remains largely closed for 3 months. EIR estimates that each month of constrained flows shifts the price outlook by about $10–15/bbl, underscoring the scale of the disruption and uncertainty around its duration. Despite West Texas Intermediate (WTI) prices of $90–100/bbl, EIR does not expect US producers to materially increase output. The firm forecasts US liquids production growth of 370,000 b/d by end-2026 and 580,000 b/d by end-2027, citing drilling-to-production lags, industry consolidation, and continued capital discipline. Global oil demand growth for 2026 has been reduced to about 500,000 b/d from 1.0 million b/d as higher energy prices and anticipated supply disruptions weigh on economic activity. Cumulative global oil stock draws are estimated at roughly 1 billion bbl through 2027, with non-OECD inventories—particularly in Asia—absorbing nearly half of the impact. A 60-day Jones Act waiver may provide limited short-term US shipping flexibility, but EIR said the measure is unlikely to materially affect global oil prices given broader market forces.

Read More »

Equinor begins drilling $9-billion natural gas development project offshore Brazil

Equinor has started drilling the Raia natural gas project in the Campos basin presalt offshore Brazil. The $9-billion project is Equinor’s largest international investment, its largest project under execution, and marks the deepest water depth operation in its portfolio. The drilling campaign, which began Mar. 24 with the Valaris DS‑17 drillship, includes six wells in the Raia area 200 km offshore in water depths of around 2,900 m. The area is expected to hold recoverable natural gas and condensate reserves of over 1 billion boe. Raia’s development concept is based on production through wells connected to a 126,000-b/d floating production, storage and offloading unit (FPSO), which will treat produced oil/condensate and gas. Natural gas will be transported through a 200‑km pipeline from the FPSO to Cabiúnas, in the city of Macaé, Rio de Janeiro state. Once in operation, expected in 2028, the project will have the capacity to export up to 16 million cu m/day of natural gas, which could represent 15% of Brazil’s natural gas demand, the company said in a release Mar. 24. “While drilling takes place, integration and commissioning activities on the FPSO are progressing well putting us on track towards a safe start of operations in 2028,” said Geir Tungesvik, executive vice-president, projects, drilling and procurement, Equinor. The Raia project is operated by Equinor (35%), in partnership with Repsol Sinopec Brasil (35%) and Petrobras (30%).

Read More »

Nscale Expands AI Factory Strategy With Power, Platform, and Scale

Nscale has moved quickly from startup to serious contender in the race to build infrastructure for the AI era. Founded in 2024, the company has positioned itself as a vertically integrated “neocloud” operator, combining data center development, GPU fleet ownership, and a software stack designed to deliver large-scale AI compute. That model has helped it attract backing from investors including Nvidia, and in early March 2026 the company raised another $2 billion at a reported $14.6 billion valuation. Reuters has described Nscale’s approach as owning and operating its own data centers, GPUs, and software stack to support major customers including Microsoft and OpenAI. What makes Nscale especially relevant now is that it is no longer content to operate as a cloud intermediary or capacity provider. Over the past year, the company has increasingly framed itself as an AI hyperscaler and AI factory builder, seeking to combine land, power, data center shells, GPU procurement, customer offtake, and software services into a single integrated platform. Its acquisition of American Intelligence & Power Corporation, or AIPCorp, is the clearest signal yet of that shift, bringing energy infrastructure directly into the center of Nscale’s business model. The AIPCorp transaction is significant because it gives Nscale more than additional development capacity. The company said the deal includes the Monarch Compute Campus in Mason County, West Virginia, a site of up to 2,250 acres with a state-certified AI microgrid and a power runway it says can scale beyond 8 gigawatts. Nscale also said the acquisition establishes a new division, Nscale Energy & Power, headquartered in Houston, extending its platform further into power development. That positioning reflects a broader shift in the AI infrastructure market. The central bottleneck is no longer simply access to GPUs. It is the ability to assemble power, cooling, land, permits, data center

Read More »

Google Research touts memory-compression breakthrough for AI processing

The last time the market witnessed a shakeup like this was China’s DeepSeek, but doubts emerged quickly about its efficacy. Developers found DeepSeek’s efficiency gains required deep architectural decisions that had to be built in from the start. TurboQuant requires no retraining or fine-tuning. You just drop it straight into existing inference pipelines, at least in theory. If it works in production systems with no retrofitting, then data center operators will get tremendous performance gains on existing hardware. Data center operators won’t have to throw hardware at the performance problem. However, analysts urge caution before jumping to conclusions. “This is a research breakthrough, not a shipping product,” said Alex Cordovil, research director for physical infrastructure at The Dell’Oro Group. “There’s often a meaningful gap between a published paper and real-world inference workloads.” Also, Dell’Oro notes that efficiency gains in AI compute tend to get consumed by more demand, known as the Jevons paradox. “Any freed-up capacity would likely be absorbed by frontier models expanding their capabilities rather than reducing their hardware footprint.” Jim Handy, president of Objective Analysis, agrees on that second part. “Hyperscalers won’t cut their spending – they’ll just spend the same amount and get more bang for their buck,” he said. “Data centers aren’t looking to reach a certain performance level and subsequently stop spending on AI. They’re looking to out-spend each other to gain market dominance. This won’t change that.” Google plans to present a paper outlining TurboQuant at the ICLR conference in Rio de Janeiro running from April 23 through April 27.

Read More »

Amazon Middle East datacenter suffers second drone hit as Iran steps up attacks

Amazon was contacted for comment on the latest Bahrain drone incident, but said it had nothing to add beyond the statement in its current advisory. Denial of infrastructure Doing the damage is the Shaheed 136, a small and unsophisticated drone designed to overwhelm defenders with numbers. If only one in twenty reaches its target, the price-performance still exceeds that of more expensive systems. When aimed at critical infrastructure such as datacenters, the effect is also psychological; the threat of an attack on its own can be enough to make it difficult for organizations to continue using an at-risk facility.  Iran’s targeting of the Bahrain datacenter is unlikely to be random. Amazon opened its ME-SOUTH-1 AWS presence in 2019, and it is still believed to be the company’s largest site in the Middle East. Earlier this week, the Islamic Revolutionary Guard Corps (IRGC) Telegram channel explicitly threatened to target at least 18 US companies operating in the region, including Microsoft, Google, Nvidia, and Apple. This follows similar threats to an even longer list of US companies made on the IRGC-affiliated Tasnim News Agency in recent weeks. That strategy doesn’t bode well for US companies that have made large investments in Middle Eastern datacenter infrastructure in recent years, drawn by the growing wealth and influence of countries in the region. This includes Amazon, which has announced plans to build a $5.3 billion datacenter in Saudi Arabia, due to become available in 2026. If this is now under threat, whether by warfare or the hypothetical possibility of attack, that will create uncertainty.

Read More »

Data Center Jobs: Engineering, Construction, Commissioning, Sales, Field Service and Facility Tech Jobs Available in Major Data Center Hotspots

Each month Data Center Frontier, in partnership with Pkaza, posts some of the hottest data center career opportunities in the market. Here’s a look at some of the latest data center jobs posted on the Data Center Frontier jobs board, powered by Pkaza Critical Facilities Recruiting. Looking for Data Center Candidates? Check out Pkaza’s Active Candidate / Featured Candidate Hotlist Power Applications Engineer Pittsburgh, PA This position is also available in: Denver, CO and Andrews, SC.  Our client is a leading provider and manufacturer of industrial electrical power equipment used in industrial applications for mission critical operations. They help their customers save money by reducing energy and operating costs and provide solutions for modernizing their customer’s existing electrical infrastructure. This company provides cooling solutions to many of the world’s largest organizations and government facilities and enterprise clients, colocation providers and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive salaries and benefits. Electrical Commissioning Engineer Ashburn, VA This traveling position is also available in: New York, NY; White Plains, NY;  Dallas, TX; Richmond, VA; Montvale, NJ; Charlotte, NC; Atlanta, GA; Hampton, GA; New Albany, OH; Cedar Rapids, IA; Phoenix, AZ; Salt Lake City, UT;  Kansas City, MO; Omaha, NE; Chesterton, IN or Chicago, IL. *** ALSO looking for a LEAD EE and ME CxA Agents and CxA PMs. ***  Our client is an engineering design and commissioning company that has a national footprint and specializes in MEP critical facilities design. They provide design, commissioning, consulting and management expertise in the critical facilities space. They have a mindset to provide reliability, energy efficiency, sustainable design and LEED expertise when providing these consulting services for enterprise, colocation and hyperscale companies. This career-growth minded opportunity offers exciting projects with leading-edge technology and innovation as well as competitive

Read More »

No joke: data centers are warming the planet

The researchers also made use of a database provided by the International Energy Agency (IEA) that the authors pointed out contains more than 11,000 locations worldwide, of which 8,472 have been detected to dwell outside of highly dense urban areas. The latter locations were then used to “quantify the effect of data centers on the environment in terms of the LST gradient that could be measured on the areas surrounding each data center.” Asking the wrong question Asked if AI data centers are really causing local warming, or if this phenomenon is overstated, Sanchit Vir Gogia, chief analyst at Greyhound Research, said, “the signal is real, but the industry is asking the wrong question. The research shows a consistent rise in land surface temperature of around 2°C  following the establishment of large data centre facilities.” The debate, however, “has quickly shifted to causality: whether this is driven by operational heat from compute, or by land transformation during construction. That distinction matters scientifically, but it does not change the strategic implication.” Land surface temperature, said Gogia, is not the same as air temperature, and that gap will be used to challenge the findings. “But dismissing the signal on that basis would be a mistake,” he noted. “Data centers concentrate energy use, replace natural surfaces with heat-retaining materials, and continuously reject heat into the environment. Those are known drivers of thermal change.” He added, “the uncomfortable truth is this: Even if the exact mechanism is debated, the outcome aligns with first principles. Infrastructure at this scale alters its surroundings. The industry does not yet have a clean way to separate construction impact from operational impact, and that ambiguity makes the risk harder to model, not easier. This is not overstated, it is under-interpreted.” Location strategy must change But will the findings change

Read More »

Schneider Electric Maps the AI Data Center’s Next Design Era

The coming shift to higher-voltage DC That internal power challenge led Simonelli to one of the most consequential architectural topics in the interview: the likely transition toward higher-voltage DC distribution at very high rack densities. He framed it pragmatically. At current density levels, the industry knows how to get power into racks at 200 or 300 kilowatts. But as densities rise toward 400 kilowatts and beyond, conventional AC approaches start to run into physical limits. Too much cable, too much copper, too much conversion equipment, and too much space consumed by power infrastructure rather than GPUs. At that point, he said, higher-voltage DC becomes attractive not for philosophical reasons, but because it reduces current, shrinks conductor size, saves space, and leaves more room for revenue-generating compute. “It is again a paradigm shift,” Simonelli said of DC power at these densities. “But it won’t be everywhere.” That is probably right. The transition will not be universal, and the exact thresholds will evolve. But his underlying point is powerful. As rack densities climb, electrical architecture starts to matter not only for efficiency and reliability, but for physical space allocation inside the rack. Put differently, power distribution becomes a compute-enablement issue. Distance between accelerators matters, too. The closer GPUs and TPUs can be kept together, the better they perform. If power infrastructure can be compacted, more of the rack can be devoted to dense compute, improving the economics and performance of the system. That is a strong example of how AI is collapsing traditional boundaries between facility engineering and compute architecture. The two are no longer cleanly separable. Gas now, renewables over time On onsite power, Simonelli was refreshingly direct. If the goal is dispatchable onsite generation at the scale now being contemplated for AI facilities, he said, “there really isn’t an alternative

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »