Stay Ahead, Stay ONMINE

AWS report: Generative AI overtakes security in global tech budgets for 2025

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Generative AI tools have surpassed cybersecurity as the top budget priority for global IT leaders heading into 2025, according to a comprehensive new study released today by Amazon Web Services. The AWS Generative AI Adoption Index, […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Generative AI tools have surpassed cybersecurity as the top budget priority for global IT leaders heading into 2025, according to a comprehensive new study released today by Amazon Web Services.

The AWS Generative AI Adoption Index, which surveyed 3,739 senior IT decision makers across nine countries, reveals that 45% of organizations plan to prioritize generative AI spending over traditional IT investments like security tools (30%) — a significant shift in corporate technology strategies as businesses race to capitalize on AI’s transformative potential.

“I don’t think it’s cause for concern,” said Rahul Pathak, Vice President of Generative AI and AI/ML Go-to-Market at AWS, in an exclusive interview with VentureBeat. “The way I interpret that is that customers’ security remains a massive priority. What we’re seeing with AI being such a major item from a budget prioritization perspective is that customers are seeing so many use cases for AI. It’s really that there’s a broad need to accelerate adoption of AI that’s driving that particular outcome.”

The extensive survey, conducted across the United States, Brazil, Canada, France, Germany, India, Japan, South Korea, and the United Kingdom, shows that generative AI adoption has reached a critical inflection point, with 90% of organizations now deploying these technologies in some capacity. More tellingly, 44% have already moved beyond the experimental phase into production deployment.

IT leaders rank generative AI as their top budget priority for 2025, significantly outpacing traditional security investments. (Credit: Amazon Web Services)

60% of companies have already appointed Chief AI Officers as C-suite transforms for the AI era

As AI initiatives scale across organizations, new leadership structures are emerging to manage the complexity. The report found that 60% of organizations have already appointed a dedicated AI executive, such as a Chief AI Officer (CAIO), with another 26% planning to do so by 2026.

This executive-level commitment reflects growing recognition of AI’s strategic importance, though the study notes that nearly one-quarter of organizations will still lack formal AI transformation strategies by 2026, suggesting potential challenges in change management.

“A thoughtful change management strategy will be critical,” the report emphasizes. “The ideal strategy should address operating model changes, data management practices, talent pipelines, and scaling strategies.”

Companies average 45 AI experiments but only 20 will reach users in 2025: the production gap challenge

Organizations conducted an average of 45 AI experiments in 2024, but only about 20 are expected to reach end users by 2025, highlighting persistent implementation challenges.

“For me to see over 40% going into production for something that’s relatively new, I actually think is pretty rapid and high success rate from an adoption perspective,” Pathak noted. “That said, I think customers are absolutely using AI in production at scale, and I think we want to obviously see that continue to accelerate.”

The report identified talent shortages as the primary barrier to transitioning experiments into production, with 55% of respondents citing the lack of a skilled generative AI workforce as their biggest challenge.

“I’d say another big piece that’s an unlock to getting into production successfully is customers really working backwards from what business objectives they’re trying to drive, and then also understanding how will AI interact with their data,” Pathak told VentureBeat. “It’s really when you combine the unique insights you have about your business and your customers with AI that you can drive a differentiated business outcome.”

Organizations conducted 45 AI experiments on average in 2024, but talent shortages prevent more than half from reaching production. (Credit: Amazon Web Services)

92% of organizations will hire AI talent in 2025 while 75% implement training to bridge skills gap

To address the skills gap, organizations are pursuing dual strategies of internal training and external recruitment. The survey found that 56% of organizations have already developed generative AI training plans, with another 19% planning to do so by the end of 2025.

“For me, it’s clear that it’s top of mind for customers,” Pathak said regarding the talent shortage. “It’s, how do we make sure that we bring our teams along and employees along and get them to a place where they’re able to maximize the opportunity.”

Rather than specific technical skills, Pathak emphasized adaptability: “I think it’s more about, can you commit to sort of learning how to use AI tools so you can build them into your day-to-day workflow and keep that agility? I think that mental agility will be important for all of us.”

The talent push extends beyond training to aggressive hiring, with 92% of organizations planning to recruit for roles requiring generative AI expertise in 2025. In a quarter of organizations, at least 50% of new positions will require these skills.

One in four organizations will require generative AI skills for at least half of all new positions in 2025. (Credit: Amazon Web Services)

Financial services joins hybrid AI revolution: only 25% of companies building solutions from scratch

The long-running debate over whether to build proprietary AI solutions or leverage existing models appears to be resolving in favor of a hybrid approach. Only 25% of organizations plan to deploy solutions developed in-house from scratch, while 58% intend to build custom applications on pre-existing models and 55% will develop applications on fine-tuned models.

This represents a notable shift for industries traditionally known for custom development. The report found that 44% of financial services firms plan to use out-of-the-box solutions — a departure from their historical preference for proprietary systems.

“Many select customers are still building their own models,” Pathak explained. “That being said, I think there’s so much capability and investment that’s gone into core foundation models that there are excellent starting points, and we’ve worked really hard to make sure customers can be confident that their data is protected. Nothing leaks into the models. Anything they do for fine-tuning or customization is private and remains their IP.”

He added that companies can still leverage their proprietary knowledge while using existing foundation models: “Customers realize that they can get the benefits of their proprietary understanding of the world with things like RAG [Retrieval-Augmented Generation] and customization and fine-tuning and model distillation.”

Most organizations favor customizing existing AI models rather than building solutions from scratch. (Credit: Amazon Web Services)

India leads global AI adoption at 64% with South Korea following at 54%, outpacing Western markets

While generative AI investment is a global trend, the study revealed regional variations in adoption rates. The U.S. showed 44% of organizations prioritizing generative AI investments, aligning with the global average of 45%, but India (64%) and South Korea (54%) demonstrated significantly higher rates.

“We are seeing massive adoption around the world,” Pathak observed. “I thought it was interesting that there was a relatively high amount of consistency on the global side. I think we did see in our respondents that, if you squint at it, I think we’ve seen India maybe slightly ahead, other parts slightly behind the average, and then kind of the U.S. right on line.”

65% of organizations will rely on third-party vendors to accelerate AI implementation in 2025

As organizations navigate the complex AI landscape, they increasingly rely on external expertise. The report found that 65% of organizations will depend on third-party vendors to some extent in 2025, with 15% planning to rely solely on vendors and 50% adopting a mixed approach combining in-house teams and external partners.

“For us, it’s very much an ‘and’ type of relationship,” Pathak said of AWS’s approach to supporting both custom and pre-built solutions. “We want to meet customers where they are. We’ve got a huge partner ecosystem we’ve invested in from a model provider perspective, so Anthropic and Meta, Stability, Cohere, etc. We’ve got a big partner ecosystem of ISVs. We’ve got a big partner ecosystem of service providers and system integrators.”

Two-thirds of organizations will rely on external expertise to deploy generative AI solutions in 2025. (Credit: Amazon Web Services)

The imperative to act now or risk being left behind

For organizations still hesitant to embrace generative AI, Pathak offered a stark warning: “I really think customers should be leaning in, or they’re going to risk getting left behind by their peers who are. The gains that AI can provide are real and significant.”

He emphasized the accelerating pace of innovation in the field: “The rate of change and the rate of improvement of AI technology and the rate of the reduction of things like the cost of inference are significant and will continue to be rapid. Things that seem impossible today will seem like old news in probably just three to six months.”

This sentiment is echoed in the widespread adoption across sectors. “We see such a rapid, such a mass breadth of adoption,” Pathak noted. “Regulated industries, financial services, healthcare, we see governments, large enterprise, startups. The current crop of startups is almost exclusively AI-driven.”

The business-first approach to AI success

The AWS report paints a portrait of generative AI’s rapid evolution from cutting-edge experiment to fundamental business infrastructure. As organizations shift budget priorities, restructure leadership teams, and race to secure AI talent, the data suggests we’ve reached a decisive tipping point in enterprise AI adoption.

Yet amid the technological gold rush, the most successful implementations will likely come from organizations that maintain a relentless focus on business outcomes rather than technological novelty. As Pathak emphasized, “AI is a powerful tool, but you got to start with your business objective. What are you trying to accomplish as an organization?”

In the end, the companies that thrive won’t necessarily be those with the biggest AI budgets or the most advanced models, but those that most effectively harness AI to solve real business problems with their unique data assets. In this new competitive landscape, the question is no longer whether to adopt AI, but how quickly organizations can transform AI experiments into tangible business advantage before their competitors do.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Backblaze adds cloud storage security protection features

Application Keys have received a significant boost, starting with Multi-Bucket Application Keys, which make it possible to create a single key that can be used for more than one specific cloud storage bucket. This enhancement provides more granular control over bucket access, reducing the attack surface.  Secondly, Backblaze is now

Read More »

Crude Futures Soar

Oil ended the session at the highest levels in over a month as President Donald Trump reiterated that the US may impose additional levies on Russia unless it reached a truce with Ukraine, stoking worries about tighter supplies. West Texas Intermediate oil settled at $69.21 a barrel while Brent settled above $72 a barrel, with both benchmarks at the highest since June. Speaking to reporters aboard Air Force One Tuesday, Trump warned of the potential for secondary sanctions if Moscow fails to reach a ceasefire within ten days. Asked if he was worried about the impact additional sanctions on Russia would have on the oil market, Trump said he was not concerned, suggesting that the US could ramp up its own energy production. “I don’t worry about it. We have so much oil in our country. We’ll just step it up, even further,” he said. This week, bullish options on the Brent crude benchmark flipped to a premium to bearish options for the first time in two weeks, signaling the optimistic sentiment extended beyond headline prices. “The new deadline caught many analysts by surprise and, if enforced, could tighten Russian crude and fuel supplies to the global market,” said Dennis Kissler, senior vice president for trading at BOK Financial Securities. Earlier Tuesday morning, Kremlin made it clear that President Vladimir Putin is unlikely to change course, after taking note of the US President’s threat. Trump’s warning follows the latest round of sanctions by the European Union targeting Russia, including penalties on India’s Nayara Energy, which has trimmed processing rates at a refinery as a result of the measures. Global markets are also focused on the US deadline to nail down trade deals by Aug. 1, and the upcoming OPEC+ meeting that will decide supply policy for September. Oil was already

Read More »

OPEC+ Oil Gambit Looking a Little Less Costly, For Now

The financial sting of OPEC+’s shock move to open the oil taps appears to be fading — for the time being.  When Saudi Arabia and its partners agreed four months ago to rapidly revive crude production, the fallout seemed catastrophic: prices crashed to a four-year low, leaving producers with widening budget deficits as state revenues dwindled.  But the ensuing months, which saw the Organization of the Petroleum Exporting Countries and its partners announce further supply increases, have brought some solace.  As benchmark Brent recovers to $70 a barrel and the countries’ production targets rise, the nominal value of output from four of OPEC’s key Middle East members has climbed to the highest since February. This month it jumped to almost $1.4 billion per day, according to calculations using data from Rystad Energy A/S.  The main uncertainty, however, is whether this tentative rebound will endure.  Oil forecasters at Goldman Sachs Group Inc. and JPMorgan Chase & Co. widely expect a price slump later this year as extra barrels from OPEC+ swell the surplus created by faltering Chinese demand and brimming US supply. That could slash OPEC+ revenues again, and even pressure the coalition to roll back the latest output hikes. Eight key OPEC+ nations will decide this weekend on another bumper production hike for September, which would complete the restart of a 2.2 million-barrel supply tranche a year ahead of schedule. The alliance is still raking in less cash than before it loosened the spigot, and could face a deeper slump in the months ahead. But for now Riyadh can take consolation that the hit from pushing through such a bold strategy wasn’t much worse.   WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate

Read More »

Department of Energy Issues Report Evaluating Impact of Greenhouse Gasses on U.S. Climate, Invites Public Comment

WASHINGTON—The U.S. Department of Energy (DOE) today released a new report, A Critical Review of Impacts of Greenhouse Gas Emissions on the U.S. Climate, evaluating existing peer-reviewed literature and government data on climate impacts of Greenhouse Gas (GHG) Emissions and providing a critical assessment of the conventional narrative on climate change. The report was developed by the 2025 Climate Working Group, a group of five independent scientists assembled by Energy Secretary Chris Wright with diverse expertise in physical science, economics, climate science and academic research. Among the key findings, the report concludes that CO2-induced warming appears to be less damaging economically than commonly believed, and that aggressive mitigation strategies may be misdirected. Additionally, the report finds that U.S. policy actions are expected to have undetectably small direct impacts on the global climate and any effects will emerge only with long delays. “The rise of human flourishing over the past two centuries is a story worth celebrating. Yet we are told—relentlessly—that the very energy systems that enabled this progress now pose an existential threat,” said U.S. Secretary of Energy Chris Wright.“Climate change is real, and it deserves attention. But it is not the greatest threat facing humanity. As someone who values data, I know that improving the human condition depends on expanding access to reliable, affordable energy.” The report was published today as part of the U.S. Environmental Protection Agency’s (EPA) proposed rule repealing the 2009 Endangerment Finding. EPA’s proposed rule, if finalized, will mark a critical step forward in achieving President Trump’s commitment to unleash American energy dominance and return to commonsense policymaking. “We stand at the threshold of a new era of energy leadership,” Secretary Wright concluded. “If we empower innovation rather than restrain it, America can lead the world in providing more abundant energy—lifting billions out of poverty,

Read More »

New York launches 1 GW bulk energy storage solicitation using new credit

Dive Brief: The state of New York announced its first bulk energy storage solicitation on Monday, issuing a request for proposals to procure a total of 1 GW from storage developers as part of a state goal to deploy 6 GW of energy storage by 2030. The New York State Energy Research and Development Authority is employing a system of index storage credits adopted last year and partially modeled on the state’s renewable energy certificates. One ISC is equivalent to one MWh of discharge capability per day, according to the RFP. “This is one of the largest energy storage procurements in the nation, and it’s coming at a time when a lot of states may be walking away from a lot of their clean energy goals,” said Kristina Persaud, who leads Advanced Energy United’s policy initiatives in New York. “I think it sends a really clear market signal to developers that New York is serious about affordable clean energy … [and] staying the course in the long term.” Dive Insight: A statement from Governor Kathy Hochul described the ISC as a “market-based mechanism that gives project owners greater revenue certainty while incentivizing them to participate in wholesale energy and capacity markets.” NYSERDA said in a release that its goal in adding bulk energy storage to New York’s grid is to “lower costs, optimize the generation and transmission of power, enhance energy grid infrastructure, and ensure the reliability and resilience of the state’s electricity system.” The storage projects that bid into the solicitation will have to meet new storage safety codes adopted last week by the State Fire Prevention and Building Code Council. Although the changes won’t formally take effect until Jan. 1, NYSERDA has “already integrated these recommendations into its residential, retail, and bulk energy storage programs.” Six of the city’s

Read More »

Pemex Posts First Profit in Over a Year

For the first time in more than a year, Petroleos Mexicanos has swung to a profit, a positive signal for the embattled state oil driller as President Claudia Sheinbaum’s administration seeks to raise as much as $12 billion to help pay down the company’s massive debts. Pemex’s results were boosted by currency moves in the second quarter, thanks to a strengthening in the peso. Lower cost of sales and stronger performance among some financial assets contributed. The positive report comes as Sheinbaum’s administration seeks to sell as much as $12 billion in securities to international investors in a bid to raise financing to help pay Pemex’s roughly $100 billion in debt. The profits could help to make the financing round go more smoothly.  Pemex posted a net income of 59.52 billion pesos ($3.2 billion) for the second quarter, compared with a 273.3 billion peso loss a year prior. Pemex reported about $30 billion in losses in 2024. Crude and condensate production slumped to 1.63 million barrels per day, down 8.6% from a year earlier, the company said. Natural gas output was almost 3.6 billion cubic feet per day, a 3.7% drop from a year prior. Crude processing climbed. The debt offering, disclosed in a filing July 22, will consist of dollar-denominated debt maturing August 2030, in the form of amortizing pre-capitalized securities, or P-Caps, a type of instrument used in asset-backed finance. Mexico’s finance ministry has said the operation would allow Pemex to address short-term financial and operational needs, while keeping the liabilities off Pemex and Mexico’s official balance sheets. Pemex said on Monday the proceeds would be used in part to refinance the company’s short-term bank debt. Pemex will also publish a comprehensive business plan in the coming weeks, which will include further guidance on future debt operations. Pemex’s

Read More »

Vitol Hands Record $10.6B Payout to Its Traders

Vitol Group handed a record $10.6 billion to its executives and senior staff through share buybacks last year, as the fallout of the energy crisis continued to deliver extraordinary riches to the world’s commodity traders. The share repurchase – almost certainly the highest such payout in the industry’s history – means privately held Vitol has distributed over $31 billion to its partners in the past decade, according to the company’s audited annual accounts seen by Bloomberg News. The numbers show how the disruptions that followed Russia’s invasion of Ukraine have handed a spectacular bonanza to a small group of commodity traders that their predecessors could only have dreamed of. Vitol has paid out more through buybacks in the past three years than in the previous 17 years combined. The world’s largest commodity trading house, the company is owned by 450-500 of its employees, a senior executive told a New York court last year. Based on that number, the 2024 payout would represent an average of over $20 million per partner, with some top executives and traders likely having received multiples of that. The results also cement Vitol’s position as the most profitable commodity trading house by far: its net profit for the year of $8.7 billion was more than the combined profits of its four closest rivals, Glencore Plc, Trafigura Group, Mercuria Energy Group Ltd. and Gunvor Group. Still, the huge payout comes as earnings are moderating across the industry. Vitol’s buyback in 2024 outstripped its profit for the year, meaning that the group’s equity dropped from $32.5 billion at the end of 2023 to $30.7 billion at the end of 2024. There’s a similar trend taking place across the largest commodity trading companies, many of which operate as employee-owned partnerships, particularly as senior executives who have accumulated valuable shareholdings retire. At

Read More »

AI Deployments are Reshaping Intra-Data Center Fiber and Communications

Artificial Intelligence is fundamentally changing the way data centers are architected, with a particular focus on the demands placed on internal fiber and communications infrastructure. While much attention is paid to the fiber connections between data centers or to end-users, the real transformation is happening inside the data center itself, where AI workloads are driving unprecedented requirements for bandwidth, low latency, and scalable networking. Network Segmentation and Specialization Inside the modern AI data center, the once-uniform network is giving way to a carefully divided architecture that reflects the growing divergence between conventional cloud services and the voracious needs of AI. Where a single, all-purpose network once sufficed, operators now deploy two distinct fabrics, each engineered for its own unique mission. The front-end network remains the familiar backbone for external user interactions and traditional cloud applications. Here, Ethernet still reigns, with server-to-leaf links running at 25 to 50 gigabits per second and spine connections scaling to 100 Gbps. Traffic is primarily north-south, moving data between users and the servers that power web services, storage, and enterprise applications. This is the network most people still imagine when they think of a data center: robust, versatile, and built for the demands of the internet age. But behind this familiar façade, a new, far more specialized network has emerged, dedicated entirely to the demands of GPU-driven AI workloads. In this backend, the rules are rewritten. Port speeds soar to 400 or even 800 gigabits per second per GPU, and latency is measured in sub-microseconds. The traffic pattern shifts decisively east-west, as servers and GPUs communicate in parallel, exchanging vast datasets at blistering speeds to train and run sophisticated AI models. The design of this network is anything but conventional: fat-tree or hypercube topologies ensure that no single link becomes a bottleneck, allowing thousands of

Read More »

ABB and Applied Digital Build a Template for AI-Ready Data Centers

Toward the Future of AI Factories The ABB–Applied Digital partnership signals a shift in the fundamentals of data center development, where electrification strategy, hyperscale design and readiness, and long-term financial structuring are no longer separate tracks but part of a unified build philosophy. As Applied Digital pushes toward REIT status, the Ellendale campus becomes not just a development milestone but a cornerstone asset: a long-term, revenue-generating, AI-optimized property underpinned by industrial-grade power architecture. The 250 MW CoreWeave lease, with the option to expand to 400 MW, establishes a robust revenue base and validates the site’s design as AI-first, not cloud-retrofitted. At the same time, ABB is positioning itself as a leader in AI data center power architecture, setting a new benchmark for scalable, high-density infrastructure. Its HiPerGuard Medium Voltage UPS, backed by deep global manufacturing and engineering capabilities, reimagines power delivery for the AI era, bypassing the limitations of legacy low-voltage systems. More than a component provider, ABB is now architecting full-stack electrification strategies at the campus level, aiming to make this medium-voltage model the global standard for AI factories. What’s unfolding in North Dakota is a preview of what’s coming elsewhere: AI-ready campuses that marry investment-grade real estate with next-generation power infrastructure, built for a future measured in megawatts per rack, not just racks per row. As AI continues to reshape what data centers are and how they’re built, Ellendale may prove to be one of the key locations where the new standard was set.

Read More »

Amazon’s Project Rainier Sets New Standard for AI Supercomputing at Scale

Supersized Infrastructure for the AI Era As AWS deploys Project Rainier, it is scaling AI compute to unprecedented heights, while also laying down a decisive marker in the escalating arms race for hyperscale dominance. With custom Trainium2 silicon, proprietary interconnects, and vertically integrated data center architecture, Amazon joins a trio of tech giants, alongside Microsoft’s Project Stargate and Google’s TPUv5 clusters, who are rapidly redefining the future of AI infrastructure. But Rainier represents more than just another high-performance cluster. It arrives in a moment where the size, speed, and ambition of AI infrastructure projects have entered uncharted territory. Consider the past several weeks alone: On June 24, AWS detailed Project Rainier, calling it “a massive, one-of-its-kind machine” and noting that “the sheer size of the project is unlike anything AWS has ever attempted.” The New York Times reports that the primary Rainier campus in Indiana could include up to 30 data center buildings. Just two days later, Fermi America unveiled plans for the HyperGrid AI campus in Amarillo, Texas on a sprawling 5,769-acre site with potential for 11 gigawatts of power and 18 million square feet of AI data center capacity. And on July 1, Oracle projected $30 billion in annual revenue from a single OpenAI cloud deal, tied to the Project Stargate campus in Abilene, Texas. As Data Center Frontier founder Rich Miller has observed, the dial on data center development has officially been turned to 11. Once an aspirational concept, the gigawatt-scale campus is now materializing—15 months after Miller forecasted its arrival. “It’s hard to imagine data center projects getting any bigger,” he notes. “But there’s probably someone out there wondering if they can adjust the dial so it goes to 12.” Against this backdrop, Project Rainier represents not just financial investment but architectural intent. Like Microsoft’s Stargate buildout in

Read More »

Google and CTC Global Partner to Fast-Track U.S. Power Grid Upgrades

On June 17, 2025, Google and CTC Global announced a joint initiative to accelerate the deployment of high-capacity power transmission lines using CTC’s U.S.-manufactured ACCC® advanced conductors. The collaboration seeks to relieve grid congestion by rapidly upgrading existing infrastructure, enabling greater integration of clean energy, improving system resilience, and unlocking capacity for hyperscale data centers. The effort represents a rare convergence of corporate climate commitments, utility innovation, and infrastructure modernization aligned with the public interest. As part of the initiative, Google and CTC issued a Request for Information (RFI) with responses due by July 14. The RFI invites utilities, state energy authorities, and developers to nominate transmission line segments for potential fast-tracked upgrades. Selected projects will receive support in the form of technical assessments, financial assistance, and workforce development resources. While advanced conductor technologies like ACCC® can significantly improve the efficiency and capacity of existing transmission corridors, technological innovation alone cannot resolve the grid’s structural challenges. Building new or upgraded transmission lines in the U.S. often requires complex permitting from multiple federal, state, and local agencies, and frequently faces legal opposition, especially from communities invoking Not-In-My-Backyard (NIMBY) objections. Today, the average timeline to construct new interstate transmission infrastructure stretches between 10 and 12 years, an untenable lag in an era when grid reliability is under increasing stress. In 2024, the Federal Energy Regulatory Commission (FERC) reported that more than 2,600 gigawatts (GW) of clean energy and storage projects were stalled in the interconnection queue, waiting for sufficient transmission capacity. The consequences affect not only industrial sectors like data centers but also residential areas vulnerable to brownouts and peak load disruptions. What is the New Technology? At the center of the initiative is CTC Global’s ACCC® (Aluminum Conductor Composite Core) advanced conductor, a next-generation overhead transmission technology engineered to boost grid

Read More »

CoreSite’s Denver Power Play: Acquisition of Historic Carrier Hotel Supercharges Interconnection Capabilities

In this episode of the Data Center Frontier Show podcast, we unpack one of the most strategic data center real estate moves of 2025: CoreSite’s acquisition of the historic Denver Gas and Electric Building. With this transaction, CoreSite, an American Tower company, cements its leadership in the Rocky Mountain region’s interconnection landscape, expands its DE1 facility, and streamlines access to Google Cloud and the Any2Denver peering exchange. Podcast guests Yvonne Ng, CoreSite’s General Manager and Vice President for the Central Region, and Adam Post, SVP of Finance and Corporate Development, offer in-depth insights into the motivations behind the deal, the implications for regional cloud and network ecosystems, and what it means for Denver’s future as a cloud interconnection hub. Carrier Hotel to Cloud Hub Located at 910 15th Street in downtown Denver, the Denver Gas and Electric Building is widely known as the most network-dense facility in the region. Long the primary interconnection hub for the Rocky Mountains, the building has now been fully acquired by CoreSite, bringing ownership and operations of the DE1 data center under a single umbrella. “This is a strategic move to consolidate control and expand our capabilities,” said Ng. “By owning the building, we can modernize infrastructure more efficiently, double the space and power footprint of DE1, and deliver an unparalleled interconnection ecosystem.” The acquisition includes the facility’s operating businesses and over 100 customers. CoreSite will add approximately 3 critical megawatts (CMW) of data center capacity, nearly doubling DE1’s footprint. Interconnection in the AI Era As AI, multicloud strategies, and real-time workloads reshape enterprise architecture, interconnection has never been more vital. CoreSite’s move elevates Denver’s role in this transformation. With the deal, CoreSite becomes the only data center provider in the region offering direct connections to major cloud platforms, including the dedicated Google Cloud Platform

Read More »

Texas Senate Bill 6: A Bellwether On How States May Approach Data Center Energy Use

Texas isn’t the first state to begin attempting to regulate energy use statewide. The impact of this legislation could shape how other states, of which there are at least a dozen in process, could shape their own programs. What are Other States Doing? There’s a clear shift toward targeted utility regulation for mega-load data centers. States are increasingly requiring cost alignment, with large consumers bearing infrastructure costs rather than residential cross-subsidization and implementing specialized contract/tariff terms, taking advantage of these huge contracts to uniquely tailor each contract. These agreements are also being used to enforce environmental responsibility through reporting mandates and permitting. And for those estates still focusing on incentivization to draw data center business, coupling incentives with guardrails, balancing investment attraction with equitable distribution. What follows is a brief  overview of U.S. states that have enacted or proposed special utility regulations and requirements for data centers. The focus is  on tariffs, cost-allocation mechanisms, green mandates, billing structures, and transparency rules. California SB 57 (2025): Introduces a special electricity tariff for large users—including data centers—with embedded zero-carbon procurement targets, aiming to integrate grid reliability with emissions goals. AB 222 (2025): Targets consumption transparency, requiring data centers to report energy usage with a specific focus on AI-driven load. Broader California Public Utilities  actions: Proposals for efficiency mandates like airflow containment via Title 24; opening utility rate cases to analyze infrastructure cost recovery from large consumers. Georgia Public Service Commission  rule changes (January 2025): Georgia Power can impose minimum billing, longer contract durations, and special terms for customers with loads >100 MW—chiefly data centers. SB 34: Mandates that data centers either assume full infrastructure costs or pay equitably—not distributing these costs to residential users. Ohio AEP Ohio proposed in 2024: For loads >25 MW (data centers, crypto), demand minimum charges, 10-year contracts, and exit penalties before new infrastructure

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »