Stay Ahead, Stay ONMINE

From code to current: How to keep AI data centers in check for a sustainable grid

Manav Mittal is a senior project manager at Consumers Energy. As artificial intelligence continues to transform industries, from healthcare and finance to autonomous vehicles and smart cities, the demand for data processing is skyrocketing. AI-driven data centers, which power the algorithms behind these innovations, are the backbone of this revolution. However, with the expansion of […]

Manav Mittal is a senior project manager at Consumers Energy.

As artificial intelligence continues to transform industries, from healthcare and finance to autonomous vehicles and smart cities, the demand for data processing is skyrocketing. AI-driven data centers, which power the algorithms behind these innovations, are the backbone of this revolution. However, with the expansion of AI capabilities comes a growing concern: how will these energy-hungry facilities affect our already strained power grids?

Take Meta’s $10 billion AI-optimized data center in Louisiana, for example. This enormous facility, designed to handle the massive computational load required by AI, will demand a staggering amount of electricity. As AI becomes more integrated into our everyday lives, the strain on the power grid is only set to increase. But here’s the thing — AI doesn’t have to be a burden on the grid. With thoughtful strategies and a proactive approach, we can minimize the environmental and infrastructural costs of these data centers. The question isn’t whether AI will disrupt the grid, but how we can make it work for us without sacrificing sustainability.

Energy efficiency: The first line of defense

It’s easy to think of data centers as mere consumers of energy, but the truth is, they’re not all created equal. There’s plenty of room for improvement when it comes to energy efficiency. The first step in minimizing AI data center impacts on the grid is simply making these centers run more efficiently.

Cooling systems alone account for a huge chunk of energy consumption in data centers. Traditionally, large HVAC systems keep servers at optimal temperatures, but these systems are often inefficient. Thankfully, innovative cooling methods — like liquid cooling and even immersion cooling — are beginning to replace outdated systems. These newer technologies can significantly reduce energy usage, which is crucial when every watt counts.

And it’s not just cooling that needs to be rethought. Advances in hardware, such as more energy-efficient processors and GPUs, are improving the performance-to-energy ratio of data centers. These small innovations might not make the headlines, but their cumulative impact on energy consumption could be profound. Data centers should be incentivized to adopt these energy-saving technologies, not only to reduce their operating costs but to lessen their impact on the grid.

Renewable energy: A cleaner, greener future

Let’s be clear — data centers don’t have to rely on fossil fuels to power their operations. In fact, many major tech companies, including Meta, have made ambitious commitments to run their data centers on 100% renewable energy. This shift to clean energy is one of the most impactful ways to reduce the strain on the grid. If AI data centers can be powered by wind, solar and other renewable sources, we’re looking at a win-win situation: energy demand is met without contributing to greenhouse gas emissions.

However, making this transition requires more than just goodwill — it requires collaboration with renewable energy developers and utilities. Power purchase agreements are a vital tool here. These long-term contracts allow data centers to secure renewable energy directly from producers, ensuring that their electricity needs are met without disrupting the grid. The beauty of this approach is that it supports the broader goal of transitioning to a clean energy economy, all while minimizing the impact on local power infrastructure.

But let’s not stop there. Data centers should also consider on-site renewable energy generation. Installing solar panels or wind turbines at their facilities can reduce their reliance on the grid during peak demand periods. In fact, on-site energy production, combined with energy storage, could allow data centers to be largely self-sufficient, alleviating much of the pressure on local grids.

Modernizing the grid: Building for the future

While improving the energy efficiency of data centers and shifting to renewable energy are essential steps, we can’t ignore the infrastructure itself. The grid, as it exists today, was not built to handle the enormous, and sometimes unpredictable, energy demands of AI data centers. As data centers become larger and more prevalent, the grid needs to evolve to accommodate them.

Here’s where smart grids come into play. These modernized grids use sensors and real-time data to better manage energy distribution. With a smart grid, utilities can dynamically adjust power flow based on demand, ensuring that energy is directed where it’s needed most. By integrating AI into grid management, utilities can anticipate and respond to shifts in energy demand caused by data centers, ensuring a more stable grid overall.

In addition to smart grids, we need to consider energy storage. Renewable energy is intermittent by nature — solar panels don’t generate electricity at night, and wind turbines are silent on calm days. By incorporating energy storage systems, such as large-scale batteries, data centers can store excess energy generated during off-peak hours and use it when demand is high. This will help to smooth out the fluctuations in energy supply and ensure that data centers are less reliant on the grid during peak times.

Demand response: A shared responsibility

But why stop with data centers? AI-driven facilities have a responsibility to participate in demand response programs. These programs incentivize businesses and consumers to reduce their energy usage during periods of peak demand, which helps prevent grid overloads. Data centers are prime candidates for demand response because they can adjust their operations — such as shifting workloads to off-peak hours — without negatively impacting performance. By participating in these programs, AI data centers can significantly ease pressure on the grid, especially during high-demand periods, like hot summer afternoons when air conditioning use is at its peak.

The key here is that grid stability is a shared responsibility. While AI data centers are heavy consumers of electricity, they also have the tools to manage their consumption intelligently. Rather than adding to the grid’s burden, these facilities can be part of the solution. Through demand response, they can reduce their energy use when it’s most needed, helping to balance supply and demand and prevent power outages.

Collaboration: A holistic approach to grid sustainability

It’s clear that minimizing the impact of AI data centers on the power grid isn’t a task for data center operators alone. This challenge requires collaboration among technology companies, utilities, policymakers and local communities. Governments must provide the right incentives to encourage the adoption of clean energy and energy-efficient technologies. At the same time, utility companies must modernize the grid to accommodate the growing demands of AI data centers and other large energy consumers.

We also need to prioritize transparency and dialogue with communities. Local governments and residents should be included in conversations about how AI data centers impact energy infrastructure. Through collaboration, we can ensure that these facilities contribute positively to both the local economy and the environment.

Conclusion: A vision for a sustainable future

The rise of AI presents enormous opportunities for innovation, but it also poses significant challenges, particularly when it comes to energy consumption. AI data centers are indispensable to the future of technology, but they must be built in a way that minimizes their impact on the power grid and the environment.

By focusing on energy efficiency, incorporating renewable energy, modernizing grid infrastructure and participating in demand response programs, we can reduce the strain AI data centers place on the grid. Ultimately, it’s about balancing progress with sustainability. As we move toward a cleaner, smarter and more connected future, we must ensure that the rise of AI doesn’t come at the expense of our planet — or our power systems.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Why enterprise networks need both reach and resilience

As enterprises expand across regions, so do their cloud platforms and digital ecosystems. But with the rise of AI and its unprecedented appetite for data, networks are now under more pressure. Many businesses are learning the limits of legacy architecture the hard way. In the race to meet today’s standard

Read More »

Oil Jumps on Vietnam Trade Deal

Oil climbed in light pre-holiday trading after US President Donald Trump said he had reached a trade deal with Vietnam. West Texas Intermediate rose 3.1% to settle above $67 a barrel after Trump said he had reached a pact with the Southeast Asian nation that eliminated the nation’s import tariff on US goods. The deal is the third announced following agreements with the major trade partners UK and China, with investors pricing in a tentative optimism that more will be reached ahead of a July 9 deadline. Oil’s jump was probably amplified by low liquidity ahead of Friday’s July Fourth holiday in the US. The price gains came despite government data Wednesday showing a buildup in US crude inventories of 3.85 million barrels. The increase is the largest in three months, and more than five times the 680,000 barrel increase projected by the industry-funded American Petroleum Institute on Tuesday. Trading activity in crude futures has declined overall since the truce between Israel and Iran led prices to plunge early last week, with volatility returning to the lower levels seen before the war. The market is likely to turn its attention to a glut forecast for later this year, with an OPEC+ meeting this weekend expected to deliver another substantial increase in production quotas. “Speculators who are already net-long are trying to protect their position,” said Robert Yawger, director of the energy futures division at Mizuho Securities USA. “The problem is that they are running into a OPEC+ meeting with no place to hide over the long weekend.” Investors will also hone in on a slew of inputs expected in the coming days, ranging from a jobs report Thursday to an OPEC+ output decision at the weekend. Oil Prices WTI for August delivery rose 3.1% to settle at $67.45 a barrel

Read More »

Chevron, Total Vying in Libya’s First Oil Tender Since 2011 War

Chevron Corp. and TotalEnergies SE are competing in Libya’s first energy exploration tender since the 2011 conflict, the country’s state-run oil firm said, as the OPEC member looks to oil majors to help ramp up production to a record. Eni SpA and Exxon Mobil Corp. are also among the 37 companies that have lodged interest, with contracts due to be signed with successful bidders by the end of 2025, National Oil Corp Chairman Massoud Seliman said in an interview in the capital, Tripoli.  “Almost all well-known international companies” are vying for the 22 offshore and onshore blocks, he said. Foreign firms stepping back into exploration would mark a watershed for the North African country, which is home to the continent’s largest reserves but has seen production hobbled by more than a decade of conflict.  Libya is split between dueling governments in its east and west, and sporadic stoppages and rounds of violence have left much of its energy infrastructure neglected and damaged. A representative for TotalEnergies declined to comment. Eni and Exxon Mobil didn’t respond to requests for comment. Chevron said it constantly reviews new exploration opportunities, but doesn’t comment on commercial matters. Authorities target daily oil output of 2 million barrels before 2030 — surpassing the 1.75 million-barrel peak reached during strongman Muammar Qaddafi’s reign in 2006. Libya currently pumps about 1.4 million barrels a day. Libya last held a bidding round in 2007, four years before the NATO-backed uprising in which Qaddafi was killed. Winners of the new tenders will bear the costs for seismic surveys and other exploration steps though they can recoup those if commercial quantities of hydrocarbons are discovered, the chairman said. NOC is awaiting approval of a development budget of about $3 billion, which will help raise output to 1.6 million daily barrels within a year, according

Read More »

California budget leaves grid reliability programs in limbo, advocates say

Dive Brief: California Gov. Gavin Newsom, D, approved a $321 billion state budget last week that cut about $18 million in previously appropriated funding from grid reliability programs and deferred decisions about future spending on the programs to a later date, clean energy advocates said. The affected programs — Demand Side Grid Support and Distributed Electricity Backup Assets — are designed to shore up the state’s energy resources by providing on-call emergency supply or load reduction resources during extreme weather events such as heat waves or other grid emergencies. Earlier proposals called for allocating $473 million to the programs through 2028, an amount that was later reduced to $50 million in a revised draft budget in May. The final adopted budget cut $18 million from DSGS without including any new funding for either program, advocates said, as legislators and the governor agreed to hold off on most decisions about the state’s Greenhouse Gas Reduction Fund and voter-approved climate bonds. Dive Insight: Advanced Energy United, a trade group representing a diverse array of energy, transportation and tech companies, said in a statement that the budget leaves “crucial clean energy and climate programs in limbo” at a time when California is facing heat waves that strain the grid and a pullback of federal support.   “We recognize the difficult fiscal environment and uncertainty around federal funding, but California cannot keep deferring on tough decisions,” said Edson Perez, California lead at the organization. “Reliability programs like DSGS have delivered real results by keeping the lights on with clean energy and should be strengthened, not scaled back.”  Newsom’s office did not immediately respond to a request for comment. In his past public statements, the governor blamed California’s budget shortfall on President Donald Trump’s “economic sabotage,” including his on-again, off-again tariffs, and market volatility. The state’s finance department had not updated its budget

Read More »

Iraq Power Grid Suffers Capacity Cut as Iran Gas Supply Slumps

Iraq’s electricity grid lost around 15% of its generation capacity after gas supplies from neighboring Iran were more than halved on Tuesday, highlighting the country’s vulnerability to energy shocks despite its oil wealth. Iranian gas deliveries currently stand at 25 million cubic meters per day, less than half the 55 million cubic meters agreed under a bilateral deal, Iraq’s Electricity Ministry said in a statement. The lost volumes have resulted in the shutdown of some gas-fired power plants and a loss of about 3,800 megawatts of generation.  High domestic demand combined with maintenance work in Iran was cited as the reason for the drop in gas supply, said Saad Freih, director of the ministry. The shortfall has strained Iraq’s already fragile power grid at a time of high summer demand and the ministry said it’s coordinating with the Oil Ministry to secure diesel as an emergency fuel. Iraq, OPEC’s second-biggest oil producer, doesn’t have enough gas to operate its mostly gas-fired power plants and suffers from crippling blackouts every summer when demand peaks. It’s also been trying to reduce the amount of wasteful gas flaring from its own fields, and has been looking at buying LNG for years as a way to fill the shortages. Iraq receives Iranian natural gas from two pipelines, but flows have been interrupted several times in recent years. In 2023, Iran cut volumes in half because of unpaid bills, which Baghdad said arose due to US sanctions on Iran.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

Groups decry Senate’s elimination of building efficiency deduction

HVAC and other industry groups are trying to retain a federal incentive for making commercial buildings more energy efficient after the U.S. Senate eliminated the Section 179D Energy Efficient Commercial Building Deduction in the 940-page domestic policy bill it passed Tuesday morning. “Section 179D … helps HVACR contractors, building owners, and the broader skilled-trades community improve energy efficiency and strengthen America’s built environment,” Air Conditioning Contractors of America said in a letter to congressional leaders last week. The group shared a summary of the letter on its website.  The provision lets owners deduct more than $1 per square foot on their federal taxes for installing LED lights, replacing old HVAC systems and making envelope renovations that improve the efficiency of their buildings. The deduction can increase to more than $5 per square foot if prevailing wage and other labor requirements are met. Supporters say the deduction has grown in value in amendments Congress has made to it since its enactment in 2005.   “Section 179D is no longer a niche benefit — it is a mainstream, high-impact opportunity when making energy-efficient upgrades,” Carey Heyman and Agatha Li of accounting firm CliftonLarsonAllen say in an information page on the provision.  In their article on the program, the accountants said they worked with a company last year that owns a 250,000-square foot Class A office building. The company was able to get a $3-per-square-foot deduction — $750,000 total —  after installing LED lights and upgrading the HVAC system while achieving compliance with prevailing wage standards. “This deduction significantly reduced the firm’s taxable income, offset the capital improvement costs, and increased the building’s appeal to sustainability-conscious tenants,” the accountants said.  In a letter last week to congressional leaders, the Sheet Metal and Air Conditioning Contractors’ National Association called the deduction the most important of

Read More »

Base Power, GVEC partner on 2-MW Texas VPP

Dive Brief: South Central Texas cooperative Guadalupe Valley Electric Cooperative has partnered with distributed energy developer Base Power on a 2-MW virtual power plant that will provide residential customers with electricity in the event of a blackout, while also allowing the utility to use home batteries for price arbitrage and transmission cost management. The battery systems are installed in new homes constructed by Lennar and will be operated directly by GVEC using Base Power’s proprietary software platform. In the future, GVEC and Base Power will work together to qualify the aggregated battery capacity in the Electric Reliability Council of Texas’ aggregated distributed energy resource, or ADER, pilot program, Gary Coke, GVEC power supply manager, said in an email. The batteries will be owned by Base Power. Dive Insight: The virtual power plant builds on Base Power’s ongoing collaboration with Lennar to install batteries in new homes. “GVEC has no direct relationship with our members in relation to this program,” Coke said. “The member selects the system as an option on the home and as a part of that selection acknowledges GVEC has the right to control the system, and we compensate Base for the exclusive right to access the batteries.” The program has already begun, with nine battery systems installed for just over 100 kW of capacity and 225 kWh of energy, Coke said. “We expect to reach 20 systems by the end of July.”  GVEC is already operating the installed batteries for transmission cost reduction during the summer and will continue to do so through September, corresponding to ERCOT’s 4CP program managing peak demand. The cooperative will also regularly operate the batteries for price arbitrage during periods of high pricing in the ERCOT market, Coke said. And the utility will work with Base Power to qualify the batteries for ADER. ADER launched in

Read More »

Arista Buys VeloCloud to reboot SD-WANs amid AI infrastructure shift

What this doesn’t answer is how Arista Networks plans to add newer, security-oriented Secure Access Service Edge (SASE) capabilities to VeloCloud’s older SD-WAN technology. Post-acquisition, it still has only some of the building blocks necessary to achieve this. Mapping AI However, in 2025 there is always more going on with networking acquisitions than simply adding another brick to the wall, and in this case it’s the way AI is changing data flows across networks. “In the new AI era, the concepts of what comprises a user and a site in a WAN have changed fundamentally. The introduction of agentic AI even changes what might be considered a user,” wrote Arista Networks CEO, Jayshree Ullal, in a blog highlighting AI’s effect on WAN architectures. “In addition to people accessing data on demand, new AI agents will be deployed to access data independently, adapting over time to solve problems and enhance user productivity,” she said. Specifically, WANs needed modernization to cope with the effect AI traffic flows are having on data center traffic. Sanjay Uppal, now VP and general manager of the new VeloCloud Division at Arista Networks, elaborated. “The next step in SD-WAN is to identify, secure and optimize agentic AI traffic across that distributed enterprise, this time from all end points across to branches, campus sites, and the different data center locations, both public and private,” he wrote. “The best way to grab this opportunity was in partnership with a networking systems leader, as customers were increasingly looking for a comprehensive solution from LAN/Campus across the WAN to the data center.”

Read More »

Data center capacity continues to shift to hyperscalers

However, even though colocation and on-premises data centers will continue to lose share, they will still continue to grow. They just won’t be growing as fast as hyperscalers. So, it creates the illusion of shrinkage when it’s actually just slower growth. In fact, after a sustained period of essentially no growth, on-premises data center capacity is receiving a boost thanks to genAI applications and GPU infrastructure. “While most enterprise workloads are gravitating towards cloud providers or to off-premise colo facilities, a substantial subset are staying on-premise, driving a substantial increase in enterprise GPU servers,” said John Dinsdale, a chief analyst at Synergy Research Group.

Read More »

Oracle inks $30 billion cloud deal, continuing its strong push into AI infrastructure.

He pointed out that, in addition to its continued growth, OCI has a remaining performance obligation (RPO) — total future revenue expected from contracts not yet reported as revenue — of $138 billion, a 41% increase, year over year. The company is benefiting from the immense demand for cloud computing largely driven by AI models. While traditionally an enterprise resource planning (ERP) company, Oracle launched OCI in 2016 and has been strategically investing in AI and data center infrastructure that can support gigawatts of capacity. Notably, it is a partner in the $500 billion SoftBank-backed Stargate project, along with OpenAI, Arm, Microsoft, and Nvidia, that will build out data center infrastructure in the US. Along with that, the company is reportedly spending about $40 billion on Nvidia chips for a massive new data center in Abilene, Texas, that will serve as Stargate’s first location in the country. Further, the company has signaled its plans to significantly increase its investment in Abu Dhabi to grow out its cloud and AI offerings in the UAE; has partnered with IBM to advance agentic AI; has launched more than 50 genAI use cases with Cohere; and is a key provider for ByteDance, which has said it plans to invest $20 billion in global cloud infrastructure this year, notably in Johor, Malaysia. Ellison’s plan: dominate the cloud world CTO and co-founder Larry Ellison announced in a recent earnings call Oracle’s intent to become No. 1 in cloud databases, cloud applications, and the construction and operation of cloud data centers. He said Oracle is uniquely positioned because it has so much enterprise data stored in its databases. He also highlighted the company’s flexible multi-cloud strategy and said that the latest version of its database, Oracle 23ai, is specifically tailored to the needs of AI workloads. Oracle

Read More »

Datacenter industry calls for investment after EU issues water consumption warning

CISPE’s response to the European Commission’s report warns that the resulting regulatory uncertainty could hurt the region’s economy. “Imposing new, standalone water regulations could increase costs, create regulatory fragmentation, and deter investment. This risks shifting infrastructure outside the EU, undermining both sustainability and sovereignty goals,” CISPE said in its latest policy recommendation, Advancing water resilience through digital innovation and responsible stewardship. “Such regulatory uncertainty could also reduce Europe’s attractiveness for climate-neutral infrastructure investment at a time when other regions offer clear and stable frameworks for green data growth,” it added. CISPE’s recommendations are a mix of regulatory harmonization, increased investment, and technological improvement. Currently, water reuse regulation is directed towards agriculture. Updated regulation across the bloc would encourage more efficient use of water in industrial settings such as datacenters, the asosciation said. At the same time, countries struggling with limited public sector budgets are not investing enough in water infrastructure. This could only be addressed by tapping new investment by encouraging formal public-private partnerships (PPPs), it suggested: “Such a framework would enable the development of sustainable financing models that harness private sector innovation and capital, while ensuring robust public oversight and accountability.” Nevertheless, better water management would also require real-time data gathered through networks of IoT sensors coupled to AI analytics and prediction systems. To that end, cloud datacenters were less a drain on water resources than part of the answer: “A cloud-based approach would allow water utilities and industrial users to centralize data collection, automate operational processes, and leverage machine learning algorithms for improved decision-making,” argued CISPE.

Read More »

HPE-Juniper deal clears DOJ hurdle, but settlement requires divestitures

In HPE’s press release following the court’s decision, the vendor wrote that “After close, HPE will facilitate limited access to Juniper’s advanced Mist AIOps technology.” In addition, the DOJ stated that the settlement requires HPE to divest its Instant On business and mandates that the merged firm license critical Juniper software to independent competitors. Specifically, HPE must divest its global Instant On campus and branch WLAN business, including all assets, intellectual property, R&D personnel, and customer relationships, to a DOJ-approved buyer within 180 days. Instant On is aimed primarily at the SMB arena and offers a cloud-based package of wired and wireless networking gear that’s designed for so-called out-of-the-box installation and minimal IT involvement, according to HPE. HPE and Juniper focused on the positive in reacting to the settlement. “Our agreement with the DOJ paves the way to close HPE’s acquisition of Juniper Networks and preserves the intended benefits of this deal for our customers and shareholders, while creating greater competition in the global networking market,” HPE CEO Antonio Neri said in a statement. “For the first time, customers will now have a modern network architecture alternative that can best support the demands of AI workloads. The combination of HPE Aruba Networking and Juniper Networks will provide customers with a comprehensive portfolio of secure, AI-native networking solutions, and accelerate HPE’s ability to grow in the AI data center, service provider and cloud segments.” “This marks an exciting step forward in delivering on a critical customer need – a complete portfolio of modern, secure networking solutions to connect their organizations and provide essential foundations for hybrid cloud and AI,” said Juniper Networks CEO Rami Rahim. “We look forward to closing this transaction and turning our shared vision into reality for enterprise, service provider and cloud customers.”

Read More »

Data center costs surge up to 18% as enterprises face two-year capacity drought

“AI workloads, especially training and archival, can absorb 10-20ms latency variance if offset by 30-40% cost savings and assured uptime,” said Gogia. “Des Moines and Richmond offer better interconnection diversity today than some saturated Tier-1 hubs.” Contract flexibility is also crucial. Rather than traditional long-term leases, enterprises are negotiating shorter agreements with renewal options and exploring revenue-sharing arrangements tied to business performance. Maximizing what you have With expansion becoming more costly, enterprises are getting serious about efficiency through aggressive server consolidation, sophisticated virtualization and AI-driven optimization tools that squeeze more performance from existing space. The companies performing best in this constrained market are focusing on optimization rather than expansion. Some embrace hybrid strategies blending existing on-premises infrastructure with strategic cloud partnerships, reducing dependence on traditional colocation while maintaining control over critical workloads. The long wait When might relief arrive? CBRE’s analysis shows primary markets had a record 6,350 MW under construction at year-end 2024, more than double 2023 levels. However, power capacity constraints are forcing aggressive pre-leasing and extending construction timelines to 2027 and beyond. The implications for enterprises are stark: with construction timelines extending years due to power constraints, companies are essentially locked into current infrastructure for at least the next few years. Those adapting their strategies now will be better positioned when capacity eventually returns.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »