Stay Ahead, Stay ONMINE

New York could reach 8.5 GW of grid flexibility in 2040 zero-emissions scenario: Brattle Group

Dive Brief: Electric vehicles, HVAC systems and other grid-connected distributed energy resources could provide New York with 8.5 GW of cost-effective grid flexibility by 2040, Brattle Group said last month in a report prepared for the New York Department of Public Service and the New York State Energy Research and Development Authority. The report estimates […]

Dive Brief:

  • Electric vehicles, HVAC systems and other grid-connected distributed energy resources could provide New York with 8.5 GW of cost-effective grid flexibility by 2040, Brattle Group said last month in a report prepared for the New York Department of Public Service and the New York State Energy Research and Development Authority.
  • The report estimates New York’s grid flexibility potential in a world where it achieves its statutory mandate for 100% zero-emissions power generation by 2040 and has electrified 60% of heating systems, 6.4 million EVs on the road and more than 2 GW of behind-the-meter energy storage capacity.
  • By 2040, grid flexibility could address approximately 25% of New York’s expected peak load “that is not served by renewable generation,” approaching the scale of other, utility-scale flexible resources, Brattle said. Technology adoption could impact total overall cost savings, however. 

Dive Insight:

Brattle’s 8.5-GW estimate represents a nearly six-fold increase from New York’s present 1.4-GW grid flexibility capability and is higher than other recent 2040 flexibility potential estimates cited in the report. NYSERDA’s most recent integration analysis estimates 2040 flexibility potential between 2.3 GW and 3.5 GW, while PEAK Coalition’s October 2024 estimates range from 3.9 GW to 7.8 GW.

The discrepancy is largely due to the Brattle study including a broader range of technologies and grid flexibility options, the report said. The PEAK Coalition scenarios only include flexibility from EV charging and electric heating, while NYSERDA’s high-flexibility scenario omits behind-the-meter storage, vehicle-to-grid charging, time-varying rates and demand response from large commercial and industrial loads.

The findings offer “estimates of potential, not forecasts of what is most likely to happen in the future unless addressable barriers that currently limit grid flexibility expansion are overcome,” the report said.

Brattle’s estimate is sensitive to the pace of flexible resource adoption and to outstanding questions about the true flexibility potential of technologies like heat pumps, Brattle Group partner and report co-author Ryan Hledik said in an email.

“If end-uses like heat pumps and EVs are adopted at levels that are lower than the policy targets, all else equal, our estimates of grid flexibility potential would be lower,” Hledik said.

With regards to heat pumps’ flexibility potential, early studies cited in the report suggest the technology can play an important role on the grid but “more experimentation and deployment is needed to ensure that it will work at scale with a positive customer experience,” Hledik said.

And while New York will need “a lot of flexibility in order to reliably support the state’s clean energy goals,” the actual generation mix in 2040 could affect potential cost savings from grid flexibility, Hledik said.

The annual cost of avoided generation is expected to be greater than $200/kW on a fully decarbonized New York grid in 2040, per National Renewable Energy Laboratories data, with a hydrogen turbine as the marginal generator, Brattle said. Reducing the 2040 capacity cost by 50% lowers potential grid flexibility by just 15% in summer and 9% in winter, indicating “that the broad findings of this study are not dependent on capacity costs reaching $200/kW [per year],” it said.

“Even if New York’s power grid doesn’t evolve that dramatically, I still would expect our findings to hold at a fundamental level, though probably with a lower estimate of total overall cost savings,” Hledik said.  

Because flexible resources are located at the grid edge, they have “a lot of different use cases … if it turns out that the system doesn’t need as much resource adequacy as expected, grid flexibility could be used to provide distribution-level benefits, and vice versa,” he added.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Nvidia claims near 50% boost in AI storage speed

Storage is an overlooked element of AI that has been overshadowed by all the emphasis on processors, namely GPUs. Large language models (LLMs) measure in the terabytes of size and all that needs to be moved around to be processed. So the faster you can move data, the better, so

Read More »

Kyndryl expands Palo Alto deal to offer managed SASE service

Kyndryl has expanded its alliance with Palo Alto Networks to add secure access service edge (SASE) services to its managed services offerings. In 2023, when Kyndryl first said it would integrate Palo Alto’s security products and services into its own managed security services, the vendors said they would ultimately support

Read More »

AI-Powered Policing: The Future of Traffic Safety in Kazakhstan

Traffic management is a growing challenge for cities worldwide, requiring a balance between enforcement, efficiency, and public trust. In Kazakhstan, the Qorgau system is redefining road safety through an innovative fusion of artificial intelligence (AI), computer vision, and mobile technology. Designed to assist traffic police in real-time violation detection and

Read More »

Quantum networking advances on Earth and in space

“Currently, the U.S. government is not investing in such testbeds or demonstrations, ensuring it will be a follower and not a leader in the development of technical advances in the field,” said a report released last year by the Quantum Economic Development Consortium, a global association of more than 250

Read More »

Chevron Is in Contact With Trump Administration Over Venezuela

Chevron Corp. is in communication with the Trump administration as some Republicans call for the US to revoke the company’s license to operate in Venezuela. “We are in contact with the current administration,” Chevron Chief Executive Officer Mike Wirth said in an interview Wednesday on Bloomberg Television. “We work closely with the government to understand their objectives, to understand their limitations.” Chevron is the only major oil producer with a waiver from the US government to operate in Venezuela despite sanctions against President Nicolás Maduro’s regime. The company produces about 20% of Venezuela’s oil and helped boost exports to a five-year high in 2024, nearing Maduro’s goal of 1 million barrels per day. Secretary of State Marco Rubio and other foreign-policy hawks, however, say Chevron provides a financial lifeline for Maduro’s regime to enrich itself and suppress civil rights.  Wirth said Chevron tries to help officials in Washington understand the potential ramifications of Chevron pulling out of Venezuela, including the impact on US refiners that depend on the heavy crude produced there.   WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed. MORE FROM THIS AUTHOR Bloomberg

Read More »

New York could reach 8.5 GW of grid flexibility in 2040 zero-emissions scenario: Brattle Group

Dive Brief: Electric vehicles, HVAC systems and other grid-connected distributed energy resources could provide New York with 8.5 GW of cost-effective grid flexibility by 2040, Brattle Group said last month in a report prepared for the New York Department of Public Service and the New York State Energy Research and Development Authority. The report estimates New York’s grid flexibility potential in a world where it achieves its statutory mandate for 100% zero-emissions power generation by 2040 and has electrified 60% of heating systems, 6.4 million EVs on the road and more than 2 GW of behind-the-meter energy storage capacity. By 2040, grid flexibility could address approximately 25% of New York’s expected peak load “that is not served by renewable generation,” approaching the scale of other, utility-scale flexible resources, Brattle said. Technology adoption could impact total overall cost savings, however.  Dive Insight: Brattle’s 8.5-GW estimate represents a nearly six-fold increase from New York’s present 1.4-GW grid flexibility capability and is higher than other recent 2040 flexibility potential estimates cited in the report. NYSERDA’s most recent integration analysis estimates 2040 flexibility potential between 2.3 GW and 3.5 GW, while PEAK Coalition’s October 2024 estimates range from 3.9 GW to 7.8 GW. The discrepancy is largely due to the Brattle study including a broader range of technologies and grid flexibility options, the report said. The PEAK Coalition scenarios only include flexibility from EV charging and electric heating, while NYSERDA’s high-flexibility scenario omits behind-the-meter storage, vehicle-to-grid charging, time-varying rates and demand response from large commercial and industrial loads. The findings offer “estimates of potential, not forecasts of what is most likely to happen in the future unless addressable barriers that currently limit grid flexibility expansion are overcome,” the report said. Brattle’s estimate is sensitive to the pace of flexible resource adoption and to outstanding

Read More »

Equinor slashes renewables, doubles down on hydrocarbons in focus on returns

Equinor’s dramatic renewable cuts and increased focus on oil and gas mirrors moves across the energy sector. The Norwegian major is prioritising shareholder returns over energy transition spending. CEO Anders Opedal cited growing energy demand, market uncertainty and an “uneven pace” in the energy transition as drivers for the strategic shift. “We are well positioned to create value in the context” of growing energy demand, Opedal said at the company’s capital markets day. Opedal said the energy transition was “moving fast in some markets but slow in most. Even the massive renewable growth is currently energy addition, not energy transition.” He attributed the challenges of the energy transition to inflation, interest rates, supply chain issues and regulatory uncertainty. “Segments like offshore wind and hydrogen are impacted. We adapt to these realities … by maximising returns.” Equinor had planned to reach 12-16 GW of renewable energy capacity by 2030. It has now cut this to 10-12 GW. “We are lowering our renewables ambition for 2030, we introduce a range for net carbon intensity and we retire our gross capex ambition. But the strategic direction is the same,” Opedal said. While the company will continue investing in projects such as the US’ Empire Wind, hydrocarbons are clearly more profitable. CFO Torgrim Reitan said the “lifecycle return” on the project was “close to 10% nominal equity return”. He went on to describe it as a “challenging project” that was “not great but it is ok”. For upstream projects, Equinor is targeting an internal rate of return of around 30% over the next 10 years. Finding focus Equinor aims to return $9 billion to shareholders this year. Of this, $4bn will come in dividends and $5bn in buybacks. “The largest driver of this is an $8bn capex reduction” over the next three years. “We

Read More »

Energy Voices: why carbon capture matters at Morecambe

Spirit Energy has concrete plans for enabling a new zero energy system. Following the sale of oil producing fields in the Norwegian North Sea in 2022, the firm, which is majority owned by Centrica (LON: CNA) has become almost entirely focused on gas production. And now it is looking to how to use its experience in this area to the next stage of the energy transition to store carbon. Susan Mackenzie is leading Spirit’s ambitious plans to repurpose depleted gas fields to create the UK’s largest permanent carbon store, capable of storing one billion tons of CO2. Mackenzie moved to Aberdeen in 1999 to complete her MsC in environmental sciences. Her career started in the oil and gas industry which she admits she felt “conflicted” following her training. “But you can work that conflict out because you need power, then you can try and get new developments to mitigate impact on the environment.” Then the opportunity came with Spirit Energy to lead the Morecambe Net Zero (MNZ) project, which she did not hesitate to take. “It was really easy,” she recalls when asked if she wanted to join the team. Spirit Energy’s Morecambe hub has been producing gas for 40 years. The firm estimates it could meet up to a quarter of the UK’s carbon storage capacity needs from 2030. MNZ is not currently part of the UK’s carbon capture and storage (CCS) track system. Last year the UK government allocated £22 billion in support for two CCS projects, one of which is nearby to MNZ on Merseyside. Mackenzie envisages going beyond current government allocations. “We need to do it. And we need to find a route to it for not just two, not just four, but many projects,” she said. “MNZ is probably the best project because of the

Read More »

Energy Voices: why carbon capture matters at Morecambe

Spirit Energy has concrete plans for enabling a new zero energy system. Following the sale of oil-producing fields in the Norwegian North Sea in 2022, the firm, which is majority owned by Centrica (LON: CNA) has become almost entirely focused on gas production. And now it is looking at how to apply its experience in this area to the next stage of the energy transition to store carbon. Susan Mackenzie is leading Spirit’s ambitious plans to repurpose depleted gas fields to create the UK’s largest permanent carbon store, capable of storing one billion tons of CO2. Mackenzie moved to Aberdeen in 1999 to complete her MsC in environmental sciences. Her career started in the oil and gas industry which she admits she felt “conflicted” following her training. “But you can work that conflict out because you need power, then you can try and get new developments to mitigate impact on the environment.” Then the opportunity came with Spirit Energy to lead the Morecambe Net Zero (MNZ) project, which she did not hesitate to take. “It was really easy,” she recalls when asked if she wanted to join the team. Spirit Energy’s Morecambe hub has been producing gas for 40 years. The firm estimates it could meet up to a quarter of the UK’s carbon storage capacity needs from 2030. MNZ is not currently part of the UK’s carbon capture and storage (CCS) track system. Last year the UK government allocated £22 billion in support for two CCS projects, one of which is nearby to MNZ on Merseyside. Mackenzie envisages going beyond current government allocations. “We need to do it. And we need to find a route to it for not just two, not just four, but many projects,” she said. “MNZ is probably the best project because of the size

Read More »

Banks improving clean energy finance ratios, still well short of net-zero pace: report

Dive Brief: Global banks spent 89 cents financing low-carbon energy supply technologies to every dollar spent on fossil fuels in 2023, according to a new report by BloombergNEF. That is an improvement from the 0.74-to-1 financing ratio from the prior year, but still far from the pace needed to reach net-zero. In terms of capital investments, low-carbon energy investments surpassed fossil fuels for the first time in 2023, with $1.2 trillion invested in low-carbon projects, compared to $1.1 trillion invested in fossil fuels, a 1.11-to-1 investment ratio, according to the report released Wednesday. While the rate of fossil fuel investment and bank-facilitated fossil fuel financing declined in 2023, the financing ratios fall well below what is needed to meet global net-zero emissions targets. “The ratio isn’t evolving at the pace needed to hit the 4:1 level required this decade under commonly referenced scenarios to limit climate change to 1.5 [degrees Celsius],” the report said. Dive Insight: The third annual report on energy supply investment and banking ratios from BloombergNEF — a Bloomberg subsidiary which provides research on the low-carbon transition — found that overall, bank-facilitated financing for global energy supply fell for the second year in a row to $1.6 trillion in 2023 from $1.8 trillion in 2022. Capital investments in energy supply increased, reaching $2.3 trillion, up from $2.1 trillion the year prior and $1.8 trillion in 2021.  The decline in energy supply financing was more dramatic for fossil fuels, however, than for low-carbon energy sources. Bank-facilitated financing for fossil fuels fell 18% to $870 billion in 2023, while low-carbon energy sources saw just a 1.4% decrease in financing to $776 billion globally. BloombergNEF said part of the increase in the energy supply banking ratio is due to how it measures finance, as well as data gaps in China,

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »