Stay Ahead, Stay ONMINE

Outdated billing systems are holding back the energy transition

Across the US, utilities are making significant advancements in the energy transition, one such being electrification: switching end-use devices like heating systems and vehicles away from fossil fuels and onto electric fuel. Because electrification – and the associated demand increase – could increase system costs if demand is left unmanaged, a subsequent trend has emerged: […]

Across the US, utilities are making significant advancements in the energy transition, one such being electrification: switching end-use devices like heating systems and vehicles away from fossil fuels and onto electric fuel. Because electrification – and the associated demand increase – could increase system costs if demand is left unmanaged, a subsequent trend has emerged: demand management at the grid-edge. Utilities are investing in new solutions to serve customer demand (particularly from new grid-edge devices like EVs and heat pumps) more efficiently. An important tool in a utility’s demand management toolbox is pricing innovation – that is, developing new rate designs that more accurately communicate to customers the costs and benefits of their use of the power system. With new time-of-use rate designs, dynamic pricing models, location-based incentives, utilities have a wealth of solutions at their disposal. But to implement these pricing innovations at scale, utilities will need modern billing technologies that can keep up.

Implementing rate design innovation requires more than just smart meters

Over the last decade, US utilities have invested billions of dollars into advanced metering infrastructure (AMI). The transition away from manually-read meters to communications-enabled smart meters has allowed utilities to collect more granular consumption data from end customers. The value of these AMI investments relates to the outcomes utilities are able to achieve with this new, granular meter data. Utilities across the US are seeking to deliver value from their AMI investments by implementing time-varying rates for customers, something they weren’t able to do without granular meter data. However, many utilities are finding that even with modernized metering equipment, the process of implementing new rate designs proves difficult and expensive if their customer IT systems (specifically, billing) have yet to be modernized. Coding a new rate design or product offering into a legacy billing system can take months to years and cost millions of dollars. This implementation bottleneck posed by legacy billing technology comes at the expense of market innovation and system efficiency – and ultimately borne by ratepayers.

What features should modern utility billing technology include?

Ability to utilize new types of data for billing purposes: utilities are collecting new types and formats of data that could be used for price-setting and billing, from increasingly-granular consumption data (collected by smart meters) to individual preference information (collected through customer surveys). Billing technology must be able to efficiently make use of those new data types as they become available.

Configurability for new rate designs and pricing:  as utilities explore new rate design offerings, they are moving from traditional flat volumetric rates to variable pricing structures that more accurately reflect marginal costs and benefits. Often implementing these has taken months or years to complete, costing millions of dollars in the process. Billing systems — and their necessary upgrades — should not serve as a bottleneck to rate design innovation.

Easily integrated with the full customer IT stack: designing new products and programs will only pay off if customers actually make use of those new offerings. To ensure robust uptake and enrollment, utilities will need to actively engage their customers regarding new products and programs, through tailored marketing and outreach, creative enrollment solutions, rate comparisons, etc. To do it well, this engagement will require a utility’s billing system to work with its other backend customer IT systems and seamlessly pass information between them.

Modern billing technology in action: Octopus Energy’s Fan Club and Agile Octopus Tariff

Octopus Energy, a retail energy supplier serving 7.95 million customers, is known for their innovative rate designs. In 2021, Octopus Energy’s British retail business launched a new product for electric customers called Fan Club. Fan Club is designed to increase popular support for onshore wind development by offering customers who live near certain onshore wind projects a discount off their electric supply rate in hours when those projects generate electricity.

Using Kraken’s billing platform, Octopus Energy cross-references a customer’s account-level zip code against the zip code of participating wind turbines (to check for participation eligibility) and then calculates and assigns bill credits to individual participants based on precise time-varying determinants (e.g. energy market prices, wind speed, etc.). Without Kraken, these billing processes may have otherwise been performed by different software programs, which could have been complicated and expensive to execute. With Kraken, Octopus Energy was able to implement Fan Club all from a single billing platform, in a matter of weeks.

Agile Octopus is another innovative smart rate that exposes customers directly to wholesale electricity market prices, using half-hourly market data. Customers enrolled on Agile Octopus are exposed to “plunge pricing,” which occurs in hours when more electricity is generated than consumed, resulting in negative price events. Customers receive alerts when wholesale prices drop below zero and can use an API from Kraken to program their smart devices (e.g. EVs) to charge when prices are negative. By consuming electricity when prices are negative, customers get paid for the electricity they use. This rate design is incredibly complex to bill because it relies on dynamic wholesale prices as key billing determinants. To implement this rate for customers, Octopus Energy required a modern billing system — which they found in Kraken.

Modern billing technology is foundational to the energy transition

To implement advanced rate designs that accelerate the energy transition and make full use of new data that is being collected at the grid edge, utilities will need to modernize their customer billing technology. Information technology vendors like Kraken are enabling this modernization by developing billing systems that allow for agility, scalability and innovation.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Fortinet speeds threat detection with improved FortiAnalyzer

The package also now integrates with FortiAI, the vendor’s genAI assistant, to better support analytics and telemetry to help security teams speed threat investigation and response, the vendor stated. “FortiAI identifies the threats that need analysis from the data collected by FortiAnalyzer, primarily collected from FortiGates. By automating the collection,

Read More »

Aryaka adds AI-powered observability to SASE platform

Nadkarni explained that Aryaka runs unsupervised machine learning models on the data to identify anomalies and outliers in the data. For example, the models may detect a sudden spike in traffic to a domain that has not been seen before. This unsupervised analysis helps surface potential issues or areas of

Read More »

Trump EPA Keeps Biden Era Gasoline Change Meant to Boost Ethanol

The Trump administration is holding firm to an April fuel policy change meant to bolster sales of corn-based ethanol, despite some oil industry warnings it could raise gasoline costs and cause fuel supply disruptions.  The Environmental Protection Agency will keep to April 28 as the implementation date for ending special treatment that waives conventional E10 gasoline from fuel volatility limits in as many as eight Midwestern states, Administrator Lee Zeldin said in a statement. The change would effectively put E10, which contains 10% ethanol, on the same regulatory footing as higher-ethanol E15 gasoline and allow both varieties to use the same raw gasoline blendstock. That raw unblended fuel would have to be less evaporative in the affected states.  The shift is meant to enable both fuel blends to be sold widely during the summer, where the existing policy often keeps E15 out of the market. The change was sought by Midwestern governors and first charted under former President Joe Biden. At the same time, the EPA will consider granting one-year delays for states that seek additional compliance time, something already sought by Ohio. The decision is an early indicator of the new Trump administration’s approach to biofuel policy. During Donald Trump’s first term, some debates pitting oil refiners against biofuel producers reached the president himself, prompting the EPA to weigh policy shifts on the treatment of refiners and federal quotas mandating use of the alternative fuels.  The producers of biofuel — and the corn and soybeans used to make it — are a politically important constituency in Washington. But Trump has also repeatedly vowed to unleash American energy and take advantage of the country’s abundant oil and gas resources for economic and geopolitical gain. The president has also made combating inflation and high consumer costs a signature of his second term.  “Today’s

Read More »

Norway Raises Estimates for Oil and Gas Resources

Estimated resource volumes on the Norwegian continental shelf rose 36 million standard cubic meters of oil equivalent (scmoe) to 15.61 billion scmoe – before accounting for production – as of year-end 2024, Norway’s upstream regulator has reported. The total figure consisted of 8.73 billion scmoe produced, 2.26 billion scmoe of reserves, 651 million scmoe of contingent resources in fields, 472 million scmoe of contingent resources in discoveries and 3.5 billion scmoe of undiscovered resources. The produced volume rose 239 million scmoe from 2023, while reserves dropped 205 million scmoe. Total contingent resources fell 17 million scmoe against 2023. Undiscovered resources climbed 20 million scmoe against 2023. The increase in undiscovered resources came from opened areas, with no change in undiscovered resources in unopened areas. “This change results from a reduction in undiscovered resources in the North Sea, coupled with increases in the Barents Sea and in the Norwegian Sea”, the Directorate said. “Large areas in the Barents Sea have yet to be opened for petroleum activity, and this is where the greatest expected value for undiscovered resources can be found”, it noted. In terms of the kind of resources, undiscovered volumes in 2024 “show a decrease in the volume of liquids and an increase in natural gas volume”, the Directorate said. “Overall, this yields a minor increase in the overall volume of undiscovered resources. This marks a shift from previous years when the volume of undiscovered resources began decreasing from 2018”. For discovered resources, “liquids have remained unchanged since 2022, while there is still growth in discovered resources for natural gas”, it reported. The Norwegian side of the North Sea had 370 million scmoe of undiscovered liquids as of 2024, all from opened areas. Undiscovered gas in the area totaled 205 million scmoe, all from opened areas. In the Norwegian Sea, undiscovered

Read More »

OPEC Says it Welcomes Decision Approved by CNPE

In a statement posted on its website recently, OPEC said it “welcomes the decision approved … by the National Energy Council of the Federative Republic of Brazil (CNPE) that formally paves the way for the participation of Brazil as a Member of the Charter of Cooperation (CoC) between oil producing countries”. OPEC noted in its statement that the decision comes after an initial announcement made by Alexandre Silveira, Minister of Mines and Energy of Brazil, at the 36th OPEC and non-OPEC Ministerial Meeting on November 30, 2023. OPEC highlighted in its statement that “this followed the historic official visit of HE Haitham Al Ghais, OPEC Secretary General, to Brazil in October 2023 – the first ever by an OPEC Secretary General to the South American nation”. “Under the able leadership and efforts made by HRH Prince Abdul Aziz bin Salman Al Saud, Saudi Arabia’s Minister of Energy, and Chairman of the OPEC and non-OPEC Ministerial Conference, several visits and high-level bilateral meetings took place throughout 2024, which culminated into … [this] significant announcement,” the OPEC statement said. A statement posted on OPEC’s website on November 30, 2023, noted that the 36th OPEC and non-OPEC Ministerial Meeting “welcomed HE Alexandre Silveira de Oliveira, Minister of Mines and Energy of the Federative Republic of Brazil, which will join the OPEC+ Charter of Cooperation starting January 2024”. In its latest statement, OPEC highlighted that the Charter of Cooperation was established in July 2019 and said it “provides a platform to facilitate dialogue and exchange views regarding conditions and developments in the global oil and energy markets”. “The goal is to contribute to a secure energy supply and lasting stability for the benefit of producers, consumers, investors and the global economy,” OPEC noted in the statement. Rigzone contacted Brazil’s ministry of mines and energy

Read More »

Starmer pledges £200m for Grangemouth site from National Wealth Fund

UK Prime Minister Sir Keir Starmer has announced a further £200 million investment in the Grangemouth industrial site, where Scotland’s only oil refinery is expected to close later this year. Refinery owner Petroineos announced the closure of the facility last year amid plans to transition the site into a fuel import terminal, leading to the loss of more than 400 jobs. Sir Keir announced the funding, which will come from the newly created National Wealth Fund, at the Scottish Labour conference in Glasgow on Sunday. Addressing the conference, Sir Keir said the Grangemouth site presented a “huge opportunity for renewal”, with potential for future uses in biofuels or hydrogen energy. “We will grasp the opportunities at Grangemouth, work alongside partners to develop viable proposals, team up with business to get new industries off the ground,” he said. © Peter Byrne/PA WirePrime Minister Sir Keir Starmer delivers his keynote speech during the Labour Party Conference, at the ACC Liverpool in September 2024. Image: Peter Byrne/PA Wire “And to attract private investors into the partnership we need, we will allocate £200m from the National Wealth Fund, for investment in Grangemouth, investment in Scotland’s industrial future.” The Prime Minister said every worker made redundant at the site would get 18 months full pay, alongside a £10m skills and training programme. Businesses that take on Grangemouth workers will also be in line for national insurance relief, Sir Keir said. It comes after Scotland’s First Minister John Swinney separately announced £25m in extra funding for Grangemouth last week. The UK and Scottish governments had previously committed £100m to the Falkirk and Grangemouth Growth Deal to support jobs and skills in the area along with a report – called Project Willow – to look at the future of the site. Grangemouth future Speaking to BBC Radio Scotland’s Good Morning

Read More »

Obsidian Energy to Sell Pembina Assets to InPlay for $224.7MM

Obsidian Energy Ltd. is selling its operating assets in the Pembina oil field in Alberta, Canada, to InPlay Oil Corp. for approximately $224.77 million (CAD 320 million). Obsidian will retain its non-operated holdings in Pembina Cardium Unit #11, the company said in a news release. The Pembina assets include 498 net sections of land in the Pembina area of Central Alberta, including associated facilities and gathering systems. Consideration for the acquisition will consist of $154.53 million (CAD 220 million) in cash, $59.71 million (CAD 85 million) of InPlay’s common shares, and InPlay’s 34.6 percent working interest in the Willesden Green Cardium Unit #2 oil field, which would bring Obsidian Energy’s ownership in the field to 99.8 percent. This additional interest is estimated to be valued at $15 million, according to the company. As part of the transaction, Obsidian Energy said it has agreed to drill four wells on two pads in the Pembina area during the first quarter of 2025 at InPlay’s expense. All rights to the wells and associated infrastructure will be transferred to InPlay upon the close of the transaction. The effective date of the transaction is December 1, 2024, and is expected to close early in the second quarter of 2025, subject to approval by InPlay shareholders, receipt of all necessary regulatory approvals and the satisfaction of other customary closing conditions. Obsidian Energy said the transaction allows the company to focus on its light oil development in Willesden Green, part of the Willesden Green Oil and Gas Field and the Duvernay Formation. As part of the transaction, InPlay will assume all assets and liabilities associated with the Pembina assets. InPlay said in a separate statement that the petroleum and natural gas assets are producing approximately 10,000 barrels of oil equivalent per day (boepd), consisting of 68 percent

Read More »

Centrica Bags Contract to Provide LNG to Petrobras

Centrica said it has secured a sale and purchase agreement to provide liquefied natural gas (LNG) to Petróleo Brasileiro S.A (Petrobras). The contract between the two companies is for the purchase of 0.8 million tons per annum (mtpa) of LNG for 15 years, beginning in 2027, Centrica said in a news release. The financial details were not disclosed. The agreement comprises approximately 30 percent of Centrica’s U.S. portfolio and will be sourced from Centrica’s Sabine Pass and Delfin supply agreements, the United Kingdom-based company noted. The agreement “marks a significant step in expanding Centrica’s global LNG business, diversifying the locations it can deliver LNG to and supporting energy security in Brazil with an important new long-term partner,” the company said in a statement. Centrica Group Chief Executive Chris O’Shea said, “Centrica is investing to deliver the energy security, efficiency and decarbonization solutions our customers need today and in the future, and LNG is, and will continue to be, a crucial foundation of the energy transition. This agreement demonstrates our approach to building long-term partnerships while derisking our portfolio exposure in the medium-term, in turn positioning us to continue growing our portfolio as new LNG supply comes into the market over the coming years”. Petrobras Director of Energy Transition and Sustainability Maurício Tolmasquim said, “The agreement with Centrica is aligned with Petrobras’ priorities to reduce its exposure to the spot market volatility, increase its competitiveness and be the best option for its customers. We also consider the contribution of this important product to promoting the energy transition”. Flexible Power Projects in Ireland Meanwhile, Centrica’s subsidiary Bord Gáis Energy has secured a 10-year capacity market contract to deliver a 334-megawatt (MW) open cycle gas turbine power station in Galway, Ireland. The new power station will be capable of running on biomethane or

Read More »

Do data centers threaten the water supply?

In a new report, the Royal Academy of Engineering called upon the government to ensure tech companies accurately report how much energy and water their data centers are using and reducing the use of drinking water for cooling. Without such action, warns one of the report’s authors, Professor Tom Rodden, “we face a real risk that our development, deployment and use of AI could do irreparable damage to the environment.” The situation is a little different for the US as the country has large bodies of water offering a  water supply that the UK just does not have. It’s not an accident that there are many data centers around the Chicago area: they’ve also got the Great Lakes to draw upon. Likewise, the Columbia and Klamath Rivers have become magnets for data centers for both water supply and hydroelectric power. Other than the Thames River, the UK doesn’t have these massive bodies of water. Still, the problem is not unique to the UK, says Alan Howard, senior analyst with Omdia. He notes that Microsoft took heat last year because it was draining the water supply of a small Arizona town of Goodyear with a new AI-oriented data center.  The city of Chandler, Arizona passed an ordinance in 2015 that restricted new water-intensive businesses from setting up shop which slowed data center development.   “I believe some data center operators just bowed out,” said Howard.

Read More »

Ireland says there will be no computation without generation

Stanish said that, in 2023, she wrote a paper that predicted “by 2028, more than 70% of multinational enterprises will alter their data center strategies due to limited energy supplies and data center moratoriums, up from only about 5% in 2023. It has been interesting watching this trend evolve as expected, with Ireland being a major force in this conversation since the boycotts against data center growth started a few years ago.” Fair, equitable, and stable electricity allocation, she said, “means that the availability of electricity for digital services is not guaranteed in the future, and I expect these policies, data center moratoriums, and regional rejections will only continue and expand moving forward.” Stanish pointed out that this trend is not just occurring in Ireland. “Many studies show that, globally, enterprises’ digital technologies are consuming energy at a faster rate than overall growth in energy supply (though, to be clear, these studies mostly assume a static position on energy efficiency of current technologies, and don’t take into account potential for nuclear or hydrogen to assuage some of these supply issues).” If taken at face value, she said, this means that a lack of resources could cause widespread electricity shortages in data centers over the next several years. To mitigate this, Stanish said, “so far, data center moratoriums and related constraints (including reduced tax incentives) have been enacted in the US (specifically Virginia and Georgia), Denmark, Singapore, and other countries, in response to concerns about the excessive energy consumption of IT, particularly regarding compute-intense AI workloads and concerns regarding an IT energy monopoly in certain regions. As a result, governments (federal, state, county, etc.) are working to ensure that consumption does not outpace capacity.” Changes needed In its report, the CRU stated, “a safe and secure supply of energy is essential

Read More »

Perspective: Can We Solve the AI Data Center Power Crisis with Microgrids?

President Trump announced a$500 billion private sector investment in the nation’s Artificial Intelligence (AI) infrastructure last month. The investment will come from The Stargate Project, a joint venture between OpenAI, SoftBank, Oracle and MGX, which intends to build 20 new AI data centers in the U.S in the next four to five years. The Stargate Project committed$100 billion for immediate deployment and construction has already begun on its first data center in Texas. At approximately a half a million square feet each, the partners say these new facilities will cement America’s leadership in AI, create jobs and stimulate economic growth. Stargate is not the only game in town, either. Microsoft is expected to invest$80 billion in AI data center development in 2025, with Google, AWS and Meta also spending big. While all this investment in AI infrastructure is certainly exciting, experts say there’s one lingering question that’s yet to be answered and it’s a big one: How are we going to power all these AI data centers? This will be one of the many questions tackled duringMicrogrid Knowledge’s annual conference, which will be held in Texas April 15-17 at the Sheraton Dallas. “Powering Data Centers: Collaborative Microgrid Solutions for a Growing Market” will be one of the key sessions on April 16. Industry experts will gather to discuss how private entities, developers and utilities can work together to deploy microgrids and distributed energy technologies that address the data center industry’s power needs. The panel will share solutions, technologies and strategies that will favorably position data centers in the energy queue. In advance of this session, we sat down with two microgrid experts to learn more about the challenges facing the data center industry and how microgrids can address the sector’s growing energy needs. We spoke with Michael Stadler, co-founder and

Read More »

Data Center Tours: Iron Mountain VA-1, Manassas, Virginia

Iron Mountain Northern Virginia Overview Iron Mountain’s Northern Virginia data centers VA-1 through VA-7 are situated on a 142-acre highly secure campus in Prince William County, Virginia. Located at 11680 Hayden Road in Manassas, Iron Mountain VA-1 spans 167,958 sq. ft. and harbors 12.4 MW of total capacity to meet colocation needs. The 36 MW VA-2 facility stands nearby. The total campus features a mixture of single and multi-tenant facilities which together provide more than 2,000,000 SF of highly efficient green colocation space for enterprises, federal agencies, service providers and hyperscale clouds.  The company notes that its Manassas campus offers tax savings compared to Ashburn and exceptional levels of energy-efficiency as well as a diverse and accessible ecosystem of cloud, network and other service providers.  Iron Mountain’s Virginia campus has 9 total planned data centers, with 5 operational facilities to date and two more data centers coming soon. VA-2 recently became the first data center in the United States to achieve DCOS Maturity Level 3.    As we continued the tour, Kinra led the way toward the break room, an area where customers can grab coffee or catch up on work. Unlike the high-end aesthetic of some other colocation providers, Iron Mountain’s approach is more practical and focused on functionality. At the secure shipping and receiving area, Kinra explained the process for handling customer equipment. “This is where our customers ship their equipment into,” he said. “They submit a ticket, send their shipments in, and we’ll take it, put it aside for them, and let them know when it’s here. Sometimes they ask us to take it to their environment, which we’ll do for them via a smart hands ticket.” Power Infrastructure and Security Measures The VA-1 campus is supported by a single substation, providing the necessary power for its growing

Read More »

Land and Expand: DPO, Microsoft, JLL and BlackChamber, Prologis, Core Scientific, Overwatch Capital

Land and Expand is a periodic feature at Data Center Frontier highlighting the latest data center development news, including new sites, land acquisitions and campus expansions. Here are some of the new and notable developments from hyperscale and colocation data center developers and operators about which we’ve been reading lately. DPO to Develop $200 Million AI Data Center in Wisconsin Rapids; Strategic Partnership with Billerud’s CWPCo Unlocks Hydroelectric Power for High-Density AI Compute Digital Power Optimization (DPO) is moving forward with plans to build a $200 million high-performance computing (HPC) data center in Wisconsin Rapids, Wisconsin. The project, designed to support up to 20 megawatts (MW) of artificial intelligence (AI) computing, leverages an innovative partnership with Consolidated Water Power Company (CWPCo), a subsidiary of global packaging leader Billerud. DPO specializes in developing and operating data centers optimized for power-dense computing. By partnering with utilities and independent power producers, DPO colocates its facilities at energy generation sites, ensuring direct access to sustainable power for AI, HPC, and blockchain computing. The company is privately held. Leveraging Power Infrastructure for Speed-to-Energization CWPCo, a regulated utility subsidiary, has operated hydroelectric generation assets since 1894, reliably serving industrial and commercial customers in Wisconsin Rapids, Biron, and Stevens Point. Parent company Billerud is a global leader in high-performance packaging materials, committed to sustainability and innovation. The company operates nine production facilities across Sweden, the USA, and Finland, employing 5,800 people in over 19 countries.  The data center will be powered by CWPCo’s renewable hydroelectric assets, tapping into the utility’s existing 32 megawatts of generation capacity. The partnership grants DPO a long-term land lease—extending up to 50 years—alongside interconnection rights to an already-energized substation and a firm, reliable power supply. “AI infrastructure is evolving at an unprecedented pace, and access to power-dense sites is critical,” said Andrew

Read More »

Data center spending to top $1 trillion by 2029 as AI transforms infrastructure

His projections account for recent advances in AI and data center efficiency, he says. For example, the open-source AI model from Chinese company DeepSeek seems to have shown that an LLM can produce very high-quality results at a very low cost with some clever architectural changes to how the models work. These improvements are likely to be quickly replicated by other AI companies. “A lot of these companies are trying to push out more efficient models,” says Fung. “There’s a lot of effort to reduce costs and to make it more efficient.” In addition, hyperscalers are designing and building their own chips, optimized for their AI workloads. Just the accelerator market alone is projected to reach $392 billion by 2029, Dell’Oro predicts. By that time, custom accelerators will outpace commercially available accelerators such as GPUs. The deployment of dedicated AI servers also has an impact on networking, power and cooling. As a result, spending on data center physical infrastructure (DCPI) will also increase, though at a more moderate pace, growing by 14% annually to $61 billion in 2029.  “DCPI deployments are a prerequisite to support AI workloads,” says Tam Dell’Oro, founder of Dell’Oro Group, in the report. The research firm raised its outlook in this area due to the fact that actual 2024 results exceeded its expectations, and demand is spreading from tier one to tier two cloud service providers. In addition, governments and tier one telecom operators are getting involved in data center expansion, making it a long-term trend.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »