Stay Ahead, Stay ONMINE

National Grid, Con Edison urge FERC to adopt gas pipeline reliability requirements

The Federal Energy Regulatory Commission should adopt reliability-related requirements for gas pipeline operators to ensure fuel supplies during cold weather, according to National Grid USA and affiliated utilities Consolidated Edison Co. of New York and Orange and Rockland Utilities. In the wake of power outages in the Southeast and the near collapse of New York […]

The Federal Energy Regulatory Commission should adopt reliability-related requirements for gas pipeline operators to ensure fuel supplies during cold weather, according to National Grid USA and affiliated utilities Consolidated Edison Co. of New York and Orange and Rockland Utilities.

In the wake of power outages in the Southeast and the near collapse of New York City’s gas system during Winter Storm Elliott in December 2022, voluntary efforts to bolster gas pipeline reliability are inadequate, the utilities said in two separate filings on Friday at FERC.

The filings were in response to a gas-electric coordination meeting held in November by the Federal-State Current Issues Collaborative between FERC and the National Association of Regulatory Utility Commissioners.

National Grid called for FERC to use its authority under the Natural Gas Act to require pipeline reliability reporting, coupled with enforcement mechanisms, and pipeline tariff reforms.

“Such data reporting would enable the commission to gain a clearer picture into pipeline reliability and identify any problematic trends in the quality of pipeline service,” National Grid said. “At that point, the commission could consider using its ratemaking, audit, and civil penalty authority preemptively to address such identified concerns before they result in service curtailments.”

On pipeline tariff reforms, FERC should develop tougher provisions for force majeure events — an unforeseen occurence that prevents a contract from being fulfilled — reservation charge crediting, operational flow orders, scheduling and confirmation enhancements, improved real-time coordination, and limits on changes to nomination rankings, National Grid said.

FERC should support efforts in New England and New York to create financial incentives for gas-fired generators to enter into winter contracts for imported liquefied natural gas supplies, or other long-term firm contracts with suppliers and pipelines, National Grid said.

Con Edison and O&R said they were encouraged by recent efforts such as North American Energy Standard Board’s Gas-Electric Harmonization report, a FERC and North American Electric Reliability Corp. Winter Storm Elliott report and NARUC’s Gas-Electric Alignment for Reliability efforts.

“Further, since Winter Storm Elliott, we have observed noticeable improvements in winter weather operations with our upstream pipeline and suppliers, including earlier information-sharing with pipeline customers and enhancements by pipelines to address real-time operational risk,” the utilities said.

However, “efforts to address the root of the problem — gas system performance during cold weather — have stalled,” they said.

A proposal to tighten the definition of “force majeure” in gas contracts was rejected by NAESB stakeholders, according to Con Edison and O&R.

“The proposal sought to inject additional clarity surrounding repeated claims of force majeure for avoidable situations, establish requirements for preventative measures, and add specificity to the force majeure definition and scope — including clarifying that while cold weather may be a potential driver of force majeure events, an appropriate level of preparation and communication should be reasonably expected and required as a condition of relief,” the utilities said.

State and federal regulators could mandate activity-based winterization standards for gas production and transportation facilities such as equipment heating, back-up fuel sources and supplies, and equipment temperature alarms, Con Edison and O&R said.

National Grid and Con Edison and O&R urged FERC to launch a notice of inquiry to begin the process of establishing reliability requirements for the interstate gas system.

The American Gas Association, a trade group for gas utilities, told FERC it supports talks on upstream gas infrastructure weatherization and transparency across the natural gas value chain.

In particular, the AGA “supports discussions surrounding the importance of storage as a tool to aid in reliability and resiliency, solutions related to addressing permitting delays for energy projects, upstream supply chain weatherization, updating the force majeure provisions for the [NAESB] Base Contract for Sale and Purchase of Natural Gas, and the benefits of reliability reporting in achieving transparency,” the trade group said.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Quinas readies UltraRam, flash memory with DRAM speed

For starters, the memory is built on what is called a” III-V technology,” a class of semiconductor materials that are composed of elements from groups III and V of the periodic table, the company stated. These materials have unique properties that make them ideal for use in electronic devices such

Read More »

7 Wi-Fi certifications to bolster wireless networking skills

Organization: Certified Wireless Network Professionals (CWNP) Price: $149.99 for the CWTS-102 exam How to prepare: CWNP offers several resources to prepare including: a live training class, a self-paced training kit, a study and reference guide, an electronic practice test, an eLearning module, an eLearning bundle, and a test and go

Read More »

BW Offshore Signs HoA with Equinor for Bay Du Nord FPSO

BW Offshore said it has signed a head of agreement (HoA) with Equinor Canada Ltd as the preferred bidder for the floating production, storage and offloading (FPSO) unit for the Bay du Nord project offshore Newfoundland and Labrador, Canada. Bay du Nord, Canada’s first deepwater oil project, is operated by Equinor in partnership with BP plc and holds an estimated 400 million barrels of recoverable light crude in its initial phase, BW Offshore said in a news release. Under the HoA, BW Offshore and Equinor will further discuss how to proceed with the technical and commercial aspects of the FPSO project, including a smart and cost-effective design through front end engineering design (FEED) work, as well as agreeing on a commercial solution, according to the release. BW Offshore said the planned FPSO will be tailored for the harsh environment of the sub-Arctic. The unit is expected to support production of up to 160,000 barrels of oil per day. The topside is planned to include emission reduction initiatives such as high-efficiency power generation and heat recovery, variable speed drives and a closed flare system. The FPSO will be designed for future tiebacks to enhance the project’s long-term economic viability and value creation, the company said. Following pre-FEED completion in mid-September, BW Offshore and Equinor plan to enter into a bridging phase to prepare for FEED in early 2026, subject to approvals by Equinor and BP. BW Offshore said it plans to establish a local office in Newfoundland during FEED. “We are honored to have been selected by Equinor, which shows confidence placed in BW Offshore’s capabilities to support such a pioneering Canadian project,” BW Offshore CEO Marco Beenen said. “This HoA follows a constructive and close dialogue with Equinor since late 2023 and highlights our commitment to bringing substantial value to

Read More »

Strategists Forecast USA Crude Stock Drop

In an oil and gas report sent to Rigzone late Tuesday by the Macquarie team, Macquarie strategists, including Walt Chancellor, revealed that they are forecasting that U.S. crude inventories will be down by 1.1 million barrels for the week ending August 29. “This follows a 2.4 million barrel draw in the prior week, with the crude balance realizing slightly tighter than our expectations,” the strategists said in the report. “As we have previously noted, we believe persistently strong implied supply has been a key feature of the U.S. crude oil balance across Q3 to date. As such, with U.S. oil production exiting Q2 at a record 13.6 million barrels per day in June, we see potential for this figure to move higher as subsequent monthly data is reported,” the strategists added. “For this week’s balance, from refineries, we model a minimal reduction in crude runs. Among net imports, we model a slight decrease, with exports (+0.3 million barrels per day) and imports (+0.2 million barrels per day) higher on a nominal basis,” they went on to state. The strategists warned in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod.+adj.+transfers), we again look for an increase (+0.3 million barrels per day) on a nominal basis this week,” the strategists said in the report. “Rounding out the picture, we anticipate a smaller increase (+0.5 million barrels) in SPR [Strategic Petroleum Reserve] stocks this week,” they added. The Macquarie strategists also stated in the report that, “among products”, they “look for a healthy gasoline draw (-3.0 million barrels) largely offset by builds in distillate (+1.4 million barrels) and jet (+1.0 million barrels)”. “We model implied demand for these three products at ~14.5 million barrels per day for the week

Read More »

Hafnia Secures Preliminary Deal to Acquire Over 14 Pct Stake in TORM

Oaktree Capital Management LP has entered a preliminary agreement to sell around 14.1 million shares it holds in TORM PLC to Hafnia Ltd for $311.43 million, or $22 per share. The volume corresponds to about 14.45 percent of TORM’s issued share capital, Hafnia, part of Singapore-based BW Group, said in a statement on its website. Hafnia and TORM own tanker fleets that ship oil, oil products and chemicals. Hafnia says it owns about 200 vessels. TORM says it owns over 80 vessels. Oaktree meanwhile is a Los Angeles-based investor. Hafnia trades on the Oslo Stock Exchange and the New York Stock Exchange while TORM is listed on the Copenhagen Stock Exchange and Nasdaq in New York. “Hafnia looks forward to making this sizeable investment in TORM with the belief that TORM is a well-managed company with a high-quality fleet”, Hafnia said. “With respect to Hafnia’s long-term position as a shareholder in TORM, Hafnia believes generally that consolidation is positive for the tanker industry but has made no decisions in this regard”. TORM said separately, “TORM has not been involved in the transaction and has no further information”. Market Outlook Earlier Hafnia reported $346.56 million in operating revenue for the second quarter, down from $563.1 million for 2Q 2024. Operating profit was $83.09 million, down from $262.14 million for 2Q 2024. Profit before income tax landed at $78 million, down from $260.77 million. Net profit was $75.34 million, or $0.15 per share – down from $259.2 million. “Strong product demand, low global inventories, improving refining margins and high export volumes have gradually supported the second quarter product tanker market and have continued into the third quarter”, Hafnia said. “Refined product volumes on water have steadily increased, and daily loadings of refined products have grown even more in the third quarter, signaling further

Read More »

Shell and BP Stations Running Low in Indonesia on Import Cuts

Shell Plc and BP Plc-branded gas stations in Indonesia have run low on fuel as supplies are curbed by import restrictions. The three brands of gasoline sold by Shell in the country are unavailable at some locations until further notice, according to a statement on its website. The company – which accounts for a small part of Indonesia’s market – is coordinating with the country’s Energy Ministry to ensure fuel availability, it said. Indonesia’s imports of gasoline, which are regulated by government quotas, fell 22 percent year-on-year at the end of August, according to Kpler data. While the country does produce some crude from local fields, it’s a net oil importer, and a lack of refining capacity means that some petroleum products are shipped in from overseas. Energy Minister Bahlil Lahadalia said private companies had requested larger quotas, Kompas reported, citing an interview. He encouraged them to purchase supplies from state-owned PT Pertamina, which dominates the country’s gasoline retailing, according to the report. Meanwhile, the volume and frequency at which Pertamina has sought to procure gasoline from overseas for the remainder of the year has surged over the last two weeks, according to tender documents tallied by Bloomberg. BP-branded stations are also facing shortages of some products, the Jakarta Post reported, citing an executive from their operator, PT Aneka Petroindo Raya. About 50 forecourts in Indonesia carry the British firm’s brand, according to a company document. Earlier this year Shell agreed to sell its Indonesian gas stations and associated fuel distribution operations to a joint venture between Philippines-based Citadel Pacific Ltd. and Sefas Group. What do you think? We’d love to hear from you, join the conversation on the Rigzone Energy Network. The Rigzone Energy Network is a new social experience created for you and all energy professionals to Speak Up about our

Read More »

Tamboran Secures Government Approval to Sell Beetaloo Appraisal Gas

Tamboran Resources Corporation has obtained the Northern Territory Government’s (NTG) approval to sell appraisal gas from its exploration permits in the Beetaloo Basin. The company said in a media release that this is the first approval granted by the NTG under the Beneficial Use of Gas (BUG) legislation. The approval follows the recent consent from the Native Title Holders for the sale of up to 60 TJ per day from the Shenandoah South (SS) Pilot Project over three years. The Beetaloo Joint Venture (BJV) has obtained all required permissions to sell gas from the SS Pilot Project. This initiative plans to start gas sales of up to 40 TJ per day to the NTG under a binding Gas Sales Agreement (GSA), which is set to begin in mid-2026, depending on weather conditions, Tamboran said. “The BJV has secured an important milestone, being the first joint venture to secure NTG approval to sell gas under the BUG legislation. This approval will allow the BJV to sell gas to the NTG under a long-term GSA, delivering energy security to Territorians”, Richard Stoneburner, Tamboran Chairman and Interim CEO, said. “The BUG legislation has been a significant initiative undertaken by the government, which allows for gas volumes to be sold into the market during the exploration and appraisal phase of the project life. Gas that would have historically been flared on site will now be delivered into the local Northern Territory gas market, delivering much-needed energy security to the market that is predominantly powered by gas. It will also deliver accelerated royalties to the NTG and Native Title Holders”, he said. Tamboran reported that drilling of the three remaining wells needed for plateau production and GSA delivery is underway, with the intermediate sections finished on all three wells. Currently, Tamboran is drilling the horizontal

Read More »

CNOOC Ltd Announces 10th Startup in Chinese Waters in 2025

CNOOC Ltd said Thursday the Wenchang 16-2 Oilfield Development Project in the Pearl River Mouth Basin had started production. This is the company’s 10th announced startup offshore China this year, and the sixth in the South China Sea. Including two projects in Brazilian waters, the publicly listed arm of China National Offshore Oil Corp. has now announced 12 upstream startups in 2025. The newest startup is expected to reach its capacity of about 11,200 barrels of oil equivalent a day in 2027. The oil is light crude, according to CNOOC Ltd. “The development of the project mainly leverages the adjacent existing facilities of the Wenchang Oilfields, with addition of a new jacket platform integrating functions such as oil and gas production, offshore drilling and completion operations as well as personnel accommodation”, CNOOC Ltd, the sole developer, said in a statement on its website. It plans to develop 15 wells. The project has an average water depth of around 150 meters (492.13 feet), according to the company. CNOOC Ltd earlier put online five projects in the South China Sea this year: the Dongfang 1-1 Gas Field 13-3 Block Development Project, the Dongfang 29-1 field, the Panyu 11-12/10-1/10-2 Oilfield Adjustment Joint Development Project, the Weizhou 5-3 field and phase II of the Wenchang 19-1 field. The other Chinese projects it put into production this year are in the Bohai Sea: phase I of the Bozhong 26-6 field, the Caofeidian 6-4 field adjustment, phase I of the Kenli 10-2 Oilfields Development Project and phase II of the Luda 5-2 North field. Overseas, CNOOC Ltd and its partners fired up Buzios7 in the Buzios field and Mero4 in the Mero field. Both fields are in Brazil’s Santos Basin. CNOOC Ltd owns 9.65 percent in Mero, operated by state-controlled Petroleo Brasileiro SA (Petrobras) with a

Read More »

Inside the AI-optimized data center: Why next-gen infrastructure is non-negotiable

How are AI data centers different from traditional data centers? AI data centers and traditional data centers can be physically similar, as they contain hardware, servers, networking equipment, and storage systems. The difference lies in their capabilities: Traditional data centers were built to support general computing tasks, while AI data centers are specifically designed for more sophisticated, time and resource-intensive workloads. Conventional data centers are simply not optimized for AI’s advanced tasks and necessary high-speed data transfer. Here’s a closer look at their differences: AI-optimized vs. traditional data centers Traditional data centers: Handle everyday computing needs such as web browsing, cloud services, email and enterprise app hosting, data storage and retrieval, and a variety of other relatively low-resource tasks. They can also support simpler AI applications, such as chatbots, that do not require intensive processing power or speed. AI data centers: Built to compute significant volumes of data and run complex algorithms, ML and AI tasks, including agentic AI workflows. They feature high-speed networking and low-latency interconnects for rapid scaling and data transfer to support AI apps and edge and internet of things (IoT) use cases. Physical infrastructure Traditional data centers: Typically composed of standard networking architectures such as CPUs suitable for handling networking, apps, and storage. AI data centers: Feature more advanced graphics processing units (GPU) (popularized by chip manufacturer Nvidia), tensor processing units (TPUs) (developed by Google), and other specialized accelerators and equipment. Storage and data management Traditional data centers: Generally, store data in more static cloud storage systems, databases, data lakes, and data lakehouses. AI data centers: Handle huge amounts of unstructured data including text, images, video, audio, and other files. They also incorporate high-performance tools including parallel file systems, multiple network servers, and NVMe solid state drives (SSDs). Power consumption Traditional data centers: Require robust cooling

Read More »

From Cloud to Concrete: How Explosive Data Center Demand is Redefining Commercial Real Estate

The world will generate 181 ZB of data in 2025, an increase of 23.13% year over year and 2.5 quintillion bytes (a quintillion byte is also called an exabyte, EB) created daily, according to a report from Demandsage. To put that in perspective: One exabyte is equal to 1 quintillion bytes, which is 1,000,000,000,000,000,000 bytes. That’s 29 TB every second, or 2.5 million TB per day. It’s no wonder data centers have become so crucial for creating, consuming, and storing data — and no wonder investor interest has skyrocketed.  The surging demand for secure, scalable, high-performance retail and wholesale colocation and hyperscale data centers is spurred by the relentless, global expansion of cloud computing and demand for AI as data generation from businesses, governments, and consumers continues to surge. Power access, sustainable infrastructure, and land acquisition have become critical factors shaping where and how data center facilities are built.  As a result, investors increasingly view these facilities not just as technology assets, but as a unique convergence of real estate, utility infrastructure, and mission-critical systems. Capitalizing on this momentum, private equity and real estate investment firms are rapidly expanding into the sector through acquisitions, joint ventures, and new funds—targeting opportunities to build and operate facilities with a focus on energy efficiency and scalability.

Read More »

Ai4 2025 Navigates Rapid Change in AI Policy, Education

The pace of innovation in artificial intelligence is fundamentally reshaping the landscape of education, and the changes are happening rapidly. At the forefront of this movement stand developers, policy makers, educational practitioners, and associated experts at the recent Ai4 2025 conference (Aug. 11-13) in Las Vegas, where leading voices such as Geoffrey Hinton “The Godfather of AI,” top executives from Google and U.S. Bank, and representatives from multiple government agencies gathered to chart the future of AI development. Importantly, educators and academic institutions played a central role, ensuring that the approach to AI in schools is informed by those closest to the classroom. Key discussions at Ai4 and recent educator symposia underscored both the promise and peril of swift technological change. Generative AI, with its lightning-fast adoption since the advent of tools like ChatGPT, is opening new possibilities for personalized learning, skills development, and operational efficiency. But participants were quick to note that acceleration brings good and bad consequences. On one hand, there’s excitement about practical classroom implementations and the potential for students to engage with cutting-edge technology. On the other, concerns about governance, ethics, safety, and the depth of genuine learning remain at the forefront. This urgency to “do this right” is echoed by teachers, unions, and developers who are united by the challenges and opportunities on the ground. Their voices highlight the need for agreement on education policy and associated regulations to keep pace with technological progress, create frameworks for ethical and responsible use, and ensure that human agency remains central in shaping the future of childhood and learning. In this rapidly evolving environment, bringing all stakeholders to the table is no longer optional; it is essential for steering AI in education toward outcomes that benefit both students and society. Global Context: America, China, and the AI Race

Read More »

Two Lenses on One Market: JLL and CBRE Show Data Centers in a Pinch

The two dominant real estate research houses, JLL and CBRE, have released midyear snapshots of the North American data center market, and both paint the same picture in broad strokes: demand remains insatiable, vacancy has plunged to record lows, and the growth of AI and hyperscale deployments is reshaping every aspect of the business. But their lenses capture different angles of the same story: one emphasizing preleasing and capital flows, the other highlighting hyperscale requirements and regional shifts. Vacancy Falls Through the Floor JLL sets the stage with a stark headline: colocation vacancy is nearing 0%. The JLL Midyear 2025 North America Data Center report warns that this scarcity “is constraining economic growth and undermining national security,” underscoring the role of data centers as critical infrastructure. CBRE’s North American Data Center Trends H1 2025 numbers back this up, recording an all-time low North America vacancy rate of 1.6%, the tightest in more than a decade. Both agree that market loosening is years away — JLL projecting vacancy hovering around 2% through 2027, CBRE noting 74.3% of new capacity already spoken for. The takeaway seems clear: without preleasing, operators and tenants alike are effectively shut out of core markets. Absorption and Preleasing Drive Growth JLL drills down into the mechanics. With virtually all absorption tied to preleasing, the firm points to Northern Virginia (647 MW) and Dallas (575 MW) as the twin engines of growth in H1, joined by Chicago, Austin/San Antonio, and Atlanta. CBRE’s absorption math is slightly different, but the conclusion aligns: Northern Virginia again leads the nation, with 538.6 MW net absorption and a remarkable 80% surge in under-construction capacity. CBRE sharpens the view by noting that the fiercest competition is at the top end: single-tenant requirements of 10 MW or more are setting pricing records as hyperscalers

Read More »

Data Center Frontier Trends Summit 2025: AI, Power Constraints, and Moonshots Take the Stage in Reston

Aug. 28, RESTON, Va. — It’s the last day of the second-annual Data Center Frontier Trends Summit, marking the conclusion of a gathering of significant players in the data center world and their customers, all of whom are looking to get a better handle on the data center industry as it grapples with AI-fueled power demands, grid constraints, and an urgent need for infrastructure innovation. Taking place in the heart of Northern Virginia’s Data Center Alley, acknowledged as the world’s premier data center hotspot, the conference in Reston, VA saw a significant increase in attendance in its second year, going from just over 300 attendees in its inaugural year to close to five hundred attendees this year. Not unexpected, many of the attendees were newcomers to the event, attracted by the strong list of speakers focused on critical topics to the industry, with an emphasis on power and artificial intelligence. Many conversation with the attendees had them identifying specific topics that were primary motivators to attend, while one attendee simply told us “after reading the presentation descriptions and the speakers list, how could we not attend?” From the opening keynote “Playbook Interrupted” presented by Chris Downie, CEO of Flexential, the tone and message of the conference was made clear. Touching on topics such as AI’s insatiable resource appetite, tightening energy policies, and power scarcity, Chris made it clear that todays’ data centers are breaking old frameworks and demanding new strategies for growth and resilience. The message was clear; times have changed and industry executives needed to be ready to change with them. Staying ahead of the curve was going to be more difficult, but just as important. It was tyime to develop a new playbook for your business operations. Getting to the Core of It With the demand for AI centric

Read More »

NTT Data, Google Cloud Forge Alliance to Expand AI, Cloud Modernization

The partnership is designed to accelerate development of repeatable, scalable solutions, according to reps. NTT Data’s GenAI framework called “Takumi” is at the heart of this development, designed to help clients move from idea to deployment by integrating with Google Cloud’s AI stack supporting rapid prototyping and GenAI use-case creation. This initiative expands NTT Data’s Smart AI Agent Ecosystem, which unites strategic technology partnerships, specialized assets, and an AI-ready talent engine to help users deploy and manage AI at scale.  New Business Group NTT Data has established a dedicated global Google Cloud Business Group comprising thousands of engineers, architects, and advisory consultants. This team will collaborate with Google Cloud teams to help clients adopt and scale AI-powered cloud technologies. The company also is investing in training and certification programs so teams across sales, presales, and delivery can sell, migrate, and implement AI-powered cloud solutions. NTT Data says it will certify 5,000 engineers in Google Cloud technology, a step that underscores the scale of resources both firms are committing to meet surging enterprise demand. Both companies are co-investing in global sales and go-to-market campaigns designed to fast-track adoption across priority industries. A Landmark Moment for NTT—and the Industry Marv Mouchawar, Head of Global Innovation at NTT Data, said the partnership is a significant milestone in the company’s mission to drive innovation and digital transformation across industries. “By combining NTT Data’s deep expertise in AI, cloud-native modernization and enterprise solutions with Google Cloud’s advanced technologies, we are helping businesses accelerate their AI-powered cloud adoption globally and unlock new opportunities for growth,” she noted. For the data center industry, this partnership is notable not just as a technology alignment but as a signal of where digital infrastructure is headed. Hyperscale cloud providers continue to expand their reach through partnerships with major service providers,

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »