Stay Ahead, Stay ONMINE

USA DOE Announces 1st Step in its ‘Largest Deregulatory Effort in History’

In a statement posted on its website this week, the U.S. Department of Energy (DOE) announced “the first step in the Energy Department’s largest deregulatory effort in history, proposing the elimination or reduction of 47 regulations”. The statement highlighted that these actions are in accordance with U.S. President Donald Trump’s Executive Order “Zero-Based Regulation to […]

In a statement posted on its website this week, the U.S. Department of Energy (DOE) announced “the first step in the Energy Department’s largest deregulatory effort in history, proposing the elimination or reduction of 47 regulations”.

The statement highlighted that these actions are in accordance with U.S. President Donald Trump’s Executive Order “Zero-Based Regulation to Unleash American Energy”. Once finalized, these actions will cut more than 125,000 words from the Code of Federal Regulations, the statement noted.

“While it would normally take years for the Department of Energy to remove just a handful of regulations, the Trump Administration assembled a team working around the clock to reduce costs and deliver results for the American people in just over 110 days,” U.S. Secretary of Energy Chris Wright said in the statement.

The 47 deregulatory actions include “streamlining administrative procedures with respect to the import and export of natural gas”, “streamlining applications for authorization to transmit electric energy to a foreign country”, and “rescinding reporting requirements, certification, independent verification, and DOE review for voluntary greenhouse gas reporting”, the DOE statement showed.

They also include “rescinding the renewable energy production incentive”, “delaying [the] compliance date for federal agencies to meet the clean energy federal building rule”, and “streamlining the procedures for acquisition of petroleum for the strategic petroleum reserve (SPR)”, the statement highlighted.

In a statement posted on its site last month, the DOE announced that Strategic Storage Partners LLC had secured a $1.4 billion contract to manage and operate the SPR.

In that statement, the DOE noted that the SPR’s mission is to safeguard the United States from significant petroleum supply disruptions through the acquisition, storage, distribution, and management of emergency petroleum stocks, fulfilling U.S. obligations under the International Energy Program.

A “Zero-Based Regulatory Budgeting to Unleash American Energy” Executive Order was published on the White House website on April 9. In it, Trump highlighted that “this order directs certain agencies to incorporate a sunset provision into their regulations governing energy production to the extent permitted by law, thus compelling those agencies to reexamine their regulations periodically”. 

A fact sheet posted on the White House website on the same day stated that President Trump “is laser focused on energy abundance, not just efficiency”.

Earlier this month, in his opening remarks when testifying before the House Committee on Appropriations, Subcommittee on Energy and Water Development, outlining the Department of Energy’s Fiscal Year 2026 budget request, Wright said, “my priorities for the Department are clear – to unleash a golden era of American energy dominance, strengthen our national security, and lead the world in innovation”.

“A reliable and abundant energy supply is the foundation of a strong and prosperous nation – it drives our economy, safeguards our freedoms, and fuels breakthroughs that improve our lives,” he added in the remarks, which were posted on the DOE website.

In another statement posted on the DOE site this month, the DOE announced that Wright signed a Memorandum of Understanding (MOU) on energy cooperation and a Memorandum of Cooperation (MOC) on critical minerals with the Kingdom of Saudi Arabia’s Minister of Energy H.R.H. Prince Abdulaziz bin Salman Al Saud and H.E. Minister of Industry and Mineral Resources Bandar Alkhorayef, respectively. 

Wright signed the documents as a member of the United States delegation led by President Trump, the statement highlighted.

The DOE traces its lineage to the Manhattan Project effort to develop the atomic bomb during World War II and to the various energy-related programs that previously had been dispersed throughout various federal agencies, the DOE site notes. It has been in existence since 1977, the site highlights.

To contact the author, email [email protected]

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

CompTIA cert targets operational cybersecurity skills

The SecOT+ certification will provide OT professionals with the skills to manage, mitigate, and remediate security risks in manufacturing and critical infrastructure environments, according to CompTIA. The certification program will provide OT positions, such as floor technicians and industrial engineers, as well as cybersecurity engineers and network architects on the

Read More »

NOAA stops tracking cost of extreme weather and climate disasters

Dive Brief: The National Oceanic and Atmospheric Administration announced Thursday its National Centers for Environmental Information would stop tracking the cost of extreme weather and climate disasters in its Billion Dollar Weather and Climate Disaster database. The product is being retired “in alignment with evolving priorities, statutory mandates, and staffing

Read More »

Network data hygiene: The critical first step to effective AI agents

Many network teams manage some 15 to 30 different dashboards to track data across all the components in an environment, struggling to cobble together relevant information across domains and spending hours troubleshooting a single incident. In short, they are drowning in data. Artificial intelligence tools—and specifically AI agents—promise to ease

Read More »

Key takeaways from IBM Think partner event

The first week of May means flowers from April showers and that it’s time for IBM Think in Boston. The first day of the event has historically been the Partner Plus day, which is devoted to content for IBM partners, which include ISVs, technology partners and resellers. The 2025 keynote

Read More »

Crude Slips After Inventory Surge

Oil dropped after a government report showed US crude inventories rose the most in two months, overshadowing the lingering relief from the trade truce between the world’s two largest economies. West Texas Intermediate dipped 0.8% to settle near $63 a barrel, snapping a four-session winning streak. Global benchmark Brent closed around $66. US crude stockpiles rose 3.45 million barrels, the biggest gain since March, the Energy Information Administration said Wednesday. “Crude futures are in a corrective phase,” said Dennis Kissler, senior vice president for trading at BOK Financial Securities. Crude’s 50-day moving average of about $63.90 has so far provided a ceiling for gains, he said. The bearish government data stalled a rally spurred by the US-China trade détente and President Donald Trump’s hostile rhetoric on Iranian supply. Trump reiterated Wednesday that Tehran can’t have a nuclear weapon — while Iran’s foreign minister urged the US to take a “more realistic” approach to talks. The friction comes a day after Trump said during a visit to Saudi Arabia that he’ll drive Iran’s oil exports to zero if a nuclear agreement can’t be reached. Washington also announced new sanctions on a shipping network it said was moving billions of dollars’ worth of the Persian Gulf state’s oil. Oil futures have rallied from the lowest closing price in four years at the beginning of last week. They’re still down more than 10% this year, underperforming major equity markets. Trump is visiting the Middle East this week, hoping to strike deals with countries including Saudi Arabia. The kingdom — the de-facto leader of the Organization of the Petroleum Exporting Countries and its allies — has pushed the cartel to increase output to punish noncompliant members. A further boost expected at a June 1 meeting would add to concerns about a surplus. The group’s

Read More »

Nigeria Says Exxon Commits $1.5B to Deepwater Development

Exxon Mobil Corp. plans to invest $1.5 billion in a deepwater oil field in Nigeria as Africa’s top producer looks to boost output, according to the nation’s regulator. The investment, expected within the next two years, will focus on reviving production in the Usan field and is in addition to funding earmarked for planned developments in Owowo and Erha, the Nigerian Upstream Petroleum Regulatory Commission said in a statement Tuesday, citing a visit by Shane Harris, Exxon’s managing director in Nigeria. The producer aims to reach oil output of 2.4 million barrels a day after dropping to less than half of that level in 2022. Nigeria has taken measures to reduce vandalism and improve regulations, as oil majors have divested from onshore and shallow water fields due to security concerns. Local independent companies are expected to raise output as assets are transferred from recent sales. While Exxon’s investment plans signal its continued interest in the country, the company initially considered spending as much as $10 billion on Nigeria’s offshore oil developments, the government said in September.  “This move counters speculation about Exxon Mobil’s potential withdrawal from Nigeria, instead underscoring a strategic expansion and strengthening of its operational footprint in the country,” the NUPRC said in Tuesday’s statement.  Exxon could reach a final investment decision on Usan in the third quarter, subject to approvals for its field development plan and by partners, according to NUPRC.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed. MORE FROM THIS AUTHOR Bloomberg

Read More »

Carney Says He’s Open to Changing Green Laws to Boost Energy Projects

Prime Minister Mark Carney said he’ll consider altering environmental regulations passed by his predecessor, Justin Trudeau, in order to facilitate investment in major projects in Canada, including oil and gas pipelines. “We will change things at the federal level that need to be changed in order for projects to move forward,” Carney told CTV News in an interview that aired Tuesday, shortly after his new cabinet was sworn into office. Asked if those changes might apply to C-69 — legislation that created new rules for environmental assessments — or to a proposed cap on oil and gas industry emissions, he replied: “Absolutely it could include both. But I’m not going to do it conceptually. I’m going to do it on specifics, do it for moving forward.” The two laws have been regularly cited by Pierre Poilievre, leader of the opposition Conservative Party, as impediments to major conventional energy developments. Before entering politics, Carney wrote and spoke at length about the importance of transitioning away from hydrocarbon-based energy to fight climate change.  His Liberal government faces a tense relationship with the premier of the oil-rich province of Alberta, where some are even agitating to secede from Canada. The former central banker won Canada’s April 28 election, with his Liberal Party winning the most seats but falling just short of a House of Commons majority. The new energy minister is Tim Hodgson, a former director of Ontario electricity utility Hydro One Ltd. and Canadian oil producer MEG Energy Corp., and previously the CEO of Goldman Sachs Group Inc. in Canada.  Carney said his plan to make Canada a superpower “in both clean and conventional energies” involves using domestic oil and gas production to displace imports — particularly from the US — as well as increasing exports. But it also means Canada should play a major role

Read More »

Amid record demand rise, efficiency and demand response can lower Texas power bills: Aurora

Expanded use of energy efficiency and demand response could help lower Texas power bills while also boosting grid reliability, according to a May 6 paper from Aurora Energy Research. Electrification, data centers and industrial growth are driving energy demand higher across the U.S., and the trend is particularly evident in Texas. In April, the Electric Reliability Council of Texas published a load forecast that predicted nearly 60 GW of new peak summer demand by 2031. Electricity supplied by the grid is expected to rise from 486 TWh this year to 1,038 TWh in 2031. Retrieved from Electric Reliability Council of Texas. “Texas’s economic momentum is driving a sharp rise in electricity demand, creating both a challenge and an opportunity,” Olivier Beaufils, head of USA Central for Aurora, said in a statement. “For the average Texas household, converting to a heat pump instead of resistance heat can decrease the yearly cost of electricity by $424, approximately 45%,” the Aurora report found. “Increased demand response participation can save industrial consumers $1.9M/yr and residential consumers $225 per year.”  If 7.7 GW of demand side response capacity, including both residential and industrial consumers, is added to the ERCOT system by 2040, it would reduce the need for new gas-fired generation by 1 GW, the report said. Texas has been trying to build more gas plants through a state backed loan program, but the Texas Energy Fund has seen 35% of its proposed dispatchable capacity canceled or withdrawn amid escalating project costs and supply chain challenges. “Energy efficiency and demand response programs provide critical additional levers for Texas to keep pace with the demand while cutting costs for consumers and strengthening the reliability of the grid,” Beaufils said.

Read More »

Lawmakers spar over IRA credits in budget hearings

Dive Brief: As Republicans work to advance budget legislation that would make deep cuts to the Inflation Reduction Act, members of the House are debating the issue in committee markup hearings, with Democrats arguing that early phaseouts of IRA credits would harm developers and raise energy bills.  The IRA is anticipated to cost taxpayers between $780 billion and $2 trillion over its first ten years, making it an appealing target for Republicans looking to reduce federal spending. However, a large portion of IRA spending has so far benefited Republican districts, and some Republicans have advocated for careful cuts to the legislation – including Rep. Claudia Tenney, R-N.Y., who warned against a wind down of the 45U nuclear production tax credit during the House Ways and Means Committee markup Tuesday. Dive Insight: “Nuclear power represents nearly a quarter of New York’s electrical supply, and my district in Western and Central New York is home to all four of New York’s reactors,” Tenney said. “Those reactors are critical to the baseload power of our region and our state and should be considered separately from our conversations on renewable energy.” In the House Energy and Commerce Committee markup on Tuesday, Rep. Kathy Castor, D-Fla., proposed an amendment to the committee’s budget that would “delay any provisions in this bill from taking effect until the Energy Information Administration studies the impact of the bill and the actions taken so far by the Trump administration.” “People need to know about this,” she said. “You can’t rush a bill like this, with such a large impact, through without people understanding.” Rep. Bob Latta, R-Ohio, opposed the amendment, saying the “best way to lower energy prices is to expand use of America’s abundant energy resources — gas, oil, coal, hydropower — that provide affordable reliable power and fuels, and

Read More »

IFMA adds its name to Energy Star fight

The International Facility Management Association last week called on Congress to step in and stop the Trump administration from ending Energy Star, the federal government’s program for encouraging the manufacture of energy efficient appliances and systems.   “Consumers and businesses alike rely on Energy Star ratings when making informed decisions about appliances and equipment,” Dean Stanberry, immediate past chair of IFMA, said in a statement May 8. “Eliminating this publicly funded resource without consulting the industries and individuals who depend on it is shortsighted and irresponsible.” The administration hasn’t said it wants to eliminate the program, but in its fiscal year 2026 budget request for the Environmental Protection Agency, it proposes to eliminate all funding for the Atmospheric Protection Program, which administers Energy Star.  “The Atmospheric Protection Program is an overreach of Government authority that imposes unnecessary and radical climate change regulations on businesses and stifles economic growth,” the administration says in its budget proposal. “By prioritizing climate change over job creation and energy independence, the program has burdened American industries with costly mandates, ultimately hurting consumers and taxpayers. This program is eliminated in the 2026 Budget.” In response to news organizations’ request for clarification on the plan, EPA was noncommittal.   “The EPA is delivering organizational improvements to the personnel structure that will directly benefit the American people and better advance the agency’s core mission, while Powering the Great American Comeback,” the agency told CBS News in an email.  The Competitive Enterprise Institute, an anti-regulation organization, says the issue isn’t the program’s effectiveness but whether its energy-saving mission is within the proper scope of government. “Green purchasing programs assume the federal government needs to meddle in the marketplace by providing its seal of approval on what it deems to be environmentally satisfactory products,” the organization says in a white paper. “If

Read More »

HPE ‘morphs’ private cloud portfolio with improved virtualization, storage and data protection

What do you get when combining Morpheus with Aruba? As part of the extensible platform message that HPE is promoting with Morpheus, it’s also working in some capabilities from the broader HPE portfolio. One integration is with HPE Aruba for networking microsegmentation. Bhardwaj noted that a lot of HPE Morpheus users are looking for microsegmentation in order to make sure that the traffic between two virtual machines on a server is secure. “The traditional approach of doing that is on the hypervisor, but that costs cycles on the hypervisor,” Bhardwaj said. “Frankly, the way that’s being delivered today, customers have to pay extra cost on the server.” With the HPE Aruba plugin that now works with HPE Morpheus, the microsegmentation capability can be enabled at the switch level. Bhardwaj said that by doing the microsegmentation in the switch and not the hypervisor, costs can be lowered and performance can be increased. The integration brings additional capabilities, including the ability to support VPN and network address translation (NAT) in an integrated way between the switch and the hypervisor. VMware isn’t the only hypervisor supported by HPE  The HPE Morpheus VM Essentials Hypervisor is another new element in the HPE cloud portfolio. The hypervisor is now being integrated into HPE’s private cloud offerings for both data center as well as edge deployments.

Read More »

AMD targets hosting providers with affordable EPYC 4005 processors

According to Pinkesh Kotecha, chairman and MD of Ishan Technologies, AMD’s 4th Gen EPYC processors stood out because they offer the right combination of high performance, energy efficiency, and security. “Their high core density and ability to optimize performance per watt made them ideal for managing data-intensive operations like real-time analytics and high-frequency transactions. Additionally, AMD’s strong AI roadmap and growing portfolio of AI-optimised solutions position them as a forward-looking partner, ready to support our customers’ evolving AI and data needs. This alignment made AMD a clear choice over alternatives,” Kotecha said. By integrating AMD EPYC processors, Ishan Technologies’ Ishan Cloud plans to empower enterprises across BFSI, ITeS, and manufacturing industries, as well as global capability centers and government organizations, to meet India’s data localization requirements and drive AI-led digital transformation. “The AMD EPYC 4005 series’ price-to-performance ratio makes it an attractive option for cloud hosting and web services, where cost-efficient, always-on performance is essential,” said Manish Rawat, analyst, TechInsights. Prabhu Ram, VP for the industry research group at CMR, said EPYC 4005 processors deliver a compelling mix of performance-per-watt, higher core counts, and modern I/O support, positioning it as a strong alternative to Intel’s Xeon E-2400 and 6300P, particularly for edge deployments. Shah of Counterpoint added, “While ARM-based Ampere Altra promises higher power efficiencies and is ideally adopted in more cloud and hyperscale data centers, though performance is something where x86-based Zen 5 architecture excels and nicely balances the efficiencies with lower TDPs, better software compatibilities supported by a more mature ecosystem.”

Read More »

Shell’s immersive cooling liquids the first to receive official certification from Intel

Along with the certification, Intel is offering a Xeon processor single-phase immersion warranty rider. This indicates Intel’s confidence in the durability and effectiveness of Shell’s fluids. Yates explained that the rider augments Intel’s standard warranty terms and is available to data center operators deploying 4th and 5th generation Xeon processors in Shell immersion fluids. The rider is intended to provide data center operators confidence that their investment is guaranteed when deployed correctly. Shell’s fluids are available globally and can be employed in retrofitted existing infrastructure or used in new builds. Cuts resource use, increases performance Data centers consume anywhere from 10 to 50 times more energy per square foot than traditional office buildings, and they are projected to drive more than 20% of the growth in electricity demand between now and 2030. Largely due to the explosion of AI, data center energy consumption is expected to double from 415 terawatt-hours in 2024 to around 945 TWh by 2030. There are several other technologies used for data center cooling, including air cooling, cold plate (direct-to-chip), and precision cooling (targeted to specific areas), but the use of immersion cooling has been growing, and is expected to account for 36% of data center thermal management revenue by 2028. With this method, servers and networking equipment are placed in cooling fluids that absorb and dissipate heat generated by the electronic equipment. These specialized fluids are thermally conductive but not electrically conductive (dielectric) thus making them safe for submerging electrical equipment.

Read More »

Cisco joins AI infrastructure alliance

“The addition of Cisco reinforces AIP’s commitment to an open-architecture platform and fostering a broad ecosystem that supports a diverse range of partners on a non-exclusive basis, all working together to build a new kind of AI infrastructure,” the group said in a statement.  Separately, Cisco announced AI initiatives centered in the Middle East region. Last week, Cisco CEO Chuck Robbins visited Saudi Arabia, UAE, Qatar, and Bahrain. This week, Jeetu Patel, executive vice president and chief product officer, is in Saudi Arabia, where he is participating in President Trump’s state visit to the region, according to Cisco. Related new projects include:  An initiative with HUMAIN, Saudi Arabia’s new AI enterprise to help build an open, scalable, resilient and cost-efficient AI infrastructure: “This landmark collaboration will set a new standard for how AI infrastructure is designed, secured and delivered – combining Cisco’s global expertise with the Kingdom’s bold AI ambitions. The multi-year initiative aims to position the country as a global leader in digital innovation,” Cisco stated. A collaboration with the UAE-basedG42 to co-develop a secure AI portfolio and AI-native services: Cisco and G42 will work together to assess the potential to co-develop and jointly deploy AI-powered cybersecurity packages, as well as a reference architecture that integrates Cisco’s networking, security, and infrastructure solutions specifically designed for high-performance computing. This collaboration aims to help customers build and secure AI-ready data centers and develop AI workloads effectively, according to the companies. Interest in Qatar’s digital transformation: Qatar’s Ministry of Interior and Cisco signed a letter of intent to collaborate on Qatar’s digital transformation, AI, infrastructure development and cybersecurity.

Read More »

Cato Networks introduces AI-powered policy analysis engine

Cato Networks this week announced a new policy analysis engine for its cloud-based secure access service edge platform that the company says will optimize and improve SASE policies, reduce risk, simplify compliance, and reduce manual maintenance efforts. Cato Autonomous Policies is built into the Cato SASE Cloud Platform and can provide enterprises with AI-driven recommendations to eliminate security exposure, tighten access controls, and improve network performance. The first use case of the policy engine is designed for firewall as a service (FWaaS) environments in which “firewall rule bloat” is present, Cato explained in a statement. The bloat comes from organizations accumulating thousands of rules that were designed to protect the environment, but after becoming outdated or misconfigured, actually lead to increased risk. “Most enterprises rely on a mix of firewalls deployed in data centers, branch offices, and cloud environments. Over time, rule sets grow, become inconsistent, and are filled with redundant, outdated, or conflicting entries,” wrote Demetris Booth, product marketing director at Cato Networks, in a blog post on the product news. “As a result, security policies become hard to manage, even harder to audit, and often misaligned with zero-trust principles. AI-driven firewall policy management is necessary for modern enterprises to streamline and optimize security operations.”

Read More »

Riverbed bolsters network acceleration for AI’s performance bottlenecks

“Enterprises are worried about bad actors capturing encrypted traffic and saving copies for when quantum computing advances can break the encryption, providing the bad actors with free access to data. It’s a real concern,” Frey explains. “Post-quantum cryptography is a way to get ahead of that now.” Riverbed also introduced the SteelHead 90 series of network acceleration appliances, which the company says will provide resilient network performance to customers. The series includes: SteelHead 8090, which delivers up to 60 Gbps of data movement over a WAN. It supports multiple 100 Gigabyte network interfaces to pull data from the LAN. SteelHead 6090, which delivers up to 20 Gbps of data movement over a WAN, targeted for mid-scale data centers. SteelHead 4090 and 2090, which support mid-sized data center and edge use cases, with 500 Mbps and 200 Mbps of accelerated traffic, as well as up to 10 Gbps of total traffic processing for quality of service (QoS) and application classification use cases. Riverbed SteelHead Virtual, is a software-only version designed for virtualization environments and private cloud deployments, which is compatible with VMWare ESXI, KVM, and Microsoft Hyper-V. “For customers that are familiar with Riverbed, this is a big change in performance. We’ve gone from moving one appliance at 30 Gbps to 60 Gbps. We want to make sure that whether it’s new AI projects or existing data projects, we have ubiquitous availability across clouds,” says Chalan Aras, senior vice president and general manager of Acceleration at Riverbed. “We’re making it less expensive to move data—we are about half the price of traditional data movement methods.” With this announcement, Riverbed also unveiled its Flex licensing subscription offering. According to Riverbed, Flex makes it possible for enterprises to transfer licenses from hardware to virtual to cloud devices at no cost. Enterprises can reassign

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »