Stay Ahead, Stay ONMINE

Starmer pledges £200m for Grangemouth site from National Wealth Fund

UK Prime Minister Sir Keir Starmer has announced a further £200 million investment in the Grangemouth industrial site, where Scotland’s only oil refinery is expected to close later this year. Refinery owner Petroineos announced the closure of the facility last year amid plans to transition the site into a fuel import terminal, leading to the […]

UK Prime Minister Sir Keir Starmer has announced a further £200 million investment in the Grangemouth industrial site, where Scotland’s only oil refinery is expected to close later this year.

Refinery owner Petroineos announced the closure of the facility last year amid plans to transition the site into a fuel import terminal, leading to the loss of more than 400 jobs.

Sir Keir announced the funding, which will come from the newly created National Wealth Fund, at the Scottish Labour conference in Glasgow on Sunday.

Addressing the conference, Sir Keir said the Grangemouth site presented a “huge opportunity for renewal”, with potential for future uses in biofuels or hydrogen energy.

“We will grasp the opportunities at Grangemouth, work alongside partners to develop viable proposals, team up with business to get new industries off the ground,” he said.

Prime Minister Sir Keir Starmer delivers his keynote speech during the Labour Party Conference, at the ACC Liverpool. © Peter Byrne/PA Wire
Prime Minister Sir Keir Starmer delivers his keynote speech during the Labour Party Conference, at the ACC Liverpool in September 2024. Image: Peter Byrne/PA Wire

“And to attract private investors into the partnership we need, we will allocate £200m from the National Wealth Fund, for investment in Grangemouth, investment in Scotland’s industrial future.”

The Prime Minister said every worker made redundant at the site would get 18 months full pay, alongside a £10m skills and training programme.

Businesses that take on Grangemouth workers will also be in line for national insurance relief, Sir Keir said.

It comes after Scotland’s First Minister John Swinney separately announced £25m in extra funding for Grangemouth last week.

The UK and Scottish governments had previously committed £100m to the Falkirk and Grangemouth Growth Deal to support jobs and skills in the area along with a report – called Project Willow – to look at the future of the site.

Grangemouth future

Speaking to BBC Radio Scotland’s Good Morning Scotland programme on Monday, Scottish Labour leader Anas Sarwar said work must proceed “at pace” to attract investors.

Sarwar said the £200m funding is “not to be sniffed at”, and could lead to investment in areas such as hydrogen, synthetic fuels and sustainable aviation fuel (SAF) production.

“It’s £200m of National Wealth Fund money, which, as I say, is meant to attract multipliers in terms of investment from private sector money,” he said.

© Andrew Milligan/PA Wire
Scottish Labour leader Anas Sarwar. Image: Andrew Milligan/PA Wire

“But I think it all depends on the projects that come forward. And I think if we can make, as I say, one of the three examples I’ve mentioned, I think there’s four or five other things that project willow is looking at.

“All of them I think would secure a positive future for the site of Grangemouth. But what we require now is to move at pace and attract that investment.”

‘Huge investment opportunity’, Sarwar says

Sarwar said the Grangemouth site is a “huge investment opportunity” due to its strategic location, grid connections and existing port infrastructure.

But he criticised the “intransigence” from the site’s owner Petroineos, a joint venture between PetroChina and Ineos, when asked whether Labour could prevent job losses.

In response, Petroineos regional head of legal and external affairs Iain Hardie told the PA News Agency that the company warned politicians and officials “repeatedly” of the looming closure “for several years”.

“This included detailed briefings to Mr Sarwar and members of his team,” Hardie said.

“It will always be a matter of regret that neither Scottish nor UK governments were moved to action until last summer, but what did happen is that Petroineos conceived project willow, to identify opportunities for low-carbon manufacturing that could secure a bright future for Grangemouth.

“The financial commitments made by the current governments at Holyrood and Westminster this past week are hugely encouraging but that money can only be deployed to good effect if complemented by policy and regulatory interventions that will make these technologies investable.”

Government has ‘finally listened’, union says

Trade unions have criticised the site’s closure and accused both the Scottish and UK governments of not doing enough to ensure a just transition for workers.

But following the announcement, the Unite union said it welcomed the funding announcement for Grangemouth, although it cautioned the “devil will be in the detail”.

Unite general secretary Sharon Graham said: “This is welcome news after months of our campaign and supported by the community of Grangemouth, Keir Starmer and the UK government have finally listened. This needs to be the start not the end in delivering a real workers’ transition for Grangemouth.’

© Andrew Milligan/PA Wire
Unite General Secretary Sharon Graham speaks at a demonstration to protest at Petroineos plans to close Grangemouth oil refinery, during the Scottish Labour Party conference at the Scottish Exhibition Centre (SEC) in Glasgow. Image: Andrew Milligan/PA Wire

“Following this announcement, it is essential that all stakeholders come together to put the meat on the bones and that this investment counts for jobs and our security. Clear timescales will be important as well as details on jobs.”

Unite has previously outlined a plan to produce sustainable aviation fuel (SAF) and the union says the Grangemouth refinery is “essential to the success of the Scottish economy”.

SNP First Minister John Swinney also welcomed the funding announcement, calling it a “step in the right direction”.

“Everyone working at Grangemouth’s refinery is a valued employee with skills that are key to Scotland’s net zero future,” he said.

“We will continue to work constructively with the UK government to secure the site’s future.”

Recommended for you

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Fortinet speeds threat detection with improved FortiAnalyzer

The package also now integrates with FortiAI, the vendor’s genAI assistant, to better support analytics and telemetry to help security teams speed threat investigation and response, the vendor stated. “FortiAI identifies the threats that need analysis from the data collected by FortiAnalyzer, primarily collected from FortiGates. By automating the collection,

Read More »

National Grid sells US renewables arm in £1.4bn deal

National Grid (LON:NG) has agreed to sell its US onshore renewables business to Brookfield Asset Management in a deal valuing the division at $1.74 billion (£1.37 billion). The deal is expected to complete by the end of September. The FTSE 100-listed energy infrastructure group, which runs much of Britain’s electricity grid, is selling off parts of its portfolio to help fund investment plans. National Grid said: “This transaction is another important step in delivering National Grid’s previously communicated strategy to focus on networks and streamline our business, as announced in May 2024.” The group said in December that it would invest £35 billion in its electricity-transmission business over the five years to March 2031 under aims to almost double the amount of energy that can be transported around the UK. It comes as part of a wider spending plan to invest about £60 billion in networks before the end of the decade, with more than £30 billion of that going to England, Scotland and Wales. As well as its move to offload National Grid Renewables in the US, the transmission giant is also selling its UK liquid natural gas asset, Grain LNG. National Grid Renewables develops, constructs, owns and runs solar, onshore wind farms and battery storage assets in the US. Recommended for you National Grid scores SF6 UK-first at London Power Tunnels project

Read More »

Why is the USA Natural Gas Price Dropping Today?

The price of U.S. natural gas is likely being driven by a combination of both technical and fundamental drivers. That’s what Art Hogan, Chief Market Strategist at B. Riley Wealth, told Rigzone in an exclusive interview on Monday when asked why the U.S. natural gas price is dropping today. “On the fundamental side, according to AccuWeather reports, most locations this upcoming week will observe high temperatures between 10-15 degrees Fahrenheit above the historical average for late February,” Hogan said. “Technically, the movements of the natural gas futures confirm this bearish pressure as the upside seems to be capped at last week’s high at $4.476, followed by a sharp sell-off that resulted in a weekly close at $4.234 after testing the week’s low at $3.554,” he added. When asked the same question in a separate exclusive interview on Monday, Phil Flynn, a senior market analyst at the PRICE Futures Group, said, “despite the fact that we saw a major drop in natural gas inventories and the fact that we’re further below the five year average than we have been probably in almost in two years, the hope of spring is giving the market a bit of a sell off – despite the strong technical breakout of last week”. “The warmer temperatures are raising hopes that winter is coming to an end, and the demand will start to ease off and production will rise,” he added. In the interview, Flynn went on to tell Rigzone that “the key for this market will be what happens in March after the warm-up”. “There are still some forecasters that are calling for a return to arctic-like temperatures into March, and if that does happen it’s going to create an interesting dynamic for this market and the ‘flip side’ that the ability of U.S. natural gas

Read More »

Trump pauses implementation of new DOE appliance efficiency standards

The U.S. Department of Energy will postpone the implementation of several appliance energy efficiency standards finalized by the Biden administration, the agency said Feb. 14. The natural gas sector hailed the decision as a win for consumer choice while efficiency advocates warn the decision could add billions to utility bills. DOE in December estimated that the stronger appliance standards would save consumers about $1 trillion and cut emissions by 2.5 billion metric tons over three decades. But Trump’s policies favor less regulation, and the Project 2025 platform created by the Heritage Foundation and shaped by numerous previous and current Trump administration officials, called for eliminating appliance standards completely. “A top priority for President Trump is lowering costs for American families,” Energy Secretary Chris Wright said in a statement. “Today’s announcement will foster consumer choice and lower prices … the people, not the government, should be choosing the home appliances and products they want at prices they can afford.” Under Biden, the DOE’s appliance efficiency program boosted the standards for more than two dozen product classes, though the implementation of those standards for some appliances are still years away. DOE’s decision this month postpones implementation of new standards for central air conditioners, clothes washers and dryers, general service lamps, walk-in coolers and freezers, gas instantaneous water heaters, commercial refrigeration equipment, and air compressors. The Energy Policy and Conservation Act says DOE must review appliance efficiency standards every six years. DOE also said it will create a new energy efficiency category for natural gas tankless water heaters, exempting them from “onerous rules.” Under Biden, the agency in December published a final rule that opponents said would essentially ban some types of water heaters. The National Association of Home Builders said it “strongly supports congressional resolutions introduced in both chambers of Congress that seek to block the Biden administration’s recent attempt

Read More »

Are utility demand response program costs outstripping their benefits in California?

Collin Smith is a regulatory affairs manager at Leap, a company that helps distributed energy resources participate in wholesale markets. California’s electricity prices are skyrocketing. In October, Gov. Gavin Newsom issued an executive order aimed at tackling this issue, calling for the California Public Utilities Commission to evaluate the cost-effectiveness of ratepayer-funded programs and determine where program spending may be outstripping those programs’ benefits. Among the trends worth scrutinizing is the state’s increasing shift towards utility-run supply-side demand response programs. In 2015, California sub-divided its DR programs, creating a class of programs known as “supply-side DR,” or SSDR. This class of programs allowed DR aggregators to bid demand-side resources directly into California’s wholesale market and earn capacity payments through its Resource Adequacy program.  The logic was simple: both energy prices in the wholesale market and capacity prices in RA are determined by the laws of supply and demand, producing the most cost-effective pricing for these respective energy products. As a result, using these mechanisms to procure DR resources would naturally direct California’s load-serving entities to procure the least-cost options, supporting California’s longstanding goal of developing DR programs that can cost-effectively meet the needs of the grid.  Who pays the price of a program? Over the last decade, third-party DR providers, or DRPs, have begun participating directly in these markets at scale, but California also allows investor-owned utilities to participate via their own SSDR programs. However, unlike third-party DRPs, the costs for IOU-run programs are largely divorced from market dynamics. Administrative costs for third-party DRPs are covered by their balance sheets, so they cannot be higher than what those DRPs can recover from the market. By contrast, administrative costs for IOU-run programs are recovered directly from ratepayers, so their costs can be whatever the IOU says it needs to run them. These

Read More »

Wood re-enters takeover talks with Dubai firm Sidara

Aberdeen-headquartered Wood (LON:WG) has confirmed it has received a fresh approach from rival firm Sidara regarding a possible takeover bid. The Dubai-based firm walked away from making an offer for Wood in August last year after months of takeover speculation. On 3 July Sidara had made a “‘final offer” in a previous bid to acquire Wood at 230p per share due to be decided by 31 July. But in August, Sidara said it opted against making an improved offer “in light of rising geopolitical risks and financial market uncertainty”. Since then, Wood’s share price has collapsed from around 200p per share to its current level of around 34.7p per share. On Monday, Wood confirmed an approach from Sidara “in relation to a possible offer for the entire issued and to be issued share capital of the company”. Under UK “put up or shut up” laws, Sidara now has until 24 March to announce a firm intention to make an offer for Wood, although the deadline can be extended. Despite a 33% rise in its share price following confirmation of the Sidara offer, the increase pushed Wood’s market capitalisation to just over £240m, well short of the approximately £1.6bn valuation Sidara had offered last year. Sidara takeover saga All three takeover bids from Sidara, also known as Dar Al-Handasah Consultants Shair and Partners, last year came in below a separate takeover bid from US firm Apollo in 2023. Wood has been the subject of continued takeover speculation in recent years as the company continues to struggle financially, posting a £754 million half-year loss in August. © DC ThomsonWood Group’s headquarters in Aberdeen. Wood chief executive Ken Gilmartin has said he is “disappointed” with the company’s finances as he continues efforts to sell off parts of the business and write down contracts

Read More »

US governors press for infrastructure permitting reform

The National Governors Association will press Congress this year to pass permitting reform legislation, the group said Friday. “Permitting reform is one of those issues where both Republicans and Democrats recognize the problem, we largely agree on solutions, and Congress gets close year after year to doing something,” NGA Vice Chair Oklahoma Gov. Kevin Stitt, R, said in a press release. “But somehow it just never crosses the finish line.” The NGA launched a working group to explore issues surrounding permitting reform, infrastructure acceleration and energy resilience, with a focus on the National Environmental Policy Act, judicial review and transmission siting, the group said. Stitt and Pennsylvania Gov. Josh Shapiro, D, will co-chair the working group. Governors will also focus on making sure the federal government meets its already committed obligations for federally funded projects in states, territories and commonwealths, the NGA said Saturday. “State input is key to avoid abrupt changes that create uncertainty and adversely impact the countless services we run to support infrastructure, education, health care, economic growth and disaster response in our states,” NGA Chair Colorado Gov. Jared Polis, D, said in a press release. Federal priorities for the organization include working with Congress on the “most impactful” programs for states and territories in the Infrastructure Investment and Jobs Act and the CHIPS and Science Act, according to the NGA. They also include speeding infrastructure development and securing energy resilience. The United States must move faster to expand existing power plants and build new generating resources, transmission lines and gas pipelines, Department of Interior Secretary Doug Burgum said at the NGA’s annual meeting. Burgum contends the U.S. is heading for a “catastrophe” because baseload power plants have been prematurely closed and electric demand is increasing, partly driven by data centers for artificial intelligence. “We’ve just got

Read More »

Do data centers threaten the water supply?

In a new report, the Royal Academy of Engineering called upon the government to ensure tech companies accurately report how much energy and water their data centers are using and reducing the use of drinking water for cooling. Without such action, warns one of the report’s authors, Professor Tom Rodden, “we face a real risk that our development, deployment and use of AI could do irreparable damage to the environment.” The situation is a little different for the US as the country has large bodies of water offering a  water supply that the UK just does not have. It’s not an accident that there are many data centers around the Chicago area: they’ve also got the Great Lakes to draw upon. Likewise, the Columbia and Klamath Rivers have become magnets for data centers for both water supply and hydroelectric power. Other than the Thames River, the UK doesn’t have these massive bodies of water. Still, the problem is not unique to the UK, says Alan Howard, senior analyst with Omdia. He notes that Microsoft took heat last year because it was draining the water supply of a small Arizona town of Goodyear with a new AI-oriented data center.  The city of Chandler, Arizona passed an ordinance in 2015 that restricted new water-intensive businesses from setting up shop which slowed data center development.   “I believe some data center operators just bowed out,” said Howard.

Read More »

Ireland says there will be no computation without generation

Stanish said that, in 2023, she wrote a paper that predicted “by 2028, more than 70% of multinational enterprises will alter their data center strategies due to limited energy supplies and data center moratoriums, up from only about 5% in 2023. It has been interesting watching this trend evolve as expected, with Ireland being a major force in this conversation since the boycotts against data center growth started a few years ago.” Fair, equitable, and stable electricity allocation, she said, “means that the availability of electricity for digital services is not guaranteed in the future, and I expect these policies, data center moratoriums, and regional rejections will only continue and expand moving forward.” Stanish pointed out that this trend is not just occurring in Ireland. “Many studies show that, globally, enterprises’ digital technologies are consuming energy at a faster rate than overall growth in energy supply (though, to be clear, these studies mostly assume a static position on energy efficiency of current technologies, and don’t take into account potential for nuclear or hydrogen to assuage some of these supply issues).” If taken at face value, she said, this means that a lack of resources could cause widespread electricity shortages in data centers over the next several years. To mitigate this, Stanish said, “so far, data center moratoriums and related constraints (including reduced tax incentives) have been enacted in the US (specifically Virginia and Georgia), Denmark, Singapore, and other countries, in response to concerns about the excessive energy consumption of IT, particularly regarding compute-intense AI workloads and concerns regarding an IT energy monopoly in certain regions. As a result, governments (federal, state, county, etc.) are working to ensure that consumption does not outpace capacity.” Changes needed In its report, the CRU stated, “a safe and secure supply of energy is essential

Read More »

Perspective: Can We Solve the AI Data Center Power Crisis with Microgrids?

President Trump announced a$500 billion private sector investment in the nation’s Artificial Intelligence (AI) infrastructure last month. The investment will come from The Stargate Project, a joint venture between OpenAI, SoftBank, Oracle and MGX, which intends to build 20 new AI data centers in the U.S in the next four to five years. The Stargate Project committed$100 billion for immediate deployment and construction has already begun on its first data center in Texas. At approximately a half a million square feet each, the partners say these new facilities will cement America’s leadership in AI, create jobs and stimulate economic growth. Stargate is not the only game in town, either. Microsoft is expected to invest$80 billion in AI data center development in 2025, with Google, AWS and Meta also spending big. While all this investment in AI infrastructure is certainly exciting, experts say there’s one lingering question that’s yet to be answered and it’s a big one: How are we going to power all these AI data centers? This will be one of the many questions tackled duringMicrogrid Knowledge’s annual conference, which will be held in Texas April 15-17 at the Sheraton Dallas. “Powering Data Centers: Collaborative Microgrid Solutions for a Growing Market” will be one of the key sessions on April 16. Industry experts will gather to discuss how private entities, developers and utilities can work together to deploy microgrids and distributed energy technologies that address the data center industry’s power needs. The panel will share solutions, technologies and strategies that will favorably position data centers in the energy queue. In advance of this session, we sat down with two microgrid experts to learn more about the challenges facing the data center industry and how microgrids can address the sector’s growing energy needs. We spoke with Michael Stadler, co-founder and

Read More »

Data Center Tours: Iron Mountain VA-1, Manassas, Virginia

Iron Mountain Northern Virginia Overview Iron Mountain’s Northern Virginia data centers VA-1 through VA-7 are situated on a 142-acre highly secure campus in Prince William County, Virginia. Located at 11680 Hayden Road in Manassas, Iron Mountain VA-1 spans 167,958 sq. ft. and harbors 12.4 MW of total capacity to meet colocation needs. The 36 MW VA-2 facility stands nearby. The total campus features a mixture of single and multi-tenant facilities which together provide more than 2,000,000 SF of highly efficient green colocation space for enterprises, federal agencies, service providers and hyperscale clouds.  The company notes that its Manassas campus offers tax savings compared to Ashburn and exceptional levels of energy-efficiency as well as a diverse and accessible ecosystem of cloud, network and other service providers.  Iron Mountain’s Virginia campus has 9 total planned data centers, with 5 operational facilities to date and two more data centers coming soon. VA-2 recently became the first data center in the United States to achieve DCOS Maturity Level 3.    As we continued the tour, Kinra led the way toward the break room, an area where customers can grab coffee or catch up on work. Unlike the high-end aesthetic of some other colocation providers, Iron Mountain’s approach is more practical and focused on functionality. At the secure shipping and receiving area, Kinra explained the process for handling customer equipment. “This is where our customers ship their equipment into,” he said. “They submit a ticket, send their shipments in, and we’ll take it, put it aside for them, and let them know when it’s here. Sometimes they ask us to take it to their environment, which we’ll do for them via a smart hands ticket.” Power Infrastructure and Security Measures The VA-1 campus is supported by a single substation, providing the necessary power for its growing

Read More »

Land and Expand: DPO, Microsoft, JLL and BlackChamber, Prologis, Core Scientific, Overwatch Capital

Land and Expand is a periodic feature at Data Center Frontier highlighting the latest data center development news, including new sites, land acquisitions and campus expansions. Here are some of the new and notable developments from hyperscale and colocation data center developers and operators about which we’ve been reading lately. DPO to Develop $200 Million AI Data Center in Wisconsin Rapids; Strategic Partnership with Billerud’s CWPCo Unlocks Hydroelectric Power for High-Density AI Compute Digital Power Optimization (DPO) is moving forward with plans to build a $200 million high-performance computing (HPC) data center in Wisconsin Rapids, Wisconsin. The project, designed to support up to 20 megawatts (MW) of artificial intelligence (AI) computing, leverages an innovative partnership with Consolidated Water Power Company (CWPCo), a subsidiary of global packaging leader Billerud. DPO specializes in developing and operating data centers optimized for power-dense computing. By partnering with utilities and independent power producers, DPO colocates its facilities at energy generation sites, ensuring direct access to sustainable power for AI, HPC, and blockchain computing. The company is privately held. Leveraging Power Infrastructure for Speed-to-Energization CWPCo, a regulated utility subsidiary, has operated hydroelectric generation assets since 1894, reliably serving industrial and commercial customers in Wisconsin Rapids, Biron, and Stevens Point. Parent company Billerud is a global leader in high-performance packaging materials, committed to sustainability and innovation. The company operates nine production facilities across Sweden, the USA, and Finland, employing 5,800 people in over 19 countries.  The data center will be powered by CWPCo’s renewable hydroelectric assets, tapping into the utility’s existing 32 megawatts of generation capacity. The partnership grants DPO a long-term land lease—extending up to 50 years—alongside interconnection rights to an already-energized substation and a firm, reliable power supply. “AI infrastructure is evolving at an unprecedented pace, and access to power-dense sites is critical,” said Andrew

Read More »

Data center spending to top $1 trillion by 2029 as AI transforms infrastructure

His projections account for recent advances in AI and data center efficiency, he says. For example, the open-source AI model from Chinese company DeepSeek seems to have shown that an LLM can produce very high-quality results at a very low cost with some clever architectural changes to how the models work. These improvements are likely to be quickly replicated by other AI companies. “A lot of these companies are trying to push out more efficient models,” says Fung. “There’s a lot of effort to reduce costs and to make it more efficient.” In addition, hyperscalers are designing and building their own chips, optimized for their AI workloads. Just the accelerator market alone is projected to reach $392 billion by 2029, Dell’Oro predicts. By that time, custom accelerators will outpace commercially available accelerators such as GPUs. The deployment of dedicated AI servers also has an impact on networking, power and cooling. As a result, spending on data center physical infrastructure (DCPI) will also increase, though at a more moderate pace, growing by 14% annually to $61 billion in 2029.  “DCPI deployments are a prerequisite to support AI workloads,” says Tam Dell’Oro, founder of Dell’Oro Group, in the report. The research firm raised its outlook in this area due to the fact that actual 2024 results exceeded its expectations, and demand is spreading from tier one to tier two cloud service providers. In addition, governments and tier one telecom operators are getting involved in data center expansion, making it a long-term trend.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »