Stay Ahead, Stay ONMINE

Calumet reports delay in government loan for Montana renewable fuels plant

Update: The White House budget office on Wednesday, Jan. 29, rescinded an order freezing federal grants.  Calumet Inc. subsidiary Montana Renewables LLC (MRL) has confirmed that the first tranche of a loan previously approved by the US Department of Energy (DOE) for expansion of a renewable fuels manufacturing plant in Great Falls, Mont., will undergo […]

Update: The White House budget office on Wednesday, Jan. 29, rescinded an order freezing federal grants. 

Calumet Inc. subsidiary Montana Renewables LLC (MRL) has confirmed that the first tranche of a loan previously approved by the US Department of Energy (DOE) for expansion of a renewable fuels manufacturing plant in Great Falls, Mont., will undergo a ‘tactical delay.’ 

Finalized as of Jan. 10, $782 million—the first of a two-tranche loan guarantee—was expected to fund eligible expenses MRL previously incurred, upon payment of which MRL was expected to simultaneously make a $150-million equity investment with funds the operator currently holds, Calumet said earlier this month (OGJ Online, Jan. 13, 2025).

Calumet said Jan. 28 that it had been informed by the DOE Loan Programs Office that the initial money will undergo a delay “to confirm alignment with White House priorities.” Calumet said it was informed that the delay “should be days or weeks.” 

Questions about Executive Order alignment 

Included in Jan. 20 Executive Orders, President Donald Trump pushed to pause federal loans and grants while the new administration conducts reviews to assess alignment with the administration’s policies as applicable by law. A memorandum from the Office of Management and Budget said the freeze pertains to climate change and green energy provisions, but confusion remains.

Late Tuesday, a federal judge temporarily blocked the move for existing programs until Monday afternoon, just minutes before the funding freeze was to go into effect, the Associated Press reported. Judge Loren AliKhan granted an administrative stay in a case, setting a hearing for additional arguments.

In its update regarding the timing of the DOE loan, Todd Borgmann, Calumet’s chief executive officer, said “we are well aligned with White House priorities to support domestic agriculture, energy security, technical innovation and energy independence, all of which play a role in the tremendous bipartisan support this loan has received since its inception.” 

“The recently issued Executive Order specifically highlights the importance of biofuels to our nation’s domestic energy policy and energy independence, and we look forward to a quick review confirming that we are aligned with the Administration’s goals.”

According to Calumet and MRL, the MaxSAF expansion is expected to boost the Great Falls biorefinery’s sustainable aviation fuel (SAF) production by more than 700% to about 300 million gal/year to bring the site’s combined SAF-renewable diesel production capacity at the site of 330 million gal/year (OGJ Online, July 10, 2023).

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Quantum networking advances on Earth and in space

“Currently, the U.S. government is not investing in such testbeds or demonstrations, ensuring it will be a follower and not a leader in the development of technical advances in the field,” said a report released last year by the Quantum Economic Development Consortium, a global association of more than 250

Read More »

Who wins in networking in 2025 and beyond?

In this third team, we also find vendors most interested and active in the IoT opportunity, two of which are usually not considered enterprise vendors at all: Ericsson and Nokia. Both these companies are actively pursuing new IoT strategies, and while they are surely hoping that broad public-sensor IoT will

Read More »

Cisco researchers highlight emerging threats to AI models

Cisco security researchers this week detailed a number of threats they are seeing from bad actors trying to infect or attack AI’s most common component – the large language model. Some techniques used to hide messages or attacks from anti-spam systems are familiar to security specialists: “Hiding the nature of

Read More »

Macquarie Strategists Forecast USA Crude Inventory Build

In an oil and gas report sent to Rigzone by the Macquarie team late Monday, Macquarie strategists revealed that they are forecasting that U.S. crude inventories will be up by 10.8 million barrels for the week ending January 31. “This compares to our early look for the week which anticipated an 11.7 million barrel build, and a 3.5 million barrel build realized for the week ending January 24,” the strategists said in the report. “On the product side of the ledger, in aggregate, our expectations are slightly looser than our early view,” they added. In the report, the Macquarie strategists noted that, “for this week’s crude balance, from refineries”, they “model crude runs down modestly (-0.3 million barrels per day)”. “Among net imports, we model a slight decrease, with exports slightly higher (+0.1 million barrels per day) and imports effectively unchanged on a nominal basis,” they added. The strategists stated in the report that timing of cargoes remains a source of potential volatility in this week’s crude balance. “From implied domestic supply (prod. +adj. +transfers), we look for a rebound (+0.8 million barrels per day) following a soft print last week amidst potential weather impacts,” the strategists said in the report. “Rounding out the picture, we anticipate another small increase in SPR [Strategic Petroleum Reserve] inventory (+0.3 million barrels) on the week,” they added. The Macquarie strategists also noted in the report that, “among products”, they “look for builds in gasoline (+0.5 million barrels) and jet (+0.5 million barrels), with a draw in distillate (-3.8 million barrels)”. “We model implied demand for these three products at ~13.8 million barrels per day for the week ending January 31,” the Macquarie strategists went on to state. In an oil and gas report sent to Rigzone by the Macquarie team late Thursday, Macquarie

Read More »

California PUC streamlines transmission permitting process

The California Public Utilities Commission on Thursday streamlined its transmission permitting requirements as part of a broader effort to meet the state’s clean energy goals. Among the changes, the PUC now requires project developers to meet with commission staff before filing a permit application to help the review process run more smoothly, according to the agency. The revised rules also give transmission developers a chance to submit a draft version of California Environmental Quality Act documents instead of an environmental assessment to help speed permit reviews, according to a proposed decision approved by the commission on Thursday. Also, the PUC will assume a project is needed if the California Independent System Operator has already determined that it is needed, a change that will avoid duplicative analysis, according to the commission. The decision adopts a pilot to study ways to streamline the PUC’s environmental review process. “These changes will accelerate permitting timelines by reducing redundancy and shifting environmental analysis earlier in the application process,” PUC Commissioner Karen Douglas, who led the proceeding, said in a press release. The revised permitting rules will help the PUC address the scale of grid upgrades that will be needed to maintain a reliable grid and meet California’s climate goals, Douglas said. California will need about $46 billion to $63 billion in new transmission over the next 20 years, according to a “transmission outlook” report CAISO released July 31. The investments include up to $12 billion for upgrades to its system, up to $15.2 billion for transmission to import wind from other states and up to $37 billion for offshore wind. California law requires that all the state’s electricity comes from renewable energy and zero-carbon sources by 2045. The PUC’s decision completes a two-phase process to implement SB 529, a bill signed into law in 2022 that

Read More »

Aberdeen’s Interocean wins work on BP wind farms

Aberdeen-based Interocean has been awarded a contract to carry out marine vetting and assurance work on BP’s Morgan, Mona and Morven offshore wind farms. The three-year deal, signed for an undisclosed sum, will see the marine solutions provider will assist the BP (LON: BP) and Energie Württemberg AG (EnBW) projects as it looks to deliver 5.9GW of offshore wind capacity. The firm has said that its existing team will cover the work, however, it is on track to achieve 20% global growth between July 2024 and July this year. Interocean chief operations officer, Alex Reid, said that the firm looks forward to working on assets “of such national importance”. He added: “As a UK headquartered company, we take immense pride in supporting initiatives that have the potential to enhance energy security and benefit local communities.” Morgan (1.5GW) and Mona (1.5GW) are located in the Irish Sea, around 13 to 23 miles from shore and cover a combined area of approximately 360 miles². The giant 2.9GW Morven is based in the North Sea and will stand roughly 37 miles off the coast of Aberdeen. This project was approved as part of the Scot Wind leasing round which backed 17 projects with a combined capacity of 25 gigawatts in 2022. The Supermajor’s website claims that Morven will “unlock £10 billion for offshore wind development and the skills and opportunities to support Scotland’s energy transition”. All three of the developments are fixed-bottom. They are expected to generate enough electricity to power the equivalent of six million UK households every year. BP’s stakes in the projects are set to be rolled up into a new entity. Late last year BP announced that it would be carving out its offshore wind assets into a joint venture project with Japan’s largest power generation company. The new

Read More »

Australian trio progress UK North Sea projects despite ‘challenging environment’

A trio of Australian oil and gas firms are pressing ahead with development work on UK North Sea projects despite a “challenging environment”. Hartshead Resources (ASX:HHR), Finder Energy (ASX:FDR) and Triangle Energy (ASX:TEG) all hold UK production licences, with the three firms all seeing success in the 33rd oil and gas licensing round. It comes as North Sea operators come to terms with changes to the windfall tax and potential limits on future drilling under the UK Labour government. Hartshead Resources Perth-headquartered Hartshead is developing the Anning and Somerville gas project in the Southern North Sea (SNS), alongside the six licences it secured in the 33rd round. Hartshead Resources chief executive Chris Lewis said the company remains “fully focused” on progressing its Anning and Somerville gas field project. “We are pleased that certainty has been provided in relation to the fiscal policy for the oil & gas industry in the UK, and that the government has seen the vital important of retaining first year capital expensing and full capital allowances,” Lewis said. Lewis said Hartshead is consulting with the North Sea Transition Authority (NSTA) regulator and its joint venture partner Viaro Energy on advancing the project through the field development stage and to a final investment decision. Hartshead has also undertaken a “reassessment of the gas export route” for Anning and Somervile following, the company said. “This work included a revised evaluation of an option that had previously been considered unavailable due to the circumstances at the time,” the company said in its quarterly statement. © Supplied by Hartshead ResourcesHartshead’s preferred tie-in option for Anning and Sommerville. “Recent changes in asset ownership and operatorship have seen this alternate route become promising as an alternate route to transport the Anning and Somerville gas to market.” Last year, Viaro Energy a deal

Read More »

Trump Signs Executive Action to Create Sovereign Wealth Fund

President Donald Trump signed an executive action he said would direct officials to create a sovereign wealth fund for the US, following through on an idea he floated during the presidential campaign. “We have tremendous potential,” Trump told reporters in the Oval Office on Monday as he announced the move. The president said the action would charge Treasury Secretary Scott Bessent and Howard Lutnick, the nominee for Commerce secretary, with spearheading the effort. Bessent, who joined Trump at the Oval Office, said the fund would be created in the next 12 months, calling it an issue “of great strategic importance.” Trump suggested the fund could be used to facilitate the sale of TikTok, which is currently operating in the US thanks to an extension he signed prolonging the deadline for a forced sale or shutdown.  Lutnick said the US government could leverage its size and scale given the business it does with companies, citing drug makers as an example.  “If we are going to buy two billion Covid vaccines, maybe we should have some warrants and some equity in these companies,” he said. The action calls for officials to submit a plan to Trump within 90 days, including recommendations for funding, investment strategies, fund structure and governance. And it asks for an evaluation of the legal considerations for setting up and running a fund, including whether legislation is required. Trump advisers have previously discussed plans to use the US International Development Finance Corp. to partner with major institutional players to leverage US economic powers. Among those driving the conversation about using the DFC both more like a sovereign fund and as a tool to radically change America’s approach to foreign aid are Elon Musk and Stephen Feinberg, the billionaire co-founder of Cerberus Capital Management, who Trump has nominated as deputy

Read More »

Boston expands net-zero emissions requirements to new buildings, large additions

The Boston Zoning Commission has approved an amendment to the city’s zoning code that requires most large new buildings to have net-zero carbon emissions from the day they open, according to a release Thursday from Boston Mayor Michelle Wu.  The amendment sets a net-zero emissions standard for new project filings after July 1, 2025, that contain 15 units or more, are a minimum of 20,000 square feet, or add 50,000 square feet or more to existing buildings.  Buildings are the source of nearly 71% of the community’s carbon emissions, according to the release. By adjusting the operational performance requirements of new buildings, Boston can minimize its emissions and impact on the climate, the city said. Buildings can meet the new requirements by methods including reducing their energy use and reliance on fossil fuels and increasing their electrification and renewable energy sources, it said.  Projects must demonstrate compliance with this standard during an existing development review process. Both small and large projects, as defined in that review process, will need to report on embodied carbon; large projects will also be required to conduct an embodied carbon life cycle assessment analysis.  These amendments “build upon and better reflect” updated state building codes and city ordinances, including the Specialized Stretch Energy Code and the Buildings Emissions Reduction and Disclosure Ordinance, or BERDO, the city says. Boston is one of at least 13 U.S. cities that had a building performance standard, or laws that require building owners to meet energy and emissions performance thresholds, in place by early 2024, according to a July report by JLL.  BERDO, which enters its first compliance period this year, initially covers nonresidential buildings spanning 20,000 square feet or more, residential buildings with 15 or more units and any tax parcel with multiple buildings that collectively meet these thresholds,

Read More »

Linux containers in 2025 and beyond

The upcoming years will also bring about an increase in the use of standard container practices, such as the Open Container Initiative (OCI) standard, container registries, signing, testing, and GitOps workflows used for application development to build Linux systems. We’re also likely see a significant rise in the use of bootable containers, which are self-contained images that can boot directly into an operating system or application environment. Cloud platforms are often the primary platform for AI experimentation and container development because of their scalability and flexibility along the integration of both AI and ML services. They’re giving birth to many significant changes in the way we process data. With data centers worldwide, cloud platforms also ensure low-latency access and regional compliance for AI applications. As we move ahead, development teams will be able to collaborate more easily through shared development environments and efficient data storage.

Read More »

Let’s Go Build Some Data Centers: PowerHouse Drives Hyperscale and AI Infrastructure Across North America

PowerHouse Data Centers, a leading developer and builder of next-generation hyperscale data centers and a division of American Real Estate Partners (AREP), is making significant strides in expanding its footprint across North America, initiating several key projects and partnerships as 2025 begins.  The new developments underscore the company’s commitment to advancing digital infrastructure to meet the growing demands of hyperscale and AI-driven applications. Let’s take a closer look at some of PowerHouse Data Centers’ most recent announcements. Quantum Connect: Bridging the AI Infrastructure Gap in Ashburn On January 17, PowerHouse Data Centers announced a collaboration with Quantum Connect to develop Ashburn’s first fiber hub specifically designed for AI and high-density workloads. This facility is set to provide 20 MW of critical power, with initial availability slated for late 2026.  Strategically located in Northern Virginia’s Data Center Alley, Quantum Connect aims to offer scalable, high-density colocation solutions, featuring rack densities of up to 30kW to support modern workloads such as AI inference, edge caching, and regional compute integration. Quantum Connect said it currently has 1-3 MW private suites available for businesses seeking high-performance infrastructure that bridges the gap between retail colocation and hyperscale facilities. “Quantum Connect redefines what Ashburn’s data center market can deliver for businesses caught in the middle—those too large for retail colocation yet underserved by hyperscale environments,” said Matt Monaco, Senior Vice President at PowerHouse Data Centers. “We’re providing high-performance solutions for tenants with demanding needs but without hyperscale budgets.” Anchored by 130 miles of private conduit and 2,500 fiber pathways, Quantum Connect’s infrastructure offers tenants direct, short-hop connections to adjacent facilities and carrier networks.  With 14 campus entrances and secure, concrete-encased duct banks, the partners said the new facility minimizes downtime risks and reduces operational costs by eliminating the need for new optics or extended fiber runs.

Read More »

Blue Owl Swoops In As Major Backer of New, High-Profile, Sustainable U.S. Data Center Construction

With the global demand for data centers continuing to surge ahead, fueled by the proliferation of artificial intelligence (AI), cloud computing, and digital services, it is unsurprising that we are seeing aggressive investment strategies, beyond those of the existing hyperscalers. One of the dynamic players in this market is Blue Owl Capital, a leading asset management firm that has made significant strides in the data center sector. Back in October 2024 we reported on its acquisition of IPI Partners, a digital infrastructure fund manager, for approximately $1 billion. This acquisition added over $11 billion to the assets Blue Owl manages and focused specifically on digital infrastructure initiatives. This acquisition was completed as of January 5, 2025 and IPI’s Managing Partner, Matt A’Hearn has been appointed Head of Blue Owl’s digital infrastructure strategy. A Key Player In Digital Infrastructure and Data Centers With multi-billion-dollar joint ventures and financing initiatives, Blue Owl is positioning itself as a key player in the digital infrastructure space. The company investments in data centers, the implications of its strategic moves, and the broader impact on the AI and digital economy highlights the importance of investment in the data center to the economy overall. With the rapid growth of the data center industry, it is unsurprising that aggressive investment fund management is seeing it as an opportunity. Analysts continue to emphasize that the global data center market is expected to grow at a compound annual growth rate (CAGR) of 10.2% from 2023 to 2030, reaching $517.17 billion by the end of the decade. In this rapidly evolving landscape, Blue Owl Capital has emerged as a significant contributor. The firm’s investments in data centers are not just about capitalizing on current trends but also about shaping the future of digital infrastructure. Spreading the Wealth In August 2024, Blue Owl

Read More »

Global Data Center Operator Telehouse Launches Liquid Cooling Lab in the UK to Meet Ongoing AI and HPC Demand

@import url(‘/fonts/fira_sans.css’); a { color: #0074c7; } .ebm-page__main h1, .ebm-page__main h2, .ebm-page__main h3, .ebm-page__main h4, .ebm-page__main h5, .ebm-page__main h6 { font-family: “Fira Sans”, Arial, sans-serif; } body { letter-spacing: 0.025em; font-family: “Fira Sans”, Arial, sans-serif; } button, .ebm-button-wrapper { font-family: “Fira Sans”, Arial, sans-serif; } .label-style { text-transform: uppercase; color: var(–color-grey); font-weight: 600; font-size: 0.75rem; } .caption-style { font-size: 0.75rem; opacity: .6; } #onetrust-pc-sdk [id*=btn-handler], #onetrust-pc-sdk [class*=btn-handler] { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-policy a, #onetrust-pc-sdk a, #ot-pc-content a { color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-pc-sdk .ot-active-menu { border-color: #005ea0 !important; } #onetrust-consent-sdk #onetrust-accept-btn-handler, #onetrust-banner-sdk #onetrust-reject-all-handler, #onetrust-consent-sdk #onetrust-pc-btn-handler.cookie-setting-link { background-color: #005ea0 !important; border-color: #005ea0 !important; } #onetrust-consent-sdk .onetrust-pc-btn-handler { color: #005ea0 !important; border-color: #005ea0 !important; background-color: undefined !important; } Starting in early 2025, Telehouse International Corporation of Europe will offer an advanced liquid cooling lab at their newest data center, Telehouse South at the London Docklands campus in Blackwall Yard. Telehouse has partnered with four leading liquid-cooling technology vendors — Accelsius, JetCool, Legrand, and EkkoSense — to allow customers to explore different cooling technologies and management tools while evaluating suitability for their use in the customer applications. Dr. Stu Redshaw, Chief Technology and Innovation Officer at EkkoSense, said about the project: Given that it’s not possible to run completely liquid-cooled data centers, the reality for most data center operators is that liquid cooling and air cooling will have an important role to play in the cooling mix – most likely as part of an evolving hybrid cooling approach. However, key engineering questions need answering before simply deploying liquid cooling – including establishing the exact blend of air and liquid cooling technologies you’ll need. And also recognizing the complexity of managing the operation of a hybrid air cooling and liquid cooling approach within the same room. This increases the

Read More »

Flexential Partners with Lonestar to Support First Lunar Data Center

Flexential, a leading provider of secure and flexible data center solutions, this month announced that it has joined forces with Lonestar Data Holdings Inc. to support the upcoming launch of Freedom, Lonestar’s second lunar data center. Scheduled to launch aboard a SpaceX Falcon 9 rocket via Intuitive Machines, this mission is a critical step toward establishing a permanent data center on the Moon. Ground-Based Support for Lunar Data Storage Flexential’s Tampa data center will serve as the mission control platform for Lonestar’s lunar operations, providing colocation, interconnection, and professional services. The facility was chosen for its proximity to Florida’s Space Coast launch operations and its ability to deliver low-latency connectivity for critical functions. Flexential operates two data centers in Tampa and four in Florida as part of its FlexAnywhere® Platform, comprising more than 40 facilities across the U.S. “Flexential’s partnership with Lonestar represents our commitment to advancing data center capabilities beyond conventional boundaries,” said Jason Carolan, Chief Innovation Officer at Flexential. “By supporting Lonestar’s space-based data center initiative, we are helping to create new possibilities for data storage and disaster recovery. This project demonstrates how innovative data center expertise can help organizations prepare for a resilient future with off-world storage solutions.” A New Era of Space-Based Resiliency The growing demand for data center capacity, with U.S. power consumption expected to double from 17 GW in 2022 to 35 GW by 2030 (according to McKinsey & Company), is driving interest in space-based solutions. Storing data off-planet reduces reliance on terrestrial resources while enhancing security against natural disasters, warfare, and cyber threats. The Freedom data center will provide resiliency, disaster recovery, and edge processing services for government and enterprise customers requiring the highest levels of data protection. The solar-powered data center leverages Solid-State Drives (SSDs) and a Field Programmable Gate Array (FPGA) edge

Read More »

Why DeepSeek Is Great for AI and HPC and Maybe No Big Deal for Data Centers

In the rapid and ever-evolving landscape of artificial intelligence (AI) and high-performance computing (HPC), the emergence of DeepSeek’s R1 model has sent ripples across industries. DeepSeek has been the data center industry’s topic of the week, for sure. The Chinese AI app surged to the top of US app store leaderboards last weekend, sparking a global selloff in technology shares Monday morning.  But while some analysts predict a transformative impact within the industry, a closer examination suggests that, for data centers at large, the furor over DeepSeek might ultimately be much ado about nothing. DeepSeek’s Breakthrough in AI and HPC DeepSeek, a Chinese AI startup, this month unveiled its R1 model, claiming performance on par with, or even surpassing, leading models like OpenAI’s ChatGPT-4 and Anthropic’s Claude-3.5-Sonnet. Remarkably, DeepSeek developed this model at a fraction of the cost typically associated with such advancements, utilizing a cluster of 256 server nodes equipped with 2,048 GPUs. This efficiency has been attributed to innovative techniques and optimized resource utilization. AI researchers have been abuzz about the performance of the DeepSeek chatbot that produces results similar to ChatGPT, but is based on open-source models and reportedly trained on older GPU chips. Some researchers are skeptical of claims about DeepSeek’s development costs and means, but its performance appears to challenge common assumptions about the computing cost of developing AI applications. This efficiency has been attributed to innovative techniques and optimized resource utilization.  Market Reactions and Data Center Implications The announcement of DeepSeek’s R1 model led to significant market reactions, with notable declines in tech stocks, including a substantial drop in Nvidia’s valuation. This downturn was driven by concerns that more efficient AI models could reduce the demand for high-end hardware and, by extension, the expansive data centers that house them. For now, investors are re-assessing the

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »