Stay Ahead, Stay ONMINE

ISACA Barcelona president warns of quantum illiteracy

Gallego says the challenge is not theoretical but practical, adding, “We are already seeing clear warning signs.” He warns of “the so-called ‘harvest now, decrypt later’ attacks, which consist of intercepting encrypted data today to decrypt it in the future with quantum technology. This is not science fiction, but a concrete threat that requires immediate […]

Gallego says the challenge is not theoretical but practical, adding, “We are already seeing clear warning signs.” He warns of “the so-called ‘harvest now, decrypt later’ attacks, which consist of intercepting encrypted data today to decrypt it in the future with quantum technology. This is not science fiction, but a concrete threat that requires immediate responses.

Prepare now for Q-day

European organizations are not prepared to deal with this type of threat, Gallego says: “Only 4% of European organizations have a formal quantum threat strategy. And just 2% of the professionals surveyed feel genuinely familiar with these technologies. That gap between risk awareness and actual action is worrying. Preparedness cannot be optional; it must be a strategic priority.”

Gallego believes that “there is a significant lack of quantum literacy.” In his opinion, quantum computing breaks technological molds and forces us to rethink how we manage privacy, identity, and data integrity. (See also: 9 steps to take to prepare for a quantum future)

“Some organizations believe this is a problem of the future, but the truth is that preparation must start today. Because when ‘Q-day’ comes—that moment when a quantum computer is able to break today’s encryption—it will be too late to react. What we don’t encrypt securely today will be vulnerable tomorrow,” he continues. “There are already standards developed by organizations such as NIST, and it is essential to start integrating them.”

Therefore, the first thing that organizations have to do is to train their professionals in quantum fundamentals, in new encryption algorithms, and in how to adapt their infrastructures to this new paradigm, he says. The second thing they have to do is to identify which sensitive data is protected with algorithms that may become obsolete in a short time. And the third thing, according to Gallego, is to start the transition to post-quantum cryptography as soon as possible.

“Finally, I strongly believe in public-private collaboration. Real innovation happens when the state, business, and academia are rowing in the same direction. The Spanish Government’s Quantum Strategy, with more than 800 million euros of investment, is a big step in that direction,” he adds. “The ‘Q-day’ will come, we don’t know if in five, 10, or 15 years, but it is an inevitable horizon. We cannot afford a scenario in which all of our confidential information is massively exposed. Encryption-based security is non-negotiable.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Cisco automates AI-driven security across enterprise networks

“The result is automated detection and response for the most common attacks,” Shipley wrote in a blog post about the new XDR capabilities. “Machine learning, machine reasoning, and LLMs combine to trigger multiple AI agents acting on different parts of the investigation lifecycle. Each investigation has a clear verdict. This

Read More »

ISACA Barcelona president warns of quantum illiteracy

Gallego says the challenge is not theoretical but practical, adding, “We are already seeing clear warning signs.” He warns of “the so-called ‘harvest now, decrypt later’ attacks, which consist of intercepting encrypted data today to decrypt it in the future with quantum technology. This is not science fiction, but a

Read More »

8 network observability certifications to boost IT operations skills

The traditional boundaries of network management are dissolving. As enterprises lean deeper into hybrid cloud, edge computing, and AI-powered automation, it is no longer enough to monitor uptime or ping a failing router. Network operations teams need end-to-end visibility across dynamic, distributed environments—on-premises and in the cloud. Network observability platforms

Read More »

Shropshire Council approves 10 MW solar farm

Shropshire Council has granted planning permission to a major 10 MW solar farm and battery project. G Power Solar is developing the 24.31 hectare project, located at the Beeches in Alkington near Whitchurch. The battery facility would have capacity to store up to 12.5 MWh of energy. According to a development management report, the project could generate enough energy to power up to 3,000 homes in Shropshire, or roughly 2.1% of the region’s houses. According to the project’s application, the proposed development could save 2,150 tonnes of CO2 per annum – equivalent to at least 86,000 tonnes over the project’s lifetime of around 40 years. While the grid connection location has yet to be confirmed, the application states that there is a potential point of connection located 1 km to the east. The application site lies within designated open countryside and is bounded by hedgerows and trees. The majority of the site is classed as Grade 3 Agricultural Land (ALC), with a small section of the north-eastern part classed as Grade 2. In addition, the land around the solar panels will be available for sheep grazing, with only around 20% of the land occupied by the proposed equipment. The solar panels would be divided between five different field parcels, while the battery energy storage system (BESS) would be located in the north-eastern corner of the site. The project’s substation will also be included onsite. G Power Solar’s Shropshire project is part of a major solar push by the Labour government. Soon after coming to power last year, UK energy secretary Ed Miliband vowed to triple the UK’s solar capacity by 2030. Recent research found that solar panel installations in England rose by 10% in the first six months since Labour took office. This accounted for nearly 75,000 solar panels installed

Read More »

National Grid, Con Edison, other utilities urge FERC to consider gas pipeline reliability measures

A group of electric and gas utilities on Friday urged the Federal Energy Regulatory Commission to launch an inquiry to consider options for improving gas pipeline reliability. Increased demand and extreme weather in the last decade have stressed the interstate gas transportation and storage system and reduced pipeline companies’ operational flexibility, according to a filing at FERC by four National Grid utilities — Boston Gas, Brooklyn Union Gas, KeySpan Gas East and Niagara Mohawk Power — along with Consolidated Edison Co. of New York, Old Dominion Electric Cooperative, Orange & Rockland Utilities and Washington Gas Light. “The challenges facing interstate pipelines, and other parts of the natural gas value chain, continue to intensify warranting greater regulatory involvement and standardization of practices,” the utilities said. Under its Natural Gas Act authority, FERC could consider the following, the utilities suggest: Requiring interstate pipelines to submit annual reports on reliability metrics. Requiring pipeline owners to consider whether facilities they plan to replace or retire may still offer reliability benefits to customers as redundant facilities. Revisiting its policy regarding pipeline force majeure and reservation charge crediting provisions to rebalance risk-sharing between pipelines and their customers and create incentives for pipeline reliability. Additional enhanced communication protocols for extreme weather events, greater standardization of pipeline scheduling and confirmation practices, and other mechanisms to make sure customers that pay for firm service are receiving full contractual value in exchange for their financial commitments. FERC and state utility regulators are set to meet on Wednesday to discuss gas-electric coordination and gas storage. Speakers at the Federal and State Current Issues Collaborative meeting include Jim Robb, North American Electric Reliability Corp president and CEO; Gordon van Welie, ISO New England president and CEO; and Paul Cicio, Industrial Energy Consumers of America president and CEO.

Read More »

Trump’s battery tariffs threaten utility-scale storage and US grid reliability

Brandon N. Owens is the founder of Clean Power Shift, an independent journal and research platform. President Trump’s April 2 “Liberation Day” tariffs slap a 64.5% composite duty on Chinese lithium-ion batteries today and hard-wire an 82% bite for 2026, a one-two punch that could upend the economics of every grid-scale storage project still penciled into developers’ spreadsheets. Solar inputs get singed too — Chinese polysilicon, wafers, and cells now carry a straight 60% levy — but the real story is batteries, where the new math pushes imported packs toward price points the U.S. market has not seen since before the storage boom. Utility-scale battery systems are no longer peripheral components of the power grid — they are essential infrastructure. These systems balance intermittent wind and solar generation, defer costly transmission upgrades, and provide flexible capacity during peak load events and extreme weather. In 2024, the U.S. added 8.7 GW of battery storage capacity — a 66% year-over-year increase, according to the U.S. Energy Information Administration. In 2025, developers are scheduled to install over 18 GW, with the majority of battery units still sourced internationally. As of last year, nearly 70% of imported grid battery systems came from China, according to BloombergNEF. The newly announced tariffs will make these imports significantly more expensive, placing immediate upward pressure on project costs and increasing the risk of deployment delays. Several utilities and independent power producers may need to revise procurement strategies, renegotiate project terms, or in some cases, reassess near-term feasibility. For utilities facing ambitious clean energy mandates, resource adequacy challenges and extreme weather risks, battery systems have become indispensable. They play a pivotal role in evening ramp support, frequency regulation, contingency reserves and load shifting. The new tariffs will not only affect procurement costs but could also force utilities to turn

Read More »

Phillips 66 to Build 300 MMcfd Gas Processing Facility in Permian

Phillips 66 has announced a new project to build a natural gas processing plant with a capacity of 300 million cubic feet a day (MMcfd) in the Permian Basin. Expected to start operation in the first quarter (Q1) of 2027, the Iron Mesa gas plant will serve Delaware and Midland Basin production, the Houston, Texas-based midstream and downstream player said as it announced Q1 2025 results. “The acquisition of EPIC NGL earlier this month, and today’s announcement that we are constructing a new gas plant in the Permian, furthers our integrated NGL [natural gas liquids] wellhead-to-market strategy, providing stable cash flow in uncertain market environments, enabling us to consistently return over 50 percent of net operating cash flow to shareholders”, said chair and chief executive Mark Lashier. On April 1 Phillips 66 said it had completed the acquisition of EPIC Y-Grade GP LLC and EPIC Y-Grade LP for about $2.2 billion, growing its midstream presence in the Permian Basin. The units, bought from EPIC Midstream Holdings LP, own NGL pipelines, fractionation facilities and distribution systems. The acquired operations comprise two fractionators with a capacity of170,000 barrels per day (bpd) near Corpus Christi, Texas; about 350 miles of purity distribution pipelines; and an NGL pipeline around 885 miles long and with a capacity of 175,000 bpd. The NGL pipeline links the Delaware, Midland and Eagle Ford basins to the fractionation complexes and Phillips 66’s Sweeny Hub. The pipeline capacity is being raised to 225,000 bpd, in a project expected to be completed Q2. A further expansion has also been sanctioned to grow the capacity to 350,000 bpd; completion is expected 2026. EPIC had also put in place plans to raise the fractionation capacity to 280,000 bpd. “The acquired assets connect Permian production to Gulf Coast refiners, petrochemical companies and export markets, and

Read More »

Matador Sees YoY Increase in Profit as Output Rises

Matador Resources Company has posted $240.1 million in net profit for Q1 2025, up 24 percent from the $193.7 million logged for the corresponding quarter a year prior. Its oil and natural gas production averaged 198,631 barrels of oil equivalent per day (boepd) in the first quarter of 2025, a 33 percent jump year-over-year. The better-than-expected oil and natural gas production was primarily due to the outperformance of wells that were turned into sales in the fourth quarter of 2024, the company said. Average oil production rose by 36 percent to around 115,030 barrels per day in the first quarter of 2025, up from 84,777 barrels per day in the same period of 2024. Average natural gas production increased by 29 percent to 501.6 million cubic feet per day in the first quarter of 2025, compared to 389.9 million cubic feet per day during the first quarter of 2024. “Matador is pleased to report another profitable quarter that exceeded our expectations”, Joseph Wm. Foran, Matador’s Founder, Chairman and CEO, said. Foran added that as a response to the commodity price volatility, the company has adjusted its drilling and completion activity for 2025 to provide more optionality. “Matador began 2025 operating nine drilling rigs and now expects to drop to eight drilling rigs by the middle of 2025, which is made possible by the flexibility in its service contracts allowing for rapid reduction of operations”, Foran said. “The adjustment in activity is expected to reduce Matador’s drilling, completing and equipping capital expenditures for full-year 2025 by $100 million from Matador’s original expectation of $1.375 billion to a revised expectation of $1.275 billion”, Foran said, adding that the freed up cash flow of $100 million will be used to repay debt and repurchase shares, among other things.   Matador said that during

Read More »

Xcel Energy looks to limit wildfire liability, tariff impacts

Dive Brief: Xcel Energy is focused on conversations at the state and federal level about wildfire, trade and tariff policies after the company’s earnings declined during the first quarter of 2025, Chairman, President and CEO Bob Frenzel said during a Thursday earnings call. Although first-quarter electric and natural gas sales increased year over year, the company also saw its operating costs surge, driven in part by higher nuclear outage amortization costs and increased insurance premiums, according to CFO Brian Van Abel. Xcel Energy expects its costs to increase another 2% to 3% as a result of recent tariffs, which Frenzel described as “manageable.” However, he expressed concern about the impact of tariffs on battery and energy storage prices — technologies he said are necessary to meet surging electric demand. Dive Insight: Leaders from Xcel Energy have spent a lot of time engaging with Washington in recent months, Frenzel told investors on Thursday’s call. “We’re in an unprecedented period of electric demand growth and believe that we need a broad scope of energy resources to meet those needs,” Frenzel said. He listed a half-dozen trends driving increased electric load, including electric vehicle adoption and data center expansion. “The infrastructure to serve this demand growth needs to be thoughtfully planned,” he said Top of mind, Frenzel said, is advocating for the preservation of tech-neutral tax credits for wind, solar, nuclear and energy storage projects. But the company is also paying close attention to trade policy, he said. While company projections put the current cost of tariffs the Trump administration has imposed to date at a “manageable” 2% to 3%, Frenzel said he is concerned about tariffs’ potential future impacts to battery storage projects in particular. Although Xcel Energy only has one “significant” battery storage project in its current capital plan, Frenzel said he

Read More »

Deep Data Center: Neoclouds as the ‘Picks and Shovels’ of the AI Gold Rush

In 1849, the discovery of gold in California ignited a frenzy, drawing prospectors from around the world in pursuit of quick fortune. While few struck it rich digging and sifting dirt, a different class of entrepreneurs quietly prospered: those who supplied the miners with the tools of the trade. From picks and shovels to tents and provisions, these providers became indispensable to the gold rush, profiting handsomely regardless of who found gold. Today, a new gold rush is underway, in pursuit of artificial intelligence. And just like the days of yore, the real fortunes may lie not in the gold itself, but in the infrastructure and equipment that enable its extraction. This is where neocloud players and chipmakers are positioned, representing themselves as the fundamental enablers of the AI revolution. Neoclouds: The Essential Tools and Implements of AI Innovation The AI boom has sparked a frenzy of innovation, investment, and competition. From generative AI applications like ChatGPT to autonomous systems and personalized recommendations, AI is rapidly transforming industries. Yet, behind every groundbreaking AI model lies an unsung hero: the infrastructure powering it. Enter neocloud providers—the specialized cloud platforms delivering the GPU horsepower that fuels AI’s meteoric rise. Let’s examine how neoclouds represent the “picks and shovels” of the AI gold rush, used for extracting the essential backbone of AI innovation. Neoclouds are emerging as indispensable players in the AI ecosystem, offering tailored solutions for compute-intensive workloads such as training large language models (LLMs) and performing high-speed inference. Unlike traditional hyperscalers (e.g., AWS, Azure, Google Cloud), which cater to a broad range of use cases, neoclouds focus exclusively on optimizing infrastructure for AI and machine learning applications. This specialization allows them to deliver superior performance at a lower cost, making them the go-to choice for startups, enterprises, and research institutions alike.

Read More »

Soluna Computing: Innovating Renewable Computing for Sustainable Data Centers

Dorothy 1A & 1B (Texas): These twin 25 MW facilities are powered by wind and serve Bitcoin hosting and mining workloads. Together, they consumed over 112,000 MWh of curtailed energy in 2024, demonstrating the impact of Soluna’s model. Dorothy 2 (Texas): Currently under construction and scheduled for energization in Q4 2025, this 48 MW site will increase Soluna’s hosting and mining capacity by 64%. Sophie (Kentucky): A 25 MW grid- and hydro-powered hosting center with a strong cost profile and consistent output. Project Grace (Texas): A 2 MW AI pilot project in development, part of Soluna’s transition into HPC and machine learning. Project Kati (Texas): With 166 MW split between Bitcoin and AI hosting, this project recently exited the Electric Reliability Council of Texas, Inc. planning phase and is expected to energize between 2025 and 2027. Project Rosa (Texas): A 187 MW flagship project co-located with wind assets, aimed at both Bitcoin and AI workloads. Land and power agreements were secured by the company in early 2025. These developments are part of the company’s broader effort to tackle both energy waste and infrastructure bottlenecks. Soluna’s behind-the-meter design enables flexibility to draw from the grid or directly from renewable sources, maximizing energy value while minimizing emissions. Competition is Fierce and a Narrower Focus Better Serves the Business In 2024, Soluna tested the waters of providing AI services via a  GPU-as-a-Service through a partnership with HPE, branded as Project Ada. The pilot aimed to rent out cloud GPUs for AI developers and LLM training. However, due to oversupply in the GPU market, delayed product rollouts (like NVIDIA’s H200), and poor demand economics, Soluna terminated the contract in March 2025. The cancellation of the contract with HPE frees up resources for Soluna to focus on what it believes the company does best: designing

Read More »

Quiet Genius at the Neutral Line: How Onics Filters Are Reshaping the Future of Data Center Power Efficiency

Why Harmonics Matter In a typical data center, nonlinear loads—like servers, UPS systems, and switch-mode power supplies—introduce harmonic distortion into the electrical system. These harmonics travel along the neutral and ground conductors, where they can increase current flow, cause overheating in transformers, and shorten the lifespan of critical power infrastructure. More subtly, they waste power through reactive losses that don’t show up on a basic utility bill, but do show up in heat, inefficiency, and increased infrastructure stress. Traditional mitigation approaches—like active harmonic filters or isolation transformers—are complex, expensive, and often require custom integration and ongoing maintenance. That’s where Onics’ solution stands out. It’s engineered as a shunt-style, low-pass filter: a passive device that sits in parallel with the circuit, quietly siphoning off problematic harmonics without interrupting operations.  The result? Lower apparent power demand, reduced electrical losses, and a quieter, more stable current environment—especially on the neutral line, where cumulative harmonic effects often peak. Behind the Numbers: Real-World Impact While the Onics filters offer a passive complement to traditional mitigation strategies, they aren’t intended to replace active harmonic filters or isolation transformers in systems that require them—they work best as a low-complexity enhancement to existing power quality designs. LoPilato says Onics has deployed its filters in mission-critical environments ranging from enterprise edge to large colos, and the data is consistent. In one example, a 6 MW data center saw a verified 9.2% reduction in energy consumption after deploying Onics filters at key electrical junctures. Another facility clocked in at 17.8% savings across its lighting and support loads, thanks in part to improved power factor and reduced transformer strain. The filters work by targeting high-frequency distortion—typically above the 3rd harmonic and up through the 35th. By passively attenuating this range, the system reduces reactive current on the neutral and helps stabilize

Read More »

New IEA Report Contrasts Energy Bottlenecks with Opportunities for AI and Data Center Growth

Artificial intelligence has, without question, crossed the threshold—from a speculative academic pursuit into the defining infrastructure of 21st-century commerce, governance, and innovation. What began in the realm of research labs and open-source models is now embedded in the capital stack of every major hyperscaler, semiconductor roadmap, and national industrial strategy. But as AI scales, so does its energy footprint. From Nvidia-powered GPU clusters to exascale training farms, the conversation across boardrooms and site selection teams has fundamentally shifted. It’s no longer just about compute density, thermal loads, or software frameworks. It’s about power—how to find it, finance it, future-proof it, and increasingly, how to generate it onsite. That refrain—“It’s all about power now”—has moved from a whisper to a full-throated consensus across the data center industry. The latest report from the International Energy Agency (IEA) gives this refrain global context and hard numbers, affirming what developers, utilities, and infrastructure operators have already sensed on the ground: the AI revolution will be throttled or propelled by the availability of scalable, sustainable, and dispatchable electricity. Why Energy Is the Real Bottleneck to Intelligence at Scale The major new IEA report puts it plainly: The transformative promise of AI will be throttled—or unleashed—by the world’s ability to deliver scalable, reliable, and sustainable electricity. The stakes are enormous. Countries that can supply the power AI craves will shape the future. Those that can’t may find themselves sidelined. Importantly, while AI poses clear challenges, the report emphasizes how it also offers solutions: from optimizing energy grids and reducing emissions in industrial sectors to enhancing energy security by supporting infrastructure defenses against cyberattacks. The report calls for immediate investments in both energy generation and grid capabilities, as well as stronger collaboration between the tech and energy sectors to avoid critical bottlenecks. The IEA advises that, for countries

Read More »

Colorado Eyes the AI Data Center Boom with Bold Incentive Push

Even as states work on legislation to limit data center development, it is clear that some locations are looking to get a bigger piece of the huge data center spending that the AI wave has created. It appears that politicians in Colorado took a look around and thought to themselves “Why is all that data center building going to Texas and Arizona? What’s wrong with the Rocky Mountain State?” Taking a page from the proven playbook that has gotten data centers built all over the country, Colorado is trying to jump on the financial incentives for data center development bandwagon. SB 24-085: A Statewide Strategy to Attract Data Center Investment Looking to significantly boost its appeal as a data center hub, Colorado is now considering Senate Bill 24-085, currently making its way through the state legislature. Sponsored by Senators Priola and Buckner and Representatives Parenti and Weinberg, this legislation promises substantial economic incentives in the form of state sales and use tax rebates for new data centers established within the state from fiscal year 2026 through 2033. Colorado hopes to position itself strategically to compete with neighboring states in attracting lucrative tech investments and high-skilled jobs. According to DataCenterMap.com, there are currently 53 data centers in the state, almost all located in the Denver area, but they are predominantly smaller facilities. In today’s era of massive AI-driven hyperscale expansion, Colorado is rarely mentioned in the same breath as major AI data center markets.  Some local communities have passed their own incentive packages, but SB 24-085 aims to offer a unified, statewide framework that can also help mitigate growing NIMBY (Not In My Backyard) sentiment around new developments. The Details: How SB 24-085 Works The bill, titled “Concerning a rebate of the state sales and use tax paid on new digital infrastructure

Read More »

Wonder Valley and the Great AI Pivot: Kevin O’Leary’s Bold Data Center Play

Data Center World 2025 drew record-breaking attendance, underscoring the AI-fueled urgency transforming infrastructure investment. But no session captivated the crowd quite like Kevin O’Leary’s electrifying keynote on Wonder Valley—his audacious plan to build the world’s largest AI compute data center campus. In a sweeping narrative that ranged from pandemic pivots to stranded gas and Branson-brand inspiration, O’Leary laid out a real estate and infrastructure strategy built for the AI era. A Pandemic-Era Pivot Becomes a Case Study in Digital Resilience O’Leary opened with a Shark Tank success story that doubled as a business parable. In 2019, a woman-led startup called Blueland raised $50 million to eliminate plastic cleaning bottles by shipping concentrated cleaning tablets in reusable kits. When COVID-19 shut down retail in 2020, her inventory was stuck in limbo—until she made an urgent call to O’Leary. What followed was a high-stakes, last-minute pivot: a union-approved commercial shoot in Brooklyn the night SAG-AFTRA shut down television production. The direct response ad campaign that resulted would not only liquidate the stranded inventory at full margin, but deliver something more valuable—data. By targeting locked-down consumers through local remnant TV ad slots and optimizing by conversion, Blueland saw unheard-of response rates as high as 17%. The campaign turned into a data goldmine: buyer locations, tablet usage patterns, household sizes, and contact details. Follow-up SMS campaigns would drive 30% reorders. “It built such a franchise in those 36 months,” O’Leary said, “with no retail. Now every retailer wants in.” The lesson? Build your infrastructure to control your data, and you build a business that scales even in chaos. This anecdote set the tone for the keynote: in a volatile world, infrastructure resilience and data control are the new core competencies. The Data Center Power Crisis: “There Is Not a Gig on the Grid” O’Leary

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »