Stay Ahead, Stay ONMINE

DanaBot takedown shows how agentic AI cut months of SOC analysis to weeks

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More The recent takedown of DanaBot, a Russian malware platform responsible for infecting over 300,000 systems and causing more than $50 million in damage, highlights how agentic AI is redefining cybersecurity operations. According to a recent Lumen […]

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


The recent takedown of DanaBot, a Russian malware platform responsible for infecting over 300,000 systems and causing more than $50 million in damage, highlights how agentic AI is redefining cybersecurity operations. According to a recent Lumen Technologies post, DanaBot actively maintained an average of 150 active C2 servers per day, with roughly 1,000 daily victims across more than 40 countries.  

Last week, the U.S. Department of Justice unsealed a federal indictment in Los Angeles against 16 defendants of DanaBot, a Russia-based malware-as-a-service (MaaS) operation responsible for orchestrating massive fraud schemes, enabling ransomware attacks and inflicting tens of millions of dollars in financial losses to victims.  

DanaBot first emerged in 2018 as a banking trojan but quickly evolved into a versatile cybercrime toolkit capable of executing ransomware, espionage and distributed denial-of-service (DDoS) campaigns. The toolkit’s ability to deliver precise attacks on critical infrastructure has made it a favorite of state-sponsored Russian adversaries with ongoing cyber operations targeting Ukrainian electricity, power and water utilities.

DanaBot sub-botnets have been directly linked to Russian intelligence activities, illustrating the merging boundaries between financially motivated cybercrime and state-sponsored espionage. DanaBot’s operators, SCULLY SPIDER, faced minimal domestic pressure from Russian authorities, reinforcing suspicions that the Kremlin either tolerated or leveraged their activities as a cyber proxy.

As illustrated in the figure below, DanaBot’s operational infrastructure involved complex and dynamically shifting layers of bots, proxies, loaders and C2 servers, making traditional manual analysis impractical.

Overview of DanaBot pipeline and management infrastructure. Source: Team Cymru and Lumen Technologies

DanaBot shows why agentic AI is the new front line against automated threats

Agentic AI played a central role in dismantling DanaBot, orchestrating predictive threat modeling, real-time telemetry correlation, infrastructure analysis and autonomous anomaly detection. These capabilities reflect years of sustained R&D and engineering investment by leading cybersecurity providers, who have steadily evolved from static rule-based approaches to fully autonomous defense systems.

“DanaBot is a prolific malware-as-a-service platform in the eCrime ecosystem, and its use by Russian-nexus actors for espionage blurs the lines between Russian eCrime and state-sponsored cyber operations,” Adam Meyers, Head of Counter Adversary Operations, CrowdStrike told VentureBeat in a recent interview. “SCULLY SPIDER operated with apparent impunity from within Russia, enabling disruptive campaigns while avoiding domestic enforcement. Takedowns like this are critical to raising the cost of operations for adversaries.”

Taking down DanaBot validated agentic AI’s value for Security Operations Centers (SOC) teams by reducing months of manual forensic analysis into a few weeks. All that extra time gave law enforcement the time they needed to identify and dismantle DanaBot’s sprawling digital footprint quickly.

DanaBot’s takedown signals a significant shift in the use of agentic AI in SOCs. SOC Analysts are finally getting the tools they need to detect, analyze, and respond to threats autonomously and at scale, attaining the greater balance of power in the war against adversarial AI.

DanaBot takedown proves SOCs must evolve beyond static rules to agentic AI

DanaBot’s infrastructure, dissected by Lumen’s Black Lotus Labs, reveals the alarming speed and lethal precision of adversarial AI. Operating over 150 active command-and-control servers daily, DanaBot compromised roughly 1,000 victims per day across more than 40 countries, including the U.S. and Mexico. Its stealth was striking. Only 25% of its C2 servers registered on VirusTotal, effortlessly evading traditional defenses.

Built as a multi-tiered, modular botnet leased to affiliates, DanaBot rapidly adapted and scaled, rendering static rule-based SOC defenses, including legacy SIEMs and intrusion detection systems, useless.

Cisco SVP Tom Gillis emphasized this risk clearly in a recent VentureBeat interview. “We’re talking about adversaries who continually test, rewrite and upgrade their attacks autonomously. Static defenses can’t keep pace. They become obsolete almost immediately.”

The goal is to reduce alert fatigue and accelerate incident response

Agentic AI directly addresses a long-standing challenge, starting with alert fatigue. Traditional SIEM platforms burden analysts with up to 40% false-positive rates.

By contrast, agentic AI-driven platforms significantly reduce alert fatigue through automated triage, correlation and context-aware analysis. These platforms include: Cisco Security Cloud, CrowdStrike Charlotte AI, Google Chronicle Security Operations, IBM Security QRadar Suite, Microsoft Security Copilot, Palo Alto Networks Cortex XSIAM, SentinelOne Purple AI and Trellix Helix. Each platform leverages advanced AI and risk-based prioritization to streamline analyst workflows, enabling rapid identification and response to critical threats while minimizing false positives and irrelevant alerts.

Microsoft research reinforces this advantage, integrating gen AI into SOC workflows and reducing incident resolution time by nearly one-third. Gartner’s projections underscore the transformative potential of agentic AI, estimating a productivity leap of approximately 40% for SOC teams adopting AI by 2026.

“The speed of today’s cyberattacks requires security teams to rapidly analyze massive amounts of data to detect, investigate, and respond faster. Adversaries are setting records, with breakout times of just over two minutes, leaving no room for delay,” George Kurtz, president, CEO and co-founder of CrowdStrike, told VentureBeat during a recent interview.

How SOC leaders are turning agentic AI into operational advantage

DanaBot’s dismantling signals a broader shift underway: SOCs are moving from reactive alert-chasing to intelligence-driven execution. At the center of that shift is agentic AI. SOC leaders getting this right aren’t buying into the hype. They’re taking deliberate, architecture-first approaches that are anchored in metrics and, in many cases, risk and business outcomes.

Key takeaways of how SOC leaders can turn agentic AI into an operational advantage include the following:

Start small. Scale with purpose. High-performing SOCs aren’t trying to automate everything at once. They’re targeting high-volume, repetitive tasks that often include phishing triage, malware detonation, routine log correlation and proving value early. The result: measurable ROI, reduced alert fatigue, and analysts reallocated to higher-order threats.

Integrate telemetry as the foundation, not the finish line. The goal isn’t collecting more data, it’s making telemetry meaningful. That means unifying signals across endpoint, identity, network, and cloud to give AI the context it needs. Without that correlation layer, even the best models under-deliver.

Establish governance before scale. As agentic AI systems take on more autonomous decision-making, the most disciplined teams are setting clear boundaries now. That includes codified rules of engagement, defined escalation paths and full audit trails. Human oversight isn’t a backup plan, and it’s part of the control plane.

Tie AI outcomes to metrics that matter. The most strategic teams align their AI efforts to KPIs that resonate beyond the SOC: reduced false positives, faster MTTR and improved analyst throughput. They’re not just optimizing models; they’re tuning workflows to turn raw telemetry into operational leverage.

Today’s adversaries operate at machine speed, and defending against them requires systems that can match that velocity. What made the difference in the takedown of DanaBot wasn’t generic AI. It was agentic AI, applied with surgical precision, embedded in the workflow, and accountable by design.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Congress votes to rescind California vehicle emissions waiver

Dive Brief: The U.S. Senate passed three joint resolutions May 22 nullifying California’s ability to set emissions standards for passenger cars, light duty vehicles and trucks that are stricter than national standards set by the U.S. Environmental Protection Agency. Auto and petroleum industry lobbyists targeted California’s Advanced Clean Car II regulations, adopted in 2022, which require all new passenger cars, trucks and SUVs sold in the state to be zero-emission vehicles by the 2035 model year. Federal law set in 1990 allows 17 additional states and the District of Columbia to follow California’s regulations. California Gov. Gavin Newsom, a Democrat, announced the state’s intention to file a lawsuit blocking the congressional resolutions, which await the signature of President Donald Trump to become law. Dive Insight: California’s ability to set its own vehicle emissions standards stem from the 1967 Air Quality Act, passed at a time when smog and poor air quality often permeated the Los Angeles basin. While air quality in California has improved over the years, experts fear a setback from the Senate’s action. “Public health could potentially suffer as a consequence,” said Michael Kleeman, a professor at the University of California, Davis, Department of Civil and Environmental Engineering. “This is, plain and simple, a vote against clean air to breathe,” said Aaron Kressig, transportation electrification manager at Western Resource Advocates, in an emailed statement. He warned of potential lost days at school or work and premature deaths.    “Over 150 million people in the United States are already exposed to unhealthy levels of air pollution,” Steven Higashide, director of the Clean Transportation Program at the Union of Concerned Scientists, said in an emailed statement. “The standards are based on the best available science, and were finalized with extensive public input.”  Along with public health concerns, the debate around California’s

Read More »

DOE cancels $3.7B in carbon capture, decarbonization awards

The U.S. Department of Energy on Friday canceled $3.7 billion in awards from its Office of Clean Energy Demonstrations, including $940 million in grants for two carbon capture projects planned by Calpine. The canceled awards were mainly for carbon capture and sequestration and other decarbonization projects, according to DOE. Affected companies include PPL Corp., Ørsted and Exxon Mobil Corp. The Calpine projects are for CCS projects at its 550-MW gas-fired Sutter power plant in Yuba City, California, and its 810-MW Baytown power plant in Baytown, Texas. “After a thorough and individualized financial review of each award, the DOE found that these projects failed to advance the energy needs of the American people, were not economically viable and would not generate a positive return on investment of taxpayer dollars,” DOE said. Sixteen of the 24 terminated awards were signed between President Donald Trump’s election in November and Jan. 20, according to DOE. The DOE assessed the canceled awards under a review process outlined earlier this month. The department said it is reviewing 179 awards that total over $15 billion in financial assistance. “DOE is prioritizing large-scale commercial projects that require more detailed information from the awardees for the initial phase of this review, but this process may extend to other DOE program offices as the reviews progress,” the department said. DOE created the Office of Clean Energy Demonstrations in late 2021 to manage about $27 billion in funding appropriated by the Infrastructure Investment and Jobs Act and the Inflation Reduction Act, according to a mid-November report from the U.S. Government Accountability Office. Below is a list from DOE of the canceled awards announced on Friday. Optional Caption Permission granted by US Department of Energy DOE’s decision to terminate the awards was “shortsighted,” according to Steven Nadel, executive director of the American

Read More »

California’s solar, wind curtailment jumps 29% in 2024: EIA

Solar and wind energy output in California was curtailed by 29% more in 2024 than the year before, with solar accounting for 93% of curtailed energy that year, the Energy Information Administration said in a Wednesday report. “In 2024, [the California Independent System Operator] curtailed 3.4 million megawatthours (MWh) of utility-scale wind and solar output, a 29% increase from the amount of electricity curtailed in 2023,” EIA said.  EIA said that CAISO curtailed the most solar in the spring “when solar output was relatively high and electricity demand was relatively low, because moderate spring temperatures meant less demand for space heating or air conditioning.” Optional Caption Retrieved from Energy Information Administration. Wind and solar capacity in California increased from 9.7 GW in 2014 to 28.2 GW by the end of 2024, EIA said. California curtails solar and wind generation to keep the grid stable and to leave room for natural gas generation, in order to comply with North American Electric Reliability Corp. requirements and “have generation online in time to ramp up in the evening hours,” according to the report. CAISO is responding to increased curtailments by “trading with neighboring balancing authorities to try to sell excess solar and wind power, incorporating battery storage into ancillary services, energy, and capacity markets, and including curtailment reduction in transmission planning,” according to EIA. Later this year, companies in the state are also planning to start using excess renewable energy to “make hydrogen, some of which will be stored and mixed with natural gas for summer generation at the Intermountain Power Project’s new facility scheduled to come online in July,” the report said. One of those companies, SoHyCal, said that once it begins using solar energy for this purpose, it “[expects] to produce a total of three tons per day of green hydrogen powered

Read More »

Clean power deployments neared record in Q1, but development pipeline growth slowed: ACP

Dive Brief: Eight of the top 10 states for utility-scale clean energy deployment in the first quarter of 2025 voted Republican in last year’s Presidential election, the American Clean Power Association said on Thursday. Texas was the runaway leader with more than 1,700 MW of wind, solar and energy storage deployments and a 20% year-over-year increase in total clean energy capacity. Florida, Indiana, Ohio and Wyoming rounded out the top five, ACP said. The 7.4 GW of new clean power capacity in the U.S. was the second-strongest Q1 on record. Energy storage was the fastest-growing segment, with nationwide battery storage capacity increasing 65% year over year. Dive Insight: Total utility-scale clean energy deployments in the first quarter of this year came in 9% shy of the record-setting first quarter of 2024, when developers commissioned 8,089 MW of wind, solar and storage capacity, ACP said.  ACP’s data reflects the increasingly broad geography of utility-scale solar and storage deployments. Indiana quadrupled its energy storage capacity, adding 435 MW, while Illinois, Mississippi, Wisconsin and Ohio all deployed far more solar than California. Total U.S. clean energy capacity sits at about 320.9 GW — of which, 80.7 GW is in Texas, ACP said. The clean power development pipeline expanded as well, growing 12% year over year to reach about 184.4 GW and an estimated $328 billion in completed value. But that marks a slowdown from a year ago, when fully permitted clean power capacity under construction or in advanced development rose 26% from Q1 2023. Developers have canceled more than $14 billion in clean energy projects so far this year amid uncertainty over the future of federal tax credits for clean energy investment, production and manufacturing, according to the consulting group E2. ACP’s latest report hinted at the scale of the potential risk to

Read More »

OPEC+ Mulls Even Larger Oil Output Hike as It Seeks Market Share

OPEC+ is considering accelerating its production increases by discussing a potential hike of more than 411,000 barrels a day for July as it seeks to recoup lost market share, according to people familiar with the matter. Eight key members of the Organization of the Petroleum Exporting Countries and its partners, led by Saudi Arabia, are due to hold a video conference on Saturday to discuss output policy. Their last two calls resulted in super-sized production increases that drove down prices, and the cartel may go even further this time, the people said. Some delegates among the eight nations said they were unaware of plans for an outsize boost and expected an increase closer to the 411,000-barrel-a-day hikes set for May and June. Yet the group’s deliberations are increasingly confined to a smaller group of its most powerful members, who sometimes only share decisions with their counterparts at short notice. OPEC+ has made a radical policy shift from defending prices to actively seeking to drive them lower. It stunned traders in early April by announcing a supply increase that was three times the volume planned. The move came even as markets faltered amid slowing demand and President Donald Trump’s trade war, briefly dragging crude to a four-year low below $60 a barrel, and was repeated the following month.  Brent futures slipped to trade below $64 a barrel in London on Friday. Kazakhstan’s Deputy Energy Minister Alibek Zhamauov had already alluded to the possibility of an bigger surge in comments to reporters on Thursday. “There will be a hike, but whether it will be 400, 500, 600, we don’t know — that will be announced on Saturday,” he said in Astana, according to the news agency.  Delegates have offered a range of explanations for the pivot by Riyadh. Some assert that OPEC+

Read More »

USA Crude Oil Inventories Decrease Week on Week

U.S. commercial crude oil inventories, excluding those in the Strategic Petroleum Reserve (SPR), decreased by 2.8 million barrels from the week ending May 16 to the week ending May 23, the U.S. Energy Information Administration (EIA) highlighted in its latest weekly petroleum status report. This report was released on May 29 and included data for the week ending May 23. It showed that crude oil stocks, not including the SPR, stood at 440.4 million barrels on May 23, 443.2 million barrels on May 16, and 454.7 million barrels on May 24, 2024. Crude oil in the SPR stood at 401.3 million barrels on May 23, 400.5 million barrels on May 16, and 369.3 million barrels on May 24, 2024, the report outlined. Total petroleum stocks – including crude oil, total motor gasoline, fuel ethanol, kerosene type jet fuel, distillate fuel oil, residual fuel oil, propane/propylene, and other oils – stood at 1.623 billion barrels on May 23, the report showed. Total petroleum stocks were up 0.2 million barrels week on week and down 8.7 million barrels year on year, the report revealed. “At 440.4 million barrels, U.S. crude oil inventories are about six percent below the five year average for this time of year,” the EIA said in its latest weekly petroleum status report. “Total motor gasoline inventories decreased by 2.4 million barrels from last week and are about three percent below the five year average for this time of year. Both finished gasoline inventories and blending components inventories decreased last week,” it added. “Distillate fuel inventories decreased by 0.7 million barrels last week and are about 17 percent below the five year average for this time of year. Propane/propylene inventories increased by two million barrels from last week and are four percent below the five year average for this

Read More »

Cisco bolsters DNS security package

The software can block domains associated with phishing, malware, botnets, and other high-risk categories such as cryptomining or new domains that haven’t been reported previously. It can also create custom block and allow lists and offers the ability to pinpoint compromised systems using real-time security activity reports, Brunetto wrote. According to Cisco, many organizations leave DNS resolution to their ISP. “But the growth of direct enterprise internet connections and remote work make DNS optimization for threat defense, privacy, compliance, and performance ever more important,” Cisco stated. “Along with core security hygiene, like a patching program, strong DNS-layer security is the leading cost-effective way to improve security posture. It blocks threats before they even reach your firewall, dramatically reducing the alert pressure your security team manages.” “Unlike other Secure Service Edge (SSE) solutions that have added basic DNS security in a ‘checkbox’ attempt to meet market demand, Cisco Secure Access – DNS Defense embeds strong security into its global network of 50+ DNS data centers,” Brunetto wrote. “Among all SSE solutions, only Cisco’s features a recursive DNS architecture that ensures low-latency, fast DNS resolution, and seamless failover.”

Read More »

HPE Aruba unveils raft of new switches for data center, campus modernization

And in large-scale enterprise environments embracing collapsed-core designs, the switch acts as a high-performance aggregation layer. It consolidates services, simplifies network architecture, and enforces security policies natively, reducing complexity and operational cost, Gray said. In addition, the switch offers the agility and security required at colocation facilities and edge sites. Its integrated Layer 4 stateful security and automation-ready platform enable rapid deployment while maintaining robust control and visibility over distributed infrastructure, Gray said. The CX 10040 significantly expands the capacity it can provide and the roles it can serve for enterprise customers, according to one industry analyst. “From the enterprise side, this expands on the feature set and capabilities of the original 10000, giving customers the ability to run additional services directly in the network,” said Alan Weckel, co-founder and analyst with The 650 Group. “It helps drive a lower TCO and provide a more secure network.”  Aimed as a VMware alternative Gray noted that HPE Aruba is combining its recently announced Morpheus VM Essentials plug-in package, which offers a hypervisor-based package aimed at hybrid cloud virtualization environments, with the CX 10040 to deliver a meaningful alternative to Broadcom’s VMware package. “If customers want to get out of the business of having to buy VM cloud or Cloud Foundation stuff and all of that, they can replace the distributed firewall, microsegmentation and lots of the capabilities found in the old VMware NSX [networking software] and the CX 10k, and Morpheus can easily replace that functionality [such as VM orchestration, automation and policy management],” Gray said. The 650 Group’s Weckel weighed in on the idea of the CX 10040 as a VMware alternative:

Read More »

Indian startup Refroid launches India’s first data center CDUs

They use heat exchangers and pumps to regulate the flow and temperature of fluid delivered to equipment for cooling, while isolating the technology cooling system loop from facility systems. The technology addresses limitations of traditional air cooling, which industry experts say cannot adequately handle the heat generated by modern AI processors and high-density computing applications. Strategic significance for India Industry analysts view the development as a critical milestone for India’s data center ecosystem. “India generates 20% of global data, yet contributes only 3% to global data center capacity. This imbalance is not merely spatial — it’s systemic,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “The emergence of indigenously developed CDUs signals a strategic pivot. Domestic CDU innovation is a defining moment in India’s transition from data centre host to technology co-creator.” Neil Shah, VP for research and partner at Counterpoint Research, noted that major international players like Schneider, Vertiv, Asetek, Liquidstack, and Zutacore have been driving most CDU deployments in Indian enterprises and data centers. “Having a local indigenous CDU tech and supplier designed with Indian weather, infrastructure and costs in mind expands options for domestic data center demand,” he said. AI driving data center cooling revolution India’s data center capacity reached approximately 1,255 MW between January and September 2024 and was projected to expand to around 1,600 MW by the end of 2024, according to CBRE India’s 2024 Data Center Market Update. Multiple market research firms have projected the India data center market to grow from about $5.7 billion in 2024 to $12 billion by 2030. Bhavaraju cited aggressive projections for the sector’s expansion, with AI workloads expected to account for 30% of total workloads by 2030. “All of them need liquid cooling,” he said, noting that “today’s latest GPU servers – GB200 from Nvidia

Read More »

Platform approach gains steam among network teams

Revisting the platform vs. point solutions debate The dilemma of whether to deploy an assortment of best-of-breed products from multiple vendors or go with a unified platform of “good enough” tools from a single vendor has vexed IT execs forever. Today, the pendulum is swinging toward the platform approach for three key reasons. First, complexity, driven by the increasingly distributed nature of enterprise networks, has emerged as a top challenge facing IT execs. Second, the lines between networking and security are blurring, particularly as organizations deploy zero trust network access (ZTNA). And third, to reap the benefits of AIOps, generative AI and agentic AI, organizations need a unified data store. “The era of enterprise connectivity platforms is upon us,” says IDC analyst Brandon Butler. “Organizations are increasingly adopting platform-based approaches to their enterprise connectivity infrastructure to overcome complexity and unlock new business value. When enhanced by AI, enterprise platforms can increase productivity, enrich end-user experiences, enhance security, and ultimately drive new opportunities for innovation.” In IDC’s Worldwide AI in Networking Special Report, 78% of survey respondents agreed or strongly agreed with the statement: “I am moving to an AI-powered platform approach for networking.” Gartner predicts that 70% of enterprises will select a broad platform for new multi-cloud networking software deployments by 2027, an increase from 10% in early 2024. The breakdown of silos between network and security operations will be driven by organizations implementing zero-trust principles as well as the adoption of AI and AIOps. “In the future, enterprise networks will be increasingly automated, AI-assisted and more tightly integrated with security across LAN, data center and WAN domains,” according to Gartner’s 2025 Strategic Roadmap for Enterprise Networking. While all of the major networking vendors have announced cloud-based platforms, it’s still relatively early days. For example, Cisco announced a general framework for Cisco

Read More »

Oracle to spend $40B on Nvidia chips for OpenAI data center in Texas

OpenAI has also expanded Stargate internationally, with plans for a UAE data center announced during Trump’s recent Gulf tour. The Abu Dhabi facility is planned as a 10-square-mile campus with 5 gigawatts of power. Gogia said OpenAI’s selection of Oracle “is not just about raw compute, but about access to geographically distributed, enterprise-grade infrastructure that complements its ambition to serve diverse regulatory environments and availability zones.” Power demands create infrastructure dilemma The facility’s power requirements raise serious questions about AI’s sustainability. Gogia noted that the 1.2-gigawatt demand — “on par with a nuclear facility” — highlights “the energy unsustainability of today’s hyperscale AI ambitions.” Shah warned that the power envelope keeps expanding. “As AI scales up and so does the necessary compute infrastructure needs exponentially, the power envelope is also consistently rising,” he said. “The key question is how much is enough? Today it’s 1.2GW, tomorrow it would need even more.” This escalating demand could burden Texas’s infrastructure, potentially requiring billions in new power grid investments that “will eventually put burden on the tax-paying residents,” Shah noted. Alternatively, projects like Stargate may need to “build their own separate scalable power plant.” What this means for enterprises The scale of these facilities explains why many organizations are shifting toward leased AI computing rather than building their own capabilities. The capital requirements and operational complexity are beyond what most enterprises can handle independently.

Read More »

New Intel Xeon 6 CPUs unveiled; one powers rival Nvidia’s DGX B300

He added that his read is that “Intel recognizes that Nvidia is far and away the leader in the market for AI GPUs and is seeking to hitch itself to that wagon.” Roberts said, “basically, Intel, which has struggled tremendously and has turned over its CEO amidst a stock slide, needs to refocus to where it thinks it can win. That’s not competing directly with Nvidia but trying to use this partnership to re-secure its foothold in the data center and squeeze out rivals like AMD for the data center x86 market. In other words, I see this announcement as confirmation that Intel is looking to regroup, and pick fights it thinks it can win. “ He also predicted, “we can expect competition to heat up in this space as Intel takes on AMD’s Epyc lineup in a push to simplify and get back to basics.” Matt Kimball, vice president and principal analyst, who focuses on datacenter compute and storage at Moor Insights & Strategy, had a much different view about the announcement. The selection of the Intel sixth generation Xeon CPU, the 6776P, to support Nvidia’s DGX B300 is, he said, “important, as it validates Intel as a strong choice for the AI market. In the big picture, this isn’t about volumes or revenue, rather it’s about validating a strategy Intel has had for the last couple of generations — delivering accelerated performance across critical workloads.”  Kimball said that, In particular, there are a “couple things that I would think helped make Xeon the chosen CPU.”

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »

Fueling seamless AI at scale

In partnership withArm From large language models (LLMs) to reasoning agents, today’s AI tools bring unprecedented computational demands. Trillion-parameter models, workloads running on-device, and swarms

Read More »