Stay Ahead, Stay ONMINE

Bright Data beat Elon Musk and Meta in court — now its $100M AI platform is taking on Big Tech

Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more Bright Data, the Israeli web scraping company that defeated both Meta and Elon Musk’s X in federal court, unveiled a comprehensive AI infrastructure suite Wednesday designed to give artificial intelligence systems unfettered […]

Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more


Bright Data, the Israeli web scraping company that defeated both Meta and Elon Musk’s X in federal court, unveiled a comprehensive AI infrastructure suite Wednesday designed to give artificial intelligence systems unfettered access to real-time web data — a capability the company argues Big Tech platforms are trying to monopolize.

The announcement of Deep Lookup, Browser.ai, and enhanced data collection protocols represents a dramatic expansion for the decade-old company, which has transformed from a specialized web scraping service into what CEO Or Lenchner calls “a unique infrastructure layer for AI companies.” The move comes as artificial intelligence companies increasingly struggle to access current web information needed to power chatbots, autonomous agents, and other AI applications.

“The intelligence of today’s LLMs is no longer its limiting factor; access is,” Lenchner said in an exclusive interview with VentureBeat. “We’ve spent the last decade fighting for open access to public web data, and these new offerings bring us to the next chapter in our journey, one characterized by truly accessible data and the subsequent rise of contextually-aware agents.”

The launch follows Bright Data’s high-profile legal victories in 2024, when federal judges dismissed lawsuits from both Meta and X alleging the company illegally scraped their platforms. Those rulings established crucial legal precedent defining what constitutes “public data” on the internet — information that can be viewed without logging in and therefore can be legally collected and used.

The court cases revealed that both Meta and X had been Bright Data customers even while suing the company, highlighting the contradictory stance many tech giants have taken toward web scraping. The rulings have broader implications for the AI industry, which relies heavily on web data to train and operate language models.

“It was revealed in court that both of them were a Bright Data customer, because everyone needs data, everyone, especially those who are building models,” Lenchner explained. “We are the only company that has the financial resources, and I would even say the courage to do that.”

Judge William Alsup, who presided over the X case, wrote that giving social media companies “free rein to decide, on any basis, who can collect and use data” risks creating “information monopolies that would disserve the public interest.” The ruling established that data viewable without login credentials constitutes public information that can be legally scraped.

Bright Data has now filed a countersuit against X, alleging the platform violated antitrust laws by trying to create a data monopoly to benefit Musk’s AI company, xAI. “The only reason that X are trying to stop Bright Data from allowing its customers to scrape X is that they will be the only entity that can enjoy the relevant quality data that X produces,” Lenchner said.

Deep Lookup and Browser.ai target AI companies struggling with data access

The company’s new products address what Lenchner identifies as the three core requirements for AI systems: algorithms, compute power, and data access. While Bright Data doesn’t develop AI algorithms or provide computing resources, it aims to become the definitive solution for the third requirement.

Deep Lookup functions as a natural language research engine designed to answer complex, multi-layered business questions in real-time. Unlike general-purpose search engines or AI chatbots that provide summaries, Deep Lookup specializes in comprehensive results for queries beginning with “find all.” For example, users can ask for “all shipping companies that went through the Panama and Suez canals in 2023 whose Q3 revenues declined by over 2 percent.”

The system draws from Bright Data’s massive web archive, which currently contains over 200 billion HTML pages and adds 15 billion monthly. By next year, the archive is expected to exceed 500 billion pages. “It’s not just random web pages, it’s actually what the world cares about, because our 20,000 customers represent billions of internet users,” Lenchner noted.

Browser.ai represents what the company calls “the industry’s first unblockable, AI-native browser.” Designed specifically for autonomous AI agents, the cloud-based service mimics human behavior to access websites without triggering bot detection systems. It supports natural language commands and can perform complex web interactions like booking flights or making restaurant reservations.

The browser infrastructure already processes over 150 million web actions daily, according to the company. “Almost all of them are customers,” Lenchner said of AI agent companies that have raised significant funding. “Because what we figured out, and they figured out, is that we solve that problem of entering a website without being blocked and executing web actions on the website.”

MCP Servers (Model Context Protocol) provides a low-latency control layer enabling AI agents to search, crawl, and extract live data in real-time. The protocol allows developers to build AI systems that can act on current information rather than relying solely on training data.

Patent portfolio and proxy network create competitive moat against blocking

Bright Data’s competitive advantage stems from what Lenchner describes as an “obsession” with overcoming website blocking mechanisms. The company holds over 5,500 patent claims on its technology and operates the world’s largest proxy network with more than 150 million IP addresses across 195 countries.

“We have such a good look into the internet,” Lenchner explained. “For a long time now, we have been mapping the internet, and for a long time now, we’re also archiving big chunks of the internet.”

The company’s approach involves sophisticated techniques to mimic human behavior, using real devices, IP addresses, and browser fingerprints rather than simple automated scripts. This makes detection and blocking extremely difficult for websites.

“The only way to block us, practically, is to put the data behind the login, then we won’t even try,” Lenchner said. “Sometimes there is a new blocking logic that we won’t solve immediately. It will take our research team 12 hours, three days that’s like the most it was, and we will unlock it.”

Revenue surpasses $100 million as AI demand explodes post-ChatGPT

While Bright Data remains privately held by a private equity firm, Lenchner confirmed with VentureBeat the company’s annual recurring revenue significantly exceeds $100 million. The business has experienced explosive growth since the launch of ChatGPT in late 2022, as AI companies scrambled to access training data and real-time information.

“Starting March 2023, which is pretty much when GPT-3 changed the world, the AI, or what we call the data for AI, use case just absolutely exploded for us as a company,” Lenchner said. “Everything else is also growing, because everyone needs more data, period. But this use case is just like nothing we’ve seen before.”

The company serves over 20,000 businesses, including Fortune 500 companies and major AI laboratories. Traditional customers include e-commerce platforms tracking competitor pricing, financial services firms seeking market intelligence, and enterprises conducting business research.

GDPR compliance and ethical practices differentiate from competitors

Bright Data has invested heavily in compliance infrastructure to address privacy concerns around data collection. The company follows European GDPR and California CCPA regulations, automatically notifying individuals when their personal information is collected from public sources and providing deletion options.

“The regulation and the legislation are clear since the European GDPR and at least California and CCPA regulations came to play,” Lenchner explained. “If we collected your email address, for example, we will automatically send you an email saying, ‘Hey, this is who we are. We collected your personal information from the public domain. Here’s a huge button you can click if you want to review it, and you can obviously ask to delete it.’”

The company maintains a large compliance team and extensive documentation of its practices, which proved valuable during court proceedings. “We enterprises especially love us because we have our ethical stand that was scrutinized in US courts twice,” Lenchner said.

Web access wars intensify as tech giants seek data monopolies

The battle over web data access reflects broader tensions in the AI industry about information control and competitive advantage. As AI systems become more sophisticated, access to current, comprehensive web data becomes increasingly valuable — and contentious.

Lenchner predicts the web will become “more closed” over time, similar to how Google maintains exclusive access to its web crawling capabilities while others must use alternative services. “A few tech giants are gonna get free access to every website with their agents,” he said. “The rest will need to use our infrastructure or someone else’s infrastructure.”

The company is also observing new trends, including businesses scraping AI chatbots for marketing purposes and the emergence of new protocols like MCP that enable AI agents to interact with web services more effectively.

“All of these guys that are consuming massive amounts of data, and all of us are using them, it’s all going towards building the brains of the robots,” Lenchner said. “It’s okay that you have a chatbot that is talking to a human, because that’s eventually what a robot will do.”

Robot brains and agent economy drive next phase of growth

Bright Data’s transformation from web scraping service to AI infrastructure provider reflects the rapidly evolving needs of the artificial intelligence industry. As companies rush to deploy AI agents and autonomous systems, access to real-time web data becomes as crucial as computing power and algorithmic sophistication.

The legal precedents established through Bright Data’s court victories may prove as significant as its technical innovations, potentially shaping how the entire AI industry accesses and uses web information. With major tech platforms increasingly restricting data access while simultaneously developing their own AI systems, independent infrastructure providers like Bright Data may become essential for maintaining competitive balance in the AI ecosystem.

“We’re an infrastructure company,” Lenchner emphasized. “We’re very talented engineers that hardly go anywhere, just sit with our computers and write code. We’re doing it well. We have no intentions to do anything else.”

The Deep Lookup beta launches Tuesday for business customers, with general public access available through a waitlist. Browser.ai and MCP Servers are already available to enterprise clients through Bright Data’s existing platform.

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

Why enterprise networks need both reach and resilience

As enterprises expand across regions, so do their cloud platforms and digital ecosystems. But with the rise of AI and its unprecedented appetite for data, networks are now under more pressure. Many businesses are learning the limits of legacy architecture the hard way. In the race to meet today’s standard

Read More »

Eni Next, Azimut Partner to Accelerate Clean Tech Deployment

Eni S.p.A.’s corporate venture capital company, Eni Next, has signed a collaboration agreement with Azimut Group. Under the agreement, Azimut will launch a new European Long Term Investment Fund (ELTIF) of venture capital, leveraging Eni Next’s consulting and expertise on technological developments in the energy sector. Eni said in a media release the ELTIF’s launch is expected in September 2025, and the fund will support investments in the energy tech sector.  With a EUR 100 million ($118 million) fundraising target, the Luxembourg-based fund, which is currently awaiting authorization from the relevant authorities, will be open to a broad range of investors, both institutional and private, in line with the new ELTIF 2.0 Regulation’s criteria. The portfolio will comprise U.S.-based startups and scale-ups in the clean tech sector, focusing on decarbonization, energy efficiency, sustainable mobility, and the circular economy. The fund may also invest in European and international companies, Eni said. “This strategic collaboration initiated with Azimut provides Eni Next with an additional lever to support innovative companies in the energy sector. By combining our specialized expertise with Azimut’s fundraising capabilities, the partnership will further accelerate and enhance the growth of the Eni Next portfolio”, Clara Andreoletti, CEO of Eni Next, said. “The energy sector, like many other industrial sectors, is undergoing a profound transformation driven by technological innovation. To support this transition and ensure its economic sustainability, private capital plays a crucial role in enabling new technological solutions to emerge and scale rapidly”. “As new technologies reshape the energy sector, driving a generational shift toward increasingly efficient solutions, this fund aims to give investors access to the most promising and high-potential opportunities”, Giorgio Medda, CEO of Azimut Holding, commented. “This will help bring the Group’s total investments since 2022, dedicated to global energy transition and environmental sustainability, to at least

Read More »

SP Energy Networks Earmarks $1.9B For Scottish Grid Upgrades

Scottish Power Ltd.’s SP Energy Networks has agreed on deals worth GBP 1.4 billion ($1.9 billion) for critical onshore transmission projects across central and southern Scotland. ScottishPower said in a media release that the commitment will help with the delivery of the government’s Clean Power 2030 mission. Part of a GBP 5.4 billion ($7.4 billion) decade-long investment to upgrade the grid, these contracts help SP Energy Networks build long-term partnerships with UK businesses to rewire the electricity network. This program aims to increase capacity for new homes, businesses, and clean energy, as well as improve power distribution, reduce reliance on fossil fuels, and enhance energy security, ScottishPower said. “These strategic partnerships give suppliers the confidence to invest in themselves – growing their workforce, opening new offices across the country, and creating even greater opportunities for the UK”, Nicola Connelly, CEO of SP Energy Networks, said. “This is great news for the UK and Scottish supply chains, with every pound spent directly benefiting central and southern Scotland and its infrastructure for decades to come”. The contracts cover new and upgraded high-voltage substations, overhead line construction, design, engineering, construction, and electrical works, ScottishPower said. The strategic collaborations last five years, with a possibility of a 10-year extension. Seventeen out of the 19 suppliers are in the UK, according to ScottishPower. “These SP Energy Networks partnerships take us a step closer to reaching clean power by 2030, in modernizing the country’s outdated network to get more of clean power generated in Scotland to homes and businesses across the country”, Energy Minister Michael Shanks said. “This is the clean power transition in action – investing in British supply chains that will bring skilled jobs and economic growth to communities in Scotland and beyond”. Morgan Sindall Infrastructure is the sole contractor for the substation and overhead lines

Read More »

Naftogaz, ORLEN Ink Fourth LNG Contract in 2025

ORLEN SA has agreed to deliver an additional 140 million cubic meters (4.94 billion cubic feet) of liquefied natural gas (LNG) from the United States to Ukraine’s Naftogaz Group via Poland. This is the fourth LNG supply contract signed by the state-owned companies this year, bringing Naftogaz’s total contracted gas volumes from ORLEN to 440 million cubic meters, Naftogaz said in an online statement. The contracts are part of a cooperation pact signed by ORLEN and Naftogaz last March to diversify Ukraine’s gas supply sources, ORLEN said separately. After arriving from the U.S., the LNG is planned to be regasified at the Swinoujscie terminal in Poland or the Klaipeda terminal in Lithuania and then transported to Ukraine via Poland. “Naftogaz is diversifying its sources and routes of gas supply”, said Naftogaz chief executive Sergii Koretskyi. “This enhances Ukraine’s energy security and resilience amid the ongoing full-scale war with Russia”. “Signing an additional contract for the supply of American LNG is an important element of our preparations for the coming winter heating season and a big step toward ensuring reliable gas supply for Ukrainians”, Koretskyi added. ORLEN chief operating officer Robert Soszynski said, “Thanks to our continually developed trading expertise, proprietary fleet of LNG transport vessels and reserved regasification capacities, we are well-positioned to support Ukraine in diversifying both the sources and supply routes for natural gas”. “The summer period, which is crucial for replenishing storage facilities, adds to the importance of these deliveries”, Soszynski added. “ORLEN not only ceased all Russian gas imports over three years ago, but today we are also in a position to assist neighboring countries, such as Slovakia and Ukraine, on their path toward energy independence from Russia”, Soszynski said. On Monday ORLEN said it had also eliminated Russian oil from its supply chain when the

Read More »

European Commission Proposes 90 Pct Emissions Cut by 2040

The European Commission proposed Wednesday an amendment to the European Union Climate Law to set a goal of a 90 percent net reduction in regional greenhouse gas emissions by 2040 compared to 1990, toward a climate-neutral economy by 2050. The proposed intermediate target “does not prescribe specific policies, technologies or measures, leaving Member States flexibility, taking into account the regulatory framework to achieve greenhouse gas emission reduction targets”, the official text says. The nearer intermediate aim of a 55 percent net reduction by 2030 comes with binding policies toward the legally binding target of net zero by 2050. The 2040 proposal is based on an impact assessment by the Commission that found “a reduction of 90-95 percent was the most proportionate to bring the EU economy to climate neutrality by 2050 and for the EU to contribute to global climate action in accordance with the Paris Agreement temperature goals of limiting the temperature increase to well below 2ºC above pre-industrial levels and pursuing efforts to limit the temperature increase to 1,5°C above pre-industrial levels”, the text states. The impact assessment took into account progress toward the 2030 reduction aim. The assessment used economic modeling to analyze the evolution of emissions by sector and the contribution of technologies needed to reach net-zero emissions. A Commission analysis published May said the EU was on course to fulfil the 2030 aim, despite saying efforts needed to go beyond existing policies. The projection relies on the 27 member states fully implementing their National Energy and Climate Plans. In a statement for the 2040 proposal, the Commission said, “One central element is flexibilities that the Commission will consider in designing the future legislative instruments to achieve this 2040 climate target”. “These include a limited role for high-quality international credits starting from 2036, the use of domestic permanent removals in the EU Emissions

Read More »

Oil Jumps on Vietnam Trade Deal

Oil climbed in light pre-holiday trading after US President Donald Trump said he had reached a trade deal with Vietnam. West Texas Intermediate rose 3.1% to settle above $67 a barrel after Trump said he had reached a pact with the Southeast Asian nation that eliminated the nation’s import tariff on US goods. The deal is the third announced following agreements with the major trade partners UK and China, with investors pricing in a tentative optimism that more will be reached ahead of a July 9 deadline. Oil’s jump was probably amplified by low liquidity ahead of Friday’s July Fourth holiday in the US. The price gains came despite government data Wednesday showing a buildup in US crude inventories of 3.85 million barrels. The increase is the largest in three months, and more than five times the 680,000 barrel increase projected by the industry-funded American Petroleum Institute on Tuesday. Trading activity in crude futures has declined overall since the truce between Israel and Iran led prices to plunge early last week, with volatility returning to the lower levels seen before the war. The market is likely to turn its attention to a glut forecast for later this year, with an OPEC+ meeting this weekend expected to deliver another substantial increase in production quotas. “Speculators who are already net-long are trying to protect their position,” said Robert Yawger, director of the energy futures division at Mizuho Securities USA. “The problem is that they are running into a OPEC+ meeting with no place to hide over the long weekend.” Investors will also hone in on a slew of inputs expected in the coming days, ranging from a jobs report Thursday to an OPEC+ output decision at the weekend. Oil Prices WTI for August delivery rose 3.1% to settle at $67.45 a barrel

Read More »

Chevron, Total Vying in Libya’s First Oil Tender Since 2011 War

Chevron Corp. and TotalEnergies SE are competing in Libya’s first energy exploration tender since the 2011 conflict, the country’s state-run oil firm said, as the OPEC member looks to oil majors to help ramp up production to a record. Eni SpA and Exxon Mobil Corp. are also among the 37 companies that have lodged interest, with contracts due to be signed with successful bidders by the end of 2025, National Oil Corp Chairman Massoud Seliman said in an interview in the capital, Tripoli.  “Almost all well-known international companies” are vying for the 22 offshore and onshore blocks, he said. Foreign firms stepping back into exploration would mark a watershed for the North African country, which is home to the continent’s largest reserves but has seen production hobbled by more than a decade of conflict.  Libya is split between dueling governments in its east and west, and sporadic stoppages and rounds of violence have left much of its energy infrastructure neglected and damaged. A representative for TotalEnergies declined to comment. Eni and Exxon Mobil didn’t respond to requests for comment. Chevron said it constantly reviews new exploration opportunities, but doesn’t comment on commercial matters. Authorities target daily oil output of 2 million barrels before 2030 — surpassing the 1.75 million-barrel peak reached during strongman Muammar Qaddafi’s reign in 2006. Libya currently pumps about 1.4 million barrels a day. Libya last held a bidding round in 2007, four years before the NATO-backed uprising in which Qaddafi was killed. Winners of the new tenders will bear the costs for seismic surveys and other exploration steps though they can recoup those if commercial quantities of hydrocarbons are discovered, the chairman said. NOC is awaiting approval of a development budget of about $3 billion, which will help raise output to 1.6 million daily barrels within a year, according

Read More »

Arista Buys VeloCloud to reboot SD-WANs amid AI infrastructure shift

What this doesn’t answer is how Arista Networks plans to add newer, security-oriented Secure Access Service Edge (SASE) capabilities to VeloCloud’s older SD-WAN technology. Post-acquisition, it still has only some of the building blocks necessary to achieve this. Mapping AI However, in 2025 there is always more going on with networking acquisitions than simply adding another brick to the wall, and in this case it’s the way AI is changing data flows across networks. “In the new AI era, the concepts of what comprises a user and a site in a WAN have changed fundamentally. The introduction of agentic AI even changes what might be considered a user,” wrote Arista Networks CEO, Jayshree Ullal, in a blog highlighting AI’s effect on WAN architectures. “In addition to people accessing data on demand, new AI agents will be deployed to access data independently, adapting over time to solve problems and enhance user productivity,” she said. Specifically, WANs needed modernization to cope with the effect AI traffic flows are having on data center traffic. Sanjay Uppal, now VP and general manager of the new VeloCloud Division at Arista Networks, elaborated. “The next step in SD-WAN is to identify, secure and optimize agentic AI traffic across that distributed enterprise, this time from all end points across to branches, campus sites, and the different data center locations, both public and private,” he wrote. “The best way to grab this opportunity was in partnership with a networking systems leader, as customers were increasingly looking for a comprehensive solution from LAN/Campus across the WAN to the data center.”

Read More »

Data center capacity continues to shift to hyperscalers

However, even though colocation and on-premises data centers will continue to lose share, they will still continue to grow. They just won’t be growing as fast as hyperscalers. So, it creates the illusion of shrinkage when it’s actually just slower growth. In fact, after a sustained period of essentially no growth, on-premises data center capacity is receiving a boost thanks to genAI applications and GPU infrastructure. “While most enterprise workloads are gravitating towards cloud providers or to off-premise colo facilities, a substantial subset are staying on-premise, driving a substantial increase in enterprise GPU servers,” said John Dinsdale, a chief analyst at Synergy Research Group.

Read More »

Oracle inks $30 billion cloud deal, continuing its strong push into AI infrastructure.

He pointed out that, in addition to its continued growth, OCI has a remaining performance obligation (RPO) — total future revenue expected from contracts not yet reported as revenue — of $138 billion, a 41% increase, year over year. The company is benefiting from the immense demand for cloud computing largely driven by AI models. While traditionally an enterprise resource planning (ERP) company, Oracle launched OCI in 2016 and has been strategically investing in AI and data center infrastructure that can support gigawatts of capacity. Notably, it is a partner in the $500 billion SoftBank-backed Stargate project, along with OpenAI, Arm, Microsoft, and Nvidia, that will build out data center infrastructure in the US. Along with that, the company is reportedly spending about $40 billion on Nvidia chips for a massive new data center in Abilene, Texas, that will serve as Stargate’s first location in the country. Further, the company has signaled its plans to significantly increase its investment in Abu Dhabi to grow out its cloud and AI offerings in the UAE; has partnered with IBM to advance agentic AI; has launched more than 50 genAI use cases with Cohere; and is a key provider for ByteDance, which has said it plans to invest $20 billion in global cloud infrastructure this year, notably in Johor, Malaysia. Ellison’s plan: dominate the cloud world CTO and co-founder Larry Ellison announced in a recent earnings call Oracle’s intent to become No. 1 in cloud databases, cloud applications, and the construction and operation of cloud data centers. He said Oracle is uniquely positioned because it has so much enterprise data stored in its databases. He also highlighted the company’s flexible multi-cloud strategy and said that the latest version of its database, Oracle 23ai, is specifically tailored to the needs of AI workloads. Oracle

Read More »

Datacenter industry calls for investment after EU issues water consumption warning

CISPE’s response to the European Commission’s report warns that the resulting regulatory uncertainty could hurt the region’s economy. “Imposing new, standalone water regulations could increase costs, create regulatory fragmentation, and deter investment. This risks shifting infrastructure outside the EU, undermining both sustainability and sovereignty goals,” CISPE said in its latest policy recommendation, Advancing water resilience through digital innovation and responsible stewardship. “Such regulatory uncertainty could also reduce Europe’s attractiveness for climate-neutral infrastructure investment at a time when other regions offer clear and stable frameworks for green data growth,” it added. CISPE’s recommendations are a mix of regulatory harmonization, increased investment, and technological improvement. Currently, water reuse regulation is directed towards agriculture. Updated regulation across the bloc would encourage more efficient use of water in industrial settings such as datacenters, the asosciation said. At the same time, countries struggling with limited public sector budgets are not investing enough in water infrastructure. This could only be addressed by tapping new investment by encouraging formal public-private partnerships (PPPs), it suggested: “Such a framework would enable the development of sustainable financing models that harness private sector innovation and capital, while ensuring robust public oversight and accountability.” Nevertheless, better water management would also require real-time data gathered through networks of IoT sensors coupled to AI analytics and prediction systems. To that end, cloud datacenters were less a drain on water resources than part of the answer: “A cloud-based approach would allow water utilities and industrial users to centralize data collection, automate operational processes, and leverage machine learning algorithms for improved decision-making,” argued CISPE.

Read More »

HPE-Juniper deal clears DOJ hurdle, but settlement requires divestitures

In HPE’s press release following the court’s decision, the vendor wrote that “After close, HPE will facilitate limited access to Juniper’s advanced Mist AIOps technology.” In addition, the DOJ stated that the settlement requires HPE to divest its Instant On business and mandates that the merged firm license critical Juniper software to independent competitors. Specifically, HPE must divest its global Instant On campus and branch WLAN business, including all assets, intellectual property, R&D personnel, and customer relationships, to a DOJ-approved buyer within 180 days. Instant On is aimed primarily at the SMB arena and offers a cloud-based package of wired and wireless networking gear that’s designed for so-called out-of-the-box installation and minimal IT involvement, according to HPE. HPE and Juniper focused on the positive in reacting to the settlement. “Our agreement with the DOJ paves the way to close HPE’s acquisition of Juniper Networks and preserves the intended benefits of this deal for our customers and shareholders, while creating greater competition in the global networking market,” HPE CEO Antonio Neri said in a statement. “For the first time, customers will now have a modern network architecture alternative that can best support the demands of AI workloads. The combination of HPE Aruba Networking and Juniper Networks will provide customers with a comprehensive portfolio of secure, AI-native networking solutions, and accelerate HPE’s ability to grow in the AI data center, service provider and cloud segments.” “This marks an exciting step forward in delivering on a critical customer need – a complete portfolio of modern, secure networking solutions to connect their organizations and provide essential foundations for hybrid cloud and AI,” said Juniper Networks CEO Rami Rahim. “We look forward to closing this transaction and turning our shared vision into reality for enterprise, service provider and cloud customers.”

Read More »

Data center costs surge up to 18% as enterprises face two-year capacity drought

“AI workloads, especially training and archival, can absorb 10-20ms latency variance if offset by 30-40% cost savings and assured uptime,” said Gogia. “Des Moines and Richmond offer better interconnection diversity today than some saturated Tier-1 hubs.” Contract flexibility is also crucial. Rather than traditional long-term leases, enterprises are negotiating shorter agreements with renewal options and exploring revenue-sharing arrangements tied to business performance. Maximizing what you have With expansion becoming more costly, enterprises are getting serious about efficiency through aggressive server consolidation, sophisticated virtualization and AI-driven optimization tools that squeeze more performance from existing space. The companies performing best in this constrained market are focusing on optimization rather than expansion. Some embrace hybrid strategies blending existing on-premises infrastructure with strategic cloud partnerships, reducing dependence on traditional colocation while maintaining control over critical workloads. The long wait When might relief arrive? CBRE’s analysis shows primary markets had a record 6,350 MW under construction at year-end 2024, more than double 2023 levels. However, power capacity constraints are forcing aggressive pre-leasing and extending construction timelines to 2027 and beyond. The implications for enterprises are stark: with construction timelines extending years due to power constraints, companies are essentially locked into current infrastructure for at least the next few years. Those adapting their strategies now will be better positioned when capacity eventually returns.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »