Stay Ahead, Stay ONMINE

An ancient man’s remains were hacked apart and kept in a garage

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. This week I’ve been working on a story about a brain of glass. About five years ago, archaeologists found shiny black glass fragments inside the skull of a man who died in the Mount Vesuvius eruption of 79 CE. It seems they are pieces of brain, turned to glass. Scientists have found ancient brains before—some are thought to be at least 10,000 years old. But this is the only time they’ve seen a brain turn to glass. They’ve even been able to spot neurons inside it. The man’s remains were found at Herculaneum, an ancient city that was buried under meters of volcanic ash following the eruption. We don’t know if there are any other vitrified brains on the site. None have been found so far, but only about a quarter of the city has been excavated. Some archaeologists want to continue excavating the site. But others argue that we need to protect it. Further digging will expose it to the elements, putting the artifacts and remains at risk of damage. You can only excavate a site once, so perhaps it’s worth waiting until we have the technology to do so in the least destructive way. After all, there are some pretty recent horror stories of excavations involving angle grinders, and of ancient body parts ending up in garages. Future technologies might eventually make our current approaches look similarly barbaric. The inescapable fact of fields like archaeology or paleontology is this: When you study ancient remains, you’ll probably end up damaging them in some way. Take, for example, DNA analysis. Scientists have made a huge amount of progress in this field. Today, geneticists can crack the genetic code of extinct animals and analyze DNA in soil samples to piece together the history of an environment. But this kind of analysis essentially destroys the sample. To perform DNA analysis on human remains, scientists typically cut out a piece of bone and grind it up. They might use a tooth. But once it has been studied, that sample is gone for good. Archaeological excavations have been performed for hundreds of years, and as recently as the 1950s, it was common for archaeologists to completely excavate a site they discovered. But those digs cause damage too. Nowadays, when a site is discovered, archaeologists tend to focus on specific research questions they might want to answer, and excavate only enough to answer those questions, says Karl Harrison, a forensic archaeologist at the University of Exeter in the UK. “We will cross our fingers, excavate the minimal amount, and hope that the next generation of archaeologists will have new, better tools and finer abilities to work on stuff like this,” he says. In general, scientists have also become more careful with human remains. Matteo Borrini, a forensic anthropologist at Liverpool John Moores University in the UK, curates his university’s collection of skeletal remains, which he says includes around 1,000 skeletons of medieval and Victorian Britons. The skeletons are extremely valuable for research, says Borrini, who himself has investigated the remains of one person who died from exposure to phosphorus in a match factory and another who was murdered. When researchers ask to study the skeletons, Borrini will find out whether the research will somehow alter them. “If there is destructive sampling, we need to guarantee that the destruction will be minimal, and that there will be enough material [left] for further study,” he says. “Otherwise we don’t authorize the study.” If only previous generations of archaeologists had taken a similar approach. Harrison told me the story of the discovery of “St Bees man,” a medieval man found in a lead coffin in Cumbria, UK, in 1981. The man, thought to have died in the 1300s, was found to be extraordinarily well preserved—his skin was intact, his organs were present, and he even still had his body hair. Normally, archaeologists would dig up such ancient specimens with care, using tools made of natural substances like stone or brick, says Harrison. Not so for St Bees man. “His coffin was opened with an angle grinder,” says Harrison. The man’s body was removed and “stuck in a truck,” where he underwent a standard modern forensic postmortem, he adds. “His thorax would have been opened up, his organs [removed and] weighed, [and] the top of his head would have been cut off,” says Harrison. Samples of the man’s organs “were kept in [the pathologist’s] garage for 40 years.” If St Bees man were discovered today, the story would be completely different. The coffin itself would be recognized as a precious ancient artifact that should be handled with care, and the man’s remains would be scanned and imaged in the least destructive way possible, says Harrison. Even Lindow man, who was discovered a mere three years later in nearby Manchester, got better treatment. His remains were found in a peat bog, and he is thought to have died over 2,000 years ago. Unlike poor St Bees man, he underwent careful scientific investigation, and his remains took pride of place in the British Museum. Harrison remembers going to see the exhibit when he was 10 years old.  Harrison says he’s dreaming of minimally destructive DNA technologies—tools that might help us understand the lives of long-dead people without damaging their remains. I’m looking forward to covering those in the future. (In the meantime, I’m personally dreaming of a trip to—respectfully and carefully—visit Herculaneum.) Now read the rest of The Checkup Read more from MIT Technology Review’s archive Some believe an “ancient-DNA revolution” is underway, as scientists use modern technologies to learn about human, animal, and environmental remains from the past. My colleague Antonio Regalado has the details in his recent feature. The piece was published in the latest edition of our magazine, which focuses on relationships. Ancient DNA analysis made it to MIT Technology Review’s annual list of top 10 Breakthrough Technologies in 2023. You can read our thoughts on the breakthroughs of 2025 here.  DNA that was frozen for 2 million years was sequenced in 2022. The ancient DNA fragments, which were recovered from Greenland, may offer insight into the environment of the polar desert at the time. Environmental DNA, also known as eDNA, can help scientists assemble a snapshot of all the organisms in a given place. Some are studying samples collected from Angkor Wat in Cambodia, which is believed to have been built in the 12th century. Others are hoping that ancient DNA can be used to “de-extinct” animals that once lived on Earth. Colossal Biosciences is hoping to resurrect the dodo and the woolly mammoth. From around the web Next-generation obesity drugs might be too effective. One trial participant lost 22% of her body weight in nine months. Another lost 30% of his weight in just eight months. (STAT) A US court upheld the conviction of Elizabeth Holmes, the disgraced founder of the biotechnology company Theranos, who was sentenced to over 11 years for defrauding investors out of hundreds of millions of dollars. Her sentence has since been reduced by two years for good behavior. (The Guardian) An unvaccinated child died of measles in Texas. The death is the first reported as a result of the outbreak that is spreading in Texas and New Mexico, and the first measles death reported in the US in a decade. Health and Human Services Secretary Robert F. Kennedy Jr. appears to be downplaying the outbreak. (NBC News) A mysterious disease with Ebola-like symptoms has emerged in the Democratic Republic of Congo. Hundreds of people have been infected in the last five weeks, and more than 50 people have died. (Wired) Towana Looney has been discharged from the hospital three months after receiving a gene-edited pig kidney. “I’m so grateful to be alive and thankful to have received this incredible gift,” she said. (NYU Langone)

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

This week I’ve been working on a story about a brain of glass. About five years ago, archaeologists found shiny black glass fragments inside the skull of a man who died in the Mount Vesuvius eruption of 79 CE. It seems they are pieces of brain, turned to glass.

Scientists have found ancient brains before—some are thought to be at least 10,000 years old. But this is the only time they’ve seen a brain turn to glass. They’ve even been able to spot neurons inside it.

The man’s remains were found at Herculaneum, an ancient city that was buried under meters of volcanic ash following the eruption. We don’t know if there are any other vitrified brains on the site. None have been found so far, but only about a quarter of the city has been excavated.

Some archaeologists want to continue excavating the site. But others argue that we need to protect it. Further digging will expose it to the elements, putting the artifacts and remains at risk of damage. You can only excavate a site once, so perhaps it’s worth waiting until we have the technology to do so in the least destructive way.

After all, there are some pretty recent horror stories of excavations involving angle grinders, and of ancient body parts ending up in garages. Future technologies might eventually make our current approaches look similarly barbaric.

The inescapable fact of fields like archaeology or paleontology is this: When you study ancient remains, you’ll probably end up damaging them in some way. Take, for example, DNA analysis. Scientists have made a huge amount of progress in this field. Today, geneticists can crack the genetic code of extinct animals and analyze DNA in soil samples to piece together the history of an environment.

But this kind of analysis essentially destroys the sample. To perform DNA analysis on human remains, scientists typically cut out a piece of bone and grind it up. They might use a tooth. But once it has been studied, that sample is gone for good.

Archaeological excavations have been performed for hundreds of years, and as recently as the 1950s, it was common for archaeologists to completely excavate a site they discovered. But those digs cause damage too.

Nowadays, when a site is discovered, archaeologists tend to focus on specific research questions they might want to answer, and excavate only enough to answer those questions, says Karl Harrison, a forensic archaeologist at the University of Exeter in the UK. “We will cross our fingers, excavate the minimal amount, and hope that the next generation of archaeologists will have new, better tools and finer abilities to work on stuff like this,” he says.

In general, scientists have also become more careful with human remains. Matteo Borrini, a forensic anthropologist at Liverpool John Moores University in the UK, curates his university’s collection of skeletal remains, which he says includes around 1,000 skeletons of medieval and Victorian Britons. The skeletons are extremely valuable for research, says Borrini, who himself has investigated the remains of one person who died from exposure to phosphorus in a match factory and another who was murdered.

When researchers ask to study the skeletons, Borrini will find out whether the research will somehow alter them. “If there is destructive sampling, we need to guarantee that the destruction will be minimal, and that there will be enough material [left] for further study,” he says. “Otherwise we don’t authorize the study.”

If only previous generations of archaeologists had taken a similar approach. Harrison told me the story of the discovery of “St Bees man,” a medieval man found in a lead coffin in Cumbria, UK, in 1981. The man, thought to have died in the 1300s, was found to be extraordinarily well preserved—his skin was intact, his organs were present, and he even still had his body hair.

Normally, archaeologists would dig up such ancient specimens with care, using tools made of natural substances like stone or brick, says Harrison. Not so for St Bees man. “His coffin was opened with an angle grinder,” says Harrison. The man’s body was removed and “stuck in a truck,” where he underwent a standard modern forensic postmortem, he adds.

“His thorax would have been opened up, his organs [removed and] weighed, [and] the top of his head would have been cut off,” says Harrison. Samples of the man’s organs “were kept in [the pathologist’s] garage for 40 years.”

If St Bees man were discovered today, the story would be completely different. The coffin itself would be recognized as a precious ancient artifact that should be handled with care, and the man’s remains would be scanned and imaged in the least destructive way possible, says Harrison.

Even Lindow man, who was discovered a mere three years later in nearby Manchester, got better treatment. His remains were found in a peat bog, and he is thought to have died over 2,000 years ago. Unlike poor St Bees man, he underwent careful scientific investigation, and his remains took pride of place in the British Museum. Harrison remembers going to see the exhibit when he was 10 years old. 

Harrison says he’s dreaming of minimally destructive DNA technologies—tools that might help us understand the lives of long-dead people without damaging their remains. I’m looking forward to covering those in the future. (In the meantime, I’m personally dreaming of a trip to—respectfully and carefully—visit Herculaneum.)


Now read the rest of The Checkup

Read more from MIT Technology Review‘s archive

Some believe an “ancient-DNA revolution” is underway, as scientists use modern technologies to learn about human, animal, and environmental remains from the past. My colleague Antonio Regalado has the details in his recent feature. The piece was published in the latest edition of our magazine, which focuses on relationships.

Ancient DNA analysis made it to MIT Technology Review’s annual list of top 10 Breakthrough Technologies in 2023. You can read our thoughts on the breakthroughs of 2025 here

DNA that was frozen for 2 million years was sequenced in 2022. The ancient DNA fragments, which were recovered from Greenland, may offer insight into the environment of the polar desert at the time.

Environmental DNA, also known as eDNA, can help scientists assemble a snapshot of all the organisms in a given place. Some are studying samples collected from Angkor Wat in Cambodia, which is believed to have been built in the 12th century.

Others are hoping that ancient DNA can be used to “de-extinct” animals that once lived on Earth. Colossal Biosciences is hoping to resurrect the dodo and the woolly mammoth.

From around the web

Next-generation obesity drugs might be too effective. One trial participant lost 22% of her body weight in nine months. Another lost 30% of his weight in just eight months. (STAT)

A US court upheld the conviction of Elizabeth Holmes, the disgraced founder of the biotechnology company Theranos, who was sentenced to over 11 years for defrauding investors out of hundreds of millions of dollars. Her sentence has since been reduced by two years for good behavior. (The Guardian)

An unvaccinated child died of measles in Texas. The death is the first reported as a result of the outbreak that is spreading in Texas and New Mexico, and the first measles death reported in the US in a decade. Health and Human Services Secretary Robert F. Kennedy Jr. appears to be downplaying the outbreak. (NBC News)

A mysterious disease with Ebola-like symptoms has emerged in the Democratic Republic of Congo. Hundreds of people have been infected in the last five weeks, and more than 50 people have died. (Wired)

Towana Looney has been discharged from the hospital three months after receiving a gene-edited pig kidney. “I’m so grateful to be alive and thankful to have received this incredible gift,” she said. (NYU Langone)

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

SSHStalker botnet brute-forces its way onto 7,000 Linux machines

However, so far the botnet hasn’t done much other than maintaining persistence on infected machines. It has the ability to launch DDoS (distributed denial of service) attacks and conduct cryptomining, but hasn’t done anything yet to monetize its access. That, Flare says, suggests either the operator is still staging the

Read More »

NetBrain’s new AI agents automate network diagnosis

In testing, the system handled the majority of real-world network issues. “90% of the real-world network issues that they had when they threw them at it, it handled it,” Nixon said. “[People] couldn’t quite believe that it was at the 90% mark. People went in thinking, ‘Well, if this gives me

Read More »

IBM FlashSystems gain AI-assisted telemetry, analytics

For security, the systems include a new FlashCore Module all-flash drive, which brings hardware-accelerated, real-time ransomware detection, data reduction, analytics and operations. The devices can spot anomalies and patterns in data that need to be remediated, IBM noted. “The next-generation IBM FlashSystem elevates storage to an intelligent, always-available layer, where autonomous

Read More »

Oil Gains As Middle East Tensions Rise

Oil gained as tensions in the Middle East outweighed concerns that there is a supply glut building in the market. West Texas Intermediate rose to settle above $64 after the Wall Street Journal said that the Pentagon has told a second aircraft carrier strike group to prepare to deploy to the Middle East, citing three US officials. That follows an earlier report from the news agency that the US is considering seizing tankers with Iranian crude. “Oil right now, and just the rest of the commodity complex, is really dominated by three things: geopolitics, trade and technology,” Francisco Blanch, head of commodities research at Bank of America Global Research, said in a Bloomberg Television interview. “Certainly, right now, geopolitics are the main driving force pushing oil close to the high end of this year’s range.” Iran is the fourth-largest OPEC producer, pumping an estimated 3.3 million barrels a day in January, according to a Bloomberg survey. Crude and condensate shipments totaled about 1.63 million barrels a day last month, vessel-tracking data show. The WSJ report meant crude erased earlier losses after President Donald Trump said in a social media post that he insisted that talks with Iran continue in a meeting with Israeli Prime Minister Benjamin Netanyahu. It was widely expected that Netanyahu would push for a broad curtailment of the Islamic Republic’s military activities in the region. The commodity has also received support earlier after strong US jobs data brightened the outlook for the world’s largest economy. “A resilient labor market underpins demand for transport fuels, petrochemicals and power generation, reducing downside risks to US consumption at a time when macro sentiment had turned cautious,” said Claudio Galimberti, chief economist at Rystad Energy. The strong numbers are a sign that the demand picture is firming up, he added. Crude

Read More »

OPEC Says Oil Production Declined Last Month

OPEC+ oil production declined sharply last month amid losses in Kazakhstan, Venezuela and Iran, the group said.  The 22 nations of the alliance produced an average of 42.448 million barrels a day in January, or 439,000 a day less than the previous month, according to a copy of the group’s monthly report obtained by Bloomberg. Kazakhstan accounted for more than half of the drop. While the report didn’t give a reason for the overall decline, Kazakhstan’s production fell as it suspended operations at the Tengiz oil field, the country’s largest. The Chevron-led venture started to restore output there at the end of last month.  Separately, Venezuelan oil exports were disrupted by a US blockade during the ousting of former President Nicolas Maduro, while Iran continues to face American sanctions. Saudi Arabia and several other key nations held steady in January as the Organization of the Petroleum Exporting Countries and its allies began a three-month freeze to offset a seasonal lull in consumption. They’ll meet online on March 1 to review production levels for April and beyond. OPEC kept forecasts for global oil supply and demand unchanged for this year and next, according to the report. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

Ukraine Hits Lukoil Refinery

Ukraine attacked an oil refinery in Russia’s Volgograd region in the first major strike on Russia’s oil-processing industry this year. An overnight drone strike sparked a fire at the facility, Ukraine’s General Staff said on Telegram Wednesday. “The scope of the damage is being clarified,” it said, adding that the refinery helps supply the Russian army. Ukraine carried out multiple high-precision strikes on Russia’s energy assets last year, leading to refinery shutdowns, disruptions at oil terminals and the rerouting of some tankers. The attacks were designed to curb the Kremlin’s energy revenues and restrict fuel supplies to Russian front lines in the war, now nearing its fifth year. The Volgograd refinery, which was attacked several times last year, has a design capacity of about 300,000 barrels of crude a day. It mainly supplies oil products to southern Russia, with some volumes exported. The administration of the Volgograd region said in a Telegram statement that an an industrial plant caught fire after a drone attack but did not name the facility. Lukoil, Russia’s largest private oil producer, did not immediately respond to a request for comment. Satellite images from NASA’s Fire Information for Resource Management System show multiple fires at the refinery that began during the night of Feb. 10-11. The fires were not visible the previous day, according to the data. In January, Ukraine targeted three small independent Russian refineries, which together account for about 7% of Russia’s typical monthly crude throughput. The lull in drone strikes had offered temporary relief for Russia’s downstream sector, allowing refinery runs to gradually increase. Encouraged by the recovery, the government lifted its ban on most gasoline exports, permitting producers to resume shipments in February — a month earlier than planned. While Ukrainian attacks on Russia’s oil industry slowed in January, Moscow continued intense assaults on energy infrastructure

Read More »

TotalEnergies Cuts Buyback to Lower End of Range

(Update) February 11, 2026, 5:10 PM GMT: Article updated with comments on dividend growth, potential investment decisions and acquisitions from 14th paragraph. TotalEnergies SE trimmed its share buybacks to the lower end of its guidance range, aiming to keep debt in check as it adjusts to lower oil prices. The company plans to repurchase $750 million of stock in the first quarter, compared with $1.5 billion in the final three months of 2025, it said in an earnings statement Wednesday. For the year, its buyback target was kept at a range of $3 billion to $6 billion. TotalEnergies is the third and last of Europe’s top oil and gas producers to release earnings after Shell Plc and BP Plc published disappointing quarterly reports. The company has a lower ratio of debt to equity than its European peers and kept quarterly dividend unchanged. “This year we want to balance cash generation with cash expenditure,” Chief Executive Officer Patrick Pouyanne said during a press conference in Paris to discuss earnings. “We don’t know what will happen this year. We want to keep a healthy balance sheet.” Shares of Total closed 2.7% up, at their highest since July 2024. The company has a “solid balance sheet despite uncertain environment,“ Jefferies analysts led by Mark Wilson said in a note after the earnings release. While Big Oil is still churning out hefty profits, cash flows — particularly in Europe — have been undermined by last year’s 18% dive in crude prices. There are also widespread forecasts that the market will remain oversupplied this year as production swells both inside and outside the OPEC+ alliance. “Oil supply remains abundant, so the market is rather trending down,” Pouyanne said, adding that sanctions on Russia are causing a buildup of the nation’s crude at sea. Total’s adjusted

Read More »

EIA Sees Brent Price Dropping in 2026 and 2027

In its latest short term energy outlook (STEO), which was released on February 10, the U.S. Energy Information Administration (EIA) projected that the average Brent spot price will drop in 2026 and 2027. According to this STEO, the EIA sees the Brent spot price coming in at $57.69 per barrel in 2026 and $53.00 per barrel in 2027. The Brent spot price averaged $69.04 per barrel in 2025, the STEO showed. A quarterly breakdown included in the EIA’s latest STEO showed that the organization expects the Brent spot price to come in at $64.44 per barrel in the first quarter of this year, $57.32 per barrel in the second quarter, $55.35 per barrel in the third quarter, $54.00 per barrel in the fourth quarter, and $53.00 per barrel across the first, second, third, and fourth quarters of next year. In the STEO, the EIA highlighted that the Brent crude oil spot price averaged $67 per barrel in January, which it pointed out was $4 per barrel higher than the average in December. The EIA noted that daily Brent crude oil prices increased from an average of $62 per barrel on January 2 to $72 per barrel on January 30. “Crude oil prices rose in response to disruptions to crude oil production in the United States and Kazakhstan,” the EIA highlighted in the STEO. “Despite the near-term increase in prices and short-term disruptions to oil supply, we forecast that strong growth in global oil production will result in high global oil inventory builds over the forecast, causing crude oil prices to fall,” it added. “We forecast that Brent spot prices will average $58 per barrel in 2026 and $53 per barrel in 2027, down from an average of $69 per barrel in 2025,” it continued. In its STEO, the EIA said

Read More »

USA Allows Oilfield Contractors to Go to Work in VEN Fields

The US government issued a general license to allow oilfield-service companies to work in Venezuela as the Trump administration eases sanctions and pushes to rebuild the nation’s crude infrastructure. The license issued by the Treasury Department allows US firms to explore, develop and produce oil and natural gas in Venezuela under certain limited conditions, according to a statement Tuesday. The move is the latest in a series of steps Washington has taken to entice US companies to revive output from Venezuela’s vast crude reserves after last month’s capture of strongman Nicolás Maduro. In January, the US issued a general license that allowed for a wide range of crude operations, including exporting, transporting, refining and buying and selling crude. The general license announced Tuesday involves tasks such as geological mapping, reservoir analysis and related tasks that augment the commencement of oil production.  However, the license does not allow new joint ventures in Venezuela. US people and firms will need to provide detailed plans to the State Department and Department of Energy for any work in the country, according to the statement. The Treasury Department is also preparing to issue a general license allowing companies to pump oil in Venezuela, Bloomberg reported earlier this month.  Oilfield service companies are hired by producers to asses discoveries, drill wells, and enhance output from older assets. SLB Ltd., Halliburton Co. and Baker Hughes Co. dominate the sector. SLB has been working in Venezuela for Chevron Corp., operating under a US license held by the supermajor. The other large contractors scaled back or shut down their primary operations in the country as the previous regime tightened control over the energy industry.  WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate

Read More »

Energy providers seek flexible load strategies for data center operations

“In theory, yes, they’d have to wait a little bit longer while their queries are routed to a data center that has capacity,” said Lawrence. The one thing the industry cannot do is operate like it has in the past, where data center power was tuned and then forgotten for six months. Previously, data centers would test their power sources once or twice a year. They don’t have that luxury anymore. They need to check their power sources and loads far more regularly, according to Lawrence. “I think that for that for the data center industry to continue to survive like we all need it, there’s going to have to be some realignment on the incentives to why somebody would become flexible,” said Lawrence. The survey suggests that utilities and load operators expect to expand their demand response activities and budgets in the near term. Sixty-three percent of respondents anticipate DR program funding to grow by 50% or more over the next three years. While they remain a major source of load growth and system strain, 57% of respondents indicate that onsite power generation from data centers will be most important to improving grid stability over the next five years. One of the proposed fixes to the power shortage has been small modular nuclear reactors. These have gained a lot of traction in the marketplace even if they have nothing to sell yet. But Lawrence said that that’s not an ideal solution for existing power generators, ironically enough.

Read More »

Nokia predicts huge WAN traffic growth, but experts question assumptions

Consumer, which includes both mobile access and fixed access, including fixed wireless access. Enterprise and industrial, which covers wide-area connectivity that supports knowledge work, automation, machine vision, robotics coordination, field support, and industrial IoT. AI, including applications that people directly invoke, such as assistants, copilots, and media generation, as well as autonomous use cases in which AI systems trigger other AI systems to perform functions and move data across networks. The report outlines three scenarios: conservative, moderate, and aggressive. “Our goal is to present scenarios that fall within a realistic range of possible outcomes, encouraging stakeholders to plan across the full spectrum of high-impact demand possibilities,” the report says. Nokia’s prediction for global WAN traffic growth ranges from a 13% CAGR for the conservative scenario to 16% CAGR for moderate and 22% CAGR for aggressive. Looking more closely at the moderate scenario, it’s clear that consumer traffic dominates. Enterprise and industrial traffic make up only about 14% to 17% of overall WAN traffic, although their share is expected to grow during the 10-year forecast period. “On the consumer side, the vast majority of traffic by volume is video,” says William Webb, CEO of the consulting firm Commcisive. Asked whether any of that consumer traffic is at some point served up by enterprises, the answer is a decisive “no.” It’s mostly YouTube and streaming services like Netflix, he says. In short, that doesn’t raise enterprise concerns. Nokia predicts AI traffic boom AI is a different story. “Consumer- and enterprise-generated AI traffic imposes a substantial impact on the wide-area network (WAN) by adding AI workloads processed by data centers across the WAN. AI traffic does not stay inside one data center; it moves across edge, metro, core, and cloud infrastructure, driving dense lateral flows and new capacity demands,” the report says. An

Read More »

Cisco amps up Silicon One line, delivers new systems and optics for AI networking

Those building blocks include the new G300 as well as the G200 51.2 Tbps chip, which is aimed at spine and aggregation applications, and the G100 25.6 Tbps chip, which is aimed at leaf operations. Expanded portfolio of Silicon One P200-powered systems Cisco in October rolled out the P200 Silicon One chip and the high-end, 51.2 Tbps 8223 router aimed at distributed AI workloads. That system supports Octal Small Form-Factor Pluggable (OSFP) and Quad Small Form-Factor Pluggable Double Density (QSFP-DD) optical form factors that help the box support geographically dispersed AI clusters. Cisco grew the G200 family this week with the addition of the 8122X-64EF-O, a 64x800G switch that will run the SONiC OS and includes support for Cisco 800G Linear Pluggable Optics (LPO) connectivity. LPO components typically set up direct links between fiber optic modules, eliminating the need for traditional components such as a digital signal processor. Cisco said its P200 systems running IOS XR software now better support core routing services to allow data-center-to-data-center links and data center interconnect applications. In addition, Cisco introduced a P200-powered 88-LC2-36EF-M line card, which delivers 28.8T of capacity. “Available for both our 8-slot and 18-slot modular systems, this line card enables up to an unprecedented 518.4T of total system bandwidth, the highest in the industry,” wrote Guru Shenoy, senior vice president of the Cisco provider connectivity group, in a blog post about the news. “When paired with Cisco 800G ZR/ZR+ coherent pluggable optics, these systems can easily connect sites over 1,000 kilometers apart, providing the high-density performance needed for modern data center interconnects and core routing.”

Read More »

NetBox Labs ships AI copilot designed for network engineers, not developers

Natural language for network engineers Beevers explained that network operations teams face two fundamental barriers to automation. First, they lack accurate data about their infrastructure. Second, they aren’t software developers and shouldn’t have to become them. “These are not software developers. They are network engineers or IT infrastructure engineers,” Beevers said. “The big realization for us through the copilot journey is they will never be software developers. Let’s stop trying to make them be. Let’s let these computers that are really good at being software developers do that, and let’s let the network engineers or the data center engineers be really good at what they’re really good at.”  That vision drove the development of NetBox Copilot’s natural language interface and its capabilities. Grounding AI in infrastructure reality The challenge with deploying AI  in network operations is trust. Generic large language models hallucinate, produce inconsistent results, and lack the operational context to make reliable decisions. NetBox Copilot addresses this by grounding the AI agent in NetBox’s comprehensive infrastructure data model. NetBox serves as the system of record for network and infrastructure teams, maintaining a semantic map of devices, connections, IP addressing, rack layouts, power distribution and the relationships between these elements. Copilot has native awareness of this data structure and the context it provides. This enables queries that would be difficult or impossible with traditional interfaces. Network engineers can ask “Which devices are missing IP addresses?” to validate data completeness, “Who changed this prefix last week?” for change tracking and compliance, or “What depends on this switch?” for impact analysis before maintenance windows.

Read More »

US pushes voluntary pact to curb AI data center energy impact

Others note that cost pressure isn’t limited to the server rack. Danish Faruqui, CEO of Fab Economics, said the AI ecosystem is layered from silicon to software services, creating multiple points where infrastructure expenses eventually resurface. “Cloud service providers are likely to gradually introduce more granular pricing models across cloud, AI, and SaaS offerings, tailored by customer type, as they work to absorb the costs associated with the White House energy and grid compact,” Faruqui said.   This may not show up as explicit energy surcharges, but instead surface through reduced discounts, higher spending commitments, and premiums for guaranteed capacity or performance. “Smaller enterprises will feel the impact first, while large strategic customers remain insulated longer,” Rawat said. “Ultimately, the compact would delay and redistribute cost pressure; it does not eliminate it.” Implications for data center design The proposal is also likely to accelerate changes in how AI facilities are designed. “Data centers will evolve into localized microgrids that combine utility power with on-site generation and higher-level implementation of battery energy storage systems,” Faruqui said. “Designing for grid interaction will become imperative for AI data centers, requiring intelligent, high-speed switching gear, increased battery energy storage capacity for frequency regulation, and advanced control systems that can manage on-site resources.”

Read More »

Intel teams with SoftBank to develop new memory type

However, don’t expect anything anytime soon. Intel’s Director of Global Strategic Partnerships Sanam Masroor outlined the plans in a blog post. Operations are expected to begin in Q1 2026, with prototypes due in 2027 and commercial products by 2030. While Intel has not come out and said it, that memory design is almost identical to HBM used in GPU accelerators and AI data centers. HBM sits right on the GPU die for immediate access to the GPU, unlike standard DRAM which resides on memory sticks plugged into the motherboard. HBM is much faster than DDR memory but is also much more expensive to produce. It’s also much more profitable than standard DRAM which is why the big three memory makers – Micron, Samsung, and SK Hynix – are favoring production of it.

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »