Stay Ahead, Stay ONMINE

Junior Peña, neutrino hunter

Growing up in South Central Los Angeles, Junior Peña learned to keep his eyes down and his schedule full. In his neighborhood, a glance could invite trouble, and many kids—including his older brother—were pulled into gang culture. He knew early on that he wanted something else. With his parents working long hours, he went to after-school programs, played video games, and practiced martial arts. But his friends had no idea that he also spent hours online poring over textbooks and watching lectures, teaching himself advanced mathematics and philosophy. “Being good at school wasn’t how people saw me,” he says.  One night in high school, he came across a YouTube video about the Higgs boson—the so-called “God particle,” thought to give mass to nearly everything in the universe. “I remember my mind being flooded with questions about life, the universe, and our existence,” he recalls. He’d already looked into philosophers’ answers to those questions but was drawn to the more concrete explanations of physics. After his independent study helped Peña pass AP calculus as a junior, his fascination with physics led him to the University of Southern California, the 2019 session of MIT’s Summer Research Program, and then MIT for grad school. Today, he’s working to shed light on neutrinos, the ghostly uncharged particles that slip effortlessly through matter. Particles that would require a wall of lead five light-years thick to stop. As a grad student in the lab of Joseph Formaggio, an experimental physicist known for pioneering new techniques in neutrino detection, Peña works alongside leading physicists designing technology to precisely measure what are arguably the universe’s most elusive particles. Emanating from such sources as the sun and supernovas (and generated artificially by particle accelerators and nuclear reactors), neutrinos reveal their presence through an absence. Their existence was initially posited in the 1930s by the physicist Wolfgang Pauli, who noticed that energy seemed to go missing when atoms underwent a process known as radioactive beta decay. According to the law of conservation of energy, the total energy of the particles emitted during radioactive decay must equal the energy of the decaying atom. To account for the missing energy, Pauli proposed the existence of an undetectable particle that was carrying it away.  Einstein’s E = mc2 tells us that if energy is missing, then mass must be too. Yet according to the standard model of physics—which offers our most trusted theory for how particles behave—neutrinos should have no mass at all. Unlike other particles, they don’t interact with the Higgs field, a kind of cosmic molasses that slows particles down and gives them mass. Because they pass through it untouched, they should remain massless.  But by the early 2000s, researchers had discovered that neutrinos, which had first been detected in the 1950s, can shift between three types, a feat possible only if they have mass. So now the tantalizing question is: What is their mass?  Determining neutrinos’ exact mass could explain why matter triumphed over antimatter, refine models of cosmic evolution, and clarify the particles’ role in dark matter and dark energy. And the Formaggio Lab is part of Project 8, an international collaboration of 71 scientists in 17 institutions working to make that measurement. To do this, the lab uses tritium, an unstable isotope of hydrogen that decays into helium, releasing both an electron and a particle called an antineutrino (“every particle has an antiparticle counterpart,” Formaggio explains). By precisely measuring the energy spectrum of those electrons, scientists can determine how much energy is missing, allowing them to infer the neutrinos’ mass. At the heart of this experiment is a novel detection method called cyclotron radiation emission spectroscopy (CRES), first proposed in 2008 by Formaggio and his then postdoc Benjamin Monreal, which “listens” to the faint radio signals emitted as electrons spiral through a magnetic field. Peña was instrumental in designing a crucial part of the tool that will make this possible: a copper cavity that he likens to a guitar, with the electrons released during beta decay acting like plucked strings. The cavity will amplify their signals, helping researchers to measure them exactly. Peña spent more than a year developing and refining a flashlight-size prototype of the device in collaboration with machinists and fellow physicists. Peña designed a prototype copper microwave resonator to amplify the signals of electrons emitted as tritium decays, allowing researchers to measure them exactly and infer the neutrino’s mass.JESSICA CHOMIK-MORALES, SM ’25 “He had to learn the [design and simulation] software, figure out how to interpret the signals, and test iteration after iteration,” says Formaggio, Peña’s advisor. “It’s been incredible watching him take this from a rough idea to a working design.” The design of Peña’s cavity had to balance competing demands. It needed a way to extract the electrons’ signals that was compatible with the researchers’ methods for calibrating the system, one of which involves using an electron gun to inject electrons of a known, precise energy into the cavity. And it also needed to preserve the properties of the electromagnetic fields within the cavity. In May, Peña sent his final prototype to the University of Washington, where it was installed in July. Researchers hope to begin calibration this fall. Then Peña’s cavity and the full experimental setup will be scaled up so in a few years they can begin collecting CRES data using tritium. “We’ve been working toward this for at least three years,” says Jeremy Gaison, a Project 8 physicist at the Pacific Northwest National Lab. “When we finally turn on the experiment, it’s going to be incredible to see if all of our simulations and studies actually hold up in real data.” Peña’s contribution to the effort “is the core of this experiment,” says Wouter Van De Pontseele, another Project 8 collaborator and former Formaggio Lab postdoc. “Junior took an idea and turned it into reality.”  Project 8 is still in its early stages. The next phase will scale up with larger, more complex versions of the technology Peña played a key role in developing, culminating in a vast facility designed to hunt for the neutrino’s mass. If that is successful, the findings could have profound implications for our understanding of the universe’s structure, the evolution of galaxies, and even the fundamental nature of matter itself. Eager to keep probing such open questions in fundamental physics, Peña is still exploring options for his postdoc work. One possibility is focusing on the emerging field of levitated nanosensors, which could advance gravitation experiments, efforts to detect dark matter, and searches for the sterile neutrino, a posited fourth variety that interacts even more rarely than the others. “Experimental particle physics is long-term work,” says Van De Pontseele. “Some of us will stay on this project for decades, but Junior can walk away knowing he made a lasting impact.” Peña also hopes to have a lasting impact as a professor, opening doors for students who, like him, never saw themselves reflected in the halls of academia. “A summer program brought me here,” he says. “I owe it to the next kid to show they belong.”

Growing up in South Central Los Angeles, Junior Peña learned to keep his eyes down and his schedule full. In his neighborhood, a glance could invite trouble, and many kids—including his older brother—were pulled into gang culture. He knew early on that he wanted something else. With his parents working long hours, he went to after-school programs, played video games, and practiced martial arts. But his friends had no idea that he also spent hours online poring over textbooks and watching lectures, teaching himself advanced mathematics and philosophy. “Being good at school wasn’t how people saw me,” he says. 

One night in high school, he came across a YouTube video about the Higgs boson—the so-called “God particle,” thought to give mass to nearly everything in the universe. “I remember my mind being flooded with questions about life, the universe, and our existence,” he recalls. He’d already looked into philosophers’ answers to those questions but was drawn to the more concrete explanations of physics.

After his independent study helped Peña pass AP calculus as a junior, his fascination with physics led him to the University of Southern California, the 2019 session of MIT’s Summer Research Program, and then MIT for grad school. Today, he’s working to shed light on neutrinos, the ghostly uncharged particles that slip effortlessly through matter. Particles that would require a wall of lead five light-years thick to stop.

As a grad student in the lab of Joseph Formaggio, an experimental physicist known for pioneering new techniques in neutrino detection, Peña works alongside leading physicists designing technology to precisely measure what are arguably the universe’s most elusive particles. Emanating from such sources as the sun and supernovas (and generated artificially by particle accelerators and nuclear reactors), neutrinos reveal their presence through an absence. Their existence was initially posited in the 1930s by the physicist Wolfgang Pauli, who noticed that energy seemed to go missing when atoms underwent a process known as radioactive beta decay. According to the law of conservation of energy, the total energy of the particles emitted during radioactive decay must equal the energy of the decaying atom. To account for the missing energy, Pauli proposed the existence of an undetectable particle that was carrying it away. 

Einstein’s E = mc2 tells us that if energy is missing, then mass must be too. Yet according to the standard model of physics—which offers our most trusted theory for how particles behave—neutrinos should have no mass at all. Unlike other particles, they don’t interact with the Higgs field, a kind of cosmic molasses that slows particles down and gives them mass. Because they pass through it untouched, they should remain massless. 

But by the early 2000s, researchers had discovered that neutrinos, which had first been detected in the 1950s, can shift between three types, a feat possible only if they have mass. So now the tantalizing question is: What is their mass? 

Determining neutrinos’ exact mass could explain why matter triumphed over antimatter, refine models of cosmic evolution, and clarify the particles’ role in dark matter and dark energy. And the Formaggio Lab is part of Project 8, an international collaboration of 71 scientists in 17 institutions working to make that measurement. To do this, the lab uses tritium, an unstable isotope of hydrogen that decays into helium, releasing both an electron and a particle called an antineutrino (“every particle has an antiparticle counterpart,” Formaggio explains). By precisely measuring the energy spectrum of those electrons, scientists can determine how much energy is missing, allowing them to infer the neutrinos’ mass.

At the heart of this experiment is a novel detection method called cyclotron radiation emission spectroscopy (CRES), first proposed in 2008 by Formaggio and his then postdoc Benjamin Monreal, which “listens” to the faint radio signals emitted as electrons spiral through a magnetic field. Peña was instrumental in designing a crucial part of the tool that will make this possible: a copper cavity that he likens to a guitar, with the electrons released during beta decay acting like plucked strings. The cavity will amplify their signals, helping researchers to measure them exactly. Peña spent more than a year developing and refining a flashlight-size prototype of the device in collaboration with machinists and fellow physicists.

Peña designed a prototype copper microwave resonator to amplify the signals of electrons emitted as tritium decays, allowing researchers to measure them exactly and infer the neutrino’s mass.
JESSICA CHOMIK-MORALES, SM ’25

“He had to learn the [design and simulation] software, figure out how to interpret the signals, and test iteration after iteration,” says Formaggio, Peña’s advisor. “It’s been incredible watching him take this from a rough idea to a working design.”

The design of Peña’s cavity had to balance competing demands. It needed a way to extract the electrons’ signals that was compatible with the researchers’ methods for calibrating the system, one of which involves using an electron gun to inject electrons of a known, precise energy into the cavity. And it also needed to preserve the properties of the electromagnetic fields within the cavity. In May, Peña sent his final prototype to the University of Washington, where it was installed in July. Researchers hope to begin calibration this fall. Then Peña’s cavity and the full experimental setup will be scaled up so in a few years they can begin collecting CRES data using tritium.

“We’ve been working toward this for at least three years,” says Jeremy Gaison, a Project 8 physicist at the Pacific Northwest National Lab. “When we finally turn on the experiment, it’s going to be incredible to see if all of our simulations and studies actually hold up in real data.”

Peña’s contribution to the effort “is the core of this experiment,” says Wouter Van De Pontseele, another Project 8 collaborator and former Formaggio Lab postdoc. “Junior took an idea and turned it into reality.” 

Project 8 is still in its early stages. The next phase will scale up with larger, more complex versions of the technology Peña played a key role in developing, culminating in a vast facility designed to hunt for the neutrino’s mass. If that is successful, the findings could have profound implications for our understanding of the universe’s structure, the evolution of galaxies, and even the fundamental nature of matter itself.

Eager to keep probing such open questions in fundamental physics, Peña is still exploring options for his postdoc work. One possibility is focusing on the emerging field of levitated nanosensors, which could advance gravitation experiments, efforts to detect dark matter, and searches for the sterile neutrino, a posited fourth variety that interacts even more rarely than the others.

“Experimental particle physics is long-term work,” says Van De Pontseele. “Some of us will stay on this project for decades, but Junior can walk away knowing he made a lasting impact.”

Peña also hopes to have a lasting impact as a professor, opening doors for students who, like him, never saw themselves reflected in the halls of academia. “A summer program brought me here,” he says. “I owe it to the next kid to show they belong.”

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

IBM, AMD team on quantum computing

IBM and AMD are working together to blend Big Blue’s quantum computers with the chipmaker’s CPUs, GPUs and FPGAs to build intelligent, quantum-centric, high-performance computers. The plan is to combine the power and intelligence of quantum computers with the benefits of classic computing to enable new kinds of algorithms that

Read More »

Spotlight report: IT careers in the AI era

AI is not only changing the world at large, it’s radically changing the professional worlds of all IT workers—from developers and SOC analysts, to helpdesk staff, I&O teams, enterprise architects, and CIOs. What new skills and knowledge will you need to thrive in the age of AI? What new roles

Read More »

In the enterprise IT view of AI agents, three words matter most

Reactive AI agents Starting with the reactive category is justified, because this category makes up almost half the total number of deployed AI agent applications enterprises report. Most early enterprise AI applications were chatbots aimed at pre- and post-sale support, and some of these have been gradually converting from online

Read More »

Oil Settles Lower After Rally

Oil dropped after four sessions of gains as investors waited for clearer signals on supplies and tracked the wider market fallout from US President Donald Trump’s attempt to oust a Federal Reserve governor. West Texas Intermediate fell by 2.4%, the biggest loss since early August, to settle near $63 a barrel, remaining within the range of $62 to $65 that prices have been locked in most of this month. Crude’s dip on Tuesday reflected a wariness toward risk assets in wider markets, driven by Trump’s bid to remove the Fed’s Lisa Cook. The Department of Homeland Security issued a notice Monday to double tariffs on all Indian imports as the US sought to penalize the country for buying Russian crude. The move — to take effect Wednesday — is billed as part of a broader effort by Trump to broker peace between Russia and Ukraine. Still, Trump’s history of reneging on trade threats is leading many investors to discount the prospect that the higher tariffs will take effect. A similar sentiment is visible in the South Asian nation, where refiners plan to maintain the bulk of their purchases from Russia in the coming weeks. Crude has traded in a narrow band for most of August, with traders assessing the effect of US levies as well as the longer-term consequences of major supply hikes from OPEC+. The International Energy Agency warned earlier this month that the global oil market was on track for a record surplus next year as demand growth slows while supplies swell, though markets have been firm over the summer. Meanwhile, Trump praised oil prices “down close to $60 a barrel” at a Cabinet meeting Tuesday, adding the price will be “breaking that pretty soon.” He didn’t clarify which catalyst is set to push prices below $60. Trading

Read More »

PetroChina Proposes Buying Gas Cos for $5.6B

PetroChina Co. has proposed buying three natural gas storage companies for 40 billion yuan ($5.6 billion) to bolster the nation’s infrastructure for the fuel. The board of China’s biggest oil and gas producer on Tuesday approved a plan to buy Xinjiang Gas Storage, Xiangguosi Gas Storage and Liaohe Gas Storage from state-owned controlling shareholder China National Petroleum Corp., according to a filing on the Hong Kong stock exchange. The move comes as China ramps up its gas consumption in an effort to boost energy security and ease dependence on coal to reduce emissions and air quality. China’s natural gas demand is expected to rise more than 40% through 2050, according to a projection by the International Energy Agency that takes into account existing and announced policies. Buying the three companies would add about 11 billion cubic meters to PetroChina’s gas storage capacity, the company said. “This will enhance adjustment efficiency and maximize the overall benefits of the natural gas industry chain,” the company said in the filing.  PetroChina is offering to pay about 17 billion yuan for Xinjiang Gas Storage, 10 billion yuan for Xiangguosi Gas Storage and about 13 yuan for Liaobe Gas Storage.  The proposed deal comes after PetroChina’s profits have fallen from a record high in the first half of the year as crude prices dropped and domestic oil demand stagnated. The company reported 84 billion yuan ($12 billion) in net profit for the first six months of the year, compared to 89 billion yuan in the same period in 2024. WHAT DO YOU THINK? Generated by readers, the comments included herein do not reflect the views and opinions of Rigzone. All comments are subject to editorial review. Off-topic, inappropriate or insulting comments will be removed.

Read More »

U.S. Department of Energy to Distribute Next Round of HALEU to U.S. Nuclear Industry

HALEU to support deployment of next-generation reactors and domestic fuel lines WASHINGTON—The U.S. Department of Energy (DOE) today made conditional commitments to provide high-assay low-enriched uranium (HALEU) to three U.S. companies to meet near-term fuel needs. The second round of HALEU allocations will support the testing of two advanced reactor designs and jumpstart a new domestic advanced fuel line to unleash an American nuclear energy renaissance. “President Trump has prioritized jumpstarting a true nuclear energy renaissance, and the Department of Energy is doing everything within its power to achieve this ambitious agenda, including increasing access to materials needed to fabricate advanced nuclear fuels,” said U.S. Secretary of Energy Chris Wright. “We’re reducing our dependence on foreign-sourced minerals while giving the private sector the boost it needs to succeed. The HALEU allocation program is a win for the economy, energy security, and the American people.” HALEU is not currently available from domestic suppliers and many advanced reactors need the material to achieve smaller designs, longer operating cycles, and increased efficiencies over current technologies. To help fill this gap, DOE created a process for nuclear developers to request HALEU material from DOE sources, including material from the National Nuclear Security Administration (NNSA). DOE allocated its first amounts of HALEU to five companies earlier this year, with three of them requiring fuel delivery in 2025. Three additional companies have been selected for conditional commitments based on prioritization criteria established through the program. The three companies are: Antares Nuclear, Inc., for use in their advanced microreactor design that is looking to go critical by July 4, 2026, under the Department’s Reactor Pilot Program. Standard Nuclear, Inc., to establish TRISO fuel lines to support the Reactor Pilot Program and other TRISO-fueled reactors. Abilene Christian University/Natura Resources LLC, for use in a new molten salt research

Read More »

Import tariffs could slow transmission development, drive up utility costs: Morningstar

Dive Brief: The Trump administration’s 50% import tariffs on steel, aluminum and copper could drive up the cost of utility grid modernization and clean energy projects while making it harder to buy transformers and other equipment, according to Morningstar DBRS. With materials typically accounting for 20% to 30% of a transmission project’s costs, the tariffs could slow or reduce the scope of transmission development, Morningstar analysts said in a report released Monday. “Some marginal projects (e.g., economic-driven grid upgrades or non-mandated resiliency projects) may no longer generate a positive net present value when significant increases in costs are taken into consideration,” the analysts said. Dive Insight: The tariffs the Trump administration has imposed this year target imported inputs that are critical to the utilities sector, including structural steel for transmission towers, aluminum conductors and copper wiring, analysts with the credit rating agency said. The tariffs also cover semi-finished and derivative copper products, such as pipes, tubes, wires and cables, rods, sheets/strips and fittings/connectors/electrical components, the analysts said. “For utilities, which consume large quantities of steel (e.g., poles, pylons, rebar), aluminum (e.g., conductors, transformer components), and copper products (e.g., finished wire/cable, connectors, fittings), these dynamics point to higher input costs for capital projects,” Morningstar analysts said. Some importers are diverting shipments away from the United States to avoid the tariffs, raising the risk of procurement delays, according to the report. “Electrical steel — a specialty steel used in transformers — remains in short domestic supply, and new tariffs could further constrain imports needed for grid equipment,” the analysts said. “The 50% steel tariff extends to the steel content of manufactured products, meaning even imported transformers are subject to hefty duties on their steel cores.” With U.S. electric bills climbing 23%, on average, from 2019 to 2024, increased utility costs may raise concerns

Read More »

Southeast utilities are signing gas capacity contracts that will burden customers for decades

Shelley Hudson Robbins is senior decarbonization manager for the Southern Alliance for Clean Energy. All across the South, electric companies and gas pipeline companies are striking deals with each other that will increase customer electric bills for decades to come. Called “firm transportation” contracts, these agreements will require electric customers to pay for new pipelines whether or not the pipelines are fully used, and they cannot deliver the level of reliability that state regulators think they will. In the latest example, EQT is seeking firm transportation customers for a 500,000 dekatherm (Dth) per day compression expansion of the Mountain Valley Pipeline that connects Appalachian Basin fracked natural gas to the massive Williams Transco pipeline. This project is the latest in a deluge of pipeline expansion projects that are backed by long-term contracts with Southeastern electric utilities such as Duke Energy in the Carolinas, Dominion and Santee Cooper in South Carolina, Oglethorpe Power in Georgia, and Southern Company in Georgia, Alabama, and Mississippi. These companies have contracted for more than 3 million Dth on three pipeline expansions currently underway at the Federal Energy Regulatory Commission: Transco Southeast Supply Enhancement Project, MVP Southgate, and Kinder Morgan’s South System Expansion 4. The contracting utilities will pay for this pipeline capacity every month and every year for 20 years, regardless of how much gas they actually use for power burn. These costs are passed directly to ratepayers. At what cost? This utility rush to sign contracts for firm pipeline transportation, or FT, originates with Southeastern public utility commissions encouraging the acquisition of firm transportation for existing and planned baseload gas additions based on the mirage that it will make the electric system more reliable. Firm transportation does not guarantee that the utilities will be able to get the actual gas molecules they need on a

Read More »

EIA projects record natural gas consumption in 2025

Listen to the article 2 min This audio is auto-generated. Please let us know if you have feedback. A Monday analysis from the Energy Information Administration forecasts U.S. natural gas consumption increasing 1% in 2025, setting a record — then decreasing slightly in 2026. “Natural gas consumption was high in the beginning of the year, driving our forecast,” EIA said. “In January, U.S. natural gas consumption was a record 126.8 [billion cubic feet per day], 5% more than the previous record set in January 2024.” EIA expects natural gas consumption to average a record 91.4 billion cubic feet per day this year. Optional Caption Courtesy of U.S. Energy Information Administration “Natural gas remains the most prevalent source of electricity generation in the United States, but so far in 2025 natural gas has lost market share in the electric power sector to coal, solar, and wind,” the agency said. “We expect increases in natural gas consumed in the residential and commercial sectors to offset decreases in natural gas consumed in the electric power sector.” The anticipated 2026 decrease is due “in part to expected milder weather in the winter months and therefore less consumption in the residential and commercial sectors,” EIA said. Utilities and hyperscalers are turning to natural gas deployment as electricity demand increases, and the Trump administration is incentivizing the deployment of gas as a solution to that demand. At the same time, the increasing demand for gas is driving up the wait time for gas turbine orders to as much as seven years. “There are underappreciated risks to the current round of robust gas development for data center customers,” investment bank Jefferies said in a Monday note. During a March earnings call, NextEra CEO John Ketchum noted to investors that the cost to build a new natural gas

Read More »

HPE extends Juniper’s Mist AI to boost data center management

Further, Aaron stated that Marvis Actions offers automated remediations for IT-approved scenarios. Using a Human-in-the-Loop (HITL) trust model, customers can develop confidence over time, giving Marvis AI Assistant permission to automatically resolve problems such as: Correcting VLAN misconfigurations Shutting down ports to resolve network loops Upgrading noncompliant devices Handling routine policy updates and firmware compliance Resolving port-stuck issues and misconfigured access points “Each action, whether initiated by IT or executed autonomously by Marvis AI Assistant, is validated post-remediation and logged in the Marvis Actions Dashboard. This maintains full auditability and HITL oversight while building trust through consistent, accurate results,” Aaron wrote. Juniper also extended Marvis further into the vendor’s Apstra data center networking environment by letting the platform have access to Apstra’s contextual graph database, which maps the components in the data center including switches, routers, servers, links, policies and services.  The idea is to let the MistAI framework understand complex queries, break them into logical components, and iteratively query data sources to synthesize actionable responses, Aaron stated.  “This framework currently supports nearly 300 API queries. It will expand to enable autonomous service provisioning activities, incorporate additional data sources like elastic search, and enhance feedback mechanisms for continuous learning—critical steps toward fully self-driving data centers,” Aaron wrote. In addition to the Apstra extension, Juniper is adding Marvis Minis capabilities to data center operations. Marvis Minis set up a digital twin of a customer’s network environment to simulate and test user connections, validate network configurations, and find/detect problems without users being present and without requiring any additional hardware, according to Juniper.

Read More »

Vertiv launches one-day installation package for AI data center systems

Data center infrastructure vendor Vertiv has introduced Vertiv OneCore, a fully modular data center building block design supporting AI and HPC applications intended to speed deployment of compute equipment in as little as one day. OneCore is a unified “slab-up,” factory-assembled, turnkey platform that integrates the company’s power, thermal, and IT infrastructure systems within a pre-assembled shell. Slab-up is a data center design where equipment—such as server racks and cabinets—are installed directly on a solid concrete slab floor, rather than on a raised floor system. Vertiv says the design simplifies logistics, minimizes on-site labor and complexity, and supports consistent quality, cost, and schedule outcomes. Vertiv Unify, which assists Vertiv cooling, UPS, and power management equipment to connect to building and data center management, provides integrated system visibility and centralized management. 

Read More »

Nvidia turns to software to speed up its data center networking hardware for AI

Typically chunks of AI tasks are distributed across GPUs, which then coordinate to provide a unified output. Adaptive routing ensures the network and GPUs over long distances are in sync when running AI workloads, Shainer said. Jitter bugs “If I retransmit the packet, I create jitter, which means one GPU out of many will be delayed and all the others have to wait for that GPU to finish,” Shainer said. The congestion control improvements remove bottlenecks by balancing transmissions across switches. Nvidia tested XGS algorithms in its server hardware and measured a 1.9x improvement in GPU-to-GPU communication compared to off-the-shelf networking technology, executives said during a briefing on the technology. Cloud providers already have long-distance high-speed networks. For example, Google’s large-scale Jupiter network uses optical switching for fast communications between its AI chips, which are called TPUs. It is important to separate the physical infrastructure from the software algorithms like XGS, Shainer said.

Read More »

Fluke Networks expands testing to help ease data center networking challenges

High-density fiber connections amplify contamination risks The shift toward higher-density fiber connections has significantly complicated contamination control. Modern array connectors can house up to 24 individual fibers within a single connection point. In contrast, traditional duplex connections contained just two fibers. “The slightest little bit of dust on one of those nine micron wide fibers, which, by the way, is much smaller than a human hair, the slightest little bit of dust on any one of the 24 in that connector, and it’s not going to work,” Mullins explained.  The inspection and cleaning requirements extend beyond traditional fiber testing. Each kit includes cleaning and inspection capabilities. Mullins noted that many technicians take shortcuts on fiber preparation.  “Cleaning and inspecting a fiber, every time you unplug it and plug it back in, adds, like another minute worth of work. But you know what? If you don’t do it, you’re gonna pay for it down the road,” he said. Cable identification a persistent challenge In addition to the new kits, Fluke Networks is also continuing to help solve other persistent networking issues. Physical cable identification continues to plague data center operations despite advances in network management and monitoring. Fluke’s solutions address this through multiple approaches. These include tone and probe technology, remote identification systems, and active network port discovery.

Read More »

Cisco ties storage networking gear to IBM z17 mainframe

“IBM Z systems are mainframes known for their ability to handle massive transaction volumes, support large-scale databases, and provide unmatched levels of security and uptime,” wrote Fausto Vaninetti, a senior solutions engineer for data center technologies at Cisco, in a blog post about the news. “The newest in the IBM Z system family, IBM z17 is the first mainframe fully engineered for the AI age, unlocking expanded capabilities for enterprise-scale AI, such as large language models, generative AI, and accelerated inferencing. However, the performance of mainframe applications depends on the underlying storage infrastructure.” SANs play a critical role in ensuring fast, reliable, and secure access to data, Vaninetti wrote: “For mainframe environments, which leverage high-speed [Fibre Connection] FICON protocol, having a robust SAN fabric that supports these requirements is non-negotiable. A solution that combines high throughput, low latency, and enterprise-class resilience is vital to ensure seamless operations and meet stringent service-level agreement requirements.” According to Vaninetti, some standout features of the MDS 9000 Series for mainframe environments include:

Read More »

Scaling Up: Tract’s Master-Planned Land and Infrastructure Approach to Data Center Development

With the rapid growth of physical data center infrastructure, it’s no surprise that a niche market has emerged for companies specializing in land acquisition. Reports of massive property purchases by firms planning new facilities appear almost daily—and so do accounts of the challenges developers face before the first shovel hits the ground. As parcel sizes grow and power and water demands intensify, the complexities of acquiring and preparing these sites have only increased. Tract is a leader in this space. The Denver-based company develops master-planned data center parks, with more than 25,000 acres of potential sites under its control and plans to support over 25 GW of workload capacity. To put that into perspective, 25,000 acres is roughly 40 square miles—about two-thirds the land area of Washington, D.C., or, for European readers, two-thirds the size of Liechtenstein. Building Shovel-Ready Megasites Rather than waiting for developers to come knocking, Tract takes a proactive approach, built on the core belief that the future of data center growth lies in pre-entitled, zoned, and infrastructure-ready megasites. The company works years in advance to deliver shovel-ready campuses with reliable energy, fiber connectivity, and municipal cooperation already in place. Its model emphasizes strategic land aggregation in high-growth regions, the cultivation of long-term relationships with utilities and governments, and master planning for power, cooling, transportation, and sustainability. This integrated approach positions Tract to deliver both speed and certainty to hyperscale project developers—at scale. Tract’s leadership team brings deep industry experience. Founder and Executive Chairman Grant van Rooyen previously led acquisitions and expansions at Cologix and Teraco. President Matt Spencer brings more than 35 years of telecom and infrastructure leadership, while Chief Energy Officer Nat Sahlstrom, former head of Amazon’s global energy, water, and sustainability teams, helped make Amazon the world’s largest buyer of renewable energy. Backed by

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »