Stay Ahead, Stay ONMINE

4 technologies that could power the future of energy

Where can you find lasers, electric guitars, and racks full of novel batteries, all in the same giant room? This week, the answer was the 2025 ARPA-E Energy Innovation Summit just outside Washington, DC. Energy innovation can take many forms, and the variety in energy research was on display at the summit. ARPA-E, part of the US Department of Energy, provides funding for high-risk, high-reward research projects. The summit gathers projects the agency has funded, along with investors, policymakers, and journalists. Hundreds of projects were exhibited in a massive hall during the conference, featuring demonstrations and research results. Here are four of the most interesting innovations MIT Technology Review spotted on site.  Steel made with lasers Startup Limelight Steel has developed a process to make iron, the main component in steel, by using lasers to heat iron ore to super-high temperatures.  Steel production makes up roughly 8% of global greenhouse gas emissions today, in part because most steel is still made with blast furnaces, which rely on coal to hit the high temperatures that kick off the required chemical reactions.  Limelight instead shines lasers on iron ore, heating it to temperatures over 1,600 °C. Molten iron can then be separated from impurities, and the iron can be put through existing processes to make steel.  The company has built a small demonstration system with a laser power of about 1.5 kilowatts, which can process between 10 and 20 grams of ore. The whole system is made up of 16 laser arrays, each just a bit larger than a postage stamp. The components in the demonstration system are commercially available; this particular type of laser is used in projectors. The startup has benefited from years of progress in the telecommunications industry that has helped bring down the cost of lasers, says Andy Zhao, the company’s cofounder and CTO.  The next step is to build a larger-scale system that will use 150 kilowatts of laser power and could make up to 100 tons of steel over the course of a year. Rocks that can make fuel The hunks of rock at a booth hosted by MIT might not seem all that high-tech, but someday they could help produce fuels and chemicals.  A major topic of conversation at the ARPA-E summit was geologic hydrogen—there’s a ton of excitement about efforts to find underground deposits of the gas, which can be used as a fuel across a wide range of industries, including transportation and heavy industry.  Last year, ARPA-E funded a handful of projects on the topic, including one in Iwnetim Abate’s lab at MIT. Abate is among the researchers who are aiming not just to hunt for hydrogen, but to actually use underground conditions to help produce it. Earlier this year, his team published research showing that by using catalysts and conditions common in the subsurface, scientists can produce hydrogen as well as other chemicals, like ammonia. Abate cofounded a spinout company, Addis Energy, to commercialize the research, which has since also received ARPA-E funding.  All the rocks on the table, from the chunk of dark, hard basalt to the softer talc, could be used to produce these chemicals.  An electric guitar powered by iron nitride magnets The sound of music drifted from the Niron Magnetics booth across nearby walkways. People wandering by stopped to take turns testing out the company’s magnets, in the form of an electric guitar.  Most high-powered magnets today contain neodymium—demand for them is set to skyrocket in the coming years, especially as the world builds more electric vehicles and wind turbines. Supplies could stretch thin, and the geopolitics are complicated because most of the supply comes from China.  Niron is making new magnets that don’t contain rare earth metals. Instead, Niron’s technology is based on more abundant materials: nitrogen and iron.  The guitar is a demonstration product—today, magnets in electric guitars typically contain aluminum, nickel, and cobalt-based magnets that help translate the vibrations from steel strings into an electric signal that is broadcast through an amplifier. Niron made an instrument using its iron nitride magnets instead. (See photos of the guitar from an event last year here.) Niron opened a pilot commercial facility in late 2024 that has the capacity to produce 10 tons of magnets annually. Since we last covered Niron, in early 2024, the company has announced plans for a full-scale plant, which will have an annual capacity of about 1,500 tons of magnets once it’s fully ramped up.  Batteries for powering high-performance data centers The increasing power demand from AI and data centers was another hot topic at the summit, with server racks dotting the showcase floor to demonstrate technologies aimed at the sector. One stuffed with batteries caught my eye, courtesy of Natron Energy.  The company is making sodium-ion batteries to help meet power demand from data centers.  Data centers’ energy demands can be incredibly variable—and as their total power needs get bigger, those swings can start to affect the grid. Natron’s sodium-ion batteries can be installed at these facilities to help level off the biggest peaks, allowing computing equipment to run full out without overly taxing the grid, says Natron cofounder and CTO Colin Wessells.  Sodium-ion batteries are a cheaper alternative to lithium-based chemistries. They’re also made without lithium, cobalt, and nickel, materials that are constrained in production or processing. We’re seeing some varieties of sodium-ion batteries popping up in electric vehicles in China. Natron opened a production line in Michigan last year, and the company plans to open a $1.4 billion factory in North Carolina. 

Where can you find lasers, electric guitars, and racks full of novel batteries, all in the same giant room? This week, the answer was the 2025 ARPA-E Energy Innovation Summit just outside Washington, DC.

Energy innovation can take many forms, and the variety in energy research was on display at the summit. ARPA-E, part of the US Department of Energy, provides funding for high-risk, high-reward research projects. The summit gathers projects the agency has funded, along with investors, policymakers, and journalists.

Hundreds of projects were exhibited in a massive hall during the conference, featuring demonstrations and research results. Here are four of the most interesting innovations MIT Technology Review spotted on site. 

Steel made with lasers

Startup Limelight Steel has developed a process to make iron, the main component in steel, by using lasers to heat iron ore to super-high temperatures. 

Steel production makes up roughly 8% of global greenhouse gas emissions today, in part because most steel is still made with blast furnaces, which rely on coal to hit the high temperatures that kick off the required chemical reactions. 

Limelight instead shines lasers on iron ore, heating it to temperatures over 1,600 °C. Molten iron can then be separated from impurities, and the iron can be put through existing processes to make steel. 

The company has built a small demonstration system with a laser power of about 1.5 kilowatts, which can process between 10 and 20 grams of ore. The whole system is made up of 16 laser arrays, each just a bit larger than a postage stamp.

The components in the demonstration system are commercially available; this particular type of laser is used in projectors. The startup has benefited from years of progress in the telecommunications industry that has helped bring down the cost of lasers, says Andy Zhao, the company’s cofounder and CTO. 

The next step is to build a larger-scale system that will use 150 kilowatts of laser power and could make up to 100 tons of steel over the course of a year.

Rocks that can make fuel

The hunks of rock at a booth hosted by MIT might not seem all that high-tech, but someday they could help produce fuels and chemicals. 

A major topic of conversation at the ARPA-E summit was geologic hydrogen—there’s a ton of excitement about efforts to find underground deposits of the gas, which can be used as a fuel across a wide range of industries, including transportation and heavy industry. 

Last year, ARPA-E funded a handful of projects on the topic, including one in Iwnetim Abate’s lab at MIT. Abate is among the researchers who are aiming not just to hunt for hydrogen, but to actually use underground conditions to help produce it. Earlier this year, his team published research showing that by using catalysts and conditions common in the subsurface, scientists can produce hydrogen as well as other chemicals, like ammonia. Abate cofounded a spinout company, Addis Energy, to commercialize the research, which has since also received ARPA-E funding

All the rocks on the table, from the chunk of dark, hard basalt to the softer talc, could be used to produce these chemicals. 

An electric guitar powered by iron nitride magnets

The sound of music drifted from the Niron Magnetics booth across nearby walkways. People wandering by stopped to take turns testing out the company’s magnets, in the form of an electric guitar. 

Most high-powered magnets today contain neodymium—demand for them is set to skyrocket in the coming years, especially as the world builds more electric vehicles and wind turbines. Supplies could stretch thin, and the geopolitics are complicated because most of the supply comes from China. 

Niron is making new magnets that don’t contain rare earth metals. Instead, Niron’s technology is based on more abundant materials: nitrogen and iron. 

The guitar is a demonstration product—today, magnets in electric guitars typically contain aluminum, nickel, and cobalt-based magnets that help translate the vibrations from steel strings into an electric signal that is broadcast through an amplifier. Niron made an instrument using its iron nitride magnets instead. (See photos of the guitar from an event last year here.)

Niron opened a pilot commercial facility in late 2024 that has the capacity to produce 10 tons of magnets annually. Since we last covered Niron, in early 2024, the company has announced plans for a full-scale plant, which will have an annual capacity of about 1,500 tons of magnets once it’s fully ramped up. 

Batteries for powering high-performance data centers

The increasing power demand from AI and data centers was another hot topic at the summit, with server racks dotting the showcase floor to demonstrate technologies aimed at the sector. One stuffed with batteries caught my eye, courtesy of Natron Energy. 

The company is making sodium-ion batteries to help meet power demand from data centers. 

Data centers’ energy demands can be incredibly variable—and as their total power needs get bigger, those swings can start to affect the grid. Natron’s sodium-ion batteries can be installed at these facilities to help level off the biggest peaks, allowing computing equipment to run full out without overly taxing the grid, says Natron cofounder and CTO Colin Wessells. 

Sodium-ion batteries are a cheaper alternative to lithium-based chemistries. They’re also made without lithium, cobalt, and nickel, materials that are constrained in production or processing. We’re seeing some varieties of sodium-ion batteries popping up in electric vehicles in China.

Natron opened a production line in Michigan last year, and the company plans to open a $1.4 billion factory in North Carolina

Shape
Shape
Stay Ahead

Explore More Insights

Stay ahead with more perspectives on cutting-edge power, infrastructure, energy,  bitcoin and AI solutions. Explore these articles to uncover strategies and insights shaping the future of industries.

Shape

What is Nvidia Dynamo and why it matters to enterprises?

It uses disaggregated serving to separate the processing and generation phases of large language models (LLMs) on different GPUs, which allows each phase to be optimized independently for its specific needs and ensures maximum GPU resource utilization, the chipmaker explained.   The efficiency gain is made possible as Dynamo has

Read More »

EPA yanks air quality permit for 1.5-GW Atlantic Shores offshore wind project

The 1.5-GW Atlantic Shores 1 wind energy project in progress offshore New Jersey had its Clean Air Act permit from the Environmental Protection Agency remanded by the agency’s Environmental Appeals Board in a Friday filing. The board said EPA’s Region 2 office, which covers New Jersey, requested the voluntary remand “so that the Region has the opportunity to reevaluate the Project and its environmental impacts” in light of President Trump’s Jan. 20 executive order, which mandated a pause on offshore wind leasing and a review of existing leases. The group Save Long Beach Island originally requested an appeal and review of the project’s Clean Air Act permit last October. The group’s website states its goal is to ensure that Atlantic Shores’ proposed wind turbines “are moved further offshore and out of sight to the Hudson South Call Area which is approximately 30 to 57 miles from our coastline.” The Friday filing said that Atlantic Shores Offshore Wind filed a March 7 response objecting to the remand and arguing that Region 2 didn’t provide “good cause” for it, but the board said it “has broad discretion to grant a voluntary remand.” “The circumstances here support a voluntary remand. In this case, the Region has clearly stated its intent to reconsider the Project and permit decision in light of the Presidential Memorandum,” the board said. Region 2’s review of the permit will involve “[conferring] with other executive branch agencies regarding further evaluation of various impacts that may result from the Project, including impacts on birds, wildlife, fishing, and other relevant environmental concerns described in the Presidential Memorandum,” along with other steps left unspecified, according to the board. In a Monday statement, Atlantic Shores said it is “disappointed by the EPA’s decision to pull back its fully executed permit as regulatory certainty is critical to

Read More »

Schneider Electric, NVIDIA, ETAP collaboration set to advance data center efficiency, operations

Dive Brief: Schneider Electric and power system design and operations firm ETAP on Tuesday released digital twins that can design and simulate the power needs of AI factories, the companies said in a release shared with Facilities Dive. A digital twin is a system replica that enables users to simulate operations to improve performance in real time.  By leveraging ETAP’s integration of NVIDIA Omniverse technologies, the companies said, they were able to develop digital twins that bring together multiple inputs for mechanical, thermal, networking and electrical systems to mirror how an AI factory operates.  NVIDIA introduced its Omniverse Blueprint for AI factory design and operations Tuesday at its annual conference, GTC. The companies say the collaboration will provide enhanced insight and control over the electrical systems and power requirements at AI factories. That presents “an opportunity for significant efficiency, reliability and sustainability gains,” Schneider Electric said.  Dive Insight: AI workloads such as large-scale training clusters and edge inference servers are driving a significant increase in data center power consumption, requiring substantial computational power that has led to higher rack power densities, the companies say. This has led data center capacity proposals to balloon, with the average proposed center doubling in size from about 150 MW in early 2023 to 300 MW in mid-2024, according to an Oct. 24 report from Wood Mackenzie.  The demand is giving AI-related occupiers increasing influence over data center development decisions like site selection, design and operational requirements, CBRE said in its North America Data Center Trends H2 2024 report, released Feb. 26. These occupiers are “prioritizing markets with scalable power capacity and advanced connectivity solutions,” the commercial real estate services firm said.  Power availability remains the top priority for data center developers looking at greenfield sites, according to CBRE’s report. “Startups, enterprises, colocation providers, and

Read More »

Utility energy efficiency investment hit record $8.8B in 2023: ACEEE

In a rebound from early COVID-era declines, utilities invested a record $8.8 billion in energy efficiency in 2023, the American Council for an Energy-Efficient Economy said Tuesday in its assessment of state policies. “Energy efficiency remains our nation’s least-cost energy resource while also delivering additional benefits such as grid reliability and resilience,” ACEEE said. “Utility electricity efficiency investments slowed from 2020 through 2022, but in 2023 reached a new high.” The total includes $6.9 billion for electric efficiency and $1.9 billion in gas efficiency investments. The total is an increase of approximately 16% compared with the recent low of 2020, ACEEE said, and 6% more than the previous high in 2019.  Many states and utilities are now looking beyond more traditional efficiency efforts, like lighting retrofits, to focus on deep energy home upgrades, smart buildings, expansion of electric vehicle infrastructure, zero-energy buildings, and electrification of space and water heating, according to the report. But the investment is uneven. ACEEE’s “2025 State Energy Efficiency Scorecard” found that just five states accounted for 90% of the increase in 2023 efficiency spending: Massachusetts, Missouri, New Jersey, New York and Pennsylvania. “In the wake of rapidly rising energy prices and electricity bills, several states are recognizing energy efficiency’s important role in keeping energy affordable by helping homeowners and businesses reduce costs, by improving living conditions, and by creating jobs, all while supporting increasingly ambitious state and local goals to reduce carbon emissions,” ACEEE said. The scorecard ranks states on their policy and program efforts to advance energy efficiency. California ranked No. 1 for the seventh time. Massachusetts was second, followed by New York, with Maryland and Vermont tied for fourth. “Leading states are reducing costs and cutting pollution through energy savings measures, but many other states are stagnating,” Mark Kresowik, senior policy director at ACEEE and lead author of

Read More »

Oil Titans Meet With Trump as USA Unwinds Environmental Rules

The chief executives of more than a dozen oil companies will deliver a message of gratitude — as well as caution — when they meet with President Donald Trump on Wednesday.  Industry leaders say they have plenty of reasons to give thanks. Trump is an unabashed champion of US oil and gas production who has vowed to unleash the industry’s potential. Two months into office, he’s already taken steps to begin unwinding policies that increased operational costs and reduced demand for fuel. But for the roughly 15 oil bosses set to visit Trump at the White House, there also are warning signs on that path to energy dominance. Energy Secretary Chris Wright has floated a $50-per-barrel target for crude that’s too low to sustain some US production. The president has spent days enthusiastically praising oil price declines that came after he pushed OPEC+ to boost output and the cartel obliged.  Meanwhile, the president’s threatened tariffs are stoking industry concerns about potential economic declines even as the levies raise costs for the materials oil companies use to refine gasoline and drill wells.  “I suspect operators are enthusiastic about new market opportunities” posed by Trump, said Kevin Book, managing director of the Washington consulting firm ClearView Energy Partners LLC. But they’re also likely “concerned about demand headwinds from a trade war. And, generally speaking, in the oil patch, higher prices go over better than lower ones.” The meeting is set to be the first of its kind since Trump’s second inauguration and his creation of a new National Energy Dominance Council to shape policy.  Potential participants are set to span the industry’s full spectrum, including integrated oil companies (Exxon Mobil Corp., Chevron Corp., Shell Plc, BP PLC, ConocoPhillips, Hess Corp.); independent producers (Diamondback Energy Inc., APA Corp’s Apache, Occidental Petroleum Corp., Continental Resources Inc.); refiners (Marathon

Read More »

Project Willow could create 800 jobs at Grangemouth

A series of clean energy projects at the Grangemouth refinery could create around 800 jobs over the next 15 years, according to the Project Willow summary. The £1.5 million feasibility study, published today, also stated that this figure could potentially rise to 1,200. The document outlines nine potential projects that could be developed at the site, helping secure a long-term industrial future for the refinery These projects are divided into three tracks. In the waste category, the proposed projects are hydrothermal upgrading (breaking down hard-to-recycle plastics), chemical plastics recycling and ABE biorefining (breaking down waste material). In the bio-feedstock track, Grangemouth could gain the capacity to break down Scottish timber into bioethanol, add anaerobic digestion of bioresources and digestate pyrolysis, or convert Scottish cover crops into sustainable aviation fuel and renewable diesel using low-carbon hydrogen. The third category is offshore wind conduit to replace natural gas with hydrogen, using low-carbon hydrogen to produce methanol and convert it to sustainable aviation fuel, and producing low-carbon ammonia from hydrogen for shipping and chemicals. The Project Willow report envisions that the Grangemouth overhaul could generate £600-700m of total annual gross value add for the Scottish economy by 2040 under its base case, potentially rising to £1-2bn. The developments could also reduce Scotland’s greenhouse gas emissions by 3.5-6 mtpa. However, this will require around £3.5bn of capital investment from the private sector to achieve – and, according to the report, the maximum capex outlays for all projects and growth scenarios would be over £8bn. Under Project Willow’s timelines, the first projects – hydrothermal plastic recycling, dissolution plastics recycling and anaerobic digestion – could come online by 2028. Later projects, such as e-ammonia, e-methanol and methanol to jet fuel, could come online by 2035, with the final phase of bioethanol generation entering operations by 2039.

Read More »

Net zero is an ‘economic opportunity’ – Miliband

The transition to net zero is “the economic opportunity of the 21st century”, energy secretary Ed Miliband said on Wednesday. He described the opportunity as the “most important context” for the discussions being had by government in the energy sector. “There’s been a long-standing issue in our country about creating the good jobs at good wages that people want to see,” Miliband said. “We haven’t done enough for them. If you want our growth in our economy, what are good jobs in our economy? This is absolutely at the core, but we haven’t done enough as a country.” The Confederation of British Industry (CBI) published a report last month showing that the net zero economy grew three times faster than the economy as a whole last year. That report showed the economic opportunity also led to a 10.2% additional increase in employment and higher average wages. Miliband said that Britain has been “losing out” in the race to equip the country with renewable energy jobs, adding that is “what the government is determined to do”. “Denmark has almost all kinds of many renewable jobs per capita,” he said. “Why haven’t we captured these jobs?” A lack of “industrial policy” is the reason Miliband cited for falling behind in the race to create renewable energy jobs. “Industrial policy is at the heart of what we’re trying to do with GB Energy and capitalising, public investment, as part of that,” said Miliband. Despite Jonathan Reynolds’ eight months as trade secretary, the government has “lost” its “muscle memory”, according to Miliband; something he said it is now “regaining” under his leadership. He said the role of the state is to provide a “framework” for industry and break down the barriers to private investment. He made a call to make “public investment and crowd

Read More »

IBM broadens access to Nvidia technology for enterprise AI

The IBM Storage Scale platform will support CAS and now will respond to queries using the extracted and augmented data, speeding up the communications between GPUs and storage using Nvidia BlueField-3 DPUs and Spectrum-X networking, IBM stated. The multimodal document data extraction workflow will also support Nvidia NeMo Retriever microservices. CAS will be embedded in the next update of IBM Fusion, which is planned for the second quarter of this year. Fusion simplifies the deployment and management of AI applications and works with Storage Scale, which will handle high-performance storage support for AI workloads, according to IBM. IBM Cloud instances with Nvidia GPUs In addition to the software news, IBM said its cloud customers can now use Nvidia H200 instances in the IBM Cloud environment. With increased memory bandwidth (1.4x higher than its predecessor) and capacity, the H200 Tensor Core can handle larger datasets, accelerating the training of large AI models and executing complex simulations, with high energy efficiency and low total cost of ownership, according to IBM. In addition, customers can use the power of the H200 to process large volumes of data in real time, enabling more accurate predictive analytics and data-driven decision-making, IBM stated. IBM Consulting capabilities with Nvidia Lastly, IBM Consulting is adding Nvidia Blueprint to its recently introduced AI Integration Service, which offers customers support for developing, building and running AI environments. Nvidia Blueprints offer a suite pre-validated, optimized, and documented reference architectures designed to simplify and accelerate the deployment of complex AI and data center infrastructure, according to Nvidia.  The IBM AI Integration service already supports a number of third-party systems, including Oracle, Salesforce, SAP and ServiceNow environments.

Read More »

Nvidia’s silicon photonics switches bring better power efficiency to AI data centers

Nvidia typically uses partnerships where appropriate, and the new switch design was done in collaboration with multiple vendors across different aspects, including creating the lasers, packaging, and other elements as part of the silicon photonics. Hundreds of patents were also included. Nvidia will licensing the innovations created to its partners and customers with the goal of scaling this model. Nvidia’s partner ecosystem includes TSMC, which provides advanced chip fabrication and 3D chip stacking to integrate silicon photonics into Nvidia’s hardware. Coherent, Eoptolink, Fabrinet, and Innolight are involved in the development, manufacturing, and supply of the transceivers. Additional partners include Browave, Coherent, Corning Incorporated, Fabrinet, Foxconn, Lumentum, SENKO, SPIL, Sumitomo Electric Industries, and TFC Communication. AI has transformed the way data centers are being designed. During his keynote at GTC, CEO Jensen Huang talked about the data center being the “new unit of compute,” which refers to the entire data center having to act like one massive server. That has driven compute to be primarily CPU based to being GPU centric. Now the network needs to evolve to ensure data is being fed to the GPUs at a speed they can process the data. The new co-packaged switches remove external parts, which have historically added a small amount of overhead to networking. Pre-AI this was negligible, but with AI, any slowness in the network leads to dollars being wasted.

Read More »

Critical vulnerability in AMI MegaRAC BMC allows server takeover

“In disruptive or destructive attacks, attackers can leverage the often heterogeneous environments in data centers to potentially send malicious commands to every other BMC on the same management segment, forcing all devices to continually reboot in a way that victim operators cannot stop,” the Eclypsium researchers said. “In extreme scenarios, the net impact could be indefinite, unrecoverable downtime until and unless devices are re-provisioned.” BMC vulnerabilities and misconfigurations, including hardcoded credentials, have been of interest for attackers for over a decade. In 2022, security researchers found a malicious implant dubbed iLOBleed that was likely developed by an APT group and was being deployed through vulnerabilities in HPE iLO (HPE’s Integrated Lights-Out) BMC. In 2018, a ransomware group called JungleSec used default credentials for IPMI interfaces to compromise Linux servers. And back in 2016, Intel’s Active Management Technology (AMT) Serial-over-LAN (SOL) feature which is part of Intel’s Management Engine (Intel ME), was exploited by an APT group as a covert communication channel to transfer files. OEM, server manufacturers in control of patching AMI released an advisory and patches to its OEM partners, but affected users must wait for their server manufacturers to integrate them and release firmware updates. In addition to this vulnerability, AMI also patched a flaw tracked as CVE-2024-54084 that may lead to arbitrary code execution in its AptioV UEFI implementation. HPE and Lenovo have already released updates for their products that integrate AMI’s patch for CVE-2024-54085.

Read More »

HPE, Nvidia broaden AI infrastructure lineup

“Accelerated by 2 NVIDIA H100 NVL, [HPE Private Cloud AI Developer System] includes an integrated control node, end-to-end AI software that includes NVIDIA AI Enterprise and HPE AI Essentials, and 32TB of integrated storage providing everything a developer needs to prove and scale AI workloads,” Corrado wrote. In addition, HPE Private Cloud AI includes support for new Nvidia GPUs and blueprints that deliver proven and functioning AI workloads like data extraction with a single click, Corrado wrote. HPE data fabric software HPE has also extended support for its Data Fabric technology across the Private Cloud offering. The Data Fabric aims to create a unified and consistent data layer that spans across diverse locations, including on-premises data centers, public clouds, and edge environments to provide a single, logical view of data, regardless of where it resides, HPE said. “The new release of Data Fabric Software Fabric is the data backbone of the HPE Private Cloud AI data Lakehouse and provides an iceberg interface for PC-AI users to data hosed throughout their enterprise. This unified data layer allows data scientists to connect to external stores and query that data as iceberg compliant data without moving the data,” wrote HPE’s Ashwin Shetty in a blog post. “Apache Iceberg is the emerging format for AI and analytical workloads. With this new release Data Fabric becomes an Iceberg end point for AI engineering. This makes it simple for AI engineering data scientists to easily point to the data lakehouse data source and run a query directly against it. Data Fabric takes care of metadata management, secure access, joining files or objects across any source on-premises or in the cloud in the global namespace.” In addition, HPE Private Cloud AI now supports pre-validated Nvidia blueprints to help customers implement support for AI workloads.  AI infrastructure optimization  Aiming to help customers

Read More »

Schneider Electric Adds Data Center and Microgrid Testing Labs to Andover, MA Global R&D Center

Schneider Electric, a global leader in energy management and automation, has established its Global Innovation Hubs as key centers for technological advancement, collaboration, and sustainable development. These hub facilities serve as ecosystems where cutting-edge solutions in energy efficiency, industrial automation, and digital transformation are designed, tested, and deployed to address the world’s most pressing energy and sustainability challenges. Energy Management and Industrial Automation Focus Strategically located around the world, Schneider Electric’s Global Innovation Hubs are positioned to drive regional and global innovation in energy management and industrial automation. The hubs focus on developing smart, connected, and sustainable solutions across various sectors, including data centers, smart buildings, industrial automation, and renewable energy. Key aspects of the Schneider Global Innovation Hubs include: Collaboration and Co-Innovation: Partnering with startups, industry leaders, and research institutions to accelerate innovation. Fostering an open ecosystem where ideas can be rapidly developed and tested. Digital Transformation and Automation: Leveraging IoT, AI, and cloud technologies to enhance energy efficiency. Implementing digital twin technology for real-time monitoring and predictive maintenance. Sustainability and Energy Efficiency: Developing solutions that contribute to decarbonization and net-zero emissions. Creating energy-efficient systems for buildings, industries, and critical infrastructure. Customer-focused Innovation: Offering live demonstrations, simulation environments, and test labs for customers. Customizing solutions to meet specific industry challenges and regulatory requirements. Schneider’s Andover R&D Lab Highlights While there are 11 hubs worldwide to give the global customer base more convenient locations where they can evaluate Schneider product, the new lab facilities have also been added to one of the company’s five global R&D locations. The selected location is co-located with Schneider’s US research labs in Andover, Massachusetts. With the addition of these two new labs there are now 41 labs located in Andover. Over the last year, Schneider Electric has invested approximately $2.4 billion in R&D. The

Read More »

Executive Roundtable: Probing Data Center Power Infrastructure and Energy Resilience in 2025

Ryan Baumann, Rehlko: Industry leaders are taking bold steps to secure long-term energy availability by embracing innovative backup power solutions, forming strategic partnerships, and exploring alternative energy sources. To overcome the challenges ahead, collaboration is key—operators, utilities, OEMs, and technology providers must come together, share insights, and create customized solutions that keep energy both reliable and sustainable as the landscape evolves. One of the most significant strategies is the growing use of alternative energy sources like hydrogen, natural gas, and even nuclear to ensure a steady supply of power. These options provide a more flexible, reliable backup to grid power, especially in markets with fluctuating energy demands or limited infrastructure. Emergency generator systems, when equipped with proper emissions treatment, can also support the grid through peak shaving or interruptible rate programs with utilities. Hydrogen fuel cells, in particular, are becoming a game-changer for backup power. Offering zero-emission, scalable, and efficient solutions, hydrogen is helping data centers move toward their carbon-neutral goals while addressing energy reliability. When integrated into a microgrid, hydrogen fuel cells create a cohesive energy network that can isolate from the main grid during power outages, ensuring continuous energy security for critical infrastructure like data centers. Additionally, natural gas Central Utility Plants (CUPs) are emerging as a key bridging power source, helping large data centers in grid-constrained regions maintain operations until permanent utility power is available. Smart energy solutions, including customized paralleling systems, allow emergency assets to be grid-intertied, enabling utilities and communities to share power burdens during peak periods. By embracing these innovative solutions and fostering collaboration, the industry not only ensures reliable power for today’s data centers but also paves the way for a more sustainable and resilient energy future. Next:  Cooling Imperatives for Managing High-Density AI Workloads 

Read More »

Microsoft will invest $80B in AI data centers in fiscal 2025

And Microsoft isn’t the only one that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).  In a report published in October 2024, Bloomberg Intelligence estimated that demand for generative AI would push Microsoft, AWS, Google, Oracle, Meta, and Apple would between them devote $200 billion to capex in 2025, up from $110 billion in 2023. Microsoft is one of the biggest spenders, followed closely by Google and AWS, Bloomberg Intelligence said. Its estimate of Microsoft’s capital spending on AI, at $62.4 billion for calendar 2025, is lower than Smith’s claim that the company will invest $80 billion in the fiscal year to June 30, 2025. Both figures, though, are way higher than Microsoft’s 2020 capital expenditure of “just” $17.6 billion. The majority of the increased spending is tied to cloud services and the expansion of AI infrastructure needed to provide compute capacity for OpenAI workloads. Separately, last October Amazon CEO Andy Jassy said his company planned total capex spend of $75 billion in 2024 and even more in 2025, with much of it going to AWS, its cloud computing division.

Read More »

John Deere unveils more autonomous farm machines to address skill labor shortage

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Self-driving tractors might be the path to self-driving cars. John Deere has revealed a new line of autonomous machines and tech across agriculture, construction and commercial landscaping. The Moline, Illinois-based John Deere has been in business for 187 years, yet it’s been a regular as a non-tech company showing off technology at the big tech trade show in Las Vegas and is back at CES 2025 with more autonomous tractors and other vehicles. This is not something we usually cover, but John Deere has a lot of data that is interesting in the big picture of tech. The message from the company is that there aren’t enough skilled farm laborers to do the work that its customers need. It’s been a challenge for most of the last two decades, said Jahmy Hindman, CTO at John Deere, in a briefing. Much of the tech will come this fall and after that. He noted that the average farmer in the U.S. is over 58 and works 12 to 18 hours a day to grow food for us. And he said the American Farm Bureau Federation estimates there are roughly 2.4 million farm jobs that need to be filled annually; and the agricultural work force continues to shrink. (This is my hint to the anti-immigration crowd). John Deere’s autonomous 9RX Tractor. Farmers can oversee it using an app. While each of these industries experiences their own set of challenges, a commonality across all is skilled labor availability. In construction, about 80% percent of contractors struggle to find skilled labor. And in commercial landscaping, 86% of landscaping business owners can’t find labor to fill open positions, he said. “They have to figure out how to do

Read More »

2025 playbook for enterprise AI success, from agents to evals

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More 2025 is poised to be a pivotal year for enterprise AI. The past year has seen rapid innovation, and this year will see the same. This has made it more critical than ever to revisit your AI strategy to stay competitive and create value for your customers. From scaling AI agents to optimizing costs, here are the five critical areas enterprises should prioritize for their AI strategy this year. 1. Agents: the next generation of automation AI agents are no longer theoretical. In 2025, they’re indispensable tools for enterprises looking to streamline operations and enhance customer interactions. Unlike traditional software, agents powered by large language models (LLMs) can make nuanced decisions, navigate complex multi-step tasks, and integrate seamlessly with tools and APIs. At the start of 2024, agents were not ready for prime time, making frustrating mistakes like hallucinating URLs. They started getting better as frontier large language models themselves improved. “Let me put it this way,” said Sam Witteveen, cofounder of Red Dragon, a company that develops agents for companies, and that recently reviewed the 48 agents it built last year. “Interestingly, the ones that we built at the start of the year, a lot of those worked way better at the end of the year just because the models got better.” Witteveen shared this in the video podcast we filmed to discuss these five big trends in detail. Models are getting better and hallucinating less, and they’re also being trained to do agentic tasks. Another feature that the model providers are researching is a way to use the LLM as a judge, and as models get cheaper (something we’ll cover below), companies can use three or more models to

Read More »

OpenAI’s red teaming innovations define new essentials for security leaders in the AI era

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More OpenAI has taken a more aggressive approach to red teaming than its AI competitors, demonstrating its security teams’ advanced capabilities in two areas: multi-step reinforcement and external red teaming. OpenAI recently released two papers that set a new competitive standard for improving the quality, reliability and safety of AI models in these two techniques and more. The first paper, “OpenAI’s Approach to External Red Teaming for AI Models and Systems,” reports that specialized teams outside the company have proven effective in uncovering vulnerabilities that might otherwise have made it into a released model because in-house testing techniques may have missed them. In the second paper, “Diverse and Effective Red Teaming with Auto-Generated Rewards and Multi-Step Reinforcement Learning,” OpenAI introduces an automated framework that relies on iterative reinforcement learning to generate a broad spectrum of novel, wide-ranging attacks. Going all-in on red teaming pays practical, competitive dividends It’s encouraging to see competitive intensity in red teaming growing among AI companies. When Anthropic released its AI red team guidelines in June of last year, it joined AI providers including Google, Microsoft, Nvidia, OpenAI, and even the U.S.’s National Institute of Standards and Technology (NIST), which all had released red teaming frameworks. Investing heavily in red teaming yields tangible benefits for security leaders in any organization. OpenAI’s paper on external red teaming provides a detailed analysis of how the company strives to create specialized external teams that include cybersecurity and subject matter experts. The goal is to see if knowledgeable external teams can defeat models’ security perimeters and find gaps in their security, biases and controls that prompt-based testing couldn’t find. What makes OpenAI’s recent papers noteworthy is how well they define using human-in-the-middle

Read More »