
AI servers are rewriting the power rulebook
The root cause, Panasonic noted in the statement, is the electrical behavior of AI workloads. Unlike conventional server applications, AI inference and training draw large amounts of electricity in short bursts to sustain GPU processing, causing peak power levels to spike rapidly and voltages to fluctuate.
“Peak power levels for such servers can rise rapidly, and voltages can often become unstable,” the statement said. “Securing stable, highly reliable power supplies is an absolute necessity for AI datacenters.”
Vertiv warned in its 2025 Data Center Trends predictions that AI racks must handle loads that “can fluctuate from a 10% idle to a 150% overload in a flash,” requiring UPS systems and batteries with significantly higher power densities than current infrastructure provides.
Panasonic said the solution gaining traction among hyperscalers is to place a battery backup unit on each server rack rather than rely on centralized UPS infrastructure upstream, absorbing voltage instability at the source. The company said its systems also carry a peak shaving function that stores off-peak electricity and deploys it during demand spikes, reducing peak grid draw at a time when AI-driven consumption faces growing regulatory and utility scrutiny.
Several independent research bodies have reached similar conclusions on the severity of the power challenge ahead. Uptime Institute, in its Five Data Center Predictions for 2026, said “developers will not outrun the power shortage,” with research analyst Max Smolaks warning the crisis “is likely to last many years.” The IEA projected global datacenter electricity consumption could exceed 1,000 TWh by 2026, more than double 2022 levels, while Gartner has warned that energy shortages could restrict 40% of AI datacenters by 2027.
Gogia said the shift runs deeper than a hardware swap. “This is not backup in the traditional sense. This is active stabilisation,” he said. “Power delivery is no longer passive. It behaves like a dynamic system with control loops, response thresholds, and continuous monitoring requirements.” Most enterprises, he added, are not ready. “Many enterprise datacentres were designed for a different era — lower densities, predictable loads, centralised assumptions. Retrofitting for AI workloads requires redesign, not just upgrades.”



















