
Quantum computers and classical HPC are traditionally “disparate systems [that] operate in isolation,” IBM researchers explain in a new paper. This can be “cumbersome,” because users have to manually orchestrate workflows, coordinate scheduling, and transfer data between systems, thus hindering productivity and “severely” limiting algorithmic exploration.
But a hybrid approach can simplify the process of applying quantum computing to problems in areas like chemistry, materials science, and optimization, and “solve problems that were previously out of reach,” IBM says.
The researchers describe quantum-centric supercomputing (QCSC) as evolving through three distinct phases: quantum systems as specialized compute offload engines within HPC environments; quantum and classic HPC systems coupled through advanced middleware; and fully co-designed HPC and quantum systems for hybrid workflows.
The first phase focuses on establishing “foundational integration across multiple dimensions,” the researchers explain. The second phase focuses on reducing latency, creating multiple sophisticated feedback mechanisms, and supporting complex hybrid algorithms. The third phase represents the “culmination” of the integration through “fully co-designed heterogeneous systems where quantum and classical resources are architected as unified platforms from the ground up.”
The latter mirrors the trajectory of GPUs in HPC systems, the researchers note; early GPUs typically functioned as external accelerators attached to host processors. But then interconnects were established between GPUs and CPUs, and from GPU to GPU, to provide much higher bandwidth and lower latency.
“Similarly, quantum systems will transition from standalone units to fully integrated components within co-designed quantum-HPC platforms,” the researchers contend.




















