
Nord Quantique plans to use the money to expand its team, says Julien Camirand Lemyre, the company’s president, CTO and co-founder. That’s an opportunity to accelerate the development of the technology, he says.
“By extension, what this will mean for enterprise users is that quantum solutions to real-world business problems will be available sooner, due to that acceleration,” he says. “And so enterprise customers need to also accelerate how they are thinking about adoption because the advantages quantum will provide will be tangible.” Lemyre predicts that useful quantum computers will be available for enterprises before the end of the decade.
“In fact, there has been tremendous progress across the entire quantum sector in recent years,” he says. “This means industry needs to begin thinking seriously about how they will integrate quantum computing into their operations over the medium term.”
“We’re seeing, with the deployment of programs like the QBI in the US and investments of billions of dollars from public and private investors globally, an increasing maturity of quantum technologies,” said Paul Terry, CEO at Photonic, which is betting on optically-linked silicon spin qubits. “Our architecture has been designed from day one to build modular, scalable, fault-tolerant quantum systems able to be deployed in data centers,” he said.
He’s not the only one to mention fault-tolerance. DARPA stressed fault-tolerance in its announcement, and its selections point to the importance of error correction for the future of quantum computing.
The biggest problem with today’s quantum computers is that the number of errors increases faster than the number of qubits, making them impossible to scale up. Quantum companies are working on a variety of approaches to reduce the error rates low enough that quantum computers can get big enough to actually to real work.