Quantum computing invites us to imagine a world where a tool of extraordinary power enables us to solve many of the world’s most pressing problems and some of the greatest scientific mysteries. Little wonder then that a vibrant commercial sector has already developed around companies offering prototype quantum computers, with the next ‘big breakthrough’ forever on the horizon.
But it’s time for a reality check. The prototype models available today, generally referred to as Noisy Intermediate-Scale Quantum (NISQ) computers, are a long way from achieving those lofty goals – and most likely, they never will.
The NISQ model falls short
All quantum computers suffer from environmental and operational ‘noise’ that destroys qubits – the basic units of information in quantum computing – and creates system errors. This dramatically limits the number of operations you can run on a quantum computer and fundamentally limits its usefulness. Simply adding more qubits won’t improve the performance of a NISQ computer unless you can also reduce the errors.
High-impact applications require error rates of around one error in every trillion operations (no, that’s not a typo) in a quantum computer with a few thousand qubits. Current best error rates for quantum operations currently sit between one error in every thousand to ten thousand operations using just tens of qubits. This means that the current best NISQ computers are falling short of what needs to be achieved by over eight orders of magnitude.
Given that all current qubit implementations, even the high-quality ones, have far too many noise-induced errors to run the most impactful algorithms, we have to find a way to correct these errors programmatically. And this is where the current NISQ prototypes really run into problems, because the only ways of reducing errors in these computers – such as reconfiguring algorithms via classical computing methods or simply repeating the algorithm until it works – won’t work at scale. And as we’ve already seen, scale is everything in quantum computing.
FTQC is the way forward
So what’s the solution? It exists at the level of ‘physical’ qubits, the fundamental building blocks of the quantum computer. Generally speaking, the more physical qubits used to create a ‘logical’ qubit, the lower the error rate on that logical qubit. By implementing Quantum Error Correction (QEC) techniques, and using hundreds of physical qubits per logical qubit, we can theoretically reach sufficiently low error rates to run the most powerful applications.
Of course, we’re not there yet. But to make the necessary progress to achieve this goal, we need to have a frank conversation about the current direction of travel in quantum computing. We must be clear that, while the NISQ model has attracted early-stage investment and driven the quantum market so far, continuing to move in this direction will merely lead us into a blind alley. QEC offers the best way forward, but it’s not possible to simply scale NISQ computers to the extent needed for them to work.
For quantum computing to properly fulfil its potential requires a machine with millions of individually controllable, error-corrected, high-quality qubits – referred to as a fault-tolerant quantum computer (FTQC). To create such machines, we have to look beyond performance and error rates at the current system sizes, and concentrate instead on the development of significantly bigger systems. And to do that, companies producing quantum computing hardware have to address architecture-level issues around modularity, manufacturability and connection efficiency.
For example, connecting multiple modules together that can seamlessly transfer quantum information is likely to be the only viable way for the vast majority of architectures to create a sufficiently large FTQC system. However, many current approaches for building quantum computers don’t have a working solution to connect individual modules together, making it impossible for them to scale up.
In practical terms as well, it is a huge time and cost advantage if commercially available manufacturing solutions can be used to make FTQC hardware, given that investments in new facilities and technologies can cost billions of dollars. Yet the quantum computing roadmaps of many companies still rely on technologies, material properties, or capabilities that haven’t even been invented yet.
Scalability is vital
But perhaps the biggest challenge that the quantum computing community has to overcome is short-termism. Money has gone into the NISQ model because it’s been able to deliver some limited economic value in a relatively short timescale. Yet much of this activity has been a distraction from the larger goal of reaching the FTQC scale, which is where the real value in quantum computing lies.
We have to reset the conversation around quantum computing to focus more on scalability and encourage long-term thinking, otherwise what should be a transformative technology is in danger of remaining stuck in the shallows, its potential to solve the big issues facing the world unharnessed.
Dr Sebastian Weidt, CEO, Universal Quantum
UK not prepared for climate impacts, says CCC
Perhaps a Longtitude prize to solve railway line problems. "extreme heat causing further disruption through rail buckling and power line...