Summit is the fastest supercomputer in the world – capable of 200 petaFlops and the first computer to perform a quintillion operations per second (that’s 10^18!). But, with great fanfare, Google announced that its quantum chip Sycamore completed a specific task faster than Summit. This is a huge achievement that has the quantum computer industry buzzing with excitement.

From the outside, though, Google’s achievement may give the impression that quantum computers will be nothing more than a one-trick pony. After all, Google carefully chose one very specific computational task – namely, sampling from a random quantum circuit, which is particularly hard on a conventional computer like Summit but of little practical relevance. The next, and more important, task is to make quantum computers useful.

**UK's quantum innovations outlined in new report**

Just like (most) conventional computers around us, quantum computers are universal – a clever quantum programmer can write code to run any computational task as long as it can be represented by a programme. But that does not mean quantum computers will be faster than conventional computers at all of these tasks.

The secret sauce that makes quantum computers fast is called quantum superposition. Quantum bits, or qubits, can not only be in one of two states but also in any superposition of these states. While this sounds like it ought to result in massive computational advantages, it is very hard to find algorithms that exploit it. To extract information, the quantum state will have to be measured, which collapses the corresponding qubits into either a one or a zero. A quantum algorithm designer has to find cunning ways of circumventing this difficulty by exploiting interference. Currently, there are only around 400 quantum algorithms, compared with the millions and millions of algorithms that exist for conventional computers.

The advantages that these quantum algorithms have over their conventional counterparts vary. For example, a quantum algorithm called Grover’s algorithm searches for an item in a list. This gives a square-root advantage over its classical counterpart. While some of the algorithms with square-root advantages may become very important in the future, they are not candidates for a near-term demonstration of useful quantum computation. A small quantum computer simply cannot compete with 50 years of progress in semiconductor technology if there is a mere square-root gain in the algorithm. The quantum computer would need to perform 10^9 operations per second to be faster than Summit.

The most famous quantum algorithm, Shor’s algorithm, is a method for finding the prime factors of an integer. It has an exponential advantage over conventional computers but is competing against a long history of development in algorithms with methods such as the number field sieve being capable of factoring very large numbers. Whilst quantum computers are exponentially better at factoring numbers, they will need to be very large before this gets competitive.

Quantum computers excel at simulating other quantum systems

Rather, the answer to the billion-dollar quest for quantum usefulness is based on a mundane-sounding realisation: being quantum systems themselves, quantum computers excel at simulating other quantum systems.

Quantum mechanics is one of the most successful theories of physics – it predicts properties of materials such as their colour or how well they conduct electricity, and the movement of molecules as they approach a protein, for example. Quantum computers could simulate electrons in atoms and molecules exponentially faster than classical computers and bring a sea change to how researchers in industry and academia design new materials or drugs. With R&D budgets in the billions of dollars, these applications are significant. And they may become relevant rather soon because conventional methods are likely to be outpaced even by small and noisy or “NISQ” (noisy intermediate scale quantum computing) devices. In the past few years, quantum algorithms such as the so-called variational quantum eigensolver have been developed that use the strengths of quantum computers and classical computers in concert. Only the most difficult parts of the calculations are delegated to the quantum machine, while the classical computer sums and optimises over many individual quantum computations.

However, computational chemists have developed many tricks to expedite the simulation of materials and chemicals on conventional computers against all odds. There is a whole zoo of methods for calculating chemical and materials properties, each with a different trade-off between accuracy and computational efficiency. In conjunction with hardware developments such as GPUs, conventional computational methods are rapidly gaining importance in materials and drugs discovery. For example, the recent development of drugs against HIV – called HIV integrase inhibitors – has relied heavily on computational modelling. Calculating the effectiveness and specificity of HIV integrase inhibitors are problems that naturally lend themselves to a quantum computation. But how fast and accurate would the quantum computer have to be to really make a difference in the drug discovery process?

If quantum computers are to beat conventional computers on a useful problem, quantum computing researchers need to engage with computational chemists to find problems of relevance, and customise hardware and software to these specific cases. This exchange between communities is beginning to happen. More and more pharmaceutical and materials companies are engaging with quantum computing. Given that quantum physicists have long worked with computer scientists on one of the toughest problems in the world – building a quantum computer – they are no strangers to cross-disciplinary work. But they are about to start broadening their perspective once more.

**Dr Steve Brierley is CEO of Cambridge-based quantum computing software developer Riverlane**

## Comment: material advances could help usher in a hydrogen revolution

I suspect they mean that the strength, as stress to failure or yield, is 200 times; this would imply a very large yield strain (at least compared with...