McKinsey's recent report entitled "A game plan for quantum computing" explains how quantum computers can solve algorithmic challenges of complexity and scale not previously addressable. They can do this in a matter of seconds or minutes, when classical computers would take hundreds or thousands of years.  

 The report speculates about exciting applications across a range of applications and industry sectors, like:

  • Cutting development time for chemicals and pharmaceuticals;
  • Solving optimisation problems like routing and layouts at unprecedented speed; and
  • Underpinning next generation quantum AI in applications like autonomous vehicles.

Leaders should be thinking hard about how quantum computing will upset the natural order in their industry sectors and key applications. The report suggests business leaders in first-wave sectors need to develop a quantum strategy quickly or they will be left behind by innovative companies, such as Barclays, BASF, BMW, DOW, ExxonMobil and others that already have taken strategic steps into quantum computing. 

The report also mentions some of the challenges in building and operating quantum computers based on needing to manipulate and measure the entangled qubits that are together capable of performing numerous calculations simultaneously. 

Initially, quantum computing services based in the cloud will start to allow technical communities access to this technology.  Open source and other libraries of code and pseudocode for execution on quantum computers are already available online.

Impressive results have been achieved using combinations of the quantum computers currently available with classical super computers.  Given quantum computers are particularly efficient in executing certain types of algorithms, problems can be broken down such that quantum computers working, billions of times faster than today’s classical computers, execute a first (conventionally inefficient) phase of a problem and the remainder of the problem is then solved using classical supercomputers.  

However, perhaps the immediate concern for society and businesses in general is the efficiency with which quantum computers can crack many of the widely used encryption schemes that protect our privacy in Internet communications and data storage systems.  McKinsey’s report quotes Prof. Jonathan Dowling from Louisiana State University:

“If you have business and/or trade secrets that you would want to keep secret for 10 to 50 years, then you need to start worrying now”.  

In part, this concern is driven by reported increases in "download now, decrypt later", where attackers are obtaining and storing encrypted data with the intention of decrypting it using quantum computers in the near future. For example, an end-to-end encrypted VOIP call may be secure in the present-day, but anyone recording and storing the encrypted traffic will be able to reveal the content whenever capable quantum computers become available.

There are two main branches of quantum computing technologies.  First, universal (gate based) quantum computers are being developed by the likes of IBM, Intel, Microsoft and Alibaba.  Second, quantum annealers are being developed by D-wave and others.  Although the science underpinning these machines differs, both branches have prolific potential for breaking the core encryption schemes underpinning communication and data storage on the Internet as we know it today.

Public key encryption is the technology used to securely visit HTTPS websites, to allow secure email, to allow biometric passport security and more. Most of the public key encryption schemes in use today are secure based on the difficulty of deducing prime factors of very large non-prime numbers (for example RSA encryption & signing), or of deducing discrete logarithms (for example ECDSA).

Classical computers take a VERY long time (hundreds or thousands of years, sometimes more) to find prime factors of very large numbers based on iterative guesswork.  Both universal quantum computers and quantum annealing machines are capable of attacking the trickier aspects of these factoring problems in ways that are orders of magnitude more efficient, and hence quantum computers can render many types of current encryption obsolete.

For universal quantum computers increasingly optimised derivatives of Shor's algorithm that restate the factoring problem as a periodicity of sequence problem, and employ highly efficient quantum Fourier transform operations, can cause all possible values to be computed simultaneously and all incorrect answers to be cancelled by destructive interference. This way, the factoring problem can be solved in relatively few steps and in unfathomably short timescales.  

For annealing based (adiabatic) quantum computers Hamiltonian functions are used to describe physical systems and the solution is progressed to the lowest energy state. The factorisation problem can be expressed as a binary optimisation problem that allows minima to be determined incredibly efficiently. Over recent years the multiplication optimisations integral to these types of calculation have been refined to enable the factorisation of 20 bit numbers by quantum computers having fewer than 100 qubits. As a result, this branch of quantum computing is perhaps regarded as being closer to rendering currently known encryption systems obsolete.

With well-informed commentators now expecting quantum computing deployments to be more widely accessible within five years, none of this bodes well for potentially vulnerable encryption schemes like RSA, elliptic curve, digital signatures / ECDSA, SSL and TLS type schemes, to name a few.

When the first large scale quantum computers deploy commercially, some types of encryption, such as lattice-based, code based, hash-based and isogeny-based encryption systems, are expected to offer a level of quantum resistance.  Yet the staggering advances in algorithm design and processing of the last two decades demonstrate how quickly near-impossible computing tasks can unravel in the face of innovation and new technologies.  The fairly frequent discovery of new solutions to maths problems previously regarded as unsolvable (such as the extension of the Langlands Bridge beyond Fermat’s Last Theorem, solved last year) suggests that relying on the computational difficulty of maths in an era where computational resources are about to become limitless, may be naïve.

Lattice-based algorithms are currently the forerunners in NISTS Post-Quantum Cryptography efforts, aiming to provide replacements for quantum-vulnerable techniques such as RSA and ECDSA. However, several of the small population of cryptographers in the World who have looked closely at lattice-based algorithms have observed that there are many questions still to be answered, and no certainty that practical implementation can adequately serve the needs of the World.   

From a pure science point of view Quantum Key Distribution (QKD), using the quantum mechanical properties photons to encrypt data, offers a theoretically robust ("quantum secure") form of cryptography. This form of cryptography relies on the fact that when an entangled photon has settled on a certain quantum state, then the photon, or its entangled counterpart, must have been observed.  Such systems unequivocally identify compromised keys so that they can be discarded. 

QKD requires its own infrastructure because encoded photons do not travel well in existing fibres. Owing to the technical challenges in distributing quantum keys globally, initial QKD infrastructure is likely to be satellite-based.  With the rapidly decreasing cost of satellite operations, this solution will likely come to the fore in the immediate future.

The take away here is that as businesses begin to develop their "Quantum Strategies" the issues of data security and secure communications will need to factor high on the agenda, with provably quantum safe solutions being deployed early in strategy execution.  It seems risky for organisations to gain assurance from Post Quantum Algorithms at this time, since there is no easy way to test their effectiveness and longevity. The process of gaining a good level of confidence in the security of mathematical-based cryptography is slow and arduous, with multi-year efforts such as the NIST Post Quantum Cryptography effort still being some time away from arriving at robust conclusions.  And of course it will always be difficult for anyone to say, credibly, that Post Quantum Algorithms will remain mathematically provably secure from quantum attack.

Instead focussing on security products that use provably quantum safe solutions, based on the laws of physics, such as Quantum Key Distribution, may be a starting point for organisations to protect their data and evolve their existing security infrastructures. The timescales and practical challenges required to make encryption algorithms secure mean that QKD is likely to be of increased importance in the next phase.  For years businesses have taken for granted the robustness of the critical infrastructure relied upon for data security but - as McKinsey reiterate in their report - this change could impact organisations as soon as 2022.   A new world in which quantum computer-like capabilities are deployed more widely is rapidly approaching and businesses should be formulating plans now.