Via Morgan Stanley:
Quantum computing is at an inflection point – moving from fundamental theoretical research to an engineering development phase, including commercial experiments. That said, in the medium term, we see a transition period during which classical computers will simulate quantum algorithms, while genuine quantum computers are customised to fit those algorithms.
While the classical computer is very good at calculus, the quantum computer is even better at sorting, finding prime numbers, simulating molecules, and optimisation, and thus could open the door to a new computing era. Moore’s Law was the main driver of the digital revolution; we believe quantum computing could trigger the beginning of a fourth industrial revolution, with far reaching consequences for many sectors where computing power is becoming a limitation for R&D, such as Financials, Pharma (drug discovery), Oil & Gas (well data analysis), Utilities (nuclear fusion), Chemicals (polymer design), Aerospace & Defense (plane design), Capital Goods (digital manufacturing and predictive maintenance), Artificial Intelligence, and Big Data search in general.
It is hard to discern among currently evolving hardware platforms, which will be resilient enough to beat classical supercomputers in the next few years, but in our view the listed companies with the most credible internal quantum computing roadmaps are IBM, Google, Microsoft and Nokia Bell Labs. We also believe that within the disruption path there is room for new companies to emerge as important players, such as D-Wave and Rigetti. Outside of tech, companies having led the charge for several years include Airbus, Lockheed Martin, and Raytheon in the aerospace and defence sector, with recent interest by Amgen and Biogen (for molecule simulation) and Volkswagen (traffic optimisation).
Life beyond transistors? Because quantum computing is not suited to all compute tasks, smartphones, PCs, and web servers storing data will continue to run on current technology. We think the high-end compute platforms could see a transition post 2025, similar to how steam engines coexisted with combustion engines and electric motors for decades before being decommissioned. In the medium term, we see incremental demand for FPGAs and GPUs (possibly benefiting Xilinx, nVidia, and maybe Intel) as more supercomputers from Atos and Fujitsu are developed to simulate the behaviour of quantum computers. If quantum computers eventually do become ubiquitous, then the growth of high-end computing systems that emulate them could be affected, hence limiting the valuations of those stocks, but this is more a post 2020 event, in our view.
Quantum Computing at a Glance
What is quantum computing? While classical computers use the laws of mathematics, quantum computers use the laws of physics. We can describe the difference between classical computing and quantum computing with the image of a coin. In classical computing, information is stored in bits with two states, 0 or 1 – or heads or tails. In quantum computing, information is stored in quantum bits (“qubits”) that can be any state between 0 and 1 – similar to a spinning coin that can be both heads and tails at the same time. The information contained in qubit is much richer than the information contained in a classical bit.
Every time you look at a spinning coin it takes on a different state, i.e. 75% heads and 25% tails, depending on where in the rotation you look at it. This is similar to a qubit that has a very fragile state, which can change each time you look at it. To search potential solutions to a problem, a classical computer individually tests different combinations of 0s and 1s. Meanwhile, quantum computing can test all combinations at once because a qubit represents all combinations of states between 0 and 1 at the same time.
In a quantum computer, qubits are interconnected by logic gates, like in a classical computer, but the available operands are more diverse and more complicated.
What is the holy grail of quantum computing? Exponential acceleration. In other words, a quantum computer would be able to compute at a much faster speed (exponentially faster) than a classical computer. This implies that classical algorithms, which would take years to solve on a current supercomputer, could take just hours or minutes on a quantum computer.
What are the hurdles to achieve quantum computing? Qubits are fragile and the state of a qubit can always change, therefore a significant challenge in quantum computing is maintaining a stable state of qubits in order to read the information. Quantum computers must operate at extremely low temperatures to slow the movement of qubits and are much larger in size than classical computers. Maintaining a stable state of qubits and correcting for the “noise” of high error rates as the number of qubits scale are some of the largest hurdles in quantum computing, which companies are solving in different ways. It is also difficult to read the result of an experiment without damaging the quantum state of a qubit. As a result, the read-out of a quantum experiment is not always the same – it gives a “probabilistic” answer, while a classical computer gives a deterministic’ “answer (always the same and predictable).
What are the use cases? In order to be practical, quantum computers should have at least >50 qubits, relative to 16 [units?] today, and with a lower level of noise. Early applications are expected to be used in the chemistry and pharmaceutical industries, where scientists can decrease the time to discover new materials, for example new catalysts to produce fertilizer with much less energy than today. With the addition of more qubits, applications are expected to grow into the finance (i.e. portfolio optimization) and machine learning spaces. What is the time line? Google, IBM, and others have achieved 16+ qubit systems, while others have simulated quantum computing with more than 40 qubits. That said, quantum computers are measured by not only the number of qubits but also by the amount of interconnection (the more interconnection, the more flexible) and the noise level (the lower the better). Google is tight-lipped about its efforts, but IBM has talked about having a 50-qubit system within the next few years. We believe that there will be a two- to three-year period during which simulations and real quantum hardware systems will coexist before quantum computers reach more than 50 qubits and simulators become less relevant. Beyond that, it is all about “scaling” the number of qubits at an acceptable level of noise, similar to what the computer/semiconductor industry experienced by doubling the number of transistor every 18 to 24 months for the same production cost (which is known as Moore’s Law).