Quantum computing at an inflection point

Via Morgan Stanley:

Quantum computing is at an inflection point – moving from fundamental theoretical research to an engineering development phase, including commercial experiments. That said, in the medium term, we see a transition period during which classical computers will simulate quantum algorithms, while genuine quantum computers are customised to fit those algorithms.

While the classical computer is very good at calculus, the quantum computer is even better at sorting, finding prime numbers, simulating molecules, and optimisation, and thus could open the door to a new computing era. Moore’s Law was the main driver of the digital revolution; we believe quantum computing could trigger the beginning of a fourth industrial revolution, with far reaching consequences for many sectors where computing power is becoming a limitation for R&D, such as Financials, Pharma (drug discovery), Oil & Gas (well data analysis), Utilities (nuclear fusion), Chemicals (polymer design), Aerospace & Defense (plane design), Capital Goods (digital manufacturing and predictive maintenance), Artificial Intelligence, and Big Data search in general.

It is hard to discern among currently evolving hardware platforms, which will be resilient enough to beat classical supercomputers in the next few years, but in our view the listed companies with the most credible internal quantum computing roadmaps are IBM, Google, Microsoft and Nokia Bell Labs. We also believe that within the disruption path there is room for new companies to emerge as important players, such as D-Wave and Rigetti. Outside of tech, companies having led the charge for several years include Airbus, Lockheed Martin, and Raytheon in the aerospace and defence sector, with recent interest by Amgen and Biogen (for molecule simulation) and Volkswagen (traffic optimisation).

Life beyond transistors? Because quantum computing is not suited to all compute tasks, smartphones, PCs, and web servers storing data will continue to run on current technology. We think the high-end compute platforms could see a transition post 2025, similar to how steam engines coexisted with combustion engines and electric motors for decades before being decommissioned. In the medium term, we see incremental demand for FPGAs and GPUs (possibly benefiting Xilinx, nVidia, and maybe Intel) as more supercomputers from Atos and Fujitsu are developed to simulate the behaviour of quantum computers. If quantum computers eventually do become ubiquitous, then the growth of high-end computing systems that emulate them could be affected, hence limiting the valuations of those stocks, but this is more a post 2020 event, in our view.

Quantum Computing at a Glance

What is quantum computing? While classical computers use the laws of mathematics, quantum computers use the laws of physics. We can describe the difference between classical computing and quantum computing with the image of a coin. In classical computing, information is stored in bits with two states, 0 or 1 – or heads or tails. In quantum computing, information is stored in quantum bits (“qubits”) that can be any state between 0 and 1 – similar to a spinning coin that can be both heads and tails at the same time. The information contained in qubit is much richer than the information contained in a classical bit.

Every time you look at a spinning coin it takes on a different state, i.e. 75% heads and 25% tails, depending on where in the rotation you look at it. This is similar to a qubit that has a very fragile state, which can change each time you look at it. To search potential solutions to a problem, a classical computer individually tests different combinations of 0s and 1s. Meanwhile, quantum computing can test all combinations at once because a qubit represents all combinations of states between 0 and 1 at the same time.

In a quantum computer, qubits are interconnected by logic gates, like in a classical computer, but the available operands are more diverse and more complicated.

What is the holy grail of quantum computing? Exponential acceleration. In other words, a quantum computer would be able to compute at a much faster speed (exponentially faster) than a classical computer. This implies that classical algorithms, which would take years to solve on a current supercomputer, could take just hours or minutes on a quantum computer.

What are the hurdles to achieve quantum computing? Qubits are fragile and the state of a qubit can always change, therefore a significant challenge in quantum computing is maintaining a stable state of qubits in order to read the information. Quantum computers must operate at extremely low temperatures to slow the movement of qubits and are much larger in size than classical computers. Maintaining a stable state of qubits and correcting for the “noise” of high error rates as the number of qubits scale are some of the largest hurdles in quantum computing, which companies are solving in different ways. It is also difficult to read the result of an experiment without damaging the quantum state of a qubit. As a result, the read-out of a quantum experiment is not always the same – it gives a “probabilistic” answer, while a classical computer gives a deterministic’ “answer (always the same and predictable).

What are the use cases? In order to be practical, quantum computers should have at least >50 qubits, relative to 16 [units?] today, and with a lower level of noise. Early applications are expected to be used in the chemistry and pharmaceutical industries, where scientists can decrease the time to discover new materials, for example new catalysts to produce fertilizer with much less energy than today. With the addition of more qubits, applications are expected to grow into the finance (i.e. portfolio optimization) and machine learning spaces. What is the time line? Google, IBM, and others have achieved 16+ qubit systems, while others have simulated quantum computing with more than 40 qubits. That said, quantum computers are measured by not only the number of qubits but also by the amount of interconnection (the more interconnection, the more flexible) and the noise level (the lower the better). Google is tight-lipped about its efforts, but IBM has talked about having a 50-qubit system within the next few years. We believe that there will be a two- to three-year period during which simulations and real quantum hardware systems will coexist before quantum computers reach more than 50 qubits and simulators become less relevant. Beyond that, it is all about “scaling” the number of qubits at an acceptable level of noise, similar to what the computer/semiconductor industry experienced by doubling the number of transistor every 18 to 24 months for the same production cost (which is known as Moore’s Law).


  1. I always liken standard computers to men. Yes or no.

    While quantum computers have yes, no, maybe and both.

    Sound familiar? ;p

  2. what is going to happen to encryption (based on factoring primes and likely most other known methods)?
    fortunately quantum computing has good chance of being able to break, at least temporarily but for long enough to kill some parts of internet permanently. the Internet as we know it (a place for exchange private and classified information) – social media, online banking, sales or other payments, …. and bitcoin may die

    • Quantum resistant cryptographic algorithms have been developed, and cryptocurrencies using them are under development.

      Not bitcoin though; they are too inflexible and too attached to sha-256 to move until it’s too late.

  3. Good to see that this amazing new technology will be used wisely on expanding financial portfolios. What better use could be found for faster computation than shaving a few more nanoseconds off trading times etc to increase the already yuge fortunes of the people who can afford such computational capacity.

    • Can you imagine spamming in quantum computing?!

      And hey – pr0n is already there – a person is a fucker and fuckee at the same time 😀 Qbit-pr0nz!

  4. DarkMatterMEMBER

    Has anyone found a good explanation of how you write a quantum program and what it does? That is not particularly clear yet.

    Representing all states between 0 and 1 sounds like you need to believe in the continuum. I am wondering if it is possible to write a quantum program that calculates the SQRT(2)? What would be the result? The SQRT(2) is just a non terminating series of numbers, so where would the quantum computer be storing this intermediate state?

    It will be very interesting to see if they can build them and find people to write programs for them. Javascript coders need not apply.

    • Yeah Quantum computers are the new Parallel computer.
      Those that are old enough to remember will smile as they recollect their first foray into parallel programming languages …my first try was with Occam. I thought I was the dumbest person on earth when I couldn’t get anything to work anywhere near as fast as the theoretical limits, what I didn’t find out for many years is that I was probably one of the best programmers of parallel architectures alive when it came to solving more general classes of compute problems. What I had to learn was that everyone else was restricting themselves to classes of problems that uniquely suited parallel architectures.
      I suspect we’re at a similar junction wrt quantum computers and when all the BS dies down we’ll see just how unsuited these clever quantum techniques are for generalized computing tasks if for no other reason than the absence of good quantum computing programmers.

      • DarkMatterMEMBER

        Were you involved with Kia’s attempt to build a machine out of transputers? I remember talking to him about that, and asking about what the OS would look like ….

        That transputer stuff came out of Cambridge I think, and was related to the TriPOS operating system. TriPOS ended up as the basis for the Amiga OS. I think I actually have a couple of the Transputer chips somewhere and an Occam manual. They were very odd chips.

      • No involvement in that project, I was doing communications computing cores for things like iFFT/FFT engines. At the time there was a lot of interest in this generalized transformation to enable the use of parallelism for inherently serial tasks and by extension expand this to solve general computing problems, the cryptanalyst’s were quick to jump on that bandwagon too, but I really can’t talk about that stuff.

      • Oh man… I remember that stuff… I caught the tail end of it at my uni… Never got too close to the Occam thing, and while we did have a couple of transputer boards, never saw a project to involve them… oh well… they did gather dust in a parallel manner…

        >(…) we’ll see just how unsuited these clever quantum techniques are for generalized computing tasks if for no other reason than the absence of good quantum computing programmers.

        I concur… SMP systems were created and shipped out with great fanfare … what we ended up with was a bunch of single-threaded applications running on them and lusers complaining that they’re not working as well as they though they should.

        What’s worse was the terrible waste of resources by running batch jobs on SMP systems… Batch… fucking …. jobs… each job being a single-thread, ticking along… one after the other…

        No, your run of the mill copro-grammer can’t even conceive parallel programming – they have problems walking and chewing gum at the same time, why would we think they’d be better at writing code for parallel systems?

        Besides – these days – it’s all about PHP and Java Scrote, isn’t it? Fucking frameworks on top of frameworks, all designed to emulate what Perl has been doing for 30 years…

        Can’t find a guy who can read 3 lines of C… let alone write them.

        OOOh Python… OOOH Ruby! Ooooh shell script! What the??…


      • Hey,
        A general unrelated question for you coding guys. If you had kids, what programming language would you get them to learn? Or what other suggestions do you have for preparing kids to be creators versus users in the digital world of now and the future?
        Appreciate your thoughts.

  5. Why can they not develop stem cell therapy instead?

    “moving from fundamental theoretical research to an engineering development phase”. Fix back discs!

    Global Macro? We need a technology section.

  6. H&H – what you haven’t talked about is the implications of quantum computing. If they are available by 2025 even if it’s only to the GE’s etc of the world, the implications are profound. While there is some banter above in the comments, it’s applications are real and the implications are profound.
    Think fertilizers made for pennies in a order of magnitude reduced time. Think credit and interest rate risk calculations changing how often they are computed, the number of factors used, and the combinations of how a portfolio or a derivative deal is constructed and thus the entire pricing mechanism and it’s end results will change. I.e. read Google and amazon etc will do with credit (lending/banking) what they’ve done with media and retail. Materials, robotics, automation, medicine, drugs, energy, everything will become significantly cheaper to produce and at a higher quality and effectiveness.
    I.e. deflation.
    You’ll have followed my comments now for over a year and a half. We’re going to have a credit collapse not because of rising interest rates, not macro prudential, not rising unemployment but because of deflation, continued fall in wages.
    The irony is, all these technologies are being produced at a rapid rate because of all the cheap money. Companies expend capital to reduce cost ultimately even if it’s to gain a capability to increase top line. I.e.unit cost goes down even if opex remains the same but ultimately you reduce the opex as well.

    Also the noise about bit coin and it’s valuation on mb, is just that, noise, even if it’s value were to fall, it’s relevance, specifically block chain are not negligable. Cryptocurrency are here to stay even if as shadow currencies as a true currency needs a tax mandate.

    Combine, AI, robotics, quantum computing together and you get the fourth human revolution and it’s real and happening now. Dr Yellen and her cronie counterparts globally can come up with qe Infinity or vending machines of free money to every living individual. It won’t matter – if this rate of change in the technologies listed doesn’t slow, debt (globally) will collapse and I’m talking about a convergence of all these factors occurring with in the next ten years. Hold on for a bumby ride.