 EuroPython 2018: Quantum Computing: How it isn't what those Pop Articles Say

Nick Radcliffe is running Stochastic Solutions, and is an organiser of PyData Edinburgh, and was taught Quantum Field Theory by Higgs.

If you've heard anything about Quantum Computing, then you've probably heard that if Quantum Computing is possible, then SSL and encryption is in trouble. This is not sure, not proven, but this is how it goes:

RSA algorithm is based on the fact that factorising large semi-prime numbers is hard, while getting them from multiplication is easy. Except it isn't hard, it's only slow. If we don't care how long we take, we can just try lots of number combinations, and parallelise this process, and wait. But we care about time, and we don't have an infinite large computer, and even if we had, we'd still need to collect the answer from all of the noise.

Bits are boxes containing zeros or ones. If the bit is in random access memory, you can set the value and read it. The same bit can have different values at different times.

Now quantum computers are based on quantum bits: qubits. They can contain zeros or ones. Or both – partly one and partly the other at the same time. We write that as 1/sqrt(2)|0 + 1/sqrt(2)|1. A single qubit can have different values at the same time. Yes, it makes no sense, but this is how the universe works.

Let's learn this with one of the best Physics teachers: Feynman. Let's imagine two wave circles interfering with each other. Waves interfere, be they water or light – peaks add up, peaks and valleys cancel out. Now if we look at the classic double split experiment (sorry, not going to draw it here, please look it up), and replicate it with bullets. Bullets don't generate interference, they're discrete. Now let's repeat this with electrons. Electrons are like tiny bullets. And yet, they generate interference. And no, this is not due to electrons being too close to each other – if you reduce the interval so that only one electron is in the system at a time, the interference remains. They act as if they were guided by waves that go through both slits and cause interference.

Quantum mechanics describes particle motions with a wave equation for a (complex) wave, psi, whose amplitude determines where the particle is/goes.

Psi can be in a mixed state (a superposition). Schrödinger's equation tells us how psi evolves over time. Whenever we measure psi, the particle went through one of the sides, which is what we used to call collapsing the wave function.

Quantum Mechanics is the most accurate scientific theory ever. There has never been a reproducible experiment whose results contradict the predictions of quantum mechanics. Quantum mechanics has predicted phenomena with an accuracy of ~1 part in a billion. Shame it's incompatible with Einstein's Relativity Theory. Interpretations of those results are hard, though, and we have different interpretations, e.g. the Many Worlds theory (Hugh Everett, 1957): whenever there are two or more quantum possibilities, the universe "branches", and all outcomes are realised indifferent universes. Measuring psi, we get the value in our universe, but there is interference across universes.

Using entanglement we can form qubites. Entanglement is basically a group of qubits described by a single wave function psi.

Now quantum computing might be seen as parallel computation across multiple universes. This is what pop articles tell you. And it's kinda right – we could set it up like that. BUT we couldn't really get the right result back. So even if you stop reading right here, please remember: Quantum Computers are not free parallel computing across universes.

But how does it work? To benefit from QC we need to find algorithms in which (nearly) all the answers we don't want cancel out (destructive interference), leaving the answer we do want in all universes (at least with reasonably high probability). That's what Shor's Algorithm does for large sem-prime numbers. And it has been used on real quantum computres to factorize large-ish semi-prime numbers. Depending on your definition of large-ish. We have done 21.

There is an alternative Adiabatic Quantum Computer, which factorised 143 (11x13).

Summary: The physics is sound: Quantum Computing is based on our absolute best physics. The engineering is at the very least tricky (imagine your computer needs to be near absolute zero). Quantum Computers currently decohere quickly, because of other stuff in the universe(s), and we can't yet have many qubits. They're probably coming, but they'll be tricky to program.

Shor's Algorithm tackles the problem sideways. If we repeat x mod N, x2 mod N, x3 mod N (N being the number we want to factorize), this sequence will periodize. If we figure out the period of a couple of values of x, we can figure out the factors of N (as Euler showed!). Now we create a superposition of the x series, because they're predictable. And now we use the Quantum Fourier Transform to find the correct value. It's a Fourier Transform, only Quantum. It basically reinforces the good period, and removes/cancels out the bad. For illustration using Quantum clocks refer to Scott Aaronson.