Skip to 0 minutes and 3 seconds Let’s try again to get an idea of what these numbers might be numerically, what’s this threshold numerically? Well, it depends very heavily on the code. It depends on the structure of the code, how many qubits you use to encode something. What are the operations you need to do the parity checks for, how many gates do I need to extract this parity information and error correction and importantly, when it comes to hardware design, what is the architectural structure of the code. Now, what do I mean by that? If I build an architecture that, what we call, is linear, so all my qubits can talk to their neighbors to their left and to their right, what we would call a linear geometry.
Skip to 0 minutes and 41 seconds You can have a two-dimensional nearest-neighbor geometry where qubits can talk left and right and to the ones up and down. Or you can have a geometry, which is sort of all connected. Any qubit can talk to any other qubit, no matter where they are in the computer. You don’t see too many of those architectures. Getting back to your earlier question, the actual numerical thresholds, the best ones that we see for these architectures, which are based on these surface codes or topological codes, have a threshold of about 1%. Our error rates on each individual qubit, each individual gate in our quantum computer has to be below about 1%.
Skip to 1 minute and 17 seconds If we can achieve that experimentally, we can think about that this system is scalable in the sense that in principle we could build an error corrected system that’s very, very large and therefore have a quantum computer that contains many, many well-protected qubits. So, 1% error rate would allow us to run error correction and fault tolerance successfully. It would be a good start; 1% is sort of the – you hit threshold at 1%. Hitting threshold doesn’t actually help you very much because if you are sitting right at that 1%, you need an infinity number of qubits to do anything.
Skip to 1 minute and 54 seconds You just have to sit a little bit below it, so our targets at the moment are about 0.1% of errors for the actual hardware systems that are being developed. Where do we sit today with actual hardware development, so the quantum computers that are in the laboratory today, how close are they to that? Closer than you would think. Many people may have seen in the news reports, especially with larger companies, such as Google and IBM, running what we call superconducting qubit designs. They have demonstrated all these QVC and error correction protocols, either they are using the surface code or using the original code of Shor. Some of them have done multiple versions of this.
Skip to 2 minutes and 36 seconds The group at Google has released data that suggested their error rates are close to what is required. Not quite at that 0.1% level, but they are getting close. Systems such as ion traps again have been showing error rates that are pretty close to what’s needed, but not quite there yet. Then, there are several other systems that are not as well developed, but certainly have some very good properties that would allow them to scale to the millions, if not billions, of qubits that we are going to need in order to run interesting quantum algorithms.
Skip to 3 minutes and 12 seconds We are recording this in early 2018 and you’re saying we’re actually close to this threshold value in terms of error correction or error rates in physical systems that we can build today. Yes, I would expect in the not too distant future, within a year or so, multiple systems will hit threshold on this surface code at least, this 1% error rate that we require. We will see this happening in multiple systems. I would say within the next 12 months. Excellent! With the threshold being about 1% and you say we’re targeting actual physical error rates of maybe 0.1% is what we would like to have.
Skip to 3 minutes and 54 seconds If we get to that value, how big is the system going to be, how many physical qubits are we going to have to have inside one of these systems? It depends what you want to do with it. At the moment, a lot of people are trying really, really hard to find interesting and commercial quantum algorithms that don’t require too many qubits and therefore doesn’t require a lot of error correction. They haven’t been found yet. I don’t want to sound too pessimistic, but to run something of, let’s say, commercial relevance, something somebody would pay for either scientifically or commercially; you really need to start talking about 10 million qubits in your entire computer. You need to be talking at that scale.
Skip to 4 minutes and 33 seconds Because again, you also don’t just want one quantum computer and then we stop. Everybody is going to want a quantum computer, so we need these production facilities to be able to build a lot of these things very, very fast and most importantly very, very cheaply. Qubits, at the moment, are expensive and if you need a 100 million qubits or 200 million qubits to do a calculation, if they’re costing a $1000 each, that computer’s going to be a bit too expensive for most people. My Ph.D. student, Shota Nagayama, who graduated in 2017, his Ph.D. thesis is actually about architectures that will allow the system to continue to work even if not all of the qubits actually work physically.
Skip to 5 minutes and 20 seconds You were a collaborator on that actually. We were very happy to have you involved in that. Thank you. I was happy to be involved in that one. What do you think that does for our ability to actually manufacture large scale systems, what kind of impact is that going to have? As you know through the work that we did together, we tried to design these systems such that they have a certain level of – we can’t use the word ‘fault tolerance’ in this context because we have already used it, so we will call it ‘defect tolerance’ where as we manufacture qubits, not all of them work, not all of them work well enough to put into our machine.
Skip to 5 minutes and 52 seconds The designs, that you and I have worked on and other people have worked on, are there to combat this problem, but realistically your manufacturing process, you want to be as good as possible. I want to go back to the issue of how many qubits and physical devices we actually have in the system. What we had kind of motto [ph] on here and we’ve talked about the architectures, one of the architectures we talked about is the photonic architecture, which I think you were involved in that work. Kae talked about the distinction between the photons, which carry the qubits and the physical devices that make up the system. Yeah.
Skip to 6 minutes and 28 seconds You end up using the devices just to produce more and more photons in order to run your computer. Linear optics technology is another version of this where the qubits themselves – you’re not technically creating them with the devices or maintaining them with the devices that you would actually build. The devices are sort of there to mediate quantum gate operations instead. Those are the things you really have to count. In this kind of system, we’re counting the number of physical devices as opposed to the number of qubits. Exactly! We are counting the number of things we have to build, the number of things we have to pay for. That doesn’t, unfortunately, make the numbers any better.
Skip to 7 minutes and 8 seconds Even if my quantum computer ? if it were, say, a superconducting quantum computer, if that required 10 million qubits to do something, even within the atom-optics design that I worked on with Kae, we still need to manufacture 10 million of these devices that actually form the foundation of the photonics computer. There are no free lunches in this game. You need a lot of qubits and, unfortunately, it doesn’t matter which system you’re talking about. You will always need a lot of qubits. Simon, thanks for being with us here and sharing your expertise with us on Quantum Error Correction and good luck on starting up your own company with Turing. Wonderful! It has been an absolute pleasure, Rod. Thanks for having me.
QEC and Architecture
Dr. Simon Devitt of Turing, Inc. rejoins us to describe how error correction affects our architectural choices. The farther beyond the QEC threshold our technology is, the less error correction we need. Additionally, he describes some of the architectural designs he creates with Prof. Kae Nemoto, from the error correction point of view. You may wish to revisit Steps 4.3 and 4.11 in conjunction with this Step.
This is the final technical Video Step of the entire course. Next up you will find a quiz to reinforce your learning, then we turn to a series of Articles on the Quantum Information Technology Industry to wrap up.
© Keio University