Contact FutureLearn for Support
Skip main navigation
We use cookies to give you a better experience, if that’s ok you can close this message and carry on browsing. For more info read our cookies policy.
We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

Caveat: Function Growth

How fast does the cost of a function grow? Earlier this week, we talked about the computational class motiviation. In this article and the next, let us add some nuance to the notion.

We have just seen several algorithms, including factoring, quantum chemistry, and generalized search. Shor’s factoring algorithm promises superpolynomial speedup compared to the best known classical algorithm. Quantum chemistry, or quantum simulation, also offers superpolynomial speedups for some specific problems.

The most general quantum algorithm, known as Grover’s algorithm, is really a broad framework for solving problems on a quantum computer rather than an algorithm for a specific problem. It gives a square root speedup for any problem. That is, it takes a problem that is on a classical computer and makes it on a quantum computer.

Constant factors matter

With this advantage in computational complexity, there was much initial excitement about quantum computers that was naive. The importance of a critical caveat is now appreciated: constant factors matter.

In our example above, we assumed it takes you one second to write each digit of a number. Perhaps a friend is faster; it only takes her half a second per digit. This advantage seems large for writing down a single number, but for enumerating every number up to a million, it will still take her five weeks, and she will probably object to the task!

But if I give you a robot that can write at a hundred digits per second, it can complete the entire million-number task in just seventeen hours. Even though the task grows exponentially as our number of digits grows, in practice there is a range of problem sizes we care about, and if we or our computers or robots are fast enough for those cases, then we can still solve the problems to our satisfaction.

Range matters

The previous paragraph hints at another important factor: the range of the problem we are attacking. Perhaps it’s reasonable to ask a robot to write a million numbers for us, but is a billion reasonable? A trillion? Although theorists love to talk in terms of the asymptotic behavior as the problem size grows, when we build real systems we aim to solve problems of some particular size.

We can illustrate this effect with a simple example. This plot compares the two functions and , over the range 0 to 1000.

100x+5000 versus x squared, 0 to 1000

A theorist would tell you that the yellow curve, , is , which dominates the function, which is only . Thus, if the vertical axis is the time it takes to solve a problem of size , it seems obvious we would prefer the blue curve ().

That’s what the theorists would tell us. But the next plot, showing the same functions over the range 0 to 100, suggests a different, more nuanced interpretation.

100x+5000 versus x squared, 0 to 100

If the largest problem we care to solve is of size , then we would actually prefer the yellow line. Thus, the engineers might give you a different suggestion than the theorists. Although the goal of quantum computing is to develop machines with an asymptotic advantage, this distinction will matter for real systems.

Picking good problems

Leveraging the basic computational complexity advantage of Grover’s algorithm can be done for almost any problem. You’re still not going to be running a quantum version of Microsoft Word on a home quantum computer, though! Achieving larger gains requires a quantum algorithm suited to the problem, which requires the discovery of some structure in the problem.

In Step 1.13, we discuss some problems, such as climate modeling, that are commonly attacked using supercomputers. These are mostly not good candidates. In fact, there are two major constraints on problems we can solve effectively:

  1. modest input/output requirements: Quantum computers will be relatively bad at reading and writing classical data for the foreseeable future.
  2. modest memory requirements: Quantum computers also will not have gigabytes of quantum memory for the foreseeable future, so problems that can be solved using tens, hundreds or at most thousands of qubits are desirable.

    And a third requirement, of course, is:

  3. the problems must be hard to solve on a classical computer, generally meaning that the number of possible answers that must be tested grows quickly as the problem size increases.

We will see examples in more detail later this week.

Share this article:

This article is from the free online course:

Understanding Quantum Computers

Keio University