Find out what quantum computing is, the basics of how it works, and the skills you’ll need to learn more about this fascinating field.
Computer technology has come a long way over the last few decades. Computers have been getting smaller and more powerful at an exponential rate, giving us many advances. However, we’re fast approaching the physical limits of the size of computer parts, meaning we need new solutions to improve in certain areas. Quantum computing may be the answer.
Here, we take a look at the basics of quantum computing, including a definition, a look at how quantum computers work, the advantages and disadvantages of the tech, and the skills you’ll need to learn more about this field.
Table of Contents
What is quantum computing?
A quantum computer is a machine that harnesses some of the unique properties of quantum physics to solve problems that are too complex for regular computers and even supercomputers. The field of quantum computing focuses on developing technology that takes advantage of the ways in which energy and matter behave at the subatomic level.
We use the word ‘quantum’ to describe the laws of physics that apply at the level of individual atoms, electrons and elementary particles. At this microscopic level, the laws of physics are different from those we experience in our daily lives.
Quantum computing aims to manipulate and control these different physics to perform tasks and computations that our current digital computers are incapable of (at least, on short timescales).
The concept of quantum computing is a relatively new one and has, until recently, been a largely theoretical field. The first circuit-based commercial quantum computer was only introduced in 2019 by IBM. In the same year, scientists at Google also claimed their quantum computer was operating at a level beyond traditional supercomputers.
How does quantum computing work?
As you might have guessed, quantum computing is a complex field that’s difficult for non-experts to understand. However, it is possible to grasp some of the fundamental concepts, giving you a basic understanding of how quantum computers work.
Here, we’ll outline some of the very basics of quantum computing. To get a more detailed appreciation, our course on understanding quantum computers from Keio University has more information.
To grasp how quantum computing works, there are a few key concepts to understand. Below, we’ve given some simple explanations of these:
As we explore in our open step on qubits, traditional computers are built on bits. These bits (short for binary digits) are the basic units of information in computing, where two distinct configurations can be measured. They can be thought of as on or off, up or down, or, as encoded in binary, as either 0s or 1s.
In quantum computing, quantum bits or qubits form the basics of how these computers work. These qubits can be made from quantum-mechanical systems that can have two states. For example, the spin of an electron can be measured as up or down, or a single photon is either vertically or horizontally polarised.
Unlike traditional computing bits, which can be either 0s or 1s, qubits can exist as either 0s or 1s, or a mix of both simultaneously. This phenomenon, known as a state of superposition, means that all combinations of information can exist at once.
When qubits are combined together, this ability to hold all possible configurations of information at once means that complex problems can be represented far easier than with traditional computing methods.
You can learn more about the principles and properties of superposition on our open step on the subject.
Another key part of quantum computing is the quantum effect known as entanglement. Put simply, this phenomenon creates a correlation between two qubits. As such, when two or more qubits are entangled, a change to one can impact the others.
Quantum computing algorithms are based on this principle, allowing complex problems to be solved far quicker than would otherwise be possible. Again, you can learn more about entanglement in quantum computing with our open step on the topic.
How IBM’s quantum computer works
IBM’s Quantum System One computer is an incredibly intricate machine, somewhat resembling an intricate and ornate chandelier housed in a glass case. According to IBM, the actual quantum processor is not much bigger than a regular laptop processor, yet the rest of the hardware is made up of advanced cooling systems.
Quantum processors need temperature conditions approaching absolute zero to operate, and IBM uses superfluids and superconductors to generate these conditions. They then use microwave photons to control the behaviour of qubits, allowing them to produce quantum information.
Advantages and disadvantages of quantum computing
You might be wondering why we need quantum computers in the first place. They’re difficult to engineer, build and program. What’s more, the nature of quantum physics means they’re subject to errors, faults, and loss of quantum states.
However, there are some distinct advantages of quantum computing, at least in theory. Below, we’ve picked out some of the pros and cons of quantum computing:
Advantages of quantum computing
- They’re fast. Ultimately, quantum computers have the potential to provide computational power on a scale that traditional computers cannot ever match. In 2019, for example, Google claimed to carry out a calculation in about 200 seconds that would take a classical supercomputer around 10,000 years.
- They can solve complex problems. The more complex a problem, the harder it is for even a supercomputer to solve. When a classical computer fails, it’s usually because of a huge degree of complexity and many interacting variables. However, due to the concepts of superposition and entanglement, quantum computers can account for all these variables and complexities to reach a solution.
- They can run complex simulations. The speed and complexity that quantum computing can achieve means that, in theory, a quantum computer could simulate many intricate systems, allowing us to better understand some of life’s great mysteries.
Disadvantages of quantum computing
- They’re difficult to build. As we saw with IBM’s Quantum System One, a functional quantum computer needs a very specific set of conditions to operate. They require unique components, massive cooling systems, and expensive technology to run at even a basic level.
- They’re prone to errors. Due to the nature of quantum mechanics and qubits, environmental factors can soon produce errors and lose their quantum state (a process known as decoherence). These errors multiply with levels of complexity, which means that to reach their potential, a solution for error correction is needed.
- They’re only suitable for specific tasks. As we’ll see, quantum computers have the potential to deliver revolutionary solutions in some specific areas. However, due to the nature of how they work, they’re not expected to offer advantages in all areas of computing.
What is quantum computing used for?
Let’s turn our attention to some of the uses of quantum computing. Thanks to incentives such as IBM Quantum Service, organisations can access cloud-based quantum computing. So what can it be used for?
Below, we’ve highlighted some current and potential quantum computing applications:
- Molecular modelling. Even with current supercomputers, it’s hard to simulate atoms and molecules with any deal of accuracy. Using quantum computing to simulate quantum physics could provide new insights on things like how batteries operate or how proteins interact, which could help to revolutionise energy storage and medicine.
- Database searching. Given the way that quantum computers solve problems, they could be used to store and search through massive amounts of data in a much quicker time than traditional computers.
- Cryptography. A fully functioning quantum computer could potentially be used to break most existing forms of encryption, which would be a huge concern for cyber security. However, work is ongoing to try and create quantum-proof cryptography. Quantum computing could be a vital part of the future of cyber and network security.
- Weather forecasting. In the field of meteorology, vast amounts of data and many different variables are needed to create forecasts. Even supercomputers struggle to predict the weather with any great accuracy. Quantum computing could provide a boost in the level of complexity with which we predict the weather.
How to learn quantum computing
If you’re eager to learn more about quantum computing or want to eventually move into the industry, there are several skills that can be useful. Below, we’ve highlighted some of the areas you may want to develop your expertise in:
- Mathematics. A detailed knowledge of advanced mathematics is incredibly useful in the field of quantum computing. Advanced algorithms are often at the heart of the field, as is a knowledge of areas such as data analytics.
- Physics. As we’ve discussed, quantum physics forms the foundation of quantum computing. Understanding the link between physics and technology can be hugely beneficial for those wishing to enter the field.
- Programming. Another key area is the ability to write and understand code. One of the key programming languages in quantum computing is Python, which forms the basis of the Qiskit software development kit often used in the industry.
These are some of the essential skills for working in quantum computing. However, there are various open-source platforms and websites that can help you understand just how these fascinating machines work.
Is quantum computing the future of technology?
The field of quantum computing is still in its infancy. As we’ve seen, the technology is still imperfect and hard to harness, with many unknowns. However, the current and potential uses of quantum computers could change the way we understand the world around us.
From detailed models and simulations to significantly faster problem solving, quantum computers have significant potential. However, whether or not we can fully realise that potential remains to be seen. Companies such as Google and IBM are heavily invested in the technology, so if nothing else, we can expect to see further advancements in the coming years.