
Quantum computing
by Chris Woodford. Last updated: August 4, 2021.
How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despite
such amazing advances, there are still plenty of complex problems
that are beyond the reach of even the world's most powerful
computers—and there's no guarantee we'll ever be able to tackle
them. One problem is that the basic switching and memory units of
computers, known as transistors, are now approaching the point where
they'll soon be as small as individual atoms. If we want computers
that are smaller and more powerful than today's, we'll soon need to
do our computing in a radically different way. Entering the realm of
atoms opens up powerful new possibilities in the shape of quantum
computing, with processors that could work millions of times
faster than the ones we use today. Sounds amazing, but the trouble is
that quantum computing is hugely more complex than traditional
computing and operates in the Alice in Wonderland world of quantum
physics, where the "classical," sensible, everyday laws of physics no longer apply. What is
quantum computing and how does it work? Let's take a closer look!
Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.
What is conventional computing?
You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play games—but it's much more
and much less
than that. It's more, because it's a completely general-purpose
machine: you can make it do virtually anything you like. It's
less, because inside it's little more than an extremely basic
calculator, following a prearranged set of instructions called a
program. Like the Wizard of Oz, the amazing things you see in front of you
conceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.
Conventional computers have two tricks that they do really well: they can store
numbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can be
done as a series of additions, for example). Both of a computer's key
tricks—storage and processing—are accomplished using switches
called transistors, which are like microscopic versions of the
switches you have on your wall for turning on and off the lights. A
transistor can either be on or off, just as a light can either be lit
or unlit. If it's on, we can use a transistor to store a number one
(1); if it's off, it stores a number zero (0). Long strings of ones
and zeros can be used to store any number, letter, or symbol using a
code based on binary (so computers store an upper-case letter A as
1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 different
characters (such as A-Z, a-z, 0-9, and most common symbols).
Computers calculate by using circuits called logic gates,
which are made from a number of transistors connected together. Logic
gates compare patterns of bits, stored in temporary memories called
registers, and then turn them into new patterns of bits—and
that's the computer equivalent of what our human brains would call
addition, subtraction, or multiplication. In physical terms, the
algorithm that performs a particular calculation takes the form of an
electronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.
The trouble with conventional computers is that they depend on
conventional transistors. This might not sound like a problem if you
go by the amazing progress made in electronics over the last few
decades. When the transistor was invented, back in 1947, the switch
it replaced (which was called the vacuum tube) was about as
big as one of your thumbs. Now, a state-of-the-art microprocessor
(single-chip computer) packs hundreds of millions (and up to
30 billion) transistors onto a chip of silicon the size of your
fingernail! Chips like these, which are called integrated
circuits, are an incredible feat of miniaturization. Back in the
1960s, Intel co-founder Gordon Moore realized that the power of
computers doubles roughly 18 months—and it's been doing so ever
since. This apparently unshakeable trend is known as Moore's Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digits—so we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!
It sounds amazing, and it is, but it misses the point. The more
information you need to store, the more binary ones and zeros—and
transistors—you need to do it. Since most conventional computers can
only do one thing at a time, the more complex the problem you want
them to solve, the more steps they'll need to take and the longer
they'll need to do it. Some computing problems are so complex that
they need more computing power and time than any modern machine could
reasonably supply; computer scientists call those intractable
problems.
As Moore's Law advances, so the number of intractable problems
diminishes: computers get more powerful and we can do more with them.
The trouble is, transistors are just about as small as we can make
them: we're getting to the point where the laws of physics seem likely
to put a stop to Moore's Law. Unfortunately, there are still hugely
difficult computing problems we can't tackle because even the most
powerful computers find them intractable. That's one of the reasons
why people are now getting interested in quantum computing.
What is quantum computing?
“Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen.”
Richard Feynman
Quantum theory is the branch of physics that deals with the world of
atoms and the smaller (subatomic) particles inside them. You might
think atoms behave the same way as everything else in the world, in
their own tiny little way—but that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,
one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen."
[1]
If you've studied light, you may already know a bit about quantum
theory. You might know that a beam of light sometimes behaves as
though it's made up of particles (like a steady stream of
cannonballs), and sometimes as though it's waves of energy rippling
through space (a bit like waves on the sea). That's called wave-particle duality
and it's one of the ideas that comes to us from quantum theory. It's hard to grasp that
something can be two things at once—a particle and a
wave—because it's totally alien to our everyday experience: a car is
not simultaneously a bicycle and a bus. In quantum theory, however,
that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrödinger's cat. Briefly, in the weird world of
quantum theory, we can imagine a situation where something like a cat
could be alive and dead at the same time!
What does all this have to do with computers? Suppose we keep on pushing
Moore's Law—keep on making transistors smaller until they get to the
point where they obey not the ordinary laws of physics (like
old-style transistors) but the more bizarre laws of quantum
mechanics. The question is whether computers designed this way can do
things our conventional computers can't. If we can predict
mathematically that they might be able to, can we actually make them
work like that in practice?
People have been asking those questions for several decades.
Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantum
computing in the 1960s when he proposed that information is a physical entity
that could be manipulated according to the laws of physics.
[2]
One important consequence of this is that computers waste energy manipulating the bits inside them
(which is partly why computers use so much energy and get so hot, even though they appear to be doing
not very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumvent
this problem by working in a "reversible" way, implying that a quantum computer could
carry out massively complex computations without using massive amounts of energy.
[3]
In 1980, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principles
of quantum physics—in other words, a quantum Turing machine.
[4]
The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basic computations.
[5]
A few years later, Oxford University's David Deutsch
(one of the leading lights in quantum computing) outlined the
theoretical basis of a quantum computer in more detail.
[6]
How did these great scientists imagine that quantum computers might work?
Quantum + computing = quantum computing
The key features of an ordinary computer—bits, registers, logic gates,
algorithms, and so on—have analogous features in a quantum computer.
Instead of bits, a quantum computer has quantum bits or qubits,
which work in a particularly intriguing way. Where a bit can store
either a zero or a 1, a qubit can store a zero, a one, both
zero and one, or an infinite number of values in between—and
be in multiple states (store multiple values) at the same time!
If that sounds confusing, think back to light being a particle and
a wave at the same time, Schrödinger's cat being alive and dead, or a
car being a bicycle and a bus. A gentler way to think of the numbers
qubits store is through the physics concept of superposition
(where two waves add to make a third one that contains both of the
originals). If you blow on something like a flute, the pipe fills up
with a standing wave: a wave made up of a fundamental frequency (the
basic note you're playing) and lots of overtones or harmonics
(higher-frequency multiples of the fundamental). The wave inside the
pipe contains all these waves simultaneously: they're added together
to make a combined wave that includes them all. Qubits use
superposition to represent multiple states (multiple numeric values)
simultaneously in a similar way.
Just as a quantum computer can store multiple numbers at once, so it can
process them simultaneously. Instead of working in serial (doing a
series of things one at a time in a sequence), it can work in
parallel (doing multiple things at the same time). Only when you
try to find out what state it's actually in at any given moment
(by measuring it, in other words) does it "collapse" into one of its possible states—and
that gives you the answer to your problem. Estimates suggest
a quantum computer's ability to work in parallel would make it millions of times faster than
any conventional computer... if only we could build it! So how
would we do that?
What would a quantum computer be like in reality?
In reality, qubits would have to be stored by atoms, ions (atoms with
too many or too few electrons), or even smaller things such as electrons
and photons (energy packets), so a quantum computer would be almost like a table-top
version of the kind of particle physics experiments they do at
Fermilab or CERN. Now you wouldn't be racing particles round giant
loops and smashing them together, but you would need mechanisms for
containing atoms, ions, or subatomic particles, for putting them into certain
states (so you can store information), knocking them into other states (so you can
make them process information), and figuring out what their states are after particular
operations have been performed.

Photo: A single atom or ion can be trapped in an optical cavity—the space between mirrors—and controlled by precise pulses from laser beams.
In practice, there are lots of possible ways of containing atoms and changing their states using
laser beams, electromagnetic
fields, radio waves, and an assortment of other techniques.
One method is to make qubits using
quantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another method
makes qubits from what are called ion traps: you add or take away
electrons from an atom to make an ion, hold it steady in a kind of laser spotlight
(so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight),
and then flip it into different states with laser pulses. In another technique,
the qubits are photons inside optical cavities (spaces between
extremely tiny mirrors). Don't worry if you don't understand; not many people do. Since the entire
field of quantum computing is still largely abstract and theoretical, the only thing we really need to know
is that qubits are stored by atoms or other quantum-scale particles that can
exist in different states and be switched between them.
What can quantum computers do that ordinary computers can't?
Although people often assume that quantum computers must automatically be
better than conventional ones, that's by no means certain. So far,
just about the only thing we know for certain that a quantum computer could do better than a
normal one is factorisation: finding two unknown prime numbers that,
when multiplied together, give a third, known number. In 1994,
while working at Bell Laboratories, mathematician Peter Shor
demonstrated an algorithm that a quantum computer
could follow to find the "prime factors" of a large number, which
would speed up the problem enormously.
[7]
Shor's algorithm really
excited interest in quantum computing because virtually every modern
computer (and every secure, online shopping and banking website) uses
public-key encryption technology based on the virtual
impossibility of finding prime factors quickly (it is, in other words, essentially
an "intractable" computer problem). If quantum computers could
indeed factor large numbers quickly, today's online security could be
rendered obsolete at a stroke. But what goes around comes around,
and some researchers believe quantum technology will lead to
much stronger forms of encryption.
(In 2017, Chinese researchers demonstrated for the first time
how quantum encryption could be used to make a very secure video call
from Beijing to Vienna.)
Does that mean quantum computers are better than conventional ones? Not
exactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that would
be better performed by quantum methods. Given enough time and
computing power, conventional computers should still be able to solve
any problem that quantum computers could solve, eventually. In
other words, it remains to be proven that quantum computers are
generally superior to conventional ones, especially given the difficulties of
actually building them. Who knows how conventional computers might advance
in the next 50 years, potentially making the idea of quantum computers irrelevant—and even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.
Why is it so hard to make a quantum computer?
We have decades of experience building ordinary, transistor-based computers with conventional architectures;
building quantum machines means reinventing the whole idea of a computer from the bottom up.
First, there are the practical difficulties of making qubits, controlling them very precisely,
and having enough of them to do really useful things.
Next, there's a major difficulty with errors inherent in a quantum system—"noise" as this is technically
called—which seriously compromises any calculations a quantum computer might make.
There are ways around this ("quantum error correction"), but they introduce a great deal more complexity.
There's also the fundamental issue of how you get data in and out of a quantum computer,
which is, itself, a complex computing problem.
Some critics believe these issues are insurmountable;
others acknowledge the problems but argue the mission is too important to abandon.
How far off are quantum computers?
Three decades after they were first proposed, quantum computers remain
largely theoretical. Even so, there's been some encouraging progress
toward realizing a quantum machine. There were two impressive
breakthroughs in 2000. First, Isaac Chuang
(now an MIT professor, but then working at IBM's
Almaden Research Center) used five fluorine atoms to make a crude,
five-qubit quantum computer. The same year, researchers at Los
Alamos National Laboratory figured out how to make a seven-qubit
machine using a drop of liquid. Five years later, researchers at the
University of Innsbruck added an extra qubit and produced the first
quantum computer that could manipulate a qubyte (eight qubits),
later bumping the number up to 14 qubits.
These were tentative but important first steps.
Over the next few years, researchers announced more ambitious experiments, adding
progressively greater numbers of qubits. By 2011, a pioneering Canadian
company called D-Wave Systems
announced in Nature that it had produced a 128-qubit
machine
[8]; the announcement proved
highly controversial
and there was a lot of debate over whether the company's machines had really demonstrated quantum behavior.
Three years later, Google announced that it was hiring a team of academics (including University of California
at Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach.
In March 2015, the Google team
announced they were "a step closer to quantum computation," having developed
a new way for qubits to detect and protect against errors.
In 2016, MIT's Isaac Chuang and scientists from the University of Innsbruck
unveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine might
evolve into the long-promised, fully fledged encryption buster.
There's no doubt that these are hugely important advances.
and the signs are growing steadily more encouraging that quantum
technology will eventually deliver a computing revolution.
In December 2017, Microsoft unveiled a complete
quantum development kit, including a new computer language, Q#, developed specifically for
quantum applications. In early 2018,
D-wave announced plans to start rolling out quantum power to a
cloud computing platform.
A few weeks later, Google announced Bristlecone, a quantum processor
based on a 72-qubit array, that might, one day, form the cornerstone of
a quantum computer that could tackle real-world problems.
In October 2019, Google
announced it had reached another milestone: the achievement of "quantum supremacy" (the point at which a quantum
computer can beat a conventional machine at a typical computing task),
though not everyone was convinced; IBM, for example,
disputed the claim.
One thing is beyond dispute: quantum computing is very exciting—and you can find out just how exciting
by tinkering with it for yourself, In 2019, Amazon's AWS Cloud
Computing offshoot announced a service called Braket, which gives its users access to quantum computing simulators based on machines being developed by three cutting-edge companies (D-wave, IonQ, and Rigletti). Microsoft's Azure cloud platform offers
a rival service called Azure Quantum, while Google's Quantum AI website
offers access to its own research and resources. Take your pick—or try them all.
Despite all this progress, it's early days for the whole field, and most
researchers agree that we're unlikely to see practical quantum
computers appearing for some years—and more likely several decades.
The conclusion reached by an influential National Academies of Sciences, Medicine
and Engineering report in December 2018 was that "it is still too early to be able to predict the time horizon for a practical quantum computer" and that "many technical challenges remain to be resolved before we reach
this milestone."