Quantum Computing Made Easy

Quantum Computing in Simple Terms

Introduction

The dictionary describes a computer as “a programmable electronic device designed to accept data, perform prescribed mathematical and logical operations at high speed, and display the results of these operations”.  Within the computer is a microchip, which can be found in virtually every electronic device worldwide.  The number of transistors on these chips has, astonishingly, increased from a few thousand in 1971 to over five billion today.  In all this time, the same mathematical model for the computer has been present.  But, as the size of transistors continues to get smaller and smaller, the physics that describe them will eventually enter the strange realm of quantum mechanics.  This means that the previous rules, and the mathematics that go with them, may soon be overturned.  

Quantum computing is a fascinating topic that combines the fields of quantum physics, information theory and computer science.  A quantum computer can perform certain tasks much more quickly than any regular computer, even the greatest supercomputers.  This is because quantum phenomena, which are unavailable to an everyday computer, can be implemented.  Concepts such as quantum entanglement, quantum interference and superposition can produce unique computational results for certain problems. 

The age of quantum computing first became a possibility when a theoretical model for a computer, based on quantum mechanics, was proposed in the early 1980s.  At the time, however, researchers did not know how such a device would be constructed.  Despite this, it did not deter a debate on what types of programs could run on a quantum computer.  All the suggestions were considered academically interesting but impractical.  But the invention of Shor’s algorithm in the mid-90s changed all that.  This ground-breaking advance would lead to a huge up-turn in interest in quantum computing from governments, corporations and the media – which is still felt today.

Qubit vs Bit

In everyday computing, a unit of data is known as a bit (shorthand for binary digit); each bit has one of two values, either 0 or 1.  For a large part of the 20th century, the bit was characterised by the presence or the absence of a hole at certain positions in a punch card.  In modern computing, a bit is usually based upon an electronic signal controlled by a transistor, a current (or voltage) within a cable or magnetic domains on the surface of a material.  A set of eight bits is known as a byte.  This form of data can be encoded into a single character, such as a letter or a symbol.  It is, in terms of megabytes (106), gigabytes (109) or terabytes (1012), used to define the storage capacity of random-access memory and hard disk drives.

The qubit, shorthand for quantum bit, is a unit of data that has a value of 0 and 1 at the same time.  A set of eight qubits is known as a qubyte.  As its name suggests, qubits follow the rules of quantum mechanics – meaning that they have two states (which represent the 0 and 1) in a superposition.  The main difference between quantum computing and regular computing is the possibility of quantum entanglement and quantum interference acting between the qubits.  

Sometimes the qubit is called a quantum coin, since only two possibilities can occur.  It works using the same principles as Schrödinger’s cat, in that the cat is both alive and dead – i.e. in a superposition – until an observation is made by opening the box.  In the case of the qubit, this means that either state 0 or state 1 will be seen following the observation or measurement of the quantum system.  For example, the probability of a fair quantum coin is ½ for state 0 and ½ for state 1.  As you would expect, the two probabilities of a quantum coin always add up to 1.

Quantum Circuits

The processing of data, in the form of qubits, is a major requirement of a quantum computer.  This is achieved by using quantum logic gates, the simplest of which can modify individual qubits.  Quantum gates of special interest are called a bit flip, a phase flip and a Hadamard transform.  Quantum circuits are a set of wires that carry qubits to and from the quantum logic gates that can modify them. 

We imagine the qubits moving through the wires.  Because of their quantum nature, these circuits are reversible meaning that the number of input qubits is matched by the number of outputs.  Some of these outputs may be useless and presumed to be garbage, but they are needed for reversibility reasons.  When an observation or measurement is made, the superposition of the qubit is lost and the result acts like a bit.  Complexities may arise, because of quantum entanglement, when multiple qubits are input into the circuit.  In fact, a quantum logic gate that acts on two qubits at once can create an entangled state.

The phenomenon of quantum entanglement is vital to the operation of a quantum computer. Entangled qubits are totally inter-twined and cannot be treated as individual, isolated entities.  The simplest example of a quantum state for entangled qubits is known as a Bell state.  In such a scenario, following observation of the first qubit, the state of the first qubit is in either state 0 or state 1 (both have a probability of ½) and the second qubit simultaneously matches this result.  To be clear, although the observation occurs at the first qubit, the superposition state of the second qubit is also instantly lost.  

DiVincenzo’s Criteria

When more qubits are added to create a quantum computer there is an exponential growth in data.  This means that, when around 500 qubits are present, there is seemingly more data available than the number of particles in the universe.  The problem is that, following an observation, almost all of this data is lost.  Only the information connected to the observed state is found.  The remaining gigantic dataset cannot be replicated (due to the no-cloning principle, the notion that a qubit cannot be copied) and, thus, saved from extinction.  Despite this inevitable data loss, the role of a quantum computer is to manipulate the unseen data before any observation is made – so that unique and previously unattainable results can be obtained.  This means that the observation should be the final step in the quantum computing process.

A quantum computer, i.e. a programmable quantum information processing device, performs tasks by modifying a set of labelled qubits in a quantum register.  This is represented by quantum circuits that contain quantum logic gates.  They differ from electronic circuits in everyday computers because phenomena such as quantum entanglement and quantum interference can be exploited in them.  Quantum computers can perform both everyday computing tasks and the quantum mechanical processing of qubits.  However, unless quantum algorithms are used – such Shor’s algorithm or Grover’s algorithm – no quantum results are attainable; if a quantum computer is not running a quantum algorithm, it is no different to a regular computer.

The five criteria for building a quantum computer were proposed by the theoretical physicist David DiVincenzo in 2000. These are:

1.      Start the system in a well-defined and stable quantum state

2.      Must have a sufficiently large number of individually addressable qubits

3.      Feasible to apply quantum logic gates to each qubit or set of qubits

4.      Preservation of qubits in the quantum registry for a large number of operations

5.      Possible to read-out the quantum state of each qubit with high fidelity

Research on all five criteria has greatly advanced in the last 30-40 years.  However, present day quantum registers are still limited to a few or tens of qubits.  While a viable quantum computer requires several hundred to thousands of qubits.  A major cause of this limitation is the effect known as quantum decoherence.

Quantum Decoherence

Despite the research advancements at the time, a viable quantum computer was still considered improbable in the mid-90s.  This is because quantum systems are notoriously fragile, which means that their quantum characteristics are easily lost.  In terms of quantum computing, this loss or decoherence originates from the interaction of the qubits with their surroundings.  This has the same effect as an observation.

Quantum decoherence is probably the biggest hurdle to overcome when constructing a quantum computer.  It represents a decay of the data encoded into the quantum register, so that errors continually arise.  Unfortunately, this decay grows as the number of qubits increases.  Consequently, the viability of a quantum computer is compromised.  It is, therefore, vital to develop a strategy to reduce the effects of quantum decoherence.  The recent developments in error correction techniques have helped achieve this aim.

In everyday computing, error correction – the reconstruction of corrupted data – is achieved by data copying.  However, the no-cloning principle excludes this possibility in a quantum computer.  Luckily, however, techniques already available for regular computing can, surprisingly, be modified to create sophisticated quantum error correction techniques.  Primarily, they involve encoding the data into additional qubits, which then check for and eliminate various types of error.  This vast subject has become one of the most extensively developed areas of quantum computing.  These error correction techniques, combined with fault-tolerant procedures, have made a robust quantum computer a strong possibility in the near future.  

Can Quantum Computers Solve NP-Complete Problems in Polynomial Time?

The short answer is we do not know.

Some researchers believe that quantum computers cannot solve NP-complete problems in polynomial time, but no definitive proof has been offered so far to support this position.

Since the discovery of Shor’s algorithm, a great effort has been undertaken to find quantum algorithms that solve NP-complete problems in polynomial time.  At the present time, we are still waiting for a possible solution.

This blog post acts as a taster for the book “Quantum Computing Made Easy”, which is sponsored by Satalia.

—-

Satalia is an optimization company founded in 2007 by Dr Daniel Hulme. We solve hard problems using data science, optimization and artificial intelligence, and were named a 2016 Gartner Cool Vendor.