# What Happened to the Quantum Computer?

Publié le 1 mars, 2008 | Pas de commentaires

**Par Cory Dean**

Télécharger l'article au format PDF

**For several decades now, the “quantum computer” has promised to be the next great evolution in technology. Despite intense research by many of the world’s leading scientists, no significant progress has been made towards realizing an actual working device. Recent discoveries in fundamental electron physics, together with a paradigm shift in our approach to quantum computation, has generated significant excitement that soon the quantum computer may finally be realized. **

Processing speeds of modern computers have continually improved, largely due to technological advancements allowing for smaller and smaller transistors (the backbone of the computer chip) to be fabricated, which in turn allows more transistors to be packed into a single processor chip. There is, however, a physical limit to how small a device can be made, which places a potential limit on how fast a conventional computer can possibly get. It is widely believed that the peculiar properties of quantum physics will not only allow us to overcome this rapidly approaching limit, but could lead to advancements in computational power that will never be possible with conventional technology. In spite of decades of intense research no one has yet managed to build a viable *quantum computer.*

**Beyond the classical computer**

All computers function by storing information in binary code, where each “bit” of memory takes one of two states (“on” or “off”, conventionally represented by the values 1 and 0). Strings of such bits are then combined to code everything from simple text, to complex graphic vectors that can be manipulated in real time, to detailed simulations used to explore the fundamental nature of the world around us. Incredibly, even the most complex computer process results from simply manipulating these strings of 1’s and 0’s according to only a few logic rules. Given the bare-bones simplicity of this binary structure, the complexity and versatility of the modern computer is truly remarkable.

The use of quantum states in computer technology represents a natural progression since two level quantum systems (i.e. systems where only two possible states are allowed by the laws of quantum mechanics) are abundant and easy to manipulate. For example, all fundamental particles contain the quantum property termed “spin”. When placed in a sufficiently strong magnetic field, a particle with spin value 1/2 (i.e. an electron) will either align to the magnetic field or anti-align to field, with no other spin orientations allowed. By labeling the aligned state “1”, and the anti-aligned state “0”, a single electron, one of the smallest particles in the universe, can be used to represent a single bit of data. This would immediately mark a significant improvement over conventional technology, where a single bit of data is generally stored in a macroscopic collection of many millions of atoms. The most exciting prospect of realizing quantum computation, however, lies with the expectation that we will be able to exploit the bizarre property of *quantum superposition.*

In a classical two level system, a bit can only take one or the other value, i.e. it is *either* a “1” *or* “0”. In a quantum two level system however, the laws of quantum mechanics allow the quantum bit to exist in any superposition of these two states. In other words, the qubit is not restricted to taking only the values “1” *or* “0”, but instead can be prepared in a so-called *superposition state *(or mixed state), where it exists in both the “1” *and* “0” states simultaneously. This state mixing allows for an exponential increase in the amount of information that can be coded into a single bit. Algorithms based on the manipulation of these quantum bits, which can theoretically process an unrestricted number of calculations in parallel*,* are therefore expected to lead to significantly faster computational speeds. This, in turn, promises to lead to more efficient handling of computationally intensive calculations, such as database searching, and large number factorization (extremely important for code breaking). Some even believe that quantum computation might allow us to solve certain kinds of problems that are thought to be unsolvable with conventional computer technology (the proto-typical example is the so-called *traveling salesman problem*, which involves optimizing the path between two destinations).

**Limitations to realizing a quantum computer**

So, with the field of computer science more than 50 years old, and with our understanding of quantum mechanics even more mature, why are we not already using quantum computers? While quantum superposition offers the possibility of huge advances in computational power, the inherently fragile nature of these mixed states has so far presented an insurmountable obstacle to realizing an actual working device.

In the theoretical quantum computer, a calculation proceeds according to three steps: i) A collection of qubits is initialized into some known quantum state (for example, 16 electron-spin qubits could all be placed in the aligned state, representing a binary string of 16 “1’s”); ii) This initial state is transformed to some well defined final state (the first qubit is changed to a “0” etc.); iii) The final quantum state is then measured (1). In this simple algorithm, the desired computation is achieved by the transformation which takes the system from the initial to the final state. Importantly, any errors that appear while performing this transformation can influence the quantum state, causing anything from total destruction of the mixed quantum state, to an arbitrary transition to a new mixed state. Ultimately, the final measured state of the system is not determined solely by the applied transformation, but will also be influenced by how well the quantum system is isolated from sources of error. These errors can be *any* kind of error, from normal manufacturing limitations, to random fluctuations in the environment containing the quantum state such as, for example, small temperature variations.

Dealing with errors is not a new problem. Even conventional computers function reliably only because of built-in error handling. Typically, this is achieved by introducing redundancies, where multiple copies of information are generated, then checked against, with corrections made as needed. Quantum systems, on the other hand, pose a unique problem where we can not simply monitor the wave function for errors since any measurement performed can itself destroy the mixed state.

To get around this problem, algorithms have been proposed that, similar to the method of error handling in classical computers, create multiple copies of the quantum state. Subsequent comparisons only check whether these redundant states remain identical, without actually measuring the information contained in the states^{1}. These error correction routines, however, present their own challenges since they perform successfully only if the errors occur at extremely low rates. Estimates suggest, for example, that the quantum computer would need to be capable of performing 104 to 106 consecutive calculations without error, before compounding errors would foil these routines^{2}. These stringent error demands are well below what can be realistically achieved in real-world systems, leading many to wonder whether quantum computation could ever actually be achieved.

**A new approach to quantum computation**

In 2003 Alexi Kitaev, a professor of theoretical Physics and Computer Science at Caltech University, proposed that instead of focusing on error correction, or even error reduction, a more practical approach might be to look at quantum systems which are *fault tolerant *^{3}. Instead of trying to make existing quantum systems more robust, for example by attempting to better isolate them from the outside world, he proposed a new means of encoding information in quantum states that would be inherently insensitive to errors. Kitaev’s breakthrough was the realization that *topology* represents a uniquely robust property.

Topology is the branch of mathematics concerned with describing geometric properties that are preserved when an object is deformed. For example, the topology of a coffee cup and a doughnut are said to be identical since both can be described by a continuous surface punctured by a hole. Furthermore, by stretching and reshaping the coffee cup we can continuously deform it into a doughnut without tearing or rejoining the surface (Fig. 1a). The transformation from a cup to a doughnut represents an extreme change in the apparent shape of the object, but its overall topology remains intact. A second example is given in Fig. 1b, in which we imagine a rope in a closed loop with a knot. Twisting and bending the rope does not alter its topology. Instead only a severe deformation such as cutting the rope can cause a topological change. Encoding information in a quantum system with a well defined topology could potentially avoid the problem of error correction altogether since errors could then crop up and even freely distort the states without causing information loss. Only severe perturbations that change the overall topology would have any effect on the stored information. Error handling then becomes the much more manageable problem of reducing only certain kinds of errors, instead of having to eliminate all potential errors.

This apparent solution however creates the new problem of finding a suitable quantum system. This turns out to be a serious issue since most physical quantum systems are not defined by a well defined and robust topology, but instead are destroyed by even the smallest perturbation^{2}. Worse than simply finding the right kind of particles, the topological quantum computer proposed by Kitaev requires that we exploit a completely new kind of particle, with quantum properties that, while theoretically possible, have never actually been observed.

Elemental particles were historically classified as either “bosons” or “fermions”, according to the quantum statistics they exhibit. Recent experiments have shown that these actually constitute a subset of a larger, more general, class of particles, termed anyons (“any”-ons)^{4}. The proposed topological quantum computer, with all its promise of inherent fault tolerance, requires that we find an even more exotic class of particles termed “non-abelian anyons”, which are predicted to exhibit even more generalized behaviour.

Recent advances in two-dimensional electron physics have led to the discovery of what might possibly be this new state of matter. The observation of the so-called “5/2 state” (formed when electrons confined to two dimensions, and in a strong magnetic field, combine to create a new kind of composite particle) remains unexplained, however leading theories suggest the corresponding composite particles may in fact be the elusive “non-abelian anyons.” While several experiments have been proposed to probe the quantum properties of this 5/2 state (see, for example, references ^{5,6}), advancements in sample fabrication and experimental techniques have only very recently made these explorations possible.

**What does all of this mean for the future of quantum computation? **

Realistically, even if the 5/2 state is shown to exhibit the necessary statistics, this still offers no immediate practical solution to the ultimate goal of a quantum computer, since this state is only observed at extremely low temperatures (very near absolute zero). Here again though, semiconductor physics is making hopeful strides. Just this past year, the quantum hall effect, a necessary precursor to the 5/2 state, was observed for the first time at room temperature by examining new graphene (single layered graphite) samples ^{7}. This remarkable achievement, nearly 30 years in the making, gives serious hope that some day the more exotic quantum hall states (such as the 5/2) may also be observable, and therefore usable, in room temperature devices.

The recent observation of room temperature quantum hall physics, together with the potential existence of a non-abelian anyonic state (5/2), may finally provide all of the ingredients necessary for the dream of an actual working, and even practical, quantum computer to become a reality. However, we are only at the very brink of many of these discoveries, and with plenty of open questions remaining, we should not expect to find one retailing at our local computer store any time soon.

**References**

1. Preskill, John. “Lecture notes for Phys 219: Quantum Computation.” (2004). Found online at http://www.theory.caltech.edu/~preskill/ph229/#lecture

2. Das Sarma, Sankar, *et al*. “Non-Abelian Anyons and Topological Quantum Computation.” arXiv:0707.1889v1 [cond-mat.str-el]. (2007).

3. Kitaev, Alexei. “Fault-tolerant quantum computation by anyons.” Ann. Phys. 303 (2003): 2-30.

4. Camino, Fernando, Wei Zhou, and Vladimir Goldman, *et al.* “Aharonov-Bohm Superperiod in a Laughlin Quasiparticle Interferometer.” Phys. Rev. Lett. 95 (2005): 246-802.

5. Freedman, Michael, Chetan Nayak, and Kevin Walker. “Towards universal topological quantum computation in the n=5/2 fractional quantum Hall state.” Phys. Rev. B 73 (2006): 245307.

6. Bonderson, Parsa, Alexei Kitaev, and Kirill Shtengel “Detecting Non-Abelian Statistics in the n=5/2 Fractional Quantum Hall State” Phys. Rev. Lett*.* 96 (2006): 016803.

7. Novoselov, Konstantin S. et al. “Room Temperature Quantum Hall Effect in Graphene” Science315 (2007): 1379.

Cet article est publié sous un contrat Creative Commons.

**Catégorie(s):**Articles de fond / Long articles, English, Formats, Langue / Language, Sciences, Sections

**Mots-clés:**computation > computer > fault > Quantum > tolerant

### Commentaires

Répondre