Quantum computers

What are quantum computers?

What is recognised as the first computer, the Turing Machine built in the 1930s by Alan Turing, who was an English mathematician and computer scientist. He created the basic format that we still rely on today. This is a system which reads an input, its ‘bits’ as being either off (0) or on (1). The wonderful thing about quantum computers though, is that instead of having bits as either 0 or 1, the particles inside a quantum computer, known as ‘qubits’ can be both 0 and 1 at the same time. The magic happens when you have several qubits at the same time.
Consider this: a classical computer with two bits can, at any one time, have any of these combinations: 00, 01, 10, 11. A qubit can store all four of these possibilities at the same time. Physicist David Deutsch calls this their ‘parallelism’. In this way, quantum computers could be exponentially more powerful than classical computers if using the same number of bits or qubits –in a three bit computer, the quantum alternative would have eight possible states, in a four-bit there’s sixteen and so on. The fundamental principle is this that your computer at the moment can only do one calculation at a time, even though it can do this very fast. The phenominal thing about Quantum computers however, is that they could do millions at once.

 

What are the implications?

The idea of having a new wave of computers that so outpace the old ones is both exciting and terrifying; though it would be fun to have civilian computers that are, from our point of view, almost infinitely powerful, the fact is that many of our societies’ most important institutions are based on complex computer codes (algorithms) that could be broken in seconds by a quantum computer. Banking and defence for example rely on their algorithms being especially difficult to crack.


How far off are we?

Quantum computers are still very much in the experimental, if not theoretical stage. However, there have been several developments in the field that show that we may be heading in the right direction. Over a decade ago, in 2001, IBM developed a very basic quantum device which was able to find the prime factors of numbers; a process which plays a key role in cryptography. The method they used, ‘Shor’s Algorithm’, named after the mathematician Peter Shor, was put into use on a silicon chip by the University of Bristol in 2009. In February this year, IBM released an enigmatic statement in which they said they were ‘on the cusp of building systems that will take computing to a whole new level.’ The ways of constructing quantum computers have been wildly diverse: diamonds, drops of liquid and molecular magnetics have all been used as platforms on which to build, and while this wide range of possibilities may make the idea seem imminently within our reach, the reality is that they go to show that quantum computing is still very much in its infancy.