The first time a group of physicists started talking about the possibility of making calculations of quantum nature was in the 80s, but it was not until the 90’s when it began to implement; quantum computing today is closer but is still difficult to date marketing.
The mission of a quantum computer, such as a conventional one and a supercomputer, is to make trades, executed the first calculations differently: work at the atomic level and thus following the rules of quantum physics (branch of Physical charged with studying microscopic objects such as atoms).
Quantum physics has not observable macroscopic levels, unique properties and these are used by quantum computers-of which there are now only prototipos- complex data processing to achieve faster and more safely.
Quantum computers operate on individual atoms, which are called “qubits” and no bits (as in traditional).
A bit (short for binary digit binary–digito) is the smallest unit of information used in computer or digital device and is represented by two values, 0 and 1: all that is “below” a classical computer is written in terms of 0 or 1, detailed Efe Miguel Angel Martin-Delgado, professor of Theoretical Physics at the Complutense University of Madrid (UCM).
However, this changes quantum computing as a qubit, unlike a bit, may contain both values simultaneously, enabling a faster processing.
Another feature of quantum computing is that it can work in parallel: a normal PC does sequentially, first solve a problem, then another and so on.
However, adds Martin-Delgado, quantum computers can solve an exponential number of tasks simultaneously.
For example, to “break” a key encrypted a classical computer would have to sift, one by one, the various combinations of numbers to give the password, but a quantum not: he could work with thousands at a time in a second.
And what is it that enables the computing power? Shor algorithm.
This algorithm, designed by the American mathematician Peter Shor, allowing, for example, to a third party that intercepts an encrypted decipher an exponentially faster while you would with a classical computer, says Martin-Delgado message.
Peter Shor, published work describing this algorithm in 1994, which was “the starting gun for the whole world started to become interested in quantum computing.”
A year later, he became the first proposal of how to make a quantum computer by the Spanish Juan Ignacio Cirac and Austrian Peter Zoller (known as Cirac-Zoller proposal).
But the quantum computer will not only serve to “unmask” encrypted messages, but for simulation, sequence genomes more quickly, do quantum chemical calculations, which could be used to design new drugs or improve search.
Hence, companies like Google or institutions such as NASA have opted for it.
“The quantum computer will be predominant in the future of computing: will be king,” emphasizes Martin-Delgado, who points to the horizon of 2020 will be crucial to begin to realize them (depends on developing experiments).
However, to reach this point one must first resolve a number of issues, including errors (classical computers are built with mechanisms for correcting errors).
This is precisely what has just won the team of Martin-Delgado and Rainer Blatt (Innsbruck).
“We managed to make a complete quantum correction of errors in a small module where quantum operations were done”, explains the researcher, who has just published in Science and presented at several conferences in USA these results.
Classical computers are only one kind of mistake-a mistake is when you want to write 0 and puts a 1 or contrary, yet one quantum, being more complex, has three types of errors: “We have developed a method for correcting all the errors that may appear in a quantum computer, “says Markus Muller, member of the team at the Complutense.