As technology evolves, it increases the scale of integration and fit more transistors on a space and are made ever smaller microchips, and that is, the smaller, faster process reaches the chip. However, we can not infinitely small chips. There is a limit in which stop working correctly. When it comes to the scale of nanometers, the electrons escape from the channels through which to move. This is called tunneling. A particle, if you encounter an obstacle, you can not go through and bounces. But with the electrons, which are quantum particles and behave like waves, the possibility exists that some of them can pass through walls if they are too thin, in this way the signal can pass through channels which should not move. Therefore, the chip stops working properly. Consequently, the traditional digital computer it would soon reach its limits, because they have reached levels of only a few tens of nanometers. Then the need to discover new technologies and where there is quantum computing.
The idea of quantum computing emerged in 1981, when Paul Benioff explained his theory to harness the laws of quantum physics in the environment of the computer. Instead of working at the level of electrical voltage, working-level terms. In digital computing, a bit can only take two values: 0 or 1. In contrast, quantum computing, are the laws of quantum mechanics and particle overlap can be consistent: it can be 0, 1 and can be 0 and 1 at a time (two orthogonal states of subatomic particles). This enables several operations can be performed at once, depending on the number of qubits.
The number of qubits indicates the number of bits that can be in superposition. With conventional bits, if we had a record of three bits, there are eight possible values and the record could only take one of these values. However, if we have a vector of three qubits, the particle can take eight different values at once thanks to the quantum overlap. Thus a vector of three qubits would allow a total of eight parallel operations. As expected, the number of operations is exponential with respect to the number of qubits. To get an idea of breakthrough, a quantum computer with 30 qubits would be a conventional processor 10 teraflops (trillion floating point operations per second) when computers currently working on the order of gigaflops (billion operations ).

<