How quantum computing works, explained in a simple way!

Quantum computing is bringing new standard that will change almost everything we know and believe about computer. Thanks to the superposition, a particular physical behavior, this new computation technology can solve problems that not even the conventional computer memory could solve today.

Now compare and remember that current computer works in bit. Your computer only knows how to “read” information in two states: zero or one (on or off). For this, we use voltages: we apply 3V on a wire = 1; we apply 0.5V in the same wire = 0. Everything done by computer is transcribed into this system by transistors, a small boxes that stored energy and release it when necessary.



Understanding transistors is important for comparison: when transistor has electricity stored we interpret 1, and 0 in otherwise. The use of transistors for the construction of logic gates depends upon their utility as fast switches. When the base-emitter diode is turned on enough to be driven into saturation, the collector voltage with respect to the emitter may be near zero and can be used to construct gates for the TTL logic family. For the AND logic, the transistors are in series and both transistors must be in the conducting state to drive the output high. For the OR logic, the transistors are in parallel and the output is driven high if either of the transistors is conducting.


Simplifying it a lot for the case in hand, these are the physical elements that carry out the calculations that we send through programs and apps. As you can imagine, this “mechanical” system show the speed of the computer to process information, which is linear to the number of bits it has. It surely depends on the hardware and by default has a technical limit.

The technical limit might seems like an exaggeration, making bigger computers and create multibits computer. But it’s not like that. The limit becomes evident when we think that not all the classic computers in the world are smart enough to solve optimization problems when the amount of data is too large. And at this moment in history, as a civilization, we generate immense amounts of data: climatic, population, geonomic, behavioral patterns, ect. We can not create useful versions or patterns of them because of the impossibility of a classic computer assimilating them all.

The difference that makes quantum technology special, and why it has such an immensely great potential, is that its ‘bit’ also works with the superposition of both states: on and off. This happens because the process does not depend on mechanical system at transistor, thanks to the rules of quantum physics. By applying ‘quantum logic’ to the computer world, problems are solved at full speed, parallel with multitude results for each variable.

Qubits

The bits of quantum computing are called ‘qubits’. Like a bit, a qubit represents a unit of quantum information, which is governed by the rules of quantum physics. Therefore the qubit can be 0 or 1, or something between them. In fact, it can be 1 and 0 in parallel. For its hardware part, the “container” effect of transistors and logic gates are replaced by more complicated processes: “isolate” the qubit as it occurs within the transistor.

The ways of making a quantum computer



Quantum computers can varying themselves depending on the way they manage to isolate and drive qubits, but we are always interested in creating the same thing as in the transistor: getting them to interact only when we want, and there are several systems to achieve it.

When its applied to the interior of a dilution refrigerator, the gold-colored coaxial cables send input and output signals from inside the refrigerator. There are superconducting circuits, for example. These are based on small circuits cooled to very low temperatures (-273 °C) so that the properties are ‘quantized’. Imagine, for example, it can circulate through the circuit at very low temperatures 1V or 2V, but not 1.5V. This allows the machine to know which is the 0 and the 1. This is the most successful technology for now.

The other one is using trapped ions for the computer. In this process, the quantum computer uses ions (atoms that have been removed or added one or more electrons) as qubits in a certain state and keeps them trapped in laser, then combine them according to the calculation to be performed. The 0 and 1 signal are identified with different distributions of the remaining electrons. The operations are done through lasers that modify the positions.

Finally, another well-known quantum computer system is the nuclear spins. The spins are a physical property of the elementary particles. It is enough to understand that the molecules are in a certain state and the operations are implemented changing their state to a new one with magnetic resonance.

How quantum computer works

So far, it might seem that the quantum computer does magic on its own. The fact is BIG NO. It is not magic, they are physical laws, just the same way like magnets of opposite charge stick together or gravity causes things to fall. With quantum computer, we noticed new norms of phenomena that we can take advantage of.

When atoms or molecules are not part of larger chemical structures, they have different “rules” from those we see in our everyday world. These rules are dictated as quantum physics and specifically known as quantum superposition.

Quantum computing is based on a phenomenon called wave-particle duality

We talk about a behavior that is observed in subatomic particles, like the electrons of the electric charge. This phenomenon is the behavior of a flow of electrons, which are particles. A wave consists of the propagation of a disturbance of some property, involving an energy transport without transporting the matter. For example, an easy wave to imagine is acoustics. A subatomic particle is smaller than the atom, as an electron is, but it has a specific mass and position.



Therefore, strange as it may seem, the particles can behave like waves and particle also. And, according to the quantum law, when this phenomenon occurs, the particle enters a superposition of states, in which they behave as if they were in both simultaneously or at an intermediate point between the two. While classical objects are in one state or another (but always a certain one), the state of a quantum system can be a superposition of several possible states. In this case, the analogy of the coin is usually used: if the two states of a coin were to be in face or in cross, then a quantum state would be an overlap of the two.

How the same phenomenon can have two different perceptions

Sometimes in the shade you see a circle, and sometimes a rectangle. What we can say with the shadows is that, depending on how you look at it, it has the properties of a circle or a rectangle. The case of wave-particle duality is very similar. Sometimes, light behaves like waves, for example when we make interference experiments, but other times it behaves like particles when we use lasers that send a photon per pulse.

The utility of the overlay

Quantum computing tries to use the superposition of states to be able to execute more than one computation at a time. As the electrons of the qubit can be 0 and 1 at the same time, we have “yes and no” of each assumption in parallel, which allows us to have much faster computers. Of course, it does not guarantee more speed for all the problems but in those that can take advantage of this parallelism.

Imagine a given program that takes two numbers and one additional bit and does the following: if the additional bit is in state 0 then the program adds the two numbers and gives you the result, and if the bit is in state 1 the program subtracts the numbers and gives you the result. If you wanted to get the addition and subtraction of two numbers, you would have to run the program twice: one with the additional bit at 0 and one with the bit at 1. On a quantum computer, since the qubit can be in an overlay of 0 and 1, the program runs the two instructions ‘in parallel’, and with running it once you can get a result that is the superposition of the addition and subtraction of the numbers .

However, using it is not so easy. Atoms and particles have their rules, and if we do not stick to them, we can not control them. For example, you can not even look while the computer computes. As strange as it may seem, another of the laws that govern the quantum world is that superpositions can not be observed or destroyed.

Quantum properties are very fragile and even degrade over time, so many resources have to be invested in keeping quantum computers isolated from the environment. Not only have a temperature of -273 ° C, but also to keep them in vacuum conditions where an external atom can not hit them, for example.

The current idea is not that each person on the planet has its own ‘quantum laptop’ just because the required conditions are very restrictive, but there is a ‘limited’ amount of quantum computers in places where they have the right conditions of temperature, vacuum, etc. Not everyone can have a dilution refrigerator at home to keep the qubits cool, but you have to think bigger. Quantum computers are being designed with the idea of ​​solving problems that are currently too complex for classical computers.



One of the first and most promising areas to applicating the quantum computer will be chemistry. In a simple molecule of caffeine, the number of quantum states in molecules grows surprisingly fast, so fast that not all the conventional computing memory that scientists could build and contain it. Other future applications could be, for example: medicines and materials (complex molecular and chemical interactions could lead to the discovery of new medicines), logistics and supply chain (calculation of optimal trajectories along global systems), financial services (modeling of financial data and investments on a global scale), artificial intelligence (automatic learning when the data flow is very large), security (breaking cryptography, the Shor algorithm, for example, could do it).

Finally, discovering the real utility of quantum computing is going to require many hands to experience and the potential is still to be quantified. In other words, the exponential growth of this technology is still unimaginable and who knows how far it will take us? What’s certain is that the limit of computing is no longer the limit as it is being revolutionized once again, and we are lucky to contemplate it.