Since time immemorial (the digital timeline that is, which roughly takes us to the 1970’s), our hunger for computing power has been insatiable. Every 24 months, we double the number of transistors we can elegantly cram up in a relatively infinitesimal space in the order of micrometers.
The day George Boole gave us the simple yet beautiful idea of how we could control, manipulate and basically make the tiny atomic particles, a.k.a electrons our slaves; the binary number system that is, he radically transformed the rate at which humanity was going to progress. In the blink of an eye, we had the first transistors, evolving to logic gates, to integrated circuits and finally to the microprocessor.
It’s funny how something so pleasingly ingenious can be this straightforward. Just to illustrate the true elegance of the Zero: Consider a string of infinite length consisting a non-repeating sequence of only 0’s and 1’s. Converting this string to ASCII text, gives us some really magnificent results. Somewhere in that infinite string of digits is the name of every person you will ever love, the date, time, and manner of your death, and the answers to all the great questions of the universe. The results are even more dramatic when that sequence is converted into bitmap. Somewhere in that infinite string of digits is a pixel-perfect representation of the first thing you saw on earth, the last thing you ever see before your life leaves you, and all the moments, momentous and mundane, that will occur between those two points. All information that has ever existed or will ever exist; the DNA of every being in this universe. Everything. All of it given by two symbols - logic 0 and logic 1, low and high, Zero and One.
How long can we continue to tread along this path though? How many years do we have before we hit a solid dead end at the atomic scale, where we can no longer jam any more transistors? Should we start our quest for a new road or should we build one ourselves?
This edition, BufferedReader explores the inherently strange and amazing world of quantum computing.
Quantum, the word carries an aura of strangeness, mystery and has an enigmatic feel attached to it. At first glance, it really is, safe to say, incomprehensible. The concepts of superposition, tunnelling and entanglement are inconsistent with our normal world and may seem peculiar to our common sense. But breaking down and observing its infinite scope of applications, it really doesn't seem so complicated.
Before we delve into the depths of how the quantum world operates, it would be prudent to first have a look at the need of such a complicated mechanism.
The tech-world always harps on its yearning for quicker processing computers which unleash their capability to produce machines and intelligence as smart as Iron Man’s Jarvis. Quantum Computing has proved to be a boon for them.
We continue to crave for higher computing speed. As a result, the circuits in the micro- processors are diminishing, and by the year 2030, we will find them measurable on an atomic scale. This gives us a clear direction. It points to taking the next logical leap, the idea of Quantum Computing, which will harness the power of atoms and molecules to perform memory and processing tasks.
The immense amount of processing power generated by the computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, an American computer engineer made an errant prediction that just six electronic digital computers would assuage the computing requirements of the United States. Of course, then he didn't count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet. Over the passage of time, these technological advancements have fuelled our need for more, more and more computing power.
The revolutionary concept of Quantum Computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine.
The Classical Turing machine, developed by Alan Turing in the 1930s, is a theoretical device that consists of a tape of unlimited length that is divided into little squares. Each square can either hold a symbol (1 or 0) or be left blank. A read-write device reads these symbols and blanks, which gives the machine its instructions to perform a certain program. The Turing machine formed the mainstay of the modern Digital Electronics which laid the foundation of the classical computers.
Owing to the fact that Quantum Computers have the potential to perform exhaustive calculations significantly faster than any silicon-based computer, technocrats have already built basic quantum computers that can perform certain calculations. A practical Quantum Computer is not too far.
Let’s have a look at the building blocks of quantum world.
Defining Quantum: Qubits
The very groundwork of quantum mechanics is based on the principle of Superposition. The renowned theory about Schrödinger’s Cat is an excellent example to understand superposition. Let a cat be closed in a box with a device that would release toxic fumes upon activation at any instant of time. Now since it is quite uncertain when the device is activated; it is equally uncertain to say whether the cat is alive. This lead Schrödinger to put forward the superposition theory and state that cat is both alive AND dead and it is impossible to know its exact state unless the box is opened.
This principle if extended to computing, gives the nascent concept of qubits. Like bits in the classical world, qubits are the simplest possible units of information in the quantum world. They are oracle-like objects that, when asked a question (i.e., when measured), can respond in one of only two ways. Measuring a bit, either classical or quantum, will result in one of two possible outcomes. It may sound, at first, that there is hardly any alteration between them. But this is how the Quantum nature comes into role. The difference is not in the possible answers, but in the possible questions. For normal bits, only a single measurement is permitted, implicating that only a single question can be asked: Is this bit a zero or a one? In contrast, a qubit is a system which can be asked many, many different questions, but to each question, only one of two answers can be given. This bizarre behaviour is the very essence of quantum mechanics which drives the swift processing of Quantum Computers.
In a Quantum Turing machine, the difference is that the tape exists in a quantum state, as does the read-write head. This means that the symbols on the tape can be either 0 or 1 or a superposition of 0 and 1; or we may say the symbols are both 0 and 1 (and all points in between) at the same time. While the Classical Turing machine can only perform one calculation at a time, the Quantum Turing machine can perform many calculations at once.
Today's computers work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren't limited to just two states; they encode information as qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today's most powerful supercomputers. Now isn’t that amazing?
This superposition of qubits is what imparts Quantum Computers their inherent simultaneity. According to physicist David Deutsch, this parallelism allows a quantum computer to work on uncountable computations at once, while the desktop PC works on one. A 30-qubit Quantum Computer would suffice the processing power of a conventional computer that could run at 10 teraflops (trillions of floating-point operations per second). Today's typical desktop computers run at speeds measured in gigaflops.
Having multiple states is fine, but how do we extract the exact state from a qubit? How do we measure and process the information? In classical bits we could simply read the data of a bit without manipulating it. But the same cannot be done with qubits. According to a law in Quantum Mechanics, ‘Measuring a qubit changes its value to match the result of the measurement.’ This entails that if we attempt to measure a qubit, we may not get the value it possesses but the value it absorbs due to disturbance caused by measurement. So the next time, if we measure the same qubit we may get some different value, any random reading. This synchronizes with another law in Quantum Mechanics which states, ‘Qubit measurement gives random results.’
This ascends a problem with the idea of Quantum Computing. If we try to look at the subatomic particles, we could bump them, and thereby change their value. If you look at a qubit in superposition to determine its value, the qubit will assume the value of either 0 or 1, but not both (effectively turning your spiffy quantum computer into a mundane digital computer). So how could we possibly assure the quantum nature to work and make use of it simultaneously? To make a practical quantum computer, ways of making measurements indirectly to preserve the system's integrity are devised. Entanglement provides a potential answer.
In quantum physics, if we apply an outside force to two atoms, it can cause them to become entangled and the second atom can take on the properties of the first atom. So if left alone, an atom will spin in all directions. The instant it is disturbed it chooses one spin, or one value; and at the same time, the second entangled atom will choose an opposite spin, or value. This countenances us to know the value of the qubits without actually looking at them. So we have an avant-garde idea of Quantum Computing giving us a mighty strength to commit any computing task which we couldn't do earlier due to memory and speed limits.
Computers built on the principles of quantum physics—as opposed to ‘classical’ physics—promise a revolution on the order of the invention of the computer or the television. D- Wave, a small Canadian company backed by Jeff Bezos, NASA, and the CIA among others, is the first firm to sell a so-called quantum computer—at roughly $10 million for a unit. The vast increase in power could revolutionize fields as myriad as medicine, space exploration, and artificial intelligence. The uses of quantum computing are vast and consist mainly of the following:
Lockheed Martin plans to use its D-Wave to test jet software that is currently too complex for classical computers. Quantum computers would permit us to carry out simulations that are too complex to be carried out on normal computers.