Rahul Rao
Moore's law is in danger - but not for the reasons you think
by Rahul Rao, Yasir Aheer and Varun Rao
Summary:
Computing power increases are slowing down as we reach the limits of physics with semiconductor systems.
Quantum computing promises to blow current computing standards out of the water and make light of the hardest computing tasks.
The specialised hardware of quantum computers makes them difficult to build and maintain but small scale quantum computers are already in use today.
Governments around the world are locked in a quantum arms race as the decryption power of quantum computers make them a national security asset.
Computing power has long been subject to Moore’s Law, named after the ex-CEO and co-founder of Intel, who described his famous Law in a 1965 article in Electronics magazine: the number of transistors on a microchip doubles every two years and the cost of computing is halved.
Originally the prediction was a somewhat off-the-cuff remark but it proved to be surprisingly resilient to the test of time - as late as 2016 Moore's law still held true.
Advancements, however, become more difficult to make as the low-hanging fruit is picked. There are signs now that the end of the era of Moore's law is nigh. Jensen Huang the CEO of Nvidia pointed out at CES 2019 that the increase in computing power is slowing down, from 10x every five years to a few percent a year. Charles Leiserson, a computer scientist at MIT, declared the law officially dead in 2020. Robert Colwell at DARPA is of a similar opinion - he thinks the next improvements will be incremental at best. At some point, we reach the limit of semiconductor physics and it appears that that point has either passed us recently or lies shortly ahead.
Ironically, we are also entering the era of big data. Computing power is sorely needed to crunch the vast amounts of data generated daily by digital devices. The growth of data far outpaces the development of smaller and faster microchips. The inability to keep up has driven many data owners to cloud providers such as Microsoft and Amazon, who have vast arrays of servers in several locations around the globe and can provide computing power on demand.
There is, however, another interesting piece of technology that could possibly blow Moore’s Law out of the water and make light of our most difficult computing tasks. Welcome to quantum computing.
Quantum computing
The building block of quantum computing is a quantum bit (qubit), the analog of a bit in digital computing. Qubits are typically sub-atomic particles like electrons that are produced and manipulated in complex ways. Google uses superconducting circuits chilled to near absolute zero (-273.15 deg C). Other companies use qubits made from ions in a vacuum chamber. Another way to manufacture qubits is to keep photons in superposition using mazes of mirrors and fibre optics. Manufacture of qubits is, unlike regular silicon chips, an involved, complex, and expensive engineering task. Why would you go to the trouble?
The key difference between qubits and bits lies in how much information each can hold. Traditional bits are binary; they hold a single piece of information - either 1 or 0. The possible states of a qubit are the same - 1 or 0 - but qubits can represent a “superposition” of both states at the same time. This means that they read neither 1 nor 0 until they are measured. Instead, they have probabilities of being either 1 or 0. The ability to be in both states at once allows a qubit to store twice as much information as a bit. Intuitively that makes little sense but common sense is not common in the strange world of quantum physics.
A two-fold increase doesn’t look like much but combinations of qubits can store startlingly higher volumes of information than their equivalent digital counterparts - 10 qubits can store over 100 times the information of 10 bits. The power of the exponential curve is demonstrated in the rice and chessboard story.
Another less obvious and stranger phenomenon is entanglement, described by Albert Einstein as "spooky action at a distance". Entanglement refers to the tight connection between two quantum particles, which leads to changes in particle A when a change is made to particle B, irrespective of the physical distance separating the two. Entanglement allows the joint probabilities of collections of qubits to be manipulated by a single process, akin to massive parallel computing on digital computers. When the calculation is complete and the qubits are measured, the qubits “fall” into one or the other state and a result is obtained.
This parallel computing enables quantum computers to cycle through possible scenarios at a far quicker rate than digital computers. Situations where a traditional computer would use iteration to arrive at a result will be solved far quicker when quantum computing becomes mainstream.
Where will it be used?
Tasks involving massive amounts of data and innumerable interactions between different parameters stand to benefit from quantum computing, much like they did from machine learning and artificial intelligence.
The most obvious use for quantum computing is in the simulation of quantum physics. Richard Feynman himself has suggested that quantum physics would need to be simulated using machines that operate in the quantum space. Studies are already underway. In some ways this is strangely human - machines studying their own inner workings is a parallel to our current study of biology.
Biology and pharmaceuticals are other fields that are likely to see changes with the advent of quantum computing. Modelling the behaviour of drugs and antibodies at a molecular level have proven difficult for traditional platforms; quantum computing offers some hope that the vast numbers of calculations required can be performed in a feasible time-frame.
Highly iterative processes such as route-planning are already using quantum computing - Volkswagen has developed a program to predict traffic volumes and demand for transport, with the aim of optimising traffic flow and reducing congestion. Airbus has launched the Airbus Quantum Computing Challenge to, among other things, optimise fuel usage during climb and descent.
Weather forecasting is another task that requires mammoth computing power with the vast amounts of data and interactions between different measured variables. IBM is currently partnering with several meteorological research centres to crunch the steady stream of data flowing from instruments around the world.
Shor's algorithm and encryption
In 1994, American mathematician Peter Shor discovered a quantum algorithm to factorise large numbers. Shor's finding is the reason the US, China, and several other countries view quantum computing as a matter of national security.
Why? Briefly, factorisation is the basis of most of current cryptography and the ability to quickly factorise a given number is the ability to break current encryption codes. RSA encryption, the most popular algorithm currently used, operates on a simple principle - data is encrypted using a public key and decrypted with a private key. The private key consists of two large prime numbers, the product of which is the public key. Should one be able to factorise the public key, any data encrypted using it can be accessed. Analysis has suggested that this would only take 8 hours with a 20 million qubit quantum computer.
Given that it is unlikely that quantum computing will become mainstream in the next few years, there are two points of view on the dangers of quantum hacking. The kind of data to be protected determines the level of risk quantum hacking poses.
In the first category is information on things such as personal finance and health, which gets out-of-date quickly; such data created today will be of little use to hackers in 20 years who have quantum computing power at their disposal. Already encryption algorithms are being developed that will be immune to the brute-force methods of quantum computing; data with short lifetimes will be subject to quantum-proof encryption years before quantum decryption becomes viable outside controlled laboratories.
On the other hand, data on diplomatic and military decisions is valuable for decades after it is created. Several categories of documents are currently marked classified for anywhere from 30-50 years from the date of their creation. Documents such as these created today could be intercepted in the near future in their currently-encrypted form and stored until quantum computers are able to decrypt them. By this time, they would still be classified and possibly of value. Leaks from these documents could prove embarrassing to nations and governments.
But wait...
Quantum computing is not a panacea for all the ills of the world. Quantum computing will likely complement digital computing, rather than replacing it altogether, for several reasons.
Quantum computers are not easy to produce. Not only is there the difficulty in producing qubits, there is also the difficulty in generating the entanglements required for a quantum program to run. The current state-of-the-art for quantum computer is 70 cubits. Researchers in 2019 showed that 20 million qubits would be needed to break RSA encryption in 8 hours. The growth required for this to occur is significant.
Once produced, qubits are extraordinary unstable. Any interaction with the external environment - think heat or electric fields - reduces the stability of qubits. Unstable qubits lead to randomly changing data and unreliable results. To make up for this, more qubits than theoretically required have to be generated, further delaying the arrival of quantum computing.
Once produced and stabilised, there are certain select tasks that quantum computing is well suited for - some of these have been covered earlier in this article. Tasks that do not involve massive parallelism will not see a performance boost. Reading emails, opening spreadsheets, or playing Minesweeper are tasks that a digital computer would perform much more efficiently. Quantum computers are unlikely to replace your laptop.
Finally, when the state of a qubit is measured, typically at the end of a program, the qubit collapses into one or the other state. To run the program again, new qubits must be generated, with all the overhead associated with this production. This means that most tasks you currently perform on your laptop will, in fact, be slower on quantum computers.
These difficult but not insurmountable challenges mean that quantum computing will only slowly creep into our lives. As with most other technologies - see autonomous vehicles and AI - the initial impact will be overstated in popular media and quantum computing will fail to live up to the hype. "What a damp squib", we'll complain in 2030. In 50 years it will be invisibly interwoven into our lives.
Co-Authors:
Disclaimer: This article is based on our personal opinion and does not reflect or represent any organisation that we might be associated with.