Jerry Chow in the IBM quantum computing lab
Jerry Chow in the IBM quantum computing lab

Quantum computing was first mooted by the Nobel Prize winning physicist Richard Feynman in 1981. IBM scientists now believe that they have solved some of the key challenges to bringing Feynman’s dream to fruition. In a blog post by Mark Ritter, senior manager IBM T J Watson Research Center, he talks in detail about the announcement.

The research has been published in the April 29 issue of the journal Nature Communications (DOI: 10.1038/ncomms7979). In it IBM details how its scientists have been able to detect and measure two quantum errors (bit-flip and phase-flip) and this is the first time that both types of errors have been detectable simultaneously.

At the same time, IBM has demonstrated a new quantum bit (qubit) circuit that they believe is capable of being scaled to create quantum chips which could power the next generation of computers. According to Jay Gambetta, a manager in the IBM Quantum Computing Group: “Our four qubit results take us past this hurdle by detecting both types of quantum errors and can be scalable to larger systems, as the qubits are arranged in a square lattice as opposed to a linear array.”

Quantum Computing can solve the Moore’s Law problem

For over 50 years, Moore’s Law, defined by and named after Gordon Moore, has predicted that the number of transistors in a dense integrated circuit will double roughly every two years. While the time required to double has varied, the underlying statement has held true. The problem is that we are now reaching the limits of physics as we understand it.

On April 13th 2010, Techworld published an interview between Moore and Manek Dubash. In it, Moore said: “In terms of size [of transistor] you can see that we’re approaching the size of atoms which is a fundamental barrier, but it’ll be two or three generations before we get that far – but that’s as far out as we’ve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit. By then they’ll be able to make bigger chips and have transistor budgets in the billions.”

2016 will see a number of chip manufacturers releasing 14nm chip technology. A decade ago it was believed that this we would struggle to make high performing chips at that scale. Now the boundaries are being set at 10nm but eventually we will hit the atomic level that Moore describes.

To understand the importance of quantum computing you only have to look at the computing potential that it can offer. In the IBM press release they state: “If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today’s TOP500 supercomputers could successfully outperform it.”

With billions being spent each year on the development of supercomputers, there is a lot to be gained by the first company to successfully deliver a Quantum computer. It should come as no surprise, therefore, that IBM is not the only player in this space. Microsoft and Google are also challenging IBM to be the first to market with a Quantum computer.

Key use cases for Quantum Computing

Unsurprisingly, IBM is focused on where quantum computing could be deployed by customers. Arvind Krishna, senior vice president and director of IBM Research said: “Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today…. one area we find very compelling is the potential for practical quantum systems to solve problems in physics and quantum chemistry that are unsolvable today. This could have enormous potential in materials or drug design, opening up a new realm of applications.”

Perhaps the most well known use case for quantum computing has been in the world of cryptography. Despite research showing that early quantum cryptographic systems could be hacked, scientists have produced new solutions.

Last year the Los Alamos National Laboratory (LANL) reported last year that it had created a new type of quantum computing that would be uncrackable with current technology and potentially even with quantum computing. With the news last year that the US National Security Agency (NSA) was investing millions into creating a quantum computer capable of breaking the most advanced encryption any new advancements in quantum cryptography will interest a lot of people.

IBM has for several years been working on Fully Homomorphic Encryption (FHE). At present, with the exception of a limited set of operations, encrypted data has to be decrypted in order to be used and then re-encrypted when it is stored. With FHE you would be able to query and work on data stored in the cloud or other places without exposing that data to prying eyes. Only once the files are on a local device and you choose to decrypt them for other purposes would they ever be decrypted.

The challenge for FHE is that it requires significant amounts of compute power and memory to work. Quantum computing offers up the possibility of being able to deploy FHE to local servers and cloud based file systems where it could be used in real time to protect all data.

Quantum Computing and the next generation of IBM computers

One of the challenges for IBM will be how quantum computing will impact on its future computing plans. In 2014 IBM unveiled the IBM SyNAPSE chips, its second generation of neurosynaptic chips as part of its cognitive computing research. It will be interesting to see if IBM looks to merge parts of both its quantum and neurosynaptic computing programmes. If it does, then the idea of a computer system that thinks at the speed of a human brain could finally be within reach.

Conclusion

It is worth spending the time reading through the blog post from Ritter and the article in Nature. Both give much more information as to where IBM is going and the science underpinning this announcement.

LEAVE A REPLY

Please enter your comment!
Please enter your name here