Last month when researchers at Google unveiled a blueprint for quantum supremacy, little did they know that rival IBM was about to snatch the pole position. In what could be the largest and the most sophisticated quantum computer built till date, IBM has announced the development of a quantum computer capable of handling 50 qubits (quantum bits).
The Big Blue also announced another 20-qubit processor that will be made available through IBM Q cloud by the end of the year.
“Our 20-qubit machine has double the coherence time, at an average of 90 microseconds, compared to previous generations of quantum processors with an average of 50 microseconds. It is also designed to scale; the 50-qubit prototype has similar performance,” Dario Gil, who leads IBM’s quantum computing and artificial intelligence research division, said in his blog post.
IBM’s progress in this space has been truly rapid. After launching the 5-qubit system in May 2016, they followed with a 15-qubit machine this year, and then upgraded the IBM Q experience to 20-qubits, putting 50-qubits in line. That is quite a leap in 18 months.
As a technology, quantum computing is a rather difficult area to understand — information is processed differently here. Unlike normal computers that interpret either a 0 or a 1, quantum computers can live in multiple states, leading to all kinds of programming possibilities for such type of computing. Add to it the coherence factor that makes it very difficult for programmers to build a quantum algorithm.
While the company did not divulge the technical details about how its engineers could simultaneously expand the number of qubits and increase the coherence times, it did mention that the improvements were due to better “superconducting qubit design, connectivity and packaging.” That the 50-qubit prototype is a “natural extension” of the 20-qubit technology and both exhibit “similar performance metrics.”
The major goal though is to create a fault tolerant universal system that is capable of correcting errors automatically while having high coherence. “The holy grail is fault-tolerant universal quantum computing. Today, we are creating approximate universal, meaning it can perform arbitrary operations and programs, but it’s approximating so that I have to live with errors and a limited window of time to perform the operations,” Gil said.
The good news is that an ecosystem is building up. Through the IBM Q experience, more than 60,000 users have run over 1.7 million quantum experiments and generated over 35 third-party research publications. That the beta-testers included 1,500 universities, 300 high schools and 300 private-sector participants means quantum computing is closer to implementation in real world, in areas like medicine, drug discovery and materials science. “Quantum computing will open up new doors in the fields of chemistry, optimisation, and machine learning in the coming years,” Gil added. “We should savor this period in the history of quantum information technology, in which we are truly in the process of rebooting computing.”
All eyes are now on Google, IBM’s nearest rival in quantum computing at this stage. While IBM’s 50-qubit processor has taken away half the charm out of Google’s soon to be announced 49-qubit system, expect more surprises in the offing as Google has so far managed to keep its entire quantum computing machinery behind closed doors.