Researchers have been working on quantum systems for more than a decade, in the hopes of developing super-tiny, super-powerful computers. And while there is still plenty of excitement surrounding quantum computing, significant roadblocks are causing some to question whether quantum computing will ever make it out of the lab.
First, what is quantum computing? One simple definition is that quantum computers use qubits (or quantum bits) to encode information. However, unlike silicon-based computers that use bits which are zeroes or ones, qubits can exist in multiple states simultaneously. In other words, a qubit is a bit of information that has not yet decided whether it wants to be a zero or a one.
In theory, that means that quantum systems can produce simultaneous processing of calculations; in essence, true parallel systems.
Olivier Pfister, professor of experimental atomic, molecular and optical physics at the University of Virginia, says quantum algorithms could deliver exponential advances in compute speed, which would be useful for database searching, pattern recognition, solving complex mathematical problems and cracking encryption protocols.
"But the roadblocks to complete success are numerous," Pfister adds. The first is scalability - how do you build systems with large numbers of qubits. The second is even more vexing - how do you overcome "decoherence," the random changes in quantum states that occur when qubits interact with the environment.
The first roadblock is an obvious one: quantum systems are microscopic. The challenge is to gain exquisite levels of control at the atomic scale, over thousands of atoms. To date, this has only been achieved on the order of 10 atoms.
"My work with optical fields has demonstrated good preliminary control over 60 qubit equivalents, which we call 'Qmodes' and has the potential to scale to thousands of Qmodes," Pfister says. "Each Qmode is a distinctly specified color of the electromagnetic field, but to develop a quantum computer, nearly hundreds to thousands of Qmodes are required."
Decoherence is an even more vexing problem. "All the algorithms or patents in the world are not going to produce a quantum computer until we learn how to control decoherence," says Professor Philip Stamp, Director of the Pacific Institute for Theoretical Physics, Physics, and Astronomy at the University of British Columbia.
In the early days of quantum research, computer scientists used classical error correction methods to try to mitigate the effects of decoherence, but Stamp says those methods are turning out to be not applicable to the quantum world. "The strong claims for error correction as a panacea to deal with decoherence need to be re-evaluated."
According to Stamp, there are many experiments going on around the world in which researchers are claiming that they have built quantum information processing devices, but many of these claims dissolve when the hard questions about decoherence for multi-qubit systems are asked.
Sign up for CIO Asia eNewsletters.