PC World: In traditional computing, numbers are represented by either 0s or 1s, but quantum computing relies on atomic-scale units, or “qubits,” that can be simultaneously 0 and 1—a state known as a superposition that’s far more efficient. by Katherine Noyes via IDG News Service
'It typically takes about 12 qubits to factor the number 15, but researchers at MIT and the University of Innsbruck in Austria have found a way to pare that down to five qubits, each represented by a single atom, they said this week.
'Using laser pulses to keep the quantum system stable by holding the atoms in an ion trap, the new system promises scalability as well, as more atoms and lasers can be added to build a bigger and faster quantum computer able to factor much larger numbers. That, in turn, presents new risks for factorization-based methods such as RSA, used for protecting credit cards, state secrets and other confidential data.
'The development is in many ways an answer to a challenge posed back in 1994, when MIT professor Peter Shor came up with a quantum algorithm that calculates the prime factors of a large number with much better efficiency than a classical computer.
Fifteen is the smallest number that can meaningfully demonstrate Shor’s algorithm. Without any prior knowledge of the answers, the new system returned the correct factors with a confidence better than 99 percent.
'“We show that Shor’s algorithm, the most complex quantum algorithm known to date, is realizable in a way where, yes, all you have to do is go in the lab, apply more technology, and you should be able to make a bigger quantum computer,” said Isaac Chuang, professor of physics and professor of electrical engineering and computer science at MIT. “It might still cost an enormous amount of money to build—you won’t be building a quantum computer and putting it on your desktop anytime soon—but now it’s much more an engineering effort, and not a basic physics question,” Chuang added.'
"Realization of a scalable Shor algorithm" by Thomas Monz, et al here