Bit of choice: A photograph of the nine superconducting qubit device developed by the Martinis group at the University of California, Santa Barbara, where, for the first time, the qubits are able to detect and effectively protect each other from bit errors. (Courtesy: Julian Kelly/Martinis group)

Photo and caption from PhysicsWorld blog – Bit of choice: A photograph of the nine superconducting qubit device developed by the Martinis group at the University of California, Santa Barbara, where, for the first time, the qubits are able to detect and effectively protect each other from bit errors. (Courtesy: Julian Kelly/Martinis group)

John Preskill, gave the opening talk of the session “20 years of quantum error correction” at the March APS meeting. Following the talk, he was interviewed for the PhysicsWorld blog

“Although it may seem premature that scientists have been working on this problem for nearly two decades when an actual quantum computer has yet to be built, we know that we must account for such errors if our quantum computers are ever to succeed. It will be essential if we want to achieve fault-tolerant quantum computation that can deal with all sorts of noise within the system, as well as faults in the hardware (such as a faulty gate) or even a measurement.

Over the past 20 years, theoretical work in the field has made scientists confident that quantum computing of the future will be scalable. Preskill says that “it’s exciting because the experimentalists are taking it quite seriously now”, while initially the interest was mainly theoretical.”

Read the full blog post Getting a fix on quantum computations