IBM aims for immediate quantum advantage with error mitigation technique

You don’t have to be a physicist to know that noise and quantum computing don’t mix. Any noise, movement or temperature swing causes qubits – the quantum computing equivalent to a binary bit in classical computing – to fail.

That’s one of the main reasons quantum advantage (the point at which quantum surpasses classic computing) and quantum supremacy (when quantum computers solve a problem not feasible for classical computing) feel like longer-term goals and emerging technology.  It’s worth the wait, though, as quantum computers promise exponential increases over classic computing, which tops out at supercomputing.  However, due to the intricacies of quantum physics (e.g., entanglement), quantum computers are also more prone to errors based on environmental factors when compared to supercomputers or high-performance computers.

Quantum errors arise from what’s known as decoherence, a process that occurs when noise or nonoptimal temperatures interfere with qubits, changing their quantum states and causing information stored by the quantum computer to be lost.

The road(s) to quantum

Many enterprises view quantum computing technology as a zero-sum scenario and that if you want value from a quantum computer, you need fault-tolerant quantum processors and a multitude of qubits. While we wait, we’re stuck in the NISQ era — noisy intermediate-scale quantum — where quantum hasn’t surpassed  classical computers.

That’s an impression IBM hopes to change.

In a blog published today by IBM, its quantum team (Kristan Temme, Ewout van den Berg, Abhinav Kandala and Jay Gambett) writes that the history of classical computing is one of incremental advances. 

“Although quantum computers have seen tremendous improvements in their scale, quality and speed in recent years, such a gradual evolution seems to be missing from the narrative,” the team wrote.  “However, recent advances in techniques we refer to broadly as quantum error mitigation allow us to lay out a smoother path towards this goal. Along this path, advances in qubit coherence, gate fidelities and speed immediately translate to measurable advantage in computation, akin to the steady progress historically observed with classical computers.”

Finding value in noisy qubits

In a move to get a quantum advantage sooner – and in incremental steps – IBM claims to have created a technique that’s designed to tap more value from noisy qubits and move away from NISQ.

Instead of focusing solely on fault-tolerant computers. IBM’s goal is continuous and incremental improvements, Jerry Chow, the director of hardware development for IBM Quantum, told VentureBeat.

To mitigate errors, Chow points to IBM’s new probabilistic error cancellation, a technique designed to invert noisy quantum circuits to achieve error-free results, even though the circuits themselves are noisy. It does bring a runtime tradeoff, he said, because you’re giving up running more circuits to gain insight into the noise causing the errors.

The goal of the new technique is to provide a step, rather than a leap, towards quantum supremacy.  It’s “a near-term solution,” Chow said, and a part of a suite of techniques that will help IBM learn about error correction through error migration. “As you increase the runtime, you learn more as you run more qubits,” he explained.

Chow said that while  IBM continues to scale its quantum platform, this offers an incremental step. Last year, IBM unveiled a 127-qubit Eagle processor, which is capable of running quantum circuits that can’t be replicated classically.  Based on its quantum roadmap laid out in May, IBM systems is on track to reach 4,000-plus qubit quantum devices in 2025.

Not an either-or scenario: Quantum starts now

Probabilistic error cancellation represents a shift for IBM and the quantum field overall. Rather than relying solely on experiments to achieve full error correction under certain circumstances, IBM has focused on a continuous push to address quantum errors today while still moving toward fault-tolerant machines, Chow said. “You need high-quality hardware to run billions of circuits. Speed is needed. The goal is not to do error mitigation  long-term. It’s not all or nothing.”

IBM quantum computing bloggers add that its quantum error mitigation technique “is the continuous path that will take us from today’s quantum hardware to tomorrow’s fault-tolerant quantum computers. This path will let us run larger circuits needed for quantum advantage, one hardware improvement at a time.”

Charting quantum runtime as a function of quantum circuit complexity for classical computers, quantum computers with error correction and quantum computers with error mitigation. Quantum error mitigation will fill the gap before quantum error correction achieves practical runtime reductions. (Source: IBM.)

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Source

By admin