Skip to main contentIBM 

You need 100 qubits to accelerate discovery with quantum

In this new era of quantum utility, you need to run large circuits to accelerate scientific discovery.

You need 100 qubits to acclerate discovery with quantum

26 Oct 2023

Ryan Mandelbaum

Robert Davis

Jennifer Janechek

Rafi Letzter

October 27: Running Experiments On 100+ Qubits Panel Discussion | Qiskit Quantum Seminar with Introduction by Jay Gambetta

Since we put the first quantum computer on the cloud in 2016, we’ve been laying the foundations of quantum computing. Researchers running valuable small-scale experiments have created new benchmarks, implemented new algorithms, educated the community about running circuits on quantum hardware, and validated the tenets of quantum computation more broadly. But these experiments don’t help us advance science in the domains where we hope quantum will make an impact — like chemistry, materials science, optimization, and more — since we can simulate them exactly using a classical computer.

If we hope to advance these fields using quantum computers, we need to run large circuits. At IBM, we use 100 qubits as the benchmark. And we need to run those circuits on real quantum hardware.

At IBM, we use 100 qubits as the benchmark.

Earlier this year, IBM Quantum and UC Berkeley teams extracted accurate results from a quantum computer for a circuit that couldn’t be exactly simulated by a classical computer. This paper1 sparked a new era in quantum computing at IBM: the era of quantum utility, where quantum computers could provide real value beyond classical for solving real-world problems. Since then, classical approximations have continued to verify the value of our experiment — but differ among each other by a significant amount. For the first time, we can use quantum as a source of truth against which we can benchmark classical approximation methods.

Read: A new paper from IBM and UC Berkeley shows a useful application for 127-qubit quantum processors with error mitigation — and a path toward useful quantum computing.

Today, more work is flooding in as researchers begin to do quantum at a meaningful scale. Papers from teams based at the University of Washington, Stony Brook University, IBM, and elsewhere are exploring a space beyond exact classical simulability, running circuits with over 100 qubits and hundreds‚ even thousands of gates.

Taking a step back, it’s hard to believe that we’re really here. Scientists at world-leading research institutions are using our systems not just to study quantum physics, but to advance science. Folks are writing real papers to explore uncharted scientific territory in domains beyond quantum computing. This is what we mean by the “era of utility.”

So, now that we’ve entered the utility era, what kinds of utility-scale work are researchers doing?

Simulating spin chains

When IBM debuted the first quantum processor to break the 100-qubit barrier in late 2021, SUNY Stony Brook physics professor Tzu-Chieh Wei wanted to find out just how many of those qubits were actually usable given the presence of noise and errors in the current generation of pre-fault-tolerant quantum systems.

To test this, he worked with Stony Brook students Hongye Yu and Yusheng Zhao to develop an experiment to benchmark IBM’s 127 qubit Eagle processor. They used processors on ibmq_brooklyn and ibmq_washington — as well as several smaller systems — to simulate a large physical model of quantum spin chains, where interacting particles with spin, represented by qubits, occupy locations along a chain.

Their experiment took place well before IBM rolled out built-in error mitigation primitives for Qiskit Runtime, when Qiskit offered only a rudimentary version of its current read-out error mitigation capabilities. Because of this, the researchers had to build their own circuits for gate error mitigation, and combined Qiskit’s readout error mitigation with their own previous research to get accurate readout for measuring energy in their experiments.

The researchers found their results fell short of their expectations, at first. So, they had to take an additional step of designing a reference state for error mitigation, providing an absolute scale to help them characterize the energy more accurately. By combining their error mitigation techniques with this reference state, the researchers were able to calculate the energy of the quantum state very accurately, with an error of only a few percentage points — much more accurate than their initial expectations.

The researchers performed this work on systems of many different sizes in an effort to extrapolate to the infinite limit of an infinite quantum spin chain and compare that with an exact solution. They found that the density of their results2 was good, even for their low-level ansatz. The researchers achieved an error of just 4-5% which they found surprising.

“I think what’s most significant from this, and the most surprising part, is that the method we used works so well on various machines,” said Wei. “We tried nine different back-ends. [Some of] these back-ends had 27 qubits, 65, and 127 [qubits]...and the agreement with our ansatz, somehow is not too bad.”

State preparation with a side of condensed matter physics

Together with theorists from the University of Cologne and Harvard University, a team of experimentalists from IBM Quantum, led by IBMer Edward Chen, set out to efficiently prepare highly entangled quantum states across many qubits. Their unique approach,3 first proposed by the academic collaborators, allowed for a profound condensed matter physics result to pop out for free: a natural way to study a ubiquitous phase transition that occurs in magnetic systems.

Preparing entanglement on quantum computers is an essential step for many applications. But today, noise and gate errors make preparing such states challenging especially on large devices. Typically, you might begin preparing your state by initializing the qubits in the center of the device and spreading the entanglement outward. This is beneficial because you can use all the qubits in the device — but it takes valuable time.

The team explored a different way: entanglement by measurement. While we generally use measurement only to extract information, we can also use the measurement operator in the middle of a circuit to generate longer-range entanglement between qubits. They used this entanglement-by-measurement protocol using 125 qubits targeting the creation of the Greenberger-Horne-Zeilinger, or GHZ state on 54 qubits. Although the GHZ state only remained coherent up to 10 qubits, the result on 54 qubits still predominantly exhibited two bitstrings at the end of the calculation — all zeroes or all ones.

This protocol demonstrated an efficient way to generate highly-entangled states, an important result on its own — but the researchers “got something else for free,” said Chen. Tuning the parameters induced a transition from monolithic qubit behavior — all one state or all the other state — to chaos. This transition, called the Nishimori transition, is a well-known toy model to understand magnetism, but is difficult to simulate.

“Without a quantum computer, realizing the Nishimori transition would involve taking some rock and sprinkling magnetic impurities at exactly the same rate that you dial the temperature,” said Chen.

As it turned out, quantum computers offer a much easier way to study this important condensed-matter physics model. And it’s useful for quantum, too — this state is an important one to study for quantum error correction.

Uncovering the nature of matter

One recent utility-scale quantum experiment4 could help us study something even deeper — the nature of matter itself.

A team led by Roland Farrell at the University of Washington set out to study the Schwinger model — a simplified model of nature where positively and negatively charged particles attract and repel each other on a one-dimensional wire over time. This model is valuable both to quantum information science generally and to particle physics, as its behavior can help explain the force that holds atomic nuclei together, called the strong nuclear force.

But before they can study how this model evolves over time, they had to figure out how to prepare the initial, lowest-energy state on many qubits. This was another place where the Schwinger model paid off: it has translational symmetry, meaning it looks the same regardless of where along the wire you are, and only nearby charges see each other.

This means that, rather than hard code gates onto all of your qubits, you can theoretically implement gates over just a few qubits, then write a routine that automatically repeats that set of gates over the qubits and applies it across the chip.

Inspired by Virginia Tech physicist Sophia Economou, the team started with an algorithm called ADAPT-VQE, which can efficiently prepare ground states. After some trial and error, they tweaked it so that it was scalable and able to create the ground state of a large system. They started by determining the circuits on a 28-qubit simulator, then verified the scalability using classical approximations, and finally ran on 100 qubits of an IBM Quantum Eagle processor.

The processor struggled, at first — but then, the team applied an error mitigation technique that they developed in combination with other mitigation methods easily-available on IBM Quantum hardware.

“We were kind of blown away,” said Farrell. “The results we got were beautiful. They were right within our expectations.”

And these are long circuits — in the most recent iteration, they ran the circuit with over 2,100 CNOTs.

Now that the ground state is tackled, the team can begin to study the next step of Schwinger model simulation: how it evolves over time, which they’re exploring now. This is where Farrell really expects the quantum computer to shine.

“As we go to dynamics, I think that we’ll quickly be beyond the regime of what we can do with classical computers,” he said.

They can explore utility… and so can you

Systems with many interacting particles are fundamental to modern science — but they can be very difficult to simulate on a classical computer. And even if you do simulate them, the data can look like chaotic noise. But utility-scale quantum systems are revealing hidden structures5 in that noise — with the help of open-source tools that anyone can use.

We can unravel the intricacies of complex systems by discovering symmetries, or self-similarities in the system that provide us with a simpler way to understand them, and conservation laws, or properties of the system that remain constant even under complex interactions for all time. In classical physics, understanding these properties has been essential to describing complex systems and to predict their future behavior. We could potentially understand quantum systems using similar concepts, but trying to find symmetries and conserved properties in a quantum system can be an intractable problem on classical computers.

A new experiment run by scientists at IBM Quantum looks to use a quantum computer to simulate a quantum many-body system and find conserved properties called local integrals of motion. The system they sought to simulate is one used commonly in condensed matter physics: a simplification of bar magnets arranged in a two-dimensional lattice placed within a regularly pulsing magnetic field.

To study how the protocol scales, they ran increasingly large versions of the experiment until they finally ran a 124-qubit circuit with 60 time-steps’ worth of gates — including over 2,600 entangling CNOT gates. They also applied a variety of error mitigation techniques, combining zero-noise extrapolation, Pauli twirling, readout error mitigation, and more.

Their result is one of the most challenging many-body physics simulations yet run on a real quantum computer — and they successfully identified approximate local integrals of motion (LIOM) that exist across scales, from small to large systems. These LIOMs can be used to describe the dynamics of these systems in detail and more generally. This experiment demonstrates the true utility of quantum computing for a problem that, in general, can tax even the most powerful classical computers.

“Over time, this is the way that scientists will have to do their quantum simulations if they want to get answers to questions that are at the precipice of what’s known,” said IBM Quantum’s Zlatko Minev, the concluding senior author of the research.

But most importantly, this experiment was run entirely on the IBM Quantum Platform using entirely open-source tools. The team felt that interacting with the broader condensed matter physics community was crucial to succeeding, and wanted to give back to the community by keeping their results open.

Bringing useful quantum computing to the world

What do these experiments have in common? Well, they’re using quantum computers to do real experiments at a scale beyond what’s possible with exact classical simulations. In other words, we’re starting to uncover problems for which quantum computing is the best tool for generating answers and for pushing science forward.

These are just the first papers in this new era of quantum utility. But we think that we’re amid a major disruptive moment in the history of computation. We expect more utility-scale papers to follow. And, while these papers mostly deal in the realm of quantum chemistry and condensed-matter physics, it shouldn’t be long before other fields begin to reap the benefits of utility-scale quantum.

At IBM, we’re continuing to build utility-scale systems that are allowing researchers to do science at the edge of human knowledge and probe a computational realm that’s never been accessible before. We hope you, too, will get started running quantum circuits and see what possibilities quantum might have for other domains.

The University of Washington research is funded by the Inqubator for quantum simulation, Quantum Science Center, Department of Energy, Oak Ridge Leadership Facility, and the University of Washington.

Panel: Running Experiments On 100+ Qubits | Qiskit Quantum Seminar with Introduction by Jay Gambetta


References

  1. Kim, Y., Eddins, A., Anand, S. et al. Evidence for the utility of quantum computing before fault tolerance. Nature 618, 500–505 (2023). https://doi.org/10.1038/s41586-023-06096-3

    |
  2. Hongye Yu, Yusheng Zhao, and Tzu-Chieh Wei. Simulating large-size quantum spin chains on cloud-based superconducting quantum computers. Phys. Rev. Research 5, 013183 – Published 16 March 2023. https://doi.org/10.1103/PhysRevResearch.5.013183

    |
  3. Edward H. Chen, Guo-Yi Zhu, Ruben Verresen, Alireza Seif, et al. Realizing the Nishimori transition across the error threshold for constant-depth quantum circuits. arXiv:2309.02863. [Submitted on 6 Sep 2023]. https://doi.org/10.48550/arXiv.2309.02863

    |
  4. Roland C. Farrell, Marc Illa, Anthony N. Ciavarella, Martin J. Savage. Scalable Circuits for Preparing Ground States on Digital Quantum Computers: The Schwinger Model Vacuum on 100 Qubits. arXiv:2308.04481. [Submitted on 8 Aug 2023 (v1), last revised 8 Sep 2023 (this version, v2)] https://doi.org/10.48550/arXiv.2308.04481

    |
  5. Oles Shtanko, Derek S. Wang, Haimeng Zhang, Nikhil Harle, et al. Uncovering Local Integrability in Quantum Many-Body Dynamics. arXiv:2307.07552. [Submitted on 14 Jul 2023]. https://doi.org/10.48550/arXiv.2307.07552

    |

View pricing