You need 100 qubits to accelerate discovery with quantum
In this new era of quantum utility, you need to run large circuits to accelerate scientific discovery.
In this new era of quantum utility, you need to run large circuits to accelerate scientific discovery.
Since we put the first quantum computer on the cloud in 2016, we’ve been laying the foundations of quantum computing. Researchers running valuable smallscale experiments have created new benchmarks, implemented new algorithms, educated the community about running circuits on quantum hardware, and validated the tenets of quantum computation more broadly. But these experiments don’t help us advance science in the domains where we hope quantum will make an impact — like chemistry, materials science, optimization, and more — since we can simulate them exactly using a classical computer.
If we hope to advance these fields using quantum computers, we need to run large circuits. At IBM, we use 100 qubits as the benchmark. And we need to run those circuits on real quantum hardware.
At IBM, we use 100 qubits as the benchmark.
Earlier this year, Read: A new paper from IBM and UC Berkeley shows a useful application for 127qubit quantum processors with error mitigation — and a path toward useful quantum computing.IBM Quantum and UC Berkeley teams extracted accurate results from a quantum computer for a circuit that couldn’t be exactly simulated by a classical computer. This paper^{1} sparked a new era in quantum computing at IBM: the era of quantum utility, where quantum computers could provide real value beyond classical for solving realworld problems. Since then, classical approximations have continued to verify the value of our experiment — but differ among each other by a significant amount. For the first time, we can use quantum as a source of truth against which we can benchmark classical approximation methods.
Today, more work is flooding in as researchers begin to do quantum at a meaningful scale. Papers from teams based at the University of Washington, Stony Brook University, IBM, and elsewhere are exploring a space beyond exact classical simulability, running circuits with over 100 qubits and hundreds‚ even thousands of gates.
Taking a step back, it’s hard to believe that we’re really here. Scientists at worldleading research institutions are using our systems not just to study quantum physics, but to advance science. Folks are writing real papers to explore uncharted scientific territory in domains beyond quantum computing. This is what we mean by the “era of utility.”
So, now that we’ve entered the utility era, what kinds of utilityscale work are researchers doing?
When IBM debuted the first quantum processor to break the 100qubit barrier in late 2021, SUNY Stony Brook physics professor TzuChieh Wei wanted to find out just how many of those qubits were actually usable given the presence of noise and errors in the current generation of prefaulttolerant quantum systems.
To test this, he worked with Stony Brook students Hongye Yu and Yusheng Zhao to develop an experiment to benchmark IBM’s 127 qubit Eagle processor. They used processors on ibmq_brooklyn and ibmq_washington — as well as several smaller systems — to simulate a large physical model of quantum spin chains, where interacting particles with spin, represented by qubits, occupy locations along a chain.
Their experiment took place well before IBM rolled out builtin error mitigation primitives for Qiskit Runtime, when Qiskit offered only a rudimentary version of its current readout error mitigation capabilities. Because of this, the researchers had to build their own circuits for gate error mitigation, and combined Qiskit’s readout error mitigation with their own previous research to get accurate readout for measuring energy in their experiments.
The researchers found their results fell short of their expectations, at first. So, they had to take an additional step of designing a reference state for error mitigation, providing an absolute scale to help them characterize the energy more accurately. By combining their error mitigation techniques with this reference state, the researchers were able to calculate the energy of the quantum state very accurately, with an error of only a few percentage points — much more accurate than their initial expectations.
The researchers performed this work on systems of many different sizes in an effort to extrapolate to the infinite limit of an infinite quantum spin chain and compare that with an exact solution. They found that the density of their results^{2} was good, even for their lowlevel ansatz. The researchers achieved an error of just 45% which they found surprising.
“I think what’s most significant from this, and the most surprising part, is that the method we used works so well on various machines,” said Wei. “We tried nine different backends. [Some of] these backends had 27 qubits, 65, and 127 [qubits]...and the agreement with our ansatz, somehow is not too bad.”
Together with theorists from the University of Cologne and Harvard University, a team of experimentalists from IBM Quantum, led by IBMer Edward Chen, set out to efficiently prepare highly entangled quantum states across many qubits. Their unique approach,^{3} first proposed by the academic collaborators, allowed for a profound condensed matter physics result to pop out for free: a natural way to study a ubiquitous phase transition that occurs in magnetic systems.
Preparing entanglement on quantum computers is an essential step for many applications. But today, noise and gate errors make preparing such states challenging especially on large devices. Typically, you might begin preparing your state by initializing the qubits in the center of the device and spreading the entanglement outward. This is beneficial because you can use all the qubits in the device — but it takes valuable time.
The team explored a different way: entanglement by measurement. While we generally use measurement only to extract information, we can also use the measurement operator in the middle of a circuit to generate longerrange entanglement between qubits. They used this entanglementbymeasurement protocol using 125 qubits targeting the creation of the GreenbergerHorneZeilinger, or GHZ state on 54 qubits. Although the GHZ state only remained coherent up to 10 qubits, the result on 54 qubits still predominantly exhibited two bitstrings at the end of the calculation — all zeroes or all ones.
This protocol demonstrated an efficient way to generate highlyentangled states, an important result on its own — but the researchers “got something else for free,” said Chen. Tuning the parameters induced a transition from monolithic qubit behavior — all one state or all the other state — to chaos. This transition, called the Nishimori transition, is a wellknown toy model to understand magnetism, but is difficult to simulate.
“Without a quantum computer, realizing the Nishimori transition would involve taking some rock and sprinkling magnetic impurities at exactly the same rate that you dial the temperature,” said Chen.
As it turned out, quantum computers offer a much easier way to study this important condensedmatter physics model. And it’s useful for quantum, too — this state is an important one to study for quantum error correction.
One recent utilityscale quantum experiment^{4} could help us study something even deeper — the nature of matter itself.
A team led by Roland Farrell at the University of Washington set out to study the Schwinger model — a simplified model of nature where positively and negatively charged particles attract and repel each other on a onedimensional wire over time. This model is valuable both to quantum information science generally and to particle physics, as its behavior can help explain the force that holds atomic nuclei together, called the strong nuclear force.
But before they can study how this model evolves over time, they had to figure out how to prepare the initial, lowestenergy state on many qubits. This was another place where the Schwinger model paid off: it has translational symmetry, meaning it looks the same regardless of where along the wire you are, and only nearby charges see each other.
This means that, rather than hard code gates onto all of your qubits, you can theoretically implement gates over just a few qubits, then write a routine that automatically repeats that set of gates over the qubits and applies it across the chip.
Inspired by Virginia Tech physicist Sophia Economou, the team started with an algorithm called ADAPTVQE, which can efficiently prepare ground states. After some trial and error, they tweaked it so that it was scalable and able to create the ground state of a large system. They started by determining the circuits on a 28qubit simulator, then verified the scalability using classical approximations, and finally ran on 100 qubits of an IBM Quantum Eagle processor.
The processor struggled, at first — but then, the team applied an error mitigation technique that they developed in combination with other mitigation methods easilyavailable on IBM Quantum hardware.
“We were kind of blown away,” said Farrell. “The results we got were beautiful. They were right within our expectations.”
And these are long circuits — in the most recent iteration, they ran the circuit with over 2,100 CNOTs.
Now that the ground state is tackled, the team can begin to study the next step of Schwinger model simulation: how it evolves over time, which they’re exploring now. This is where Farrell really expects the quantum computer to shine.
“As we go to dynamics, I think that we’ll quickly be beyond the regime of what we can do with classical computers,” he said.
Systems with many interacting particles are fundamental to modern science — but they can be very difficult to simulate on a classical computer. And even if you do simulate them, the data can look like chaotic noise. But utilityscale quantum systems are revealing hidden structures^{5} in that noise — with the help of opensource tools that anyone can use.
We can unravel the intricacies of complex systems by discovering symmetries, or selfsimilarities in the system that provide us with a simpler way to understand them, and conservation laws, or properties of the system that remain constant even under complex interactions for all time. In classical physics, understanding these properties has been essential to describing complex systems and to predict their future behavior. We could potentially understand quantum systems using similar concepts, but trying to find symmetries and conserved properties in a quantum system can be an intractable problem on classical computers.
A new experiment run by scientists at IBM Quantum looks to use a quantum computer to simulate a quantum manybody system and find conserved properties called local integrals of motion. The system they sought to simulate is one used commonly in condensed matter physics: a simplification of bar magnets arranged in a twodimensional lattice placed within a regularly pulsing magnetic field.
To study how the protocol scales, they ran increasingly large versions of the experiment until they finally ran a 124qubit circuit with 60 timesteps’ worth of gates — including over 2,600 entangling CNOT gates. They also applied a variety of error mitigation techniques, combining zeronoise extrapolation, Pauli twirling, readout error mitigation, and more.
Their result is one of the most challenging manybody physics simulations yet run on a real quantum computer — and they successfully identified approximate local integrals of motion (LIOM) that exist across scales, from small to large systems. These LIOMs can be used to describe the dynamics of these systems in detail and more generally. This experiment demonstrates the true utility of quantum computing for a problem that, in general, can tax even the most powerful classical computers.
“Over time, this is the way that scientists will have to do their quantum simulations if they want to get answers to questions that are at the precipice of what’s known,” said IBM Quantum’s Zlatko Minev, the concluding senior author of the research.
But most importantly, this experiment was run entirely on the IBM Quantum Platform using entirely opensource tools. The team felt that interacting with the broader condensed matter physics community was crucial to succeeding, and wanted to give back to the community by keeping their results open.
What do these experiments have in common? Well, they’re using quantum computers to do real experiments at a scale beyond what’s possible with exact classical simulations. In other words, we’re starting to uncover problems for which quantum computing is the best tool for generating answers and for pushing science forward.
These are just the first papers in this new era of quantum utility. But we think that we’re amid a major disruptive moment in the history of computation. We expect more utilityscale papers to follow. And, while these papers mostly deal in the realm of quantum chemistry and condensedmatter physics, it shouldn’t be long before other fields begin to reap the benefits of utilityscale quantum.
At IBM, we’re continuing to build utilityscale systems that are allowing researchers to do science at the edge of human knowledge and probe a computational realm that’s never been accessible before. We hope you, too, will get started running quantum circuits and see what possibilities quantum might have for other domains.
The University of Washington research is funded by the Inqubator for quantum simulation, Quantum Science Center, Department of Energy, Oak Ridge Leadership Facility, and the University of Washington.
Notes
References

Kim, Y., Eddins, A., Anand, S. et al. Evidence for the utility of quantum computing before fault tolerance. Nature 618, 500–505 (2023). https://doi.org/10.1038/s41586023060963 ↩

Hongye Yu, Yusheng Zhao, and TzuChieh Wei. Simulating largesize quantum spin chains on cloudbased superconducting quantum computers. Phys. Rev. Research 5, 013183 – Published 16 March 2023. https://doi.org/10.1103/PhysRevResearch.5.013183 ↩

Edward H. Chen, GuoYi Zhu, Ruben Verresen, Alireza Seif, et al. Realizing the Nishimori transition across the error threshold for constantdepth quantum circuits. arXiv:2309.02863. [Submitted on 6 Sep 2023]. https://doi.org/10.48550/arXiv.2309.02863 ↩

Roland C. Farrell, Marc Illa, Anthony N. Ciavarella, Martin J. Savage. Scalable Circuits for Preparing Ground States on Digital Quantum Computers: The Schwinger Model Vacuum on 100 Qubits. arXiv:2308.04481. [Submitted on 8 Aug 2023 (v1), last revised 8 Sep 2023 (this version, v2)] https://doi.org/10.48550/arXiv.2308.04481 ↩

Oles Shtanko, Derek S. Wang, Haimeng Zhang, Nikhil Harle, et al. Uncovering Local Integrability in Quantum ManyBody Dynamics. arXiv:2307.07552. [Submitted on 14 Jul 2023]. https://doi.org/10.48550/arXiv.2307.07552 ↩