Quantum computing has taken a major step forward with a breakthrough demonstrating an unconditional exponential speedup—a long-awaited milestone in the field. Led by Daniel Lidar, a professor of engineering at the University of Southern California (USC) and a leading expert in quantum error correction, the research was carried out in collaboration with teams from USC and Johns Hopkins University. Their findings were published in Physical Review X.

Quantum computers have promised transformative capabilities: solving complex equations, designing next-generation medicines, breaking encryption, and discovering new materials. However, one persistent barrier has slowed progress—noise. These small but constant errors disrupt quantum operations, often rendering results less reliable than those from traditional classical computers.

This new study changed that narrative.

Using IBM’s 127-qubit Eagle quantum processors—remotely accessed via the cloud—Lidar’s team successfully demonstrated an exponential speedup in solving a specific computational task. Unlike previous examples of modest or conditional quantum speedups, this result is exponential and unconditional. That means the performance gap between quantum and classical systems continues to widen as the problem size grows, and the advantage does not rely on any unproven assumptions or theoretical loopholes.

The experiment focused on a problem known as Simon’s problem, a theoretical benchmark in quantum computing. This task requires uncovering a hidden repeating pattern within a mathematical function—a type of problem that quantum systems can solve exponentially faster than classical ones. The research team adapted this problem and fine-tuned an algorithm to make it compatible with existing quantum hardware.

Achieving this breakthrough required several innovations to suppress noise and enhance performance:

  1. Restricted Data Inputs: The team simplified the problem space by limiting the complexity of the hidden patterns, which reduced the number of operations needed—and thus, opportunities for errors to accumulate.
  2. Efficient Transpilation: They compressed quantum circuits by optimizing the sequence of quantum gates, making the computation faster and less error-prone.
  3. Dynamical Decoupling: This technique applied a series of precision-timed pulses to isolate the qubits from environmental noise, allowing more accurate operations. It had the most significant impact on preserving quantum coherence.
  4. Measurement Error Mitigation: Even after noise reduction, errors can persist during the final readout of qubit states. This step corrected for those lingering inaccuracies to further refine the results.

The study not only confirmed a long-theorized exponential quantum advantage but also showed that such an advantage can already be demonstrated on current-generation hardware. It suggests quantum systems are beginning to outperform classical counterparts in real algorithmic tasks—marking a meaningful transition from theoretical promise to technological reality.

Despite this progress, researchers emphasize that the result does not yet translate into practical, everyday applications. Simon’s problem, while foundational, is not a task with real-world utility. Real impact will require solving non-oracle-based problems—those where answers aren’t pre-built into the problem—and continuing to scale quantum systems while reducing noise.

Still, the achievement represents a watershed moment. It shows that the fundamental promise of quantum computing—exponential speedup—is no longer just a theory. It’s now experimentally proven, unconditionally, using real hardware.

By Impact Lab