Quantum computing: real progress without hype
Actualizado: 2026-05-03
For years, talking about quantum computing meant picking between two equally useless extremes: the fantasy of machines that would solve the intractable in seconds, or the skepticism that treated it as a twenty-year-away technology with no tangible applications. With several generations of hardware already in production, the first error-correction experiments at scale, and a much more mature academic and industrial community, it pays off to make an honest read of where we stand. It’s not the revolution moment marketing promised, but it’s not the skeptics’ desert either: there is real, measurable progress, and there are also limits that remain hard.
Key takeaways
- The relevant metric is no longer the count of physical qubits but useful logical qubits: how many can be built by grouping physical qubits under an error-correction code.
- Google Willow’s milestone (2024) was demonstrating that adding more physical qubits lowered the logical qubit’s effective error rate, finally crossing the theoretical threshold.
- Post-quantum cryptography is the application with the greatest practical urgency today: NIST finalized the first standards (ML-KEM, ML-DSA) in August 2024 and migration is a real priority for critical infrastructure.
- Quantum ML, enterprise optimization algorithms, and near-term RSA-breaking predictions remain research, not product.
- For most teams: stay informed without operational investment, except in computational chemistry or critical infrastructure.
Where the hardware stands today
The metric that dominated headlines five years ago was the count of physical qubits. IBM announced its Condor processor with 1121 qubits in late 2023, Google presented Willow with 105 qubits in 2024 demonstrating below-threshold error correction for the first time. The important point is that the community has abandoned the sterile race for raw count and now measures quality: gate fidelity, coherence time, qubit graph connectivity, and above all how many useful logical qubits you can build from those physical ones.
The physical-versus-logical distinction is decisive:
- A physical qubit is the real device — superconducting circuit, trapped ion, or photon — with high error rates: around one erroneous operation per thousand in today’s best systems.
- A logical qubit bundles dozens or hundreds of physical ones under an error-correction code, and its effective error rate can be orders of magnitude lower.
Willow’s 2024 milestone was demonstrating that adding more physical qubits lowered, rather than raised, the logical qubit’s effective rate, finally crossing the theoretical threshold. Several labs now run between two and ten logical qubits, still far from the hundred needed for useful algorithms, but already in a regime where improvement is incremental engineering.
Which algorithms actually work
The most sober acknowledgment is that outside specific academic publications, quantum computing has not yet delivered advantage over classical in any broadly commercial application. Google’s famous 2019 quantum supremacy experiment solved a problem designed to be hard for classical computers with no practical use. QAOA-style optimization algorithms perform well on synthetic benchmarks but tie or lose against sophisticated classical heuristics on real problems.
The niche with measurable progress is native quantum system simulation. Simulating the dynamics of frustrated magnetic materials, high-temperature superconductors, or molecular systems with many strongly correlated electrons is a quantum problem that classical computers tackle with expensive approximations. In 2025 the first experiments appeared where machines with dozens of logical qubits solved dynamics impossible to simulate classically at equivalent precision. This matters mainly to computational chemistry and materials science.
The other field with real progress is post-quantum cryptography, though the push here comes more from fear than from real quantum progress. In August 2024 NIST finalized the first post-quantum standards:
- ML-KEM for key exchange.
- ML-DSA and SLH-DSA for signatures.
Throughout 2025 industry has integrated them: TLS 1.3 with hybrid X25519MLKEM768 is already the default in Chrome and Firefox, OpenSSH 10 added native support. Shor’s algorithm, which would break RSA and elliptic curves, still needs hundreds of thousands of high-fidelity logical qubits, very far from what exists. But the principle of harvesting encrypted data today to decrypt it later has moved enough regulatory pressure that migration is a real priority for critical infrastructure.
Promises still unfulfilled
Worth saying clearly what remains research rather than product:
- Quantum machine learning — no demonstrated clear advantage over classical methods on useful tasks; several recent theoretical results suggest the apparent advantages vanish when properly accounting for the cost of loading classical data into quantum states.
- Quantum optimization for business — lab demonstrations without serious production deployment.
- Quantum cryptanalysis (Shor) — various academic groups estimate between ten and twenty years; the most recent estimates are more conservative, not less.
- Mass commercial use — the real market remains small, dominated by academic research, some pharma exploring molecular simulation, and government agencies.
How to think the decision
My reading for a technical or executive team is that quantum computing deserves attentive watching but not operational investment except in very specific niches:
- If you work in computational chemistry, materials science, or pharma with simulation problems beyond classical reach, active-exploration budget makes sense.
- If you work in critical infrastructure, post-quantum crypto migration is an urgent priority regardless of when the machine capable of breaking RSA arrives, because the harvest-now window is already open.
- For everyone else: stay informed without investing. Understand physical-versus-logical qubit vocabulary, know what ML-KEM is and why it matters, but don’t dedicate time or budget to speculative pilots.
The honest balance: quantum computing has stopped being vaporware without ceasing to be research. There is hardware that works, logical qubits that correct errors, experiments of genuine scientific value. There is no mass economic advantage, no consumer application, no 2019-slide-deck revolution. Knowing how to tell the two apart is probably more valuable today than any prediction about five years from now.