Jacar mascot — reading along A laptop whose eyes follow your cursor while you read.
Tecnología

Final NIST PQC standards: what to do with them now

Final NIST PQC standards: what to do with them now

Actualizado: 2026-05-03

In August 2024, NIST published the three final post-quantum cryptography standards:

  • FIPS 203 (ML-KEM, based on CRYSTALS-Kyber)
  • FIPS 204 (ML-DSA, based on CRYSTALS-Dilithium)
  • FIPS 205 (SLH-DSA, based on SPHINCS+)

Six months on, the headline is no longer news, and it’s time to turn it into a plan. This post collects what I’ve seen work and not work in teams starting the transition.

Key takeaways

  • There isn’t yet a quantum computer capable of breaking RSA-2048. The real risk today is “harvest now, decrypt later”: adversaries capture encrypted data today to decrypt later.
  • The first step is the most ignored: honest crypto inventory in layers (external communications, internal PKI, data at rest, vendors).
  • The most frequent mistake is swapping algorithms one-by-one instead of building crypto-agility: architectures that let you change the algorithm without rewriting the application.
  • The tactic that has settled in serious deployments: hybrid algorithms (ECDHE + ML-KEM) for external communications, classical cryptography elsewhere for now.
  • PQC signatures (ML-DSA, SLH-DSA) are orders of magnitude larger than ECDSA: an X.509 certificate with ML-DSA grows from 1-2 KB to 4-6 KB.

Order of magnitude first

There isn’t yet a quantum computer capable of breaking RSA-2048 or ECDSA P-256. Serious estimates of when there will be range from a decade to never. But patient adversaries can capture today and decrypt later — “harvest now, decrypt later.” For data with long-term value, the transition isn’t optional.

The inventory almost nobody has

The first step is the most boring and ignored: knowing where cryptography lives in your organization. Not just TLS on public websites. It includes: VPN between offices, client certificates on IoT machines, firmware update signatures, JWT in internal APIs, at-rest encryption in databases, SSH keys on admin servers, S/MIME signed mail, audit log integrity.

The sensible approach is to tackle it in layers: external communications first, then internal infrastructure, then at-rest data with long retention, then vendor integrations.

A useful tool to start: IBM’s pqc-iq, which scans Java applications for cryptographic uses.

Crypto-agility before crypto-correct

The most frequent mistake is wanting to swap algorithms one by one. Patch an application with ML-KEM and leave the rest. It produces a tick on a report, but it’s usually a mistake.

The more productive conversation is about crypto-agility. Instead of asking “which algorithm do we use?”, ask “how do we change the algorithm without rewriting the application?” That forces reviewing: where the crypto provider is instantiated, whether on-disk and on-wire formats carry algorithm identifiers, whether configuration is externalized, whether there are compatibility chains for reading old ciphertext while writing new.

A well-designed crypto-agile system today absorbs the first PQC transition without drama, and will also absorb a second when current algorithms show weaknesses.

Hybrids outside, classical inside

The tactic that has settled in serious deployments is using hybrid algorithms in key-exchange for communications, and leaving the rest on classical cryptography for now. A hybrid combines classical ECDHE with ML-KEM in the same negotiation, so an attacker needs to break both.

Cloudflare, Google, and AWS already support hybrids on their TLS edges, and OpenSSH 9.9 introduced hybrid negotiation in 2024 with sntrup761x25519.

For at-rest symmetric encryption, no urgency: AES-256 remains secure against quantum adversaries under Grover’s algorithm (reduces effective security to 128 bits — reasonable).

Sizes, latencies, budgets

PQC standards aren’t drop-in. Signature and key sizes are orders of magnitude larger than classical algorithms. An X.509 certificate with ML-DSA grows from 1-2 KB to 4-6 KB. A full TLS chain with three post-quantum certificates approaches 20 KB. On slow links (cellular IoT, satellite) this shows.

A timeline that makes sense

  1. First quarter: honest crypto inventory.
  2. Second quarter: enable hybrids on TLS at every balancer and outgoing mail when the partner supports it.
  3. Second half: review internal PKI and plan a post-quantum hierarchy for 2026-2027, without rushing.
  4. In parallel: adopt crypto-agility as a requirement in all new code.

Mid-term, regulators will push. Germany’s BSI, France’s ANSSI, America’s CNSA 2.0 (targets for 2035), and ENISA’s updated 2024 guide are all moving in this direction.

How to think about the decision

The most valuable thing I’ve learned is that the post-quantum transition is a perfect excuse to do something you already needed: clean up your organization’s cryptography. Most teams starting today discover during inventory forgotten RSA-1024 keys, SHA-1 certificates, outdated libraries, or hardcoded secrets. PQC comes after; the prior cleanup is what actually raises security.

Treating it as a compliance project with a date and a tick works but is expensive and fragile. Treating it as an opportunity to invest in crypto-agility, crypto hygiene, and documentation pays off more.

And if your reaction is “we’ll leave it for when a quantum computer appears,” remember harvest now, decrypt later. For legal acts, medical records, trade secrets, or diplomatic communications, the transition isn’t optional — it’s a matter of dates.

Was this useful?
[Total: 14 · Average: 4.4]

Written by

CEO - Jacar Systems

Passionate about technology, cloud infrastructure and artificial intelligence. Writes about DevOps, AI, platforms and software from Madrid.