In February 2026, Google publicly called on governments and industry to "prepare now" for quantum-era cybersecurity. This was not theoretical hand-wraving from a research lab trying to justify its budget. It was a direct, urgent response to accelerating quantum computing breakthroughs that are compressing the timeline for when current encryption standards will break. Google's own Willow quantum processor, unveiled in late 2024, demonstrated error correction capabilities that many physicists had considered a decade away. IBM, Microsoft, Amazon, and a constellation of well-funded startups are racing toward the same threshold. The question is no longer "if" quantum computers will break modern encryption. It is "when" — and for organizations handling sensitive data, the threat is already here.
That last point is worth pausing on. The threat is not hypothetical or future-tense. Intelligence agencies and sophisticated threat actors are already intercepting and storing encrypted data today — financial transactions, healthcare records, government communications, corporate trade secrets — with the explicit intention of decrypting it once quantum computers reach sufficient capability. This strategy, known as "harvest now, decrypt later," means that any data encrypted with RSA, ECC, or other quantum-vulnerable algorithms that has long-term sensitivity is already compromised in transit, even if the decryption has not happened yet.
The cryptographic community has not been idle. After an eight-year evaluation process that began in 2016, the National Institute of Standards and Technology (NIST) published its first three finalized post-quantum cryptography (PQC) standards in August 2024. A fourth is expected in 2025. The NSA's Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) now mandates quantum-safe algorithms for national security systems, with the first compliance deadline — January 2027 for new systems — less than a year away. The European Union, the Basel Committee on Banking Supervision, and sector-specific regulators worldwide are issuing their own guidance and timelines.
This guide provides the complete enterprise roadmap for PQC migration. Whether you are a CISO preparing a board-level briefing, a security architect designing the technical migration plan, or a CTO evaluating vendor solutions, you will find the threat analysis, standards breakdown, compliance timeline, migration methodology, cost framework, and vendor landscape you need to act decisively. The window for preparation is narrowing. The time to begin is now.
Related reading: How to Build AI Agents for Your Small Business: A Practical 2026 Guide | Answer Engine Optimization (AEO): Get Cited by AI in 2026 | Business Insurance in 2026: The Complete Guide to Protecting Your Company
The Quantum Threat Explained
To understand why post-quantum cryptography matters, you need to understand what quantum computers can do that classical computers cannot — and specifically, why that capability is an existential threat to modern cryptography.
Classical computers process information in bits — binary digits that are either 0 or 1. Quantum computers use quantum bits, or qubits, which can exist in a superposition of both states simultaneously. When qubits are entangled, they can explore exponentially many possibilities in parallel. For most computing tasks, this provides no meaningful advantage. But for certain mathematical problems — specifically, the problems that underpin modern public-key cryptography — quantum computers offer a devastating speedup.
In 1994, mathematician Peter Shor published an algorithm that demonstrated a quantum computer could factor large integers and compute discrete logarithms in polynomial time. This was a theoretical bombshell. RSA encryption relies on the difficulty of factoring the product of two large prime numbers. Elliptic curve cryptography (ECC) relies on the difficulty of the elliptic curve discrete logarithm problem. Both of these problems are computationally intractable for classical computers at sufficient key sizes — factoring a 2048-bit RSA key would take a classical supercomputer longer than the age of the universe. But Shor's algorithm, running on a sufficiently powerful quantum computer, could break RSA-2048 in hours.
The math is specific and well-understood. A fault-tolerant quantum computer with approximately 4,099 logical qubits could factor RSA-2048. For 256-bit elliptic curve keys (the most commonly used ECC key size), the requirement is lower — approximately 2,330 logical qubits. Note the emphasis on "logical" qubits: current physical qubits are noisy and error-prone, requiring thousands of physical qubits to create a single reliable logical qubit through quantum error correction. This error correction overhead is the primary barrier between where quantum computing stands today and the cryptographically relevant quantum computer (CRQC) that the cybersecurity community is preparing for.
The current state of quantum hardware is advancing rapidly. IBM's Condor processor, released in 2023, reached 1,121 superconducting qubits. Google's Willow chip demonstrated that quantum error rates decrease as the system scales up — a critical milestone called "below threshold" error correction that had been an open question in quantum physics. Microsoft announced its Majorana 1 topological qubit chip in early 2025, claiming a fundamentally different approach to error correction that could dramatically reduce the physical-to-logical qubit ratio. Amazon's Ocelot processor is pursuing similar error-correction breakthroughs. These are not incremental improvements. They represent fundamental advances toward the fault-tolerant quantum computers that Shor's algorithm requires.
Symmetric encryption algorithms like AES are less vulnerable — Grover's algorithm provides only a quadratic speedup, which means doubling the key size (AES-256 instead of AES-128) provides adequate protection. But the public-key infrastructure that secures the internet — TLS, SSH, VPN, digital signatures, code signing, certificate authorities, key exchange protocols — is built almost entirely on RSA and ECC. Every HTTPS connection, every signed software update, every encrypted email, every VPN tunnel relies on algorithms that quantum computers will eventually break.
"Harvest Now, Decrypt Later" — Why the Threat Is Already Here
The most dangerous misconception about the quantum threat is that it is a future problem. It is not. The "harvest now, decrypt later" (HNDL) strategy transforms a future capability into a present-tense vulnerability.
The concept is straightforward. Nation-state intelligence agencies and sophisticated threat actors are intercepting encrypted network traffic today — financial transactions flowing through SWIFT networks, diplomatic communications between embassies, healthcare data in transit between providers and insurers, corporate VPN traffic containing trade secrets, government classified communications. This data is encrypted with RSA or ECC, so the adversary cannot read it today. But they can store it. Storage is cheap. And when a cryptographically relevant quantum computer becomes operational — whether that is 2030 or 2035 — all of that stored data becomes readable.
Consider the implications by data category:
- Healthcare records: Patient data has regulatory protection requirements of 50+ years under HIPAA. Genomic data is permanently sensitive — your DNA sequence never changes. Medical records encrypted and transmitted today could be decrypted in 2032, exposing conditions, treatments, and genetic predispositions for every patient in the dataset.
- Financial data: Trading algorithms, M&A negotiation documents, client portfolios, and banking transaction records have long-term competitive and regulatory sensitivity. A hedge fund's proprietary trading strategy encrypted today and decrypted in 2031 may still be valuable to competitors.
- Government and military communications: Classified intelligence, diplomatic cables, and defense communications have sensitivity windows measured in decades. The intelligence value of communications intercepted today could persist for 30-50 years.
- Intellectual property: Pharmaceutical research data, semiconductor designs, AI model architectures, and manufacturing processes represent billions in R&D investment. Their competitive value extends well beyond the quantum threat timeline.
- Legal and contractual data: Attorney-client privileged communications, M&A documents, patent applications in progress — all have long-term sensitivity and legal implications if exposed.
The cybersecurity community refers to the day when a CRQC can break current encryption as "Q-Day." But the HNDL threat means that the effective exposure window is not the distance to Q-Day — it is the sensitivity lifetime of your data minus the distance to Q-Day. If your data must remain confidential for 20 years and Q-Day arrives in 8 years, you are already 12 years behind on protection. Every day you delay PQC migration is a day more data is harvested and stored for future decryption.
Get Smarter About Business & Sustainability
Join 10,000+ leaders reading Disruptors Digest. Free insights every week.
Timeline: When Will Quantum Computers Break Current Encryption?
Predicting when a CRQC will exist is notoriously difficult, but expert consensus is converging and the window is tightening. The Global Risk Institute (GRI) conducts an annual survey of leading quantum computing researchers, asking them to estimate the likelihood that a quantum computer capable of breaking RSA-2048 within 24 hours will exist within various time horizons.
The 2024 GRI survey results are sobering. The median expert estimate now places a greater-than-50% probability of a CRQC existing by 2034. Just three years ago, that threshold was 2037. The median estimate has shifted forward by three years in three years of surveying — meaning the arrival of quantum threats is accelerating faster than the timeline itself is advancing. Approximately one-third of surveyed experts now believe there is a significant probability (greater than 30%) of a CRQC by 2030.
Several factors are accelerating the timeline:
- Error correction breakthroughs: Google's Willow chip demonstrated that increasing the number of qubits in an error-correcting code actually decreases the error rate — the first time this fundamental requirement for practical quantum computing has been achieved in a real system. This breakthrough alone could reduce the physical qubit overhead needed for a logical qubit by an order of magnitude.
- New qubit architectures: Microsoft's topological qubit approach and other novel architectures promise inherently lower error rates, potentially reducing the number of physical qubits needed for a CRQC from millions to hundreds of thousands.
- Investment growth: Global investment in quantum computing exceeded $40 billion cumulative by 2025, with governments (the U.S., China, the EU, Japan, South Korea, Australia) committing tens of billions in national quantum strategies. China alone has reportedly invested over $15 billion in quantum research.
- Talent pipeline expansion: The number of quantum computing researchers, engineers, and startups has grown exponentially. More minds working on the problem means faster progress across all fronts.
- Hybrid quantum-classical algorithms: Researchers are developing algorithms that combine quantum and classical processing, potentially achieving cryptographically relevant results with fewer qubits than originally estimated.
The conservative planning assumption for enterprise risk management is that a CRQC will exist by 2030-2035. Given that a full PQC migration takes 2-5 years for most organizations, and given the HNDL threat, the mathematical conclusion is clear: organizations should have begun their migration planning in 2024-2025, and those starting now in 2026 are on the outer edge of a responsible timeline.
NIST Post-Quantum Cryptography Standards
NIST launched its PQC standardization process in 2016, inviting the global cryptographic community to submit candidate algorithms. Sixty-nine initial submissions were received. Through four rounds of rigorous evaluation — involving security analysis, performance benchmarking, and extensive peer review — NIST selected four algorithms for standardization. Three were published as final standards in August 2024. The fourth is expected in 2025.
ML-KEM (FIPS 203) — Formerly CRYSTALS-Kyber
ML-KEM (Module-Lattice-Based Key-Encapsulation Mechanism) is the primary standard for quantum-safe key exchange. It replaces RSA key exchange and ECDH (Elliptic Curve Diffie-Hellman) in protocols like TLS, SSH, and VPN.
ML-KEM is based on the Module Learning With Errors (MLWE) problem, a lattice-based mathematical problem that is believed to be hard for both classical and quantum computers. It comes in three security levels:
- ML-KEM-512: NIST Security Level 1 (equivalent to AES-128). Public key: 800 bytes. Ciphertext: 768 bytes. Shared secret: 32 bytes.
- ML-KEM-768: NIST Security Level 3 (equivalent to AES-192). Public key: 1,184 bytes. Ciphertext: 1,088 bytes. Shared secret: 32 bytes.
- ML-KEM-1024: NIST Security Level 5 (equivalent to AES-256). Public key: 1,568 bytes. Ciphertext: 1,568 bytes. Shared secret: 32 bytes.
Performance is a key advantage. ML-KEM key generation, encapsulation, and decapsulation operations are significantly faster than RSA key exchange — often by an order of magnitude. The trade-off is key and ciphertext sizes: ML-KEM-768 public keys are 1,184 bytes compared to 294 bytes for an ECDH P-256 key. This size increase affects TLS handshakes, certificate chains, and bandwidth-constrained environments, but for most enterprise applications, the impact is manageable.
ML-DSA (FIPS 204) — Formerly CRYSTALS-Dilithium
ML-DSA (Module-Lattice-Based Digital Signature Algorithm) is the primary standard for quantum-safe digital signatures. It replaces RSA signatures and ECDSA in applications including TLS authentication, code signing, document signing, certificate issuance, and S/MIME email.
Like ML-KEM, ML-DSA is lattice-based (MLWE/MSIS problems). It also comes in three security levels:
- ML-DSA-44: Security Level 2. Public key: 1,312 bytes. Signature: 2,420 bytes.
- ML-DSA-65: Security Level 3. Public key: 1,952 bytes. Signature: 3,293 bytes.
- ML-DSA-87: Security Level 5. Public key: 2,592 bytes. Signature: 4,595 bytes.
Signing and verification speeds are fast — competitive with or faster than RSA-2048 signing. The main implementation consideration is signature size: at 2,420 to 4,595 bytes, ML-DSA signatures are significantly larger than ECDSA signatures (64-72 bytes) or RSA-2048 signatures (256 bytes). This impacts certificate chain sizes in TLS, software update packages with multiple signatures, and any protocol that transmits signatures frequently. Certificate authorities and PKI operators will need to plan for increased bandwidth and storage requirements.
SLH-DSA (FIPS 205) — Formerly SPHINCS+
SLH-DSA (Stateless Hash-Based Digital Signature Algorithm) is a backup signature algorithm that provides a fundamentally different security foundation. While ML-KEM and ML-DSA are both lattice-based, SLH-DSA relies solely on the security of hash functions — a much more conservative and well-understood mathematical foundation.
SLH-DSA's security depends only on the collision resistance and preimage resistance of its underlying hash function (SHA-256 or SHAKE-256). If a breakthrough were to undermine lattice-based assumptions, SLH-DSA would remain secure. This makes it the "insurance policy" in the PQC standard suite.
The trade-off is performance. SLH-DSA signatures are large (7,856 to 49,856 bytes depending on parameter set) and signing is slower than ML-DSA. SLH-DSA is best suited for applications where signatures are generated infrequently but verified frequently, such as firmware signing, root certificate issuance, or long-term document archival. It is not recommended as a general-purpose replacement for ECDSA in high-throughput applications.
FN-DSA — Formerly FALCON
FN-DSA (FFT over NTRU-Lattice-Based Digital Signature Algorithm) is expected to be finalized as a NIST standard in 2025. It addresses a specific niche: environments that need compact signatures, such as blockchain networks, IoT devices, DNS security (DNSSEC), and other protocols where signature size is a critical constraint.
FN-DSA produces the smallest signatures among the PQC standards (666 bytes at Security Level 1, 1,280 bytes at Security Level 5). However, its implementation is more complex than ML-DSA due to its reliance on floating-point arithmetic in key generation and its NTRU lattice structure. Constant-time rollout — essential for preventing side-channel attacks — is harder to achieve with FN-DSA. For most enterprise applications, ML-DSA is the recommended default, with FN-DSA reserved for size-constrained use cases.
Compliance Deadlines and Mandates
PQC migration is not merely a technical best practice — it is rapidly becoming a regulatory requirement. The compliance environment is evolving quickly, and organizations that wait for explicit mandates before starting their migration will not meet the deadlines.
NSA CNSA 2.0 Suite
The NSA's Commercial National Security Algorithm Suite 2.0, published in September 2022 and updated through 2025, provides the most concrete timeline for quantum-safe migration. CNSA 2.0 applies to all National Security Systems (NSS) and influences requirements for defense contractors and the broader intelligence community:
- By 2025: Quantum-safe firmware signing for networking equipment and operating systems.
- By January 2027: All new systems must use quantum-safe algorithms for key establishment (ML-KEM) and digital signatures (ML-DSA or SLH-DSA). No new systems using RSA or ECC for these purposes.
- By 2029: Web browsers, servers, and cloud services used by NSS must support quantum-safe TLS.
- By 2030: All custom and legacy applications must be migrated to quantum-safe cryptography.
- By 2033: Complete migration of all networking equipment (routers, switches, firewalls, VPN concentrators).
- By 2035: Full PQC migration across all systems, with no remaining RSA or ECC dependencies.
Even organizations not directly subject to CNSA 2.0 should treat it as the de facto compliance baseline, because its requirements will cascade through procurement specifications, industry standards, and regulatory expectations.
NIST Deprecation Timeline
NIST has signaled its own deprecation schedule through Special Publication 800-131A and related guidance. Algorithms with less than 112 bits of security are already deprecated. NIST has indicated that RSA-2048 (112-bit security) and 128-bit ECC keys will be disallowed for federal systems by 2030. The complete removal of quantum-vulnerable algorithms from NIST's approved lists will follow, effectively forcing all federal contractors and regulated industries to migrate.
Sector-Specific Requirements
Financial services: The Basel Committee on Banking Supervision has issued guidance on quantum risk management for banks. The European Central Bank is developing PQC readiness requirements for systemically important financial institutions. PCI DSS 4.0, while not yet mandating PQC, requires organizations to maintain an inventory of cryptographic algorithms and demonstrate a migration plan for deprecated algorithms. SWIFT has announced its quantum-safe migration roadmap for the interbank messaging network.
Healthcare: HIPAA's security rule requires reasonable and appropriate safeguards for the protection of electronic protected health information (ePHI). Given that patient data must be protected for decades and the HNDL threat is well-documented, regulators and auditors will increasingly expect PQC readiness as part of HIPAA risk assessments. Organizations that process genomic data face even more acute pressure, as DNA sequences are permanently sensitive.
Government contractors: CMMC 2.0, FedRAMP, and the broader Federal Acquisition Regulation (FAR) ecosystem will incorporate PQC requirements aligned with CNSA 2.0 and NIST timelines. Government contractors should anticipate PQC requirements appearing in contract vehicles by 2027-2028.
Telecommunications: 5G network security specifications from 3GPP are being updated to include PQC options for authentication and key agreement protocols. VPN and TLS termination at network edges will require PQC support as carrier-grade requirements evolve.
Cryptographic Agility: The Key Principle
If there is a single architectural principle that should guide every decision in your PQC migration, it is cryptographic agility — the ability to swap cryptographic algorithms, parameters, and implementations without redesigning or rebuilding your systems.
Cryptographic agility matters for several reasons. First, standards may evolve. NIST has acknowledged that some PQC algorithms may be updated or replaced as the field matures. In February 2024, a research team published an attack that significantly weakened a candidate algorithm (SIKE) that had been under serious consideration. If a similar breakthrough were to affect one of the finalized standards, organizations with crypto-agile architectures could switch algorithms rapidly. Those without would face an emergency rip-and-replace.
Second, different contexts demand different algorithms. You may use ML-KEM-768 for TLS key exchange, ML-DSA-65 for code signing, and SLH-DSA for root CA certificates. A crypto-agile architecture handles this diversity cleanly, while a hard-coded approach creates a brittle patchwork.
Third, performance requirements will change. As quantum computing evolves, security level recommendations may increase, requiring larger key sizes and different parameter sets. Crypto-agile systems can absorb these changes without structural modifications.
Architecture Patterns for Cryptographic Agility
Abstraction layers: Never hard-code cryptographic algorithms into application logic. Instead, use abstraction layers that reference algorithm identifiers resolved through configuration. Most modern cryptographic libraries (OpenSSL 3.x, BoringSSL, AWS-LC, Bouncy Castle) support provider-based architecture that enables algorithm swapping through configuration rather than code changes.
Crypto service meshes: Centralize cryptographic operations in dedicated services or sidecars that can be updated independently of the applications that consume them. This pattern is particularly effective in microservices architectures, where updating a shared crypto service immediately updates all dependent services.
Certificate management automation: Automated certificate lifecycle management (CLM) is essential. When you need to reissue thousands of certificates with new algorithms, manual processes will not scale. Tools like cert-manager, Venafi, or Keyfactor enable automated rotation that can accommodate algorithm changes.
Protocol negotiation: Ensure that your TLS, SSH, and IPsec configurations support algorithm negotiation, allowing clients and servers to agree on the strongest mutually supported algorithm. This enables gradual migration as endpoints are updated.
The Enterprise PQC Migration Roadmap
Migrating an enterprise to post-quantum cryptography is a multi-year program, not a single project. The following phased roadmap provides a structured approach that balances urgency with thoroughness.
Phase 1: Cryptographic Inventory (Months 1-3)
You cannot protect what you cannot see. The first phase is discovering and cataloging every cryptographic asset in your environment. This is consistently the most challenging and time-consuming phase, because cryptography is embedded everywhere — often in places no one documented or remembers.
Your cryptographic inventory must cover:
- Certificates: TLS/SSL certificates (internal and external), code signing certificates, S/MIME certificates, document signing certificates, VPN certificates, mutual TLS (mTLS) certificates between services.
- Keys: SSH keys, API signing keys, encryption keys at rest, key encryption keys, database transparent data encryption (TDE) keys, HSM-stored keys.
- Algorithms in use: RSA key sizes (1024, 2048, 3072, 4096), ECC curves (P-256, P-384, P-521), symmetric algorithms (AES-128, AES-256), hash functions (SHA-1, SHA-256, SHA-384).
- Cryptographic libraries: OpenSSL versions, BoringSSL, Java JCE providers,.NET cryptographic providers, language-specific crypto libraries (PyCryptodome, libsodium, etc.).
- Protocols: TLS versions and cipher suites, SSH configurations, IPsec/IKE settings, WPA3 configurations, blockchain protocols.
- Hardware: Hardware security modules (HSMs), trusted platform modules (TPMs), smart cards, token authenticators, embedded cryptographic accelerators.
- Third-party dependencies: SaaS applications that handle your encrypted data, cloud KMS services, payment processors, identity providers, certificate authorities.
Automated discovery tools are essential. Manual inventory at enterprise scale is impractical. Solutions like SandboxAQ's AQtive Guard, Keyfactor Command, Venafi Trust Protection Platform, and IBM Quantum Safe Explorer can scan networks, code repositories, and configurations to identify cryptographic assets. Budget for this tooling in Phase 1 — it will save months of manual effort and produce a significantly more complete inventory.
Phase 2: Risk Assessment (Months 3-6)
With the inventory complete, prioritize assets based on two dimensions: data sensitivity and exposure window. Data that must remain confidential for 10+ years and is transmitted over networks (exposing it to HNDL) is the highest priority. Internal encryption at rest with short-term sensitivity is lower priority.
Build a prioritization matrix:
- Critical (migrate first): Key exchange in internet-facing TLS, VPN encryption for sensitive communications, certificate authority root and intermediate keys, code signing keys for critical infrastructure.
- High (migrate within 12 months): Internal mTLS between services handling sensitive data, database encryption keys for regulated data, document signing for legal and compliance documents.
- Medium (migrate within 24 months): Internal SSH keys, non-regulated internal communications, development and testing infrastructure.
- Low (migrate within 36 months): Legacy systems approaching end-of-life, non-sensitive internal tools, ephemeral encryption for short-lived data.
Phase 3: Pilot and Test (Months 6-12)
Deploy PQC algorithms in non-production environments. The goals of this phase are to identify compatibility issues, measure performance impact, train your engineering team, and build confidence in the migration process.
Start with TLS. Configure test web servers and load balancers with hybrid TLS cipher suites (combining classical and quantum-safe key exchange). Measure handshake latency, connection establishment time, and throughput. Most organizations find that ML-KEM adds 1-3ms to TLS handshakes — negligible for typical web applications but potentially significant for high-frequency trading or real-time control systems.
Test certificate chain validation with PQC certificates. Identify any applications, libraries, or devices that cannot parse larger PQC certificates or signatures. This is where you will discover that legacy Java versions, older OpenSSL builds, IoT firmware, and specialized appliances may need updates or replacements.
Phase 4: Hybrid Deployment (Months 12-18)
Run classical and quantum-safe encryption simultaneously in production. Hybrid mode provides backwards compatibility — clients that do not yet support PQC algorithms can still connect using classical algorithms, while PQC-capable clients negotiate quantum-safe parameters. This is the recommended transition strategy endorsed by NIST, the IETF, and major browser vendors.
For TLS, the X25519Kyber768Draft hybrid key exchange (combining X25519 ECDH with ML-KEM-768) is already supported in Chrome 124+, Firefox, and major cloud platforms. Your production web servers can negotiate this hybrid cipher suite with capable clients while falling back to classical key exchange for older clients.
For certificates, issue dual-algorithm certificates or use certificate chaining strategies that include both a classical and a PQC signature. The IETF's draft standards for hybrid certificate formats are maturing rapidly.
Phase 5: Production Migration (Months 18-30)
Phase out classical algorithms for high-priority systems. Update cipher suite configurations to prefer PQC algorithms and deprecate quantum-vulnerable options. Reissue certificates with PQC-only algorithms where all consuming systems have been validated. Rotate encryption keys to PQC algorithms for data at rest.
This phase requires coordination across teams: network operations, application development, security operations, vendor management, and compliance. A dedicated PQC migration program manager is essential for organizations with more than 500 employees.
Phase 6: Continuous Improvement (Ongoing)
PQC migration does not have a finish line. Monitor NIST for algorithm updates and parameter changes. Track the quantum computing threat timeline for any acceleration that might compress your migration schedule. Maintain your cryptographic inventory as new systems are deployed. Conduct annual assessments of cryptographic posture. Keep your crypto-agile architecture ready for the next generation of standards.
Industry-Specific Implications
Financial services: Banks and financial institutions face among the most acute PQC pressures. SWIFT messaging, Fedwire, ACH networks, payment card processing, and algorithmic trading systems all rely on quantum-vulnerable cryptography. The interconnected nature of financial networks means that PQC migration must be coordinated across the system. Individual banks cannot migrate independently — interoperability with counterparties, payment networks, and regulators is essential. The Basel Committee's guidance on quantum risk management requires banks to assess quantum-related risks as part of their operational resilience frameworks.
Healthcare: Beyond HIPAA's protection requirements, healthcare organizations face unique challenges in medical device cryptography. FDA-regulated devices with embedded cryptographic modules (insulin pumps, cardiac monitors, diagnostic equipment) cannot be easily updated and may require hardware replacement. Genomic data processing pipelines, health information exchanges (HIEs), and electronic prescribing systems all require PQC attention. The long sensitivity lifetime of health data makes HNDL a particularly acute concern.
Government and defense: Federal agencies are under the most explicit mandates through CNSA 2.0 and OMB directives. The challenge is the massive scale and diversity of government IT — legacy systems spanning decades, classified networks with specialized requirements, and extensive contractor ecosystems. Defense industrial base companies should plan for PQC requirements in contract vehicles by 2027-2028 and begin inventory now.
Telecommunications: 5G infrastructure, VPN services, and TLS termination at scale represent unique PQC challenges. The performance overhead of PQC algorithms at carrier-grade throughput (millions of TLS handshakes per second) requires hardware acceleration. Major equipment vendors (Ericsson, Nokia, Huawei) are developing PQC-capable network equipment, but rollout timelines vary. Vehicle-to-everything (V2X) communications for autonomous vehicles present additional challenges due to the constrained computing environments in vehicular hardware.
Automotive: Connected vehicles rely on cryptography for over-the-air (OTA) firmware updates, V2X communications, telematics, and infotainment system security. Vehicles have 10-15 year lifespans, meaning cars manufactured today must be resilient to quantum threats throughout their operational life. Automotive manufacturers are already incorporating PQC into their next-generation electronic architectures.
PQC Execution Costs
One of the most common questions from enterprise leaders is, "How much will this cost?" The answer depends on organization size, cryptographic complexity, regulatory requirements, and the current state of crypto-agility. Here are realistic budget ranges based on industry benchmarks and vendor pricing as of 2026:
Small and Mid-Size Businesses ($50K-$200K)
For organizations with fewer than 500 employees and relatively simple IT environments, the primary costs are cryptographic assessment consulting ($15,000-$50,000), software and library updates (largely free for open-source, $10,000-$30,000 for commercial), certificate reissuance ($5,000-$20,000), staff training ($10,000-$25,000), and testing and validation ($10,000-$50,000). Most SMBs can rely on their cloud providers (AWS, Azure, GCP) to handle infrastructure-level PQC migration, focusing their own efforts on application-level cryptography and certificate management.
Mid-Market Organizations ($200K-$1M)
Companies with 500-5,000 employees face more complexity. Budget for automated cryptographic discovery tools ($50,000-$150,000 annual license), dedicated consulting or staff augmentation ($100,000-$300,000), HSM firmware upgrades or replacements ($50,000-$200,000), application remediation and testing ($100,000-$300,000), and certificate management platform upgrades ($25,000-$75,000). Organizations in regulated industries (finance, healthcare) should add 20-30% for compliance documentation and audit preparation.
Large Enterprises ($1M-$10M+)
Enterprises with more than 5,000 employees and complex environments (multiple data centers, extensive on-premises infrastructure, custom applications, HSM farms, embedded systems) face the highest costs. Enterprise cryptographic discovery platforms ($200,000-$500,000 annual), dedicated PQC migration team or program (3-5 FTEs, $500,000-$1.5M annual), HSM replacement program ($200,000-$2M depending on fleet size), custom application remediation ($500,000-$3M depending on portfolio size), network equipment upgrades ($200,000-$1M), certificate authority infrastructure upgrades ($100,000-$500,000), and vendor management and third-party assessment ($100,000-$300,000).
The largest cost driver across all organization sizes is the cryptographic inventory phase. Organizations that already have good visibility into their cryptographic market will spend significantly less than those starting from zero. The second largest driver is hardware: HSMs, network appliances, and IoT devices that cannot be updated through software alone require physical replacement.
The Vendor Space
The PQC vendor environment has matured significantly since the NIST standards were finalized. Here are the key players and what they offer:
IBM has been the most aggressive early mover. IBM Quantum Safe Explorer provides automated cryptographic discovery and risk assessment across enterprise environments. The IBM z16 mainframe includes on-chip quantum-safe cryptographic acceleration, and IBM Cloud Key Protect supports PQC key management. IBM's Quantum Safe Remediation tooling helps organizations prioritize and execute migration plans.
Google enabled PQC hybrid key exchange in Chrome 124 (released 2024), making it the first major browser to support quantum-safe TLS in production. Google Cloud KMS supports PQC key types, and Google's internal infrastructure has been running PQC experiments at scale since 2023. Google's ALTS (Application Layer Transport Security) internal protocol has PQC support for service-to-service communication within Google Cloud.
Microsoft has integrated PQC support into Windows CNG (Cryptography: Next Generation) and is rolling out PQC capabilities across Azure services. Microsoft's SymCrypt library includes PQC implementations, and Azure Key Vault is being updated with PQC key types. Microsoft's Security Copilot includes PQC readiness assessment capabilities.
SandboxAQ (spun out of Alphabet) offers AQtive Guard, one of the most comprehensive cryptographic discovery and management platforms. AQtive Guard provides real-time visibility into cryptographic assets, risk scoring, and migration planning. It is particularly strong for large enterprises with complex, heterogeneous environments.
Thales has updated its Luna HSM line to support PQC algorithms, offering PQC-ready hardware security modules that can generate, store, and perform operations with quantum-safe keys. Thales CipherTrust Manager provides PQC-aware data encryption and key management.
Entrust offers PQC-ready PKI and certificate management through its nShield HSMs and Certificate Hub platform. Entrust has been active in the IETF hybrid certificate standards process and offers some of the earliest commercially available PQC certificates.
SEALSQ/WISeKey focuses on PQC for IoT and semiconductor applications, providing quantum-safe secure elements, certificates, and identity management for constrained devices. Their QUASARS platform targets automotive, industrial IoT, and smart city deployments.
Quantinuum offers Quantum Origin, a platform that generates cryptographic keys using verified quantum randomness, and PQC-ready encryption services. Their approach combines quantum-derived entropy with PQC algorithms for enhanced security.
PQShield provides PQC IP cores for semiconductor companies, enabling hardware-accelerated PQC in chips for mobile devices, automotive, and IoT. Their solutions address the performance challenges of PQC in constrained environments.
Hybrid Approaches: The Recommended Transition Path
NIST, the IETF, NSA, and virtually every credible security authority recommends a hybrid approach during the transition period. Hybrid cryptography runs classical and quantum-safe algorithms simultaneously, combining their outputs so that the system remains secure even if one algorithm is broken.
The rationale is straightforward. Classical algorithms (RSA, ECC) have decades of cryptanalysis behind them and are well-understood. PQC algorithms are mathematically sound but relatively new to production deployment. A hybrid approach ensures that even if a vulnerability is discovered in a PQC algorithm, classical protection remains. And even if a quantum computer breaks the classical component, the PQC component provides quantum resistance. Both would have to fail simultaneously for the system to be compromised.
For TLS, the most mature hybrid setup is X25519Kyber768, which combines X25519 (classical ECDH key exchange) with ML-KEM-768 (quantum-safe key encapsulation). The shared secret is derived from both algorithms, providing both classical and quantum security. This hybrid cipher suite is supported in Chrome, Firefox, Cloudflare, AWS, and most modern TLS libraries as of early 2026.
For digital signatures, hybrid approaches are more complex because certificate formats must be updated. The IETF Composite Signatures draft defines how to include both a classical (ECDSA) and PQC (ML-DSA) signature in a single certificate. Certificate chains can also use a mixed approach: classical signatures at intermediate levels for backward compatibility, with PQC signatures at the root for long-term security.
The hybrid transition period is expected to last 5-10 years, after which classical algorithms can be safely retired as PQC implementations mature and quantum computing advances confirm the threat model.
Common Mistakes to Avoid
Having worked with dozens of organizations in various stages of PQC awareness and planning, certain patterns of failure emerge consistently:
Waiting for "final" standards. The standards are final. FIPS 203, 204, and 205 were published in August 2024. FIPS 206 (FN-DSA) is expected in 2025. Waiting for additional certainty is waiting for nothing — and losing months of migration runway.
Underestimating the inventory challenge. Almost every organization discovers significantly more cryptographic assets than expected during inventory. The median discovery is 3-5x the estimated number of certificates, keys, and cryptographic dependencies. Budget accordingly in both time and tooling.
Ignoring embedded and IoT devices. Enterprise networks contain thousands of devices — printers, cameras, HVAC controllers, medical devices, industrial sensors — with embedded cryptographic implementations that cannot be updated via software patches. These devices require firmware updates or physical replacement, which takes significantly longer than software migration. Identify them early.
Treating PQC as an IT-only project. PQC migration touches legal (contract requirements), procurement (vendor cryptographic capabilities), compliance (regulatory deadlines), risk management (board reporting), and operations (performance impact). It requires executive sponsorship and cross-functional coordination.
Assuming cloud providers handle everything. Cloud providers will update their infrastructure, but the responsibility for your application-level cryptography remains yours. Custom applications, database encryption, inter-service communication, client-side encryption, and code signing are all your responsibility to migrate.
Skipping the hybrid phase. Going directly from classical to PQC-only algorithms in production is unnecessarily risky. The hybrid approach provides backward compatibility, risk mitigation, and operational flexibility. Skip it at your peril.
Neglecting crypto-agility. Organizations that hard-code PQC algorithms into their applications will face the same migration pain when algorithms are updated. The whole point of this migration should be to build crypto-agile systems that never face this problem again.
Getting Started Today: Five Immediate Actions
Regardless of your organization's size, industry, or current cryptographic maturity, there are five actions you should take immediately:
1. Start your cryptographic inventory. You cannot plan a migration without knowing what needs to migrate. Begin with automated discovery tools for your network, certificate infrastructure, and code repositories. Even a partial inventory is infinitely more useful than no inventory. Prioritize internet-facing systems and systems handling your most sensitive data.
2. Educate leadership. Your board, C-suite, and senior leadership need to understand the quantum threat, the compliance timeline, and the investment required. Frame PQC migration as a risk management imperative with a defined compliance deadline, not an optional technology upgrade. Use the NSA CNSA 2.0 timeline and HNDL threat as concrete reference points.
3. Assess your most sensitive data. Identify data with long-term sensitivity — data that must remain confidential for 10, 20, or 50+ years. Map how that data is encrypted in transit and at rest. This tells you where your HNDL exposure is highest and where to focus Phase 1 of your migration.
4. Test NIST algorithms in lab environments. Stand up a test environment and experiment with PQC. Configure a web server with ML-KEM hybrid TLS. Generate and validate ML-DSA certificates. Benchmark performance. Identify which of your libraries and applications already have PQC support and which need updates. OpenSSL 3.4+, BoringSSL, and the Open Quantum Safe (OQS) project provide open-source PQC implementations you can use immediately.
5. Develop a migration budget and timeline. Using the cost ranges in this guide and the phased roadmap, draft a preliminary budget and timeline for board review. Even if the numbers are rough, having a plan initiates the funding and staffing conversations that take months to resolve in large organizations. Start the procurement process for cryptographic discovery tools and HSM upgrades, which have the longest lead times.
The quantum threat to cryptography is not speculative. The mathematics are proven. The standards are published. The compliance deadlines are set. The adversaries are harvesting. The only variable is your organization's readiness. Every month of delay adds data to the HNDL vulnerability window and compresses the migration timeline against immovable compliance deadlines. The organizations that begin now will migrate smoothly and meet deadlines with confidence. Those that wait will face emergency migrations under regulatory pressure, with all the cost overruns and risk that entails.
The best time to start your PQC migration was 2024. The second best time is today.
Discover more insights in Future — explore our full collection of articles on this topic.
Frequently Asked Questions
What is post-quantum cryptography?+
Post-quantum cryptography (PQC) refers to cryptographic algorithms designed to be secure against attacks from both classical and quantum computers. Current widely used encryption methods like RSA and elliptic curve cryptography (ECC) rely on mathematical problems that quantum computers can solve efficiently using Shor's algorithm, rendering them insecure. PQC algorithms are based on different mathematical problems — lattice-based, hash-based, code-based, and multivariate polynomial problems — that are believed to be resistant to quantum attacks. In August 2024, NIST published three finalized PQC standards (ML-KEM, ML-DSA, and SLH-DSA), with a fourth (FN-DSA) expected in 2025, providing organizations with approved algorithms for migration.
When will quantum computers break current encryption?+
Most experts estimate that cryptographically relevant quantum computers (CRQCs) capable of breaking RSA-2048 and ECC will emerge between 2029 and 2035. The Global Risk Institute's annual quantum threat timeline survey shows the median expert estimate shifting earlier each year. A 4,099-qubit fault-tolerant quantum computer could factor RSA-2048 in hours, and significant progress is being made toward this threshold — IBM's Condor processor reached 1,121 qubits in 2023, and Google's Willow chip demonstrated breakthrough error correction in 2024. However, the 'harvest now, decrypt later' threat means that sensitive data encrypted today is already at risk, since adversaries can store intercepted ciphertext and decrypt it once quantum computers become available.
What are the NIST post-quantum cryptography standards?+
NIST finalized three post-quantum cryptography standards in August 2024 after an eight-year evaluation process: ML-KEM (FIPS 203, formerly CRYSTALS-Kyber), a lattice-based key encapsulation mechanism for secure key exchange replacing RSA and ECDH; ML-DSA (FIPS 204, formerly CRYSTALS-Dilithium), a lattice-based digital signature algorithm replacing RSA and ECDSA for authentication and integrity; and SLH-DSA (FIPS 205, formerly SPHINCS+), a hash-based signature scheme serving as a conservative backup. A fourth algorithm, FN-DSA (formerly FALCON), is expected to be standardized in 2025 for environments requiring compact signatures. These standards are designed to protect against both classical and quantum computer attacks.
How much does PQC migration cost for an enterprise?+
PQC migration costs vary significantly by organization size. Small and mid-size businesses can expect to spend $50,000 to $200,000, primarily on cryptographic inventory tools, consulting assessments, software updates, and staff training. Mid-market companies typically budget $200,000 to $1 million, adding costs for pilot programs, hybrid deployment testing, and certificate management infrastructure. Large enterprises should plan for $1 million to $10 million or more, covering enterprise-wide cryptographic discovery platforms, hardware security module upgrades, custom application remediation, extended testing cycles, and dedicated PQC migration teams. The largest cost drivers are the cryptographic inventory phase (discovering where cryptography is used) and hardware upgrades for devices that cannot support PQC algorithms through software updates alone.
Is my organization required to migrate to post-quantum cryptography?+
Requirements depend on your industry and regulatory environment. Federal agencies and their contractors face the most immediate mandates: NSA's CNSA 2.0 Suite requires quantum-safe algorithms for new systems by January 2027, with full migration completed between 2030 and 2035. NIST has deprecated 112-bit security algorithms and plans to disallow RSA-2048 and 128-bit ECC by 2030. In the financial sector, regulators including the Basel Committee and the European Central Bank are issuing quantum risk guidance. Healthcare organizations under HIPAA must protect patient data for decades, making PQC migration a de facto requirement. Even without explicit mandates, any organization handling data with long-term sensitivity — trade secrets, intellectual property, financial records, personal health information — should treat PQC migration as a business imperative rather than waiting for regulatory deadlines.
What is the 'harvest now, decrypt later' threat?+
'Harvest now, decrypt later' (HNDL) is a cyberattack strategy where adversaries intercept and store encrypted data today with the intention of decrypting it in the future when sufficiently powerful quantum computers become available. Nation-state intelligence agencies and sophisticated threat actors are believed to be actively collecting encrypted communications, financial transactions, healthcare records, diplomatic cables, and trade secrets — knowing that data encrypted with RSA or ECC will become readable once a cryptographically relevant quantum computer exists. This makes HNDL the most immediate quantum threat because it means data with long-term sensitivity is already at risk, even though quantum computers capable of breaking current encryption may still be years away. Organizations should prioritize PQC migration for their most sensitive long-lived data to close this vulnerability window.
Editorial team at Gray Group International covering business, sustainability, and technology.
Related Insights
- Zero Trust Security Architecture: The 2026 Implementation Guide
- Agentic AI for Business: How Autonomous AI Systems Are Transforming Operations in 2026
- Ransomware Protection for Business: The Complete Defense Playbook for 2026
- SOC 2 Compliance for Startups: The Step-by-Step Guide to Getting Certified in 2026
- Cloud Computing Cost Optimization: How to Save 40% on Your Cloud Bill in 2026