How Quantum Computing Threatens Modern Cryptography

In This Guide
Most people have a tidy mental model of encryption: if the key is long enough and the math is hard enough, your data is safe. That model mostly works in the classical world. It breaks in the quantum one.
Here’s the uncomfortable part: the biggest quantum risk to cryptography isn’t that someone will “hack AES tomorrow.” It’s that some of today’s most widely deployed public-key systems are built on problems that a sufficiently capable quantum computer can solve efficiently. And because encrypted data can be recorded now and decrypted later, the timeline that matters isn’t “when quantum computers arrive,” but how long your secrets need to stay secret.
If you’re searching for “quantum computing impact on cryptography,” you’re probably trying to answer three practical questions:
- What exactly breaks, and why?
- How soon is “soon,” in engineering terms?
- What do we do about it without setting our infrastructure on fire?
Let’s build the foundation first, then talk about the blast radius and the migration plan.
The cryptography we rely on (and the assumptions underneath)
Modern cryptography isn’t one thing. It’s a stack of different tools doing different jobs, often in the same protocol.
Public-key cryptography (also called asymmetric cryptography) is what makes the internet usable at scale. It lets you establish a secure connection with a server you’ve never met before, without pre-sharing a secret. In practice, that means things like:
- TLS certificates and the handshake that starts an HTTPS session
- SSH key authentication
- Code signing and software update integrity
- Email encryption (S/MIME, OpenPGP)
- Many blockchain signature schemes
Public-key systems work because certain math problems are easy in one direction and hard in the other. Two dominant families are:
- RSA, which relies on the difficulty of factoring a large integer into primes.
- Elliptic Curve Cryptography (ECC), which relies on the difficulty of the elliptic curve discrete logarithm problem.
Symmetric cryptography is the workhorse once both sides share a secret key. It’s used for bulk encryption and is generally fast:
- AES for encrypting data
- ChaCha20 as an alternative stream cipher
- Symmetric message authentication (for example, HMAC)
Hash functions (SHA-256, SHA-3, etc.) are used for integrity, signatures (as components), password storage (with proper key stretching), and content addressing.
The key point: quantum computing doesn’t affect all of these equally. The impact depends on which underlying “hard problem” the security rests on.
Two more concepts are load-bearing for everything else in this article:
Security level vs. key size. When we say “AES-128,” the 128 refers to key bits. Brute-forcing it classically takes on the order of 2^128 work, which is beyond reach. For RSA-2048, “2048” is not a security level; it’s a modulus size. Its effective security is often compared to around 112 bits of symmetric security. Different primitives scale differently.
The “harvest now, decrypt later” problem. If an attacker records your encrypted traffic today and can decrypt it in 10 or 15 years, your data is effectively compromised if it needed confidentiality beyond that window. This is not hypothetical; it’s a rational strategy for state-level adversaries and anyone who expects future capability to be cheaper than present capability.
That’s the setup. Now we can talk about what quantum computers actually do to the math.
Why quantum computers change the rules: Shor and Grover, explained like an engineer
Quantum computing is often described with a fog of metaphors: parallel universes, spooky action, and other phrases that make physicists sigh. For cryptography, you only need to understand two algorithmic results and what they imply.
Shor’s algorithm: the direct hit on RSA and ECC
Shor’s algorithm shows that a sufficiently large, fault-tolerant quantum computer can:
- Factor large integers efficiently (breaking RSA)
- Solve discrete logarithms efficiently (breaking ECC and finite-field DH)
“Efficiently” here means polynomial time in the size of the input, which is a categorical change from the best-known classical attacks that scale super-polynomially (or worse) for properly chosen parameters.
Unpacking that in plain terms:
- With RSA, the public key includes a large number
Nthat is the product of two secret primes. If you can factorN, you can derive the private key. - With ECC, the public key is derived from the private key via repeated group operations on an elliptic curve. If you can solve the discrete log, you can recover the private key from the public key.
A useful analogy (one of the few we’ll allow ourselves): RSA and ECC are like locks whose security depends on the assumption that picking them requires an absurd amount of time with classical tools. Shor’s algorithm isn’t a better lockpick; it’s a different kind of tool that turns the “absurd” part into “manageable,” given the right machine.
This is why the quantum threat is so lopsided: it targets the exact primitives used for key exchange and digital signatures—the parts that establish trust.
Grover’s algorithm: a speedup, not a collapse
Grover’s algorithm provides a quadratic speedup for unstructured search problems, including brute-force key search. That matters for symmetric cryptography and hashes, but it’s not the same kind of existential break as Shor.
What “quadratic speedup” means operationally:
- If a classical brute-force search takes about 2k steps, Grover can do it in about 2(k/2) steps, assuming an idealized quantum computer and the ability to implement the target function as a quantum oracle.
So:
- AES-128 under Grover looks more like 64-bit security in the most naive framing.
- AES-256 under Grover looks more like 128-bit security.
In practice, the story is messier. Grover requires many sequential operations and error correction; it’s not “press button, halve key size.” But the conservative engineering takeaway is straightforward: symmetric crypto mostly survives; you compensate by using larger keys and robust modes.
For hashes, the analogous effect is:
- Preimage resistance drops from 2n to about 2(n/2)
- Collision resistance is already about 2^(n/2) classically due to the birthday bound, and quantum doesn’t magically make collisions trivial
So SHA-256 is not “broken by quantum,” but if you need long-term preimage resistance against a powerful quantum adversary, you may prefer SHA-384 or SHA-512 in some designs.
Now that we’ve got the two algorithms straight, we can map them to real systems.
What breaks in the real world (and what mostly doesn’t)
The quantum computing impact on cryptography is easiest to understand by sorting primitives into three buckets: “broken,” “weakened,” and “mostly fine with parameter tweaks.”
Public-key encryption and key exchange: the broken bucket
These are the systems that Shor’s algorithm targets directly:
- RSA key transport in TLS (less common in modern TLS, but still present in legacy configurations)
- Diffie-Hellman (DH) over finite fields
- Elliptic-curve Diffie-Hellman (ECDH), widely used in TLS
- RSA and ECDSA signatures, used in certificates, software signing, and identity systems
- EdDSA (Ed25519/Ed448), also based on elliptic curves, also vulnerable to Shor
If an attacker has your public key and a capable quantum computer, they can derive your private key and impersonate you (signatures) or decrypt sessions (key exchange), depending on the protocol.
A subtle but important point: perfect forward secrecy (PFS) helps, but it’s not a quantum shield. PFS means that if your long-term private key is compromised later, past sessions remain safe because they used ephemeral key exchange. That’s good hygiene today. But if the ephemeral key exchange itself is based on ECDH, Shor breaks it. PFS protects against key theft; it does not protect against the underlying math becoming solvable.
Symmetric encryption and MACs: the “increase the margin” bucket
AES, ChaCha20, and well-designed MACs don’t fall over in a quantum world. They may need larger parameters depending on your threat model.
- AES-128: potentially borderline for very long-term confidentiality against a strong quantum adversary; many organizations treat AES-256 as the safer default for “store now, decrypt later” concerns.
- AES-256: generally considered robust in post-quantum planning.
- ChaCha20: similar story; key size is 256 bits.
For integrity, HMAC with SHA-256 is still widely considered solid, though some designs may prefer SHA-384/512 for extra margin.
Hashes and password hashing: mostly fine, but don’t get complacent
Hash functions remain usable, but the quantum model changes some margins:
- For digital signatures, the hash is only one component; the signature scheme is the main quantum vulnerability today.
- For password hashing, quantum doesn’t rescue weak passwords. If anything, it reinforces the same advice: use strong password hashing (Argon2id, scrypt, bcrypt with appropriate cost) and enforce high-entropy secrets where possible. Grover doesn’t make “password123” any less terrible.
The “trust infrastructure” problem: certificates, firmware, and long-lived signatures
Even if you don’t care about decrypting old traffic, signatures have their own long tail.
- Code signing: If a signing key can be derived, an attacker can sign malware that looks legitimate.
- Firmware and embedded devices: Many devices ship with baked-in trust anchors and limited update paths. If those anchors rely on RSA/ECC and can’t be updated, you’ve got a time bomb with a warranty.
- Document signatures and archives: Legal and compliance contexts often require signatures to remain verifiable for many years.
This is where the quantum threat stops being an academic “TLS handshake” issue and becomes a lifecycle management problem.
For the latest developments in post-quantum standardization and deployment realities, see our weekly quantum computing insights coverage. The details move, even if the fundamentals don’t.
Timelines and threat models: when “quantum” becomes your problem
Engineers hate vague timelines, and quantum computing has historically provided plenty of them. The right way to think about “when” is not a single date; it’s a comparison of three clocks:
- The capability clock: when a cryptographically relevant quantum computer (CRQC) exists for breaking RSA/ECC at meaningful sizes.
- The migration clock: how long it takes you to inventory, upgrade, test, and deploy new cryptography across systems.
- The secrecy clock: how long your data must remain confidential or your signatures must remain trustworthy.
If the migration clock plus the secrecy clock is longer than the capability clock, you have a problem. And you don’t need to know the exact capability date to see the risk.
“Harvest now, decrypt later” is the forcing function
If you’re protecting:
- health records
- government communications
- trade secrets with long shelf life
- customer data with regulatory retention
- encrypted backups stored offsite
…then an adversary can record ciphertext today and wait. The cost of storage is low. The cost of being wrong is also low: if quantum takes longer than expected, they still have your data archived. If it arrives sooner, they win.
This is why many security teams treat post-quantum migration as a risk management project, not a science experiment.
Migration is slow because systems are messy
Even if standards are ready, deployment takes time because:
- Crypto is embedded in protocols, libraries, HSMs, smart cards, and appliances.
- Some systems can’t be updated easily (industrial control, medical devices, long-lived IoT).
- Compliance regimes require validation (for example, FIPS modules) that lags implementation.
- Interoperability matters: you can’t unilaterally change how browsers validate certificates or how partners connect to your APIs.
If you’ve ever tried to rotate a root certificate across a fleet, you already understand the shape of the problem.
The “break” is not uniform across key sizes and use cases
A CRQC that can break RSA-2048 is not the same as one that can break RSA-4096 at the same cost. But relying on “we’ll just use bigger RSA keys” is not a plan; Shor scales too well for that to be a durable defense. Key size buys time against classical attacks; it does not change the fundamental quantum vulnerability.
So what does a plan look like? It starts with post-quantum cryptography.
Post-quantum cryptography: what replaces RSA/ECC, and how it gets deployed
Post-quantum cryptography (PQC) refers to classical algorithms designed to resist known quantum attacks. You don’t need a quantum computer to run them; you need different math assumptions.
The important thing to internalize: PQC is not “quantum encryption.” It’s conventional software running on conventional hardware, built on problems that are not known to be efficiently solvable by quantum computers.
The new primitives: KEMs and signatures
Most modern protocol designs are moving toward two building blocks:
- Key Encapsulation Mechanisms (KEMs) for establishing shared secrets (replacing RSA key transport and DH/ECDH).
- Digital signature schemes for authentication and integrity (replacing RSA/ECDSA/EdDSA).
NIST has standardized a set of PQC algorithms, most notably:
- ML-KEM (Kyber) for key establishment [1]
- ML-DSA (Dilithium) for signatures [2]
- SLH-DSA (SPHINCS+) as a stateless hash-based signature option [3]
Each comes with tradeoffs. The big practical differences you’ll notice in deployments are key sizes, signature sizes, and CPU cost. PQC often uses larger public keys and signatures than ECC, which can stress bandwidth-constrained links, handshake sizes, and some hardware security modules.
Hybrid key exchange: the pragmatic bridge
Because migrations are risky and ecosystems are heterogeneous, many deployments use hybrid key exchange: combine a classical method (like ECDH) with a PQC KEM, and derive session keys from both. The idea is simple:
- If PQC turns out to have a flaw, you still have classical security.
- If quantum breaks classical, you still have PQC security.
This is not theoretical. TLS has explicit mechanisms for negotiating new key exchange groups, and major stacks have experimented with or deployed hybrid approaches. Google’s CECPQ experiments helped validate the operational shape of PQC handshakes in real traffic [4].
Analogy number two: hybrid is like wearing both a belt and suspenders during a wardrobe transition. It’s not elegant, but it keeps your pants where they belong while you figure out sizing.
Where PQC shows up first: TLS, VPNs, and internal service meshes
The earliest wins tend to be places where you control both ends or can update clients quickly:
- Service-to-service TLS inside a cloud environment
- VPN concentrators and managed endpoints
- API gateways under your administrative control
Public internet PKI (browser-trusted certificates) is a bigger coordination problem, but it’s moving. Our ongoing coverage of TLS and PKI tracks how this evolves week to week, including browser and CA ecosystem shifts.
Crypto agility: the feature you wish you’d built earlier
If you take one architectural lesson from the quantum transition, it’s this: crypto should be replaceable without rewriting your product.
Crypto agility is not a buzzword; it’s the difference between:
- swapping a TLS cipher suite via config and library updates, versus
- discovering that your database encryption format hard-coded RSA-wrapped keys in a way you can’t migrate without downtime and data re-encryption.
Concrete practices that help:
- Centralize cryptographic operations behind well-defined interfaces.
- Avoid custom crypto formats unless you absolutely must.
- Store metadata about algorithms and parameters alongside encrypted data (so you can migrate later).
- Prefer standard protocols (TLS 1.3, JOSE with care, age, etc.) over bespoke schemes.
What about quantum key distribution (QKD)?
QKD is often mentioned in the same breath as PQC, but it’s a different category: it uses quantum physics to distribute keys with certain eavesdropping detection properties. It also requires specialized hardware and typically dedicated links.
For most organizations, QKD is not the practical answer to “quantum computing impact on cryptography.” PQC is software-deployable and fits existing network models. QKD may make sense in niche, high-assurance environments with controlled infrastructure, but it’s not a general replacement for internet cryptography.
Key Takeaways
- Quantum computers threaten RSA and ECC directly via Shor’s algorithm, which can recover private keys from public keys given a sufficiently capable machine.
- Symmetric crypto and hashes are not “broken,” but margins change; using AES-256 and appropriately strong hashes is a common conservative posture.
- “Harvest now, decrypt later” makes this a present-day risk for any data that must remain confidential for many years.
- Post-quantum cryptography (PQC) is the practical migration path, with NIST-standardized algorithms like ML-KEM (Kyber) and ML-DSA (Dilithium).
- Hybrid key exchange is a pragmatic bridge that reduces the risk of betting everything on either classical or PQC during the transition.
- Crypto agility is the real long-term defense: design systems so algorithms can be swapped without heroic rewrites.
Frequently Asked Questions
Will quantum computers break Bitcoin and other blockchains?
Many blockchains rely on ECDSA or EdDSA-style signatures, which are vulnerable to Shor’s algorithm if a public key is exposed. The practical risk depends on the chain’s address scheme, when public keys become visible, and how quickly the ecosystem can migrate to PQ signatures without fracturing consensus.
If I use TLS 1.3, am I safe from quantum attacks?
TLS 1.3 improves baseline security and typically uses forward secrecy, but it still commonly relies on ECDH and ECDSA today—both quantum-vulnerable. You’ll need PQC (often via hybrid key exchange) to address the quantum threat model.
Do I need to re-encrypt all my stored data right now?
Not automatically. Start by classifying data by required confidentiality lifetime and identifying where RSA/ECC protect stored keys (for example, envelope encryption with RSA-wrapped DEKs). Many organizations can stage migration by updating key management and handshake mechanisms first, then re-encrypting only the data that truly needs long-term protection.
Are post-quantum algorithms “proven safe”?
They’re based on problems believed to be hard for both classical and quantum computers, and they’ve undergone extensive public cryptanalysis—but “proven” is a high bar in cryptography. This is exactly why hybrid deployments and crypto agility matter: they let you adapt if assumptions change.
What should I ask vendors about quantum readiness?
Ask whether they support NIST-standard PQC algorithms, whether they have a roadmap for hybrid TLS/VPN modes, and whether cryptographic components are updatable independently of hardware refresh cycles. Also ask how they handle long-lived trust anchors (firmware signing keys, embedded certificates) and what migration tooling exists.
REFERENCES
[1] NIST, “FIPS 203: Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM).” https://csrc.nist.gov/pubs/fips/203/final
[2] NIST, “FIPS 204: Module-Lattice-Based Digital Signature Algorithm (ML-DSA).” https://csrc.nist.gov/pubs/fips/204/final
[3] NIST, “FIPS 205: Stateless Hash-Based Digital Signature Algorithm (SLH-DSA).” https://csrc.nist.gov/pubs/fips/205/final
[4] Google Security Blog, “Experimenting with Post-Quantum Cryptography.” https://security.googleblog.com/2016/07/experimenting-with-post-quantum.html
[5] NIST, “Post-Quantum Cryptography Project.” https://csrc.nist.gov/projects/post-quantum-cryptography
[6] RFC 8446, “The Transport Layer Security (TLS) Protocol Version 1.3.” https://www.rfc-editor.org/rfc/rfc8446