The "harvest now, decrypt later" framing is what finally made this click for me. Nobody cares about breaking today's SSH keys in 2035, but goverment cables or biotech IP? Yeah, thats getting stored.
One q: you mention hybrid encryption during transition. Isnt that gonna tank performance? Like, doubling the overhead for key exchange seems rough for latency-sensitive apps. Wondering if the practical move is just rip-the-bandaid and go full PQC once CRYSTALS libraries stabilize.
We're not doing two full TLS handshakes back-to-back. You’re deriving a session key from both classical + PQC exchanges. I agree that CPU cost goes up, packets get a bit bigger, but latency impact is usually modest unless you’re operating at extreme scale or ultra-low-latency. Classical crypto protects you if PQC has an unforeseen flaw; PQC protects you against “harvest now, decrypt later.” For long-lived or high-value data, that risk tradeoff is worth a bit of overhead.
I agree with your instinct though if you control both ends, have short data lifetimes, and can tolerate some algorithm churn, “rip the bandaid and go full PQC” is totally reasonable once libraries stabilize.
The "harvest now, decrypt later" framing is what finally made this click for me. Nobody cares about breaking today's SSH keys in 2035, but goverment cables or biotech IP? Yeah, thats getting stored.
One q: you mention hybrid encryption during transition. Isnt that gonna tank performance? Like, doubling the overhead for key exchange seems rough for latency-sensitive apps. Wondering if the practical move is just rip-the-bandaid and go full PQC once CRYSTALS libraries stabilize.
We're not doing two full TLS handshakes back-to-back. You’re deriving a session key from both classical + PQC exchanges. I agree that CPU cost goes up, packets get a bit bigger, but latency impact is usually modest unless you’re operating at extreme scale or ultra-low-latency. Classical crypto protects you if PQC has an unforeseen flaw; PQC protects you against “harvest now, decrypt later.” For long-lived or high-value data, that risk tradeoff is worth a bit of overhead.
I agree with your instinct though if you control both ends, have short data lifetimes, and can tolerate some algorithm churn, “rip the bandaid and go full PQC” is totally reasonable once libraries stabilize.