Think about cryptography in cybersecurity, and one will likely bring up “RSA,” which is a public key cryptography (PKC) system underpinned by the basic premise that the generation of the private-public key pair, based on two distinct prime numbers, is simple, whereas discovering the two distinct prime numbers used to generate the private-public key pair is difficult.
But this premise has been threatened with the rise of quantum computers built by IBM, Google and D-Wave. While there has been no demonstrated evidence of a quantum computer being able to successfully “crack” RSA, agencies such as NIST have initiated a search, evaluation and standardization for quantum resistant public-key cryptographic schemes since 2016.
Even if the RSA algorithm as it stands today is not breakable with a quantum computer, there exists a certain risk of threat actors stealing and storing data for their decryption later. This is also known as “Harvest Now, Decrypt Later” or “Hack Now, Decrypt Later”.
Harvest Now, Decrypt Later
Different stores of data have various levels of time-dependent sensitivity. For instance, air traffic data often is not time sensitive in the future (the location of a plane at 0000 hours on 23/12/2023 is likely not useful). But some stores of data can be sensitive for life, such as biometric minutia (one cannot change their biometric features, short of mutilation).
This has led to regulators looking at urgent deployments of quantum-resistant algorithms, which is often referred to as post-quantum cryptography (PQC).
Selecting and Future-Proofing Quantum-Resistant Algorithms
In 2022, NIST announced four quantum-resistant algorithms. CRYSTALS-Kyber was selected for general encryption, whereas CRYSTALS-Dilithium, FALCON and SPHINCS+ were selected for digital signatures. One reasoning behind NIST’s selection is to select algorithms from at least two different families of mathematical problems (lattice-based and hash functions) to mitigate the risk of a certain class of mathematical problems being easily broken by a yet-to-be-announced technology, such as a next-generation quantum computer. NIST might still be looking for more candidates that do not use lattice-based cryptography for general encryption to further mitigate this issue.
Engineering Challenges
One key concern raised by enterprises in response to the PQC risk is the timeline in which PQC threats can be realized. For instance, smart and connected vehicles that need to protect user safety and privacy will likely have lifespans that extend at least two decades, into the 2040s. While no one can put a definitive date to when a quantum computer will “break” RSA, it is helpful to contextualize that such advancements can be possible in a decade. NIST believes that a quantum computer capable of breaking 2,000-bit RSA in hours could be built by 2030 for a budget of about a billion dollars, whereas Cloudflare’s interviews suggest 2037. This presents important questions for engineering:
- What algorithms can we use that are quantum-resistant if our solutions are in use past 2030?
- Are these solutions implementable today?
Luckily, the PQC “apocalypse” does not affect all algorithms. Symmetric-key ciphers such as AES-GCM are “quantum-safe” as of the time of this writing. But PKCs like RSA, widely used, isn’t, and replacing algorithms one-for-one may not be possible since the post-quantum algorithms proposed have much larger key shares and signature lengths.
This presents risks for risk practitioners to assess.
How Is Risk Affected?
One obstacle to the replacement of PKCs by PQCs may be the lack of support. But such a transition is already happening. Vendors today are already beginning the migration to PQC, with Dilithium being well-supported for signatures. As time urgency grows, more vendors will foresee the demand for PQC algorithms to be included in their products.
Depending on the likely time sensitivity of stolen data vulnerable to “Harvest Now, Decrypt Later,” the impact of confidentiality loss will differ from none to high.
One purported risk mitigation method from a vendor is microsharding, in which data is broken down into unintelligible fragments, called shards. These shards are then stored away in multiple locations. Such shards are also mixed with poison data to scrub said shards of sensitive metadata.
Another risk mitigation method is tokenization, where sensitive data is being exchanged for tokens, which are non-sensitive. One can view tokens as “placeholders” with a well-defined ruleset somewhere else to interpret what said placeholders mean. An analogy of this is poker chips used by a casino.
In Singapore (where I live), another risk mitigation method that has been proposed by a vendor is a “wrapper” around the underlying PKC algorithm to make it quantum-resistant, which is software-based and hence more cost-effective.
Beyond PQC, there has also been suggestions to move straight into algorithms that are fundamentally secure due to the laws of quantum physics. Such algorithms are distributed through quantum key distribution (QKD). But the National Security Agency suggests that QKD is not economical due to the need for special purpose equipment and suggests that the validation of QKD in practice is limited not by quantum physics, but by hardware and engineering designs. This makes QKD unlikely to be the mainstream solution to mitigate risks associated with “Harvest Now, Decrypt Later” attacks.
Whatever solutions risk practitioners recommend, more vendors will begin to converge around various PQC standards to kick-start mass rollouts into enterprise environments. Businesses will do well to pay attention to some of the forward movers incorporating PQC algorithms to keep their data safe.