Monday, July 11, 2022

NIST's Post-Quantum Cryptography Work at a 'Critical Point'

The National Institute of Standards and Technology (NIST) last week revealed the first group of winners from its post-quantum cryptography competition. Now, the institute is looking to expand and diversify the selections and set the foundation for international standards.   The National Institute of Standards and Technology (NIST) last week revealed the first group of winners from its post-quantum cryptography competition. Now, the institute is looking to expand and diversify the selections and set the foundation for international standards.   The competition, which was initiated in 2016, attracted approximately 80 candidates. After three rounds, four algorithms were selected for post-quantum cryptographic standardization, and four additional algorithms will continue into the fourth round, Dustin Moody, post-quantum cryptography project lead at NIST, told SDxCentral. “It’s very important because if you do not use these algorithms, then you’ll eventually be vulnerable to threats from quantum computers that would completely break some of the crypto-systems we use today,” Moody explained. The announcement is “an important milestone in securing our sensitive data against the possibility of future cyberattacks from quantum computers,” Secretary of Commerce Gina Raimondo noted in a statement. Analysts noted those potential threats highlight the importance of NIST’s work. “The race to protect the world’s most sensitive data from the quantum threat is at a critical point,” data analytics and consulting firm GlobalData analysts noted. “It’s early days in the development of meaningful applications for quantum computing, but the threat to existing encryption methods is well known,” GlobalData Associate Analyst Robert Penman said in a statement. “Governments worry about the capabilities of state-backed hackers, and the defense industry fears China’s growing technological prowess. That’s why the mammoth task of identifying and replacing soon-to-be-obsolete algorithms is already underway.” NIST selected the CRYSTALS-Kyber algorithm for general encryption, used to protect information exchanged across a public network. For digital signatures, often used for identity authentication during a digital transaction or to sign a document remotely, NIST selected the CRYSTALS-Dilithium, FALCON, and SPHINCS+ algorithms. The first three algorithms are based on a family of math problems called structured lattices, while SPHINCS+ uses hash functions as a backup to avoid relying only on the security of lattices for signatures, NIST noted. Moody said the lattice-based algorithms are the best family that were being evaluated during the process. “They have very well studied security and their performance is excellent,” he explained. However, “if in the future someone discovers some brilliant new attack, we want to have other algorithms that don’t depend on lattices, that is why we selected SPHINCS+,” Moody added. NIST also named four additional alternate algorithms for the fourth round of the competition: BIKE, Classic McEliece, HQC, and SIKE. NIST plans to standardize one or two of those non-lattice-based algorithms, Moody noted. The government agency plans to call for new proposals for quantum-resistant public-key digital signature algorithms to diversify its portfolio and look for signature schemes that have short signatures and fast verification. The hash-based digital signature scheme NIST selected “has pretty big key sizes and is pretty slow, so it’s unlikely that it would be able to be easily used in most applications,” Moody said. NIST is asking for public feedback on a version of SPHINCS+ with a lower number of maximum signatures and also looks for other structures. “Speed and size are issues and NIST may be actively looking for alternatives that will perform better in situations where memory, processing, or power are limited,” Forrester analysts wrote in a blog post. Lattice-based schemes have been the most secure and performant candidates in the evaluation rounds to date, though Forrester speculates that “NIST may be hedging their bets and looking for alternatives in case advances in cryptanalysis threaten the viability of lattice cryptography — either through standard cryptanalysis or by finding a Shor’s-algorithm-like approach that can be performed by a quantum computer.” Shor’s algorithm is a quantum computer algorithm developed in 1994 by American mathematician Peter Shor. “There is no easy solution to the quantum challenge. That’s why NIST has kept its options open,” GlobalData principal analyst David Bicknell, echoed in a statement. “Significantly, NIST is covering all bases by choosing a slower and larger solution as a backup simply because it is based on a different mathematical approach. NIST must look to the future and be flexible in its choice of algorithms in the face of future threats.” NIST expects to publish its post-quantum cryptography standard by 2024. The institute then will recommend the rest of the federal governments use the standard.  Certain industries will typically follow those standards because they have businesses with the federal governments, but Moody expects it will be voluntary. Some organizations might wait for the local or international post-quantum cryptography standards to be released, either from the International Organization for Standardization (ISO) or the Internet Engineering Task Force (IETF), or from other regional standard institutions such as the American National Standards Institute (ANSI) and the European Telecommunications Standards Institute (ETSI). NIST works with those institutions. “To a large degree, these other standards organizations were very happy with what we were doing at NIST and wanted to wait for our process to finish before they selected algorithms themselves,” Moody said, adding he expects NIST’s selections will be adopted by the international standards institutions and they likely will add other algorithms from other countries.  “Our primary focus is the United States government,” Moody said. “But we do know that our algorithms get used around the world, so we want to make it as easy as possible for the algorithms to be adopted internationally as well.”

Archive