From Quantum Threats to Cryptographic Standards: Epistemic Communities at NIST and IETF

By Yug Desai
Introduction
Quantum computing is reshaping the cryptographic landscape, prompting a reexamination of longstanding encryption protocols. In August 2024, the US National Institute of Standards Technology (NIST) released the first Post-Quantum Cryptography (PQC) standards with much fanfare, ostensibly delivering a new line of defense against the risks posed by quantum computers to traditional encryption schemes. However, the story of PQC is less straightforward than it appears at first glance. The PQC algorithms selected by NIST emerged from an elaborate open competition, and their increased complexity compared to traditional counterparts means they cannot simply be dropped in place of existing algorithms without further adjustment. Much of the PQC-related work in the Internet Engineering Task Force (IETF)—the standards-developing organization responsible for core Internet standards—has focused on upgrading existing encryption schemes to be quantum-resistant. My research investigates how these standard-setting organizations negotiate technical uncertainties to produce authoritative PQC standards and deploy institutional mechanisms to coordinate dispersed experts, ultimately nurturing epistemic communities.
Employing the concept of epistemic communities—knowledge-based networks united by shared beliefs and notions of validity—I examine how SDOs like NIST and the IETF navigate these challenges. The study primarily draws on an analysis of documents and external communications by NIST and IETF working groups.
NIST
NIST is a division under the U.S. Department of Commerce and functions as an R&D organization for the U.S. Federal Government, focusing on enhancing innovation and industrial competitiveness. Although not a typical SDO, standards coordination is one of its key tasks, achieved through a ‘scientific consensus approach’ involving extensive consultations with experts. This approach not only addresses technical challenges but also sheds light on the policy implications and organizational insights integral to the standardization process.
The Information Technology Lab at NIST has long been responsible for research related to IT challenges and developing standards for federal information systems. NIST has produced some of the most successful encryption standards—such as the Data Encryption Standard, Advanced Encryption Standard, and Secure Hash Algorithms—through open and collaborative competitions. Recognizing the value of involving various stakeholders and the global cryptographic community early on, NIST employed a similar model for the PQC standardization project, which received 82 submissions through an open call for proposals in 2016.
Iteratively, through public discussions on their mailing list and community cryptanalysis in different rounds of standardization, finalists were selected. While the criteria for selection and the final decision rested with NIST, the community provided essential security proofs and inputs on attack vectors. Security, cost, and performance were the primary criteria initially laid down by NIST; however, additional criteria were incorporated into the discussion following community feedback. With NIST’s outreach to the cryptographic community and other SDOs, the process enjoys considerable legitimacy and support despite the uncertainties surrounding quantum cryptography and the robustness of proposed methods.
Using a step-by-step approach that involves open mailing list exchanges, rounds of public cryptographic evaluation, and extensive expert input, NIST addresses technical uncertainties through a transparent dialogue that refines its PQC framework. This approach facilitates collaboration among specialists, whose collective evaluations integrate isolated challenges into a consistent and authoritative standard-setting process.
IETF
Selecting and standardizing PQC algorithms is an important first step in the transition to secure, quantum-resistant communication; however, considerable challenges remain in successfully migrating critical applications and infrastructure to become ‘quantum-proof.’ Existing Internet protocols and infrastructure were developed with the properties of traditional cryptographic schemes in mind—such as key and signature sizes and computational complexity. The IETF, responsible for developing and maintaining core Internet standards, adheres to principles of open process, technical competence, volunteer participation, rough consensus, and running code. With membership on an individual basis, decision-making within the IETF is diffused among its participants.
The IETF has a stack of standards that are susceptible to cryptanalysis by quantum computers and would require modifications to accommodate the properties of PQC algorithms and manage the resulting trade-offs. This task is distributed among various working groups from which these standards originated. The Post-Quantum Use in Protocols (PQUIP) working group was established to coordinate the transition within the IETF. Its objective is to consolidate PQC activities, share operational and engineering practices, and thereby generate validated knowledge for application across the IETF. Transitioning to PQC is unlike past migrations because no Cryptographically Relevant Quantum Computers (CRQCs) exist yet. Additionally, PQC remains an emerging sub-field, and its algorithms cannot be fully trusted to be immune to future quantum attacks. Drafts and discussions in PQUIP have therefore focused on the hybrid approaches that combine traditional and post-quantum methods. They also discuss security properties of different algorithms, various properties of the hybrid methods, and the tradeoffs that have to be considered. There is also considerable emphasis on cryptographic agility as there may be a need to make drop in replacements in case of breaks. Through these deliberations, the working group is working towards a consensus-based understanding of the nuances of PQC migration and developing guidance for the wider community.
The IETF, with its volunteer-driven and decentralized structure, relies on consensus-building through open discussions and specialized groups like PQUIP that focus on coordination and guidance. This approach not only produces trusted standards through continuous, community-based review but also creates an environment where the ongoing exchange of ideas strengthens the system’s ability to adapt to emerging cryptographic challenges.
Future Work
Both NIST and IETF are continuing to work on developing new standards. The PQC standardization project at NIST is also continuing with newer standards expected to be released in the future potentially with different properties that could be beneficial for a different set of use-cases. Now that NIST has released its initial standards, various working groups are also discussing the application of these algorithms to existing standards in earnest. PQUIP is continuing to refine the guidance and terminologies to coordinate the PQC transition efforts in the IETF. It would be important to continue monitoring these processes to gather insights as the interest in these technologies increases and PQC migration gains momentum.
The draft paper can be found on this link.