Back Matter
  • 1 https://isni.org/isni/0000000404811396, International Monetary Fund

Annex I. Glossary of Technical Terms Used in the Paper

Cryptanalysis studies the encrypted secret message (ciphertext) to gain as much information as possible about the original message.

Cryptography is the science of transmitting secret information using public channels. A cryptologic system performs transformations on a message, the plaintext, and uses a key to render it unintelligible, producing a new version of the message, the ciphertext. To reverse the process, the system performs inverse transformations to recover the plaintext, decrypting the ciphertext (Dooley, 2018).

Cryptographic agility (or crypto agility) is the property that permits changing or upgrading cryptographic algorithms or parameters. While not specific to quantum computing, crypto agility would make defense against quantum computers easier by allowing substitution of today’s quantum-vulnerable public-key algorithms with quantum-resistant algorithms.

HTTPS (Hypertext Transfer Protocol Secure) is a Web communication protocol used between network devices for secure communication. It encrypts both the information a user sends to a website, and the information that the website sends back—for example, credit card information, bank statements, and e-mail.

Quantum annealing is a process for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. It finds an absolute minimum size/length/cost/distance from within a possibly very large, but nonetheless finite set of possible solutions using quantum fluctuation-based computation instead of classical computation.

Quantum computing is the use of a non-classical model of computation. Whereas traditional models of computing such as the Turing machine or Lambda calculus rely on classical representations of computational memory, a quantum computation could transform the memory into a quantum superposition of possible classical states. A quantum computer is a device that could perform such computation.

Quantum entanglement is a label for the observed physical phenomenon that occurs when a pair or group of particles is generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the pair or group cannot be described independently of the state of the others, even when the particles are separated by a large distance.

Quantum gate is a basic quantum circuit operating on a small number of qubits. They are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.

Quantum key distribution (QKD) is a secure communication method that implements a cryptographic protocol involving components of quantum mechanics. It enables two parties to produce a shared random secret key known only to them, which can then be used to encrypt and decrypt messages.

Quantum mechanics (also known as quantum physics, quantum theory, the wave mechanical model, or matrix mechanics) is a fundamental theory in physics which describes nature at the smallest scales, including atomic and subatomic.

Quantum superposition is a fundamental principle of quantum mechanics, where a system is in more than one state at a time. It states that, much like waves in classical physics, any two (or more) quantum states can be added together (“superposed”) and the result will be another valid quantum state; and conversely, that every quantum state can be represented as a sum of two or more other distinct states.

Quantum “supremacy” is demonstrating that a programmable quantum device can solve a problem that classical computers practically cannot (irrespective of the usefulness of the problem). By comparison, the weaker quantum advantage is demonstrating that a quantum device can solve a problem faster than classical computers. Using the term “supremacy” has been controversial, and quantum advantage is now often used for both descriptions.16

Qubit or quantum bit is the basic unit of quantum information. It is the quantum version of the classical binary bit. A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. It allows the qubit to be in a coherent superposition of both states/levels simultaneously, a property which is fundamental to quantum mechanics and quantum computing.

Symmetric key is an approach in cryptography when the same key must be used to either decrypt or encrypt a message. Asymmetric cryptography uses a pair of related keys, when one is used to encrypt a payload and the other to decrypt it. In public-key cryptography, users publish one of the keys, the public key, and keep the other secret, the private key. Then public key is used to encrypt the message and the private key is needed to decrypt it.

Annex II. A Brief History of Encryption, Cryptoanalysis and Digital Computers

Encryption and Cryptoanalysis

Since ancient times, cryptography has been a race between those trying to keep secrets and adversaries trying to uncover them. The earliest examples of transposition ciphers go back to at least 485 B.C., when the Greek soldiers would wrap a strip of papyrus around a staff, a scytale, write a message down its length, and send off the papyrus. The receivers could unscramble messages by wrapping them around another scytale of the same thickness. In this case, the staff’s shape represented the encryption key. The first known historical record of substitution cipher is from Roman Empire: Emperor Julius Caesar is believed to send encrypted messages to the orator Cicero replacing each letter by its third next down the alphabet. The Caesar cipher was broken as early as the 7th century by Arab cryptographers, who documented the techniques of cryptoanalysis, the science of undoing ciphers (Singh, 1999). In “A Manuscript on Deciphering Cryptographic Messages”, the philosopher al-Kindl observed that every language has a characteristic frequency of letters and sequences and that by capturing them using sample texts of that language, the cryptanalyst might decipher any message.

Simple substitutions became obsolete in the 1700s because of the proliferation of Black Chambers—offices kept by European nations for breaking ciphers and gathering intelligence. As Black Chambers industrialized cryptoanalysis, cryptographers were forced to adopt more elaborated substitutions by turning to polyalphabetic methods. Instead of referring to a single alphabet for encryption, cryptographers would switch between two alphabets for choosing replacement symbols. The Vigenère cipher, believed to be the first polyalphabetic method and also called Le Chiffre Indéchiffrable, was first described in 1553 and remained popular until it was broken in the 19th century.

World War I intensified the need for secrecy. The radio had brought new capabilities to the field, such as the coordination of troops at a long distance. However, open waves also allowed enemies to listen to communications. Each nation used its own encryption methods. Some, like the Playfair cipher used by the British, remained unbroken during the war; others, like the German ADFGVX, were broken. In the period following the World War I, machines became the logical solution for the increase in the volume of material to decrypt. Several mechanical cryptographic devices were invented in the period preceding World War II, such as the M-94 cipher device used by the US military; the C-36 by the French Army; and the Enigma by the German Army (Dooley, 2018). Also, several devices were invented to break their encryption. To break Enigma, Alan Turing—one of the inventors of the digital computer—created Bombes for the British secret operation center. Colossus, the first programmable computer based on Turing’s design, enabled the British to break the Lorenz cipher, which protected communications from the German high command. The US navy built fully automatic analog machines to break the cipher from Japan’s Purple device.

After World War II, digital computers dominated cryptography. Whereas mechanical devices are subject to physical limitations, computers operate at a much higher speed and scramble numbers, not letters, giving access to a large set of new operations. At the beginning of the 1960s, the transistor replaced the vacuum tube in digital circuits for computers and, at the end of that decade, the Internet was invented, kick-starting the current digital age. By the early 1970s, computers became available for business customers, which demanded secrecy capabilities from vendors. As regular citizens became computer users, cryptography became necessary, for instance, to enable credit card transactions or transmission of personal information through public networks. A plethora of new cryptographic schemes appeared, leading the American National Bureau of Standards to intervene in 1973 and open a public competition to choose a cryptographic standard for the United States. IBM’s Lucifer cipher, renamed Data Encryption Standard (DES), was elected as America’s official standard in 1977. After DES was broken in a public competition in 1997, it was replaced as standard by Triple-DES in 1999, and retired when NIST adopted Advanced Encryption Standard (AES) in the early 2000s.

Until mid-1970s, all cryptographic methods used symmetric keys: the same key must be used to either decrypt or encrypt a message. Thus, to use cryptography, senders and receivers had to share keys in advance, a complicated matter of logistics. Whitfield Diffie, Martin Hellman, and Ralph Merkle solved the problem in 1976. The Diffie-Hellman key exchange allowed two parties to agree on a secret key using a public channel. The trio effectively created asymmetric cryptography, whereby operations are associated with a pair of related keys: when one is used to encrypt a payload, the other decrypts it and vice versa. Two years later, Rivest, Shamir and Adleman extended the concept with public-key cryptograp hy, whereby users publish one of the keys, the public key, and keep the other secret, the private key. Asymmetric methods enabled new applications. For instance, people may claim their identity by showing a plaintext message and the cipher produced by their private key, which could be verified by decrypting the cipher using their public key. Asymmetric cryptography (including RSA), also known as public-key cryptography, is widely used over the Internet, including by the financial system, for key exchanges, digital signatures, non-repudiation and authentication. Public and private keys also underpin digital currencies and blockchain technologies.

Asymmetric or public-key cryptography is the most vulnerable to quantum computing. Potential advantages of quantum computers became apparent in the early 1980s, when Richard Feynman pointed out essential difficulties in simulating quantum mechanical systems on classical computers, and suggested that building computers based on the principles of quantum mechanics would allow us to avoid those difficulties (Nielsen, 2010). The idea was refined throughout the 1980s. In 1994, Peter Shor published an algorithm that would allow one to perform prime factorization much faster when using quantum properties. As prime numbers are used at the core of most asymmetrical cryptography methods, Shor’s algorithm used on quantum computers might render most Internet security invalid.

While quantum computing poses a threat to Internet security, quantum mechanics can also provide unbreakable cryptography. In the 1980s, researchers from IBM proposed a novel way to leverage photon polarization to perform key distribution. By using the laws of physics, Quantum Key Distribution (QKD) can become impenetrable because eavesdroppers cannot intercept communications without interfering with them. Such experimental systems have been implemented since the 1990s, but they are very far from commercial use.

Digital Computers

The origin of classical computers may be traced to 17th century France. In the small town of Clermont-Ferrand, Blaise Pascal built the first machine that enabled humanity to manipulate numbers by mechanically performing the four basic arithmetic operations. Human ability to do math was enhanced again in 1822 by the English polymath Charles Babbage’s Difference Engine. It could tabulate polynomial functions, which enabled the mechanical approximation of complex calculations such as logarithmic or trigonometric functions. Babbage also designed a general-purpose computer, the Analytical Engine. However, the project was terminated due to engineering and funding issues, and a working engine was never built in Babbage’s lifetime. The next notable machines in history were differential analyzers, analog computers that use wheel-and -disc mechanisms to perform integration of differential equations. The first differential analyzer built at MIT by Vannevar Bush in 1931 played a particularly important role in history for inspiring one of Bush’s graduate students, Claude Shannon. In 1938, he invented digital circuits for his master thesis (Shannon, 1938), proving that complex mathematical operations may be performed by running electricity through specific configurations of electronic components.

Shannon’s work was complemented by Alan Turing’s doctoral thesis. It came as an answer to the challenge produced by David Hilbert and Sir Bertrand Russel in the previous decade, the Entscheidungsproblem, or the halting problem: mathematicians should search for an algorithm to prove whether any statement is true in a system. The Turing Machine was an imaginary device composed of a mechanism that moves an infinite tape back and forth, writes symbols to it, and reads recorded symbols. The Church-Turing thesis then states that this device can compute any function on natural numbers as long as there is an effective method of obtaining its value. And, conversely, that such a method exists only if the device can compute that function.

Thus, engineering met mathematics: by the time Claude Shannon invented digital circuits, Turing had just designed the mathematical blueprint of a general-purpose computer. The resulting circuitry, Turing-complete digital computers, were capable of computing every function the imaginary machine can compute. While the Colossus, a war secret built by British intelligence to break Hitler’s communications, was the first in history, modern computers are based on the architecture designed within a team lead by John Von Neumann, first used in 1949’s EDVAC (Electronic Discrete Variable Automatic Computer). Contemporary digital devices are Turing-complete devices generally composed of processing units (e.g., CPU), storage devices (e.g., RAM/ROM and disk drives), and input and output mechanisms (e.g., keyboard and video). Desktop computers and smartphones follow this same design.

Once the design was invented, engineering advanced enormously in speeding up each of its components. For instance, vacuum tubes were prominent components of CPUs in early machines, needed for their singular capacity to control the direction of the flow of electrons through its terminals. However, tubes presented several challenges related to durability and reliability. They were replaced by transistors invented in the 1940s, which in turn were replaced by integrated circuits throughout the 1960s. Since then, performance and size of digital computers have been dictated by the technology of fabrication of integrated circuits. Since the 1960s such technologies have allowed us to double the number of components in each single integrated circuit every 18 months, as foreseen by Intel’s Gordon Moore in 1965—the so-called Moore’s law. Such advance, for instance, is the reason we were able to cram all computing power used in the Apollo 11 lunar landing capsule in 1969 into a single device by early 2010s. Similar leaps occurred for other components, spawning things like paper-thin foldable displays, or pinhead-sized devices that can store entire encyclopedias.

However, since such machines are Turing machines at its core, they are also bound by Turing machine’s limitations. One of such is their inability to tackle certain mathematical problems, the so-called NP-Hard problems. The most infamous of them is the Traveling Sales agent problem—calculating the shortest route through a series of cities and visiting each exactly once. Digital computers can calculate solutions for small setups, roughly by comparing all possible paths to each other. As problem size grows, mathematicians invented heuristic algorithms for finding reasonable solutions without going through all possibilities, but there is no certainty that the optimal path will be found.

As every NP-Hard problem is equivalent to the traveling sales agent, unlocking its solution would set in motion a whole new universe of possibilities, for many optimizations. This is the key held by quantum computers.

Annex III. Modern Cryptographic Algorithms and Their Vulnerabilities to Current Technologies

Today’s cryptography is based on three main types of algorithms: symmetric keys, asymmetric (public) keys and algorithmic hash functions, or hashing. Appendix IV lists the current and past main algorithms.

AES algorithm is currently the accepted standard for symmetric-key encryption. NIST selected it in 2001 to replace the former standard (Triple-DES). Although multiple publications introduced new cryptanalysis schemes attempting to undermine AES, the cryptographic community proved them ineffective. For example, Biryukov and others (2010) outlined an effective attack against specific variations of AES, which reduces the encryption strength. However, such attacks were deemed impractical and dismissed as a non-threat to AES encryption algorithms.

The RSA algorithm, a popular standard for asymmetric (public-key) encryption, is widely used to protect confidentiality and digital signature. The RSA algorithm has been resilient to cryptanalysis techniques since its publication in 1977, despite several attempts to challenge its strength. Earlier it was suggested that some knowledge of the plaintext message, under specific conditions, could weaken the encryption (Durfee, 2002). However, RSA algorithms continue to be resilient. Although some schemes may be used to reduce time and memory required to break public-key encryption, so far it has been proven that adequate key sizes and best practices make public-key cryptography resilient to classical computer attacks. It would take billions of years for a digital computer to break the current standard RSA 2,048-bit key (CISA, 2019).

Algorithmic hash functions were temporarily impacted by cryptanalysis, but recent progress restored their effectiveness. In 2005, the mathematician Lenstra demonstrated a hash-collision attack 17 against one of the most used hashing functions named MD5 (Lenstra et. al, 2005). Other researchers later demonstrated that a decent desktop computer equipped with a cheap graphics processor (GPU) could find a hash-collision in less than a minute. MD5 algorithm was officially retired by NIST in 2011. However, it is still widely used despite its known weaknesses, demonstrating the long-lasting issue with replacing legacy systems. NIST ran a competition to create the next standard for the algorithmic hash function named SHA-3 to overcome the cryptanalysis advancement undermining MD5 and the earlier versions of the SHA algorithms. While there are some possible weaknesses,18 SHA-3 was selected in 2015 and became the approved standard (Morawiecki et. al, 2014). Furthermore, almost any cryptographic algorithm can be strengthened by increasing its key sizes, but that would require more processing power and thus increase the costs of running the algorithm, often making it prohibitively expensive.

Beyond the encryption algorithm itself, a different class of attacks studies the exogenous systems. Sid e-channel attacks target the software, firmware, and hardware used to implement the encryption algorithm. Software and hardware vulnerabilities are usually easier to find and exploit compared to breaking the underlying mathematical techniques of the encryption algorithm. Vulnerabilities, or bugs, are the result of implementation mistakes during the development phases. However, some vulnerabilities may be the result of misuse or misconfiguration of the cryptographic libraries. The Heartbleed vulnerability (CMU, 2014) was a devastating example of a vulnerability discovered in OpenSSL, a widely used cryptographic library to secure network communication. (Lazar et. al., 2014) reported that 17 percent of the vulnerabilities in cryptographic libraries published by CVE19 between 2011 and 2014 were mistakes made during the development phases while the remaining 83 percent were related to misuse or misconfiguration by the hosting applications.

Annex IV. Main Cryptographic Algorithms

article image
article image
article image

References

  • Allen, Bryce D. 2008. “Implementing several attacks on plain ElGamal encryption.”— Graduate Theses and Dissertations, 11535. Mimeo available at https://lib.dr.iastate.edu/etd/11535.

    • Search Google Scholar
    • Export Citation
  • Anschuetz, E., Olson, J., Aspuru-Guzik, A. and Cao, Y. 2019. “Variational quantum factoring”. In International Workshop on Quantum Technology and Optimization Problems (pp. 7485). Springer, Cham.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Arute, F., Arya, K., Babbush, R. et al. , 2019. “Quantum supremacy using a programmable superconducting processor.”— Nature 574, 505510. https://www.nature.com/articles/s41586-019-1666-5#citeas.

    • Search Google Scholar
    • Export Citation
  • Bertoni Guido, Joan Daemen, Michaël Peeters and Gilles Van Assche. 2007. “Sponge Functions.”— ECRYPT Hash Workshop 2007, https://www.researchgate.net/publication/242285874_Sponge_Functions.

    • Search Google Scholar
    • Export Citation
  • Biryukov Alex, Orr Dunkelman, Nathan Keller, Dmitry Khovratovich, Adi Shamir. 2010. “Key Recovery Attacks of Practical Complexity on AES-256 Variants with up to 10 Rounds.”—Advances in Cryptology – EUROCRYPT 2010, pp 299319. https://link.springer.com/chapter/10.1007/978–3-642–13190-5_15.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Boaron Alberto, Gianluca Boso, Davide Rusca, C´edric Vulliez, Claire Autebert, Misael Caloz, Matthieu Perrenoud, Gäetan Gras, Félix Bussières, Ming-Jun Li, Daniel Nolan, Anthony Martin, and Hugo Zbinden. 2018. “Secure quantum key distribution over 421 km of optical f iber.”—July 9, 2018. Mimeo available at https://arxiv.org/pdf/1807.03222.pdf.

    • Search Google Scholar
    • Export Citation
  • Bouland, Adam, Wim van Dam, Hamed Joorati, Iordanis Kerenidis, Anupam Prakash. 2020. “Prospects and Challenges of Quantum Finance”: https://arxiv.org/pdf/2011.06492.pdf

    • Search Google Scholar
    • Export Citation
  • Burr William and Kathy Lyons-Burke. 1999. “Public Key Infrastructures for the Financial Services Industry. Mimeo. National Institute of Standards and Technology.

    • Search Google Scholar
    • Export Citation
  • Cade, Chris, Lana Mineh, Ashley Montanaro, and Stasja Stanisic. 2020. Strategies for solving the Fermi-Hubbard model on near-term quantum computers. Physical Review B.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • CISA. 2019. “Understanding Encryption.” —CISA, August 2019. Mimeo available at https://www.nd.gov/itd/sites/itd/files/legacy/alliances/siec/CISA%20Encryption%2028AUG19.pdf

    • Search Google Scholar
    • Export Citation
  • CMU. 2014: “OpenSSL TLS heartbeat extension read overflow discloses sensitive information.”—by CERT Coordination Center. Mimeo available at https://www.kb.cert.org/vuls/id/720951/.

    • Search Google Scholar
    • Export Citation
  • Diffie Whitfield and Martin Hellman. 1976. “New Directions in Cryptography.”—IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. IT-22, NO. 6, NOVEMBER 1976. https://ee.stanford.edu/~hellman/publications/24.pdf.

    • Search Google Scholar
    • Export Citation
  • De Feo Luca, David Jao, and Jerome Plut. 2011. “Towards Quantum-Resistant Cryptosystems from Supersingular Elliptic Curve Isogenies.”—Mimeo available at https://eprint.iacr.org/2011/506.pdf.

    • Search Google Scholar
    • Export Citation
  • Dobraunig Christoph, Maria Eichlseder, and Florian Mendel. 2016. “Analysis of SHA-512/224 and SHA-512/256.”—Advances in Cryptology—ASIACRYPT 2015, pp 612630, https://link.springer.com/chapter/10.1007%2F978-3-662-48800-3_25.

    • Search Google Scholar
    • Export Citation
  • Dooley, J.F. 2018. “History of Cryptography and Cryptanalysis. Codes, Ciphers, and Their Algorithms,”—Springer.

  • Glenn Durfee. 2002. “Cryptanalysis of RSA Using Algebraic and Lattice Methods.”—Stanford University. Mimeo available at http://theory.stanford.edu/~gdurf/durfee-thesis-phd.pdf.

    • Search Google Scholar
    • Export Citation
  • EFF. 1998. “Cracking DES: Secrets of Encryption Research, Wiretap Politics, and Chip Design.”—The Electronic Frontier Foundation (EFF), distributed by O’Reilly & Associates, inc. https://archive.org/details/crackingdes00elec.

    • Search Google Scholar
    • Export Citation
  • Egger D. J. et al. 2020. “Quantum Computing for Finance: State-of-the-Art and Future Prospects,” in IEEE Transactions on Quantum Engineering, Vol. 1, pp. 124, 2020, Art no. 3101724, doi: 10.1109/TQE.2020.3030314.

    • Search Google Scholar
    • Export Citation
  • Macmillan. 1971. “The Born-Einstein Letters: Correspondence between Albert Einstein and Max and Hedwig Born from 1916–1955, with commentaries by Max Born.” —Macmillan, 1971.

    • Search Google Scholar
    • Export Citation
  • El Gamal Taher. 1985. “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms.”—IEEE Transactions on Information Theory, Volume: 31, Issue: 4, Jul 1985, https://ieeexplore.ieee.org/document/1057074.

    • Search Google Scholar
    • Export Citation
  • ETSI. 2015: “Quantum Safe Cryptography and Security. An introduction, benefits, enablers and challenges.”—European Telecommunications Standards Institute, ETSI White Paper No. 8, June 2015. Mimeo available at https://www.etsi.org/images/files/ETSIWhitePapers/QuantumSafeWhitepaper.pdf.

    • Search Google Scholar
    • Export Citation
  • ETSI. 2017. “Quantum-Safe Cryptography; Quantum-Safe threat assessment.”—European Telecommunications Standards Institute, group report, March 2017. Mimeo available at https://www.etsi.org/deliver/etsi_gr/QSC/001_099/004/01.01.01_60/gr_QSC004v010101p.pdf.

    • Search Google Scholar
    • Export Citation
  • ETSI. 2020. “CYBER; Migration strategies and recommendations to Quantum Safe schemes”. Available at: https://www.etsi.org/deliver/etsi_tr/103600_103699/103619/01.01.01_60/tr_103619v010101_p.pdf

    • Search Google Scholar
    • Export Citation
  • Ferguson, Niels. 1999. “Impossible differentials in Twofish.”—Twofish Technical Report #5, October 19, 1999. Mimeo available at https://www.schneier.com/academic/paperfiles/paper-twofish-impossible.pdf.

    • Search Google Scholar
    • Export Citation
  • Galbraith et al. . 2016. Steven D. Galbraith, Christophe Petit, Barak Shani, and Yan Bo Ti, “On the Security of Supersingular Isogeny Cryptosystems.”—Advances in Cryptology – ASIACRYPT 2016, pp 6391, https://link.springer.com/chapter/10.1007%2F978-3-662-53887-6_3.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Google Report. 2020. “HTTPS encryption on the web.” —Google Transparency Report. Mimeo available at https://transparencyreport.google.com/https/overview?hl=en.

    • Search Google Scholar
    • Export Citation
  • Gramm-Leach-Bliley Act. 1999. Financial Services Modernization Act of 1999, https://www.ftc.gov/tips-advice/business-center/privacy-and-security/gramm-leach-bliley-act.

    • Search Google Scholar
    • Export Citation
  • GDPR. General Data Protection Regulation, 2018. https://gdpr-info.eu/.

  • Gidney Craig and Martin Eker. 2019. “How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits.”—December 6, 2019. Mimeo available at https://arxiv.org/pdf/1905.09749.pdf.

    • Search Google Scholar
    • Export Citation
  • Grassl Markus, Brandon Langenberg, Martin Roetteler, and Rainer Steinwandt. 2015. “Applying Grover’s algorithm to AES: quantum resource estimates.” Mimeo available at https://arxiv.org/pdf/1512.04965.pdf.

    • Search Google Scholar
    • Export Citation
  • Heninger, Nadia. 2015. “How Diffie-Hellman Fails in Practice.”—Presentation available at https://simons.berkeley.edu/talks/nadia-heninger-2015-07-07.

    • Search Google Scholar
    • Export Citation
  • Kothari, Robin. 2020. “Quantum speedups for unstructured problems: Solving two twenty-year-old problems”. Microsoft Research Blog: https://www.microsoft.com/en-us/research/blog/quantum-speedups-for-unstructured-problems-solving-two-twenty-year-old-problems/

    • Search Google Scholar
    • Export Citation
  • Johnson Don, Alfred Menezes and Scott Vansto. 2001. “The Elliptic Curve Digital Signature Algorithm (ECDSA).”—Mimeo available at https://www.cs.miami.edu/home/burt/learning/Csc609.142/ecdsa-cert.pdf.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lazar David, Haogang Chen, Xi Wang, and Nickolai Zeldovich. 2014. “Why does cryptographic software fail? A case study and open problems.”—MIT CSAIL. Mimeo available at https://people.csail.mit.edu/nickolai/papers/lazar-cryptobugs.pdf.

    • Search Google Scholar
    • Export Citation
  • Lenstra Arjen, Xiaoyun Wang and Benne de Weger. 2005. “Cryptology ePrint Archive: Report 2005/067.”—Mimeo available at https://eprint.iacr.org/2005/067.

    • Search Google Scholar
    • Export Citation
  • Martinis John, and Sergio Boixo. 2019. “Quantum Supremacy Using a Programmable Superconducting Processor.” Google AI Blog: https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html.

    • Search Google Scholar
    • Export Citation
  • Morawiecki Pawe, Josef Pieprzyk, and Marian Srebrny. 2014. “Rotational Cryptanalysis of Round-reduced Keccak.”—Conference Paper, July 2914. Mimeo available at https://www.researchgate.net/publication/267247045_Rotational_Cryptanalysis_of_Round-Reduced_Keccak.

    • Search Google Scholar
    • Export Citation
  • Moriai Shiho, and Yiqun Lisa Yin, “Cryptanalysis of Twofish (II). 1999.”—Mimeo available at https://www.schneier.com/twofish-analysis-shiho.pdf.

    • Search Google Scholar
    • Export Citation
  • Monz, T., Nigg, D., Martinez, E. A., Brandl, M. F., Schindler, P., Rines, R., Wang, S. X., Chuang, I.L. and Blatt, R. 2016: “Realization of a scalable Shor algorithm”. Science, 351(6277), pp.10681070.

    • Search Google Scholar
    • Export Citation
  • National Academies of Sciences. 2019. “Engineering, and Medicine: Quantum Computing: Progress and Prospects.” The National Academies Press, Washington, DC.

    • Search Google Scholar
    • Export Citation
  • Nielsen, M.A. and Chuang, I.L. 2010. “Quantum Computation and Quantum Information.”—Cambridge University Press.

  • NIST. 2019. “Transitioning the Use of Cryptographic Algorithms and Key Lengths.”—NIST, March 21, 2019. Mimeo available at https://csrc.nist.gov/News/2019/NIST-Publishes-SP-800-131A-Rev-2

    • Search Google Scholar
    • Export Citation
  • NIST. 2020. “Status Report on the First Round of the NIST Post-Quantum Cryptography Standardization Process.” —National Institute of Standards and Technology Internal Report 8240, July 2020. Mimeo available at https://csrc.nist.gov/publications/detail/nistir/8309/final

    • Search Google Scholar
    • Export Citation
  • Orus Roman, Samuel Mugel, and Enrique Lizaso. 2019. “Quantum computing for finance: overview and prospects.” —Reviews in Physics, Volume 4, November 2019. Mimeo available at https://doi.org/10.1016/j.revip.2019.100028.

    • Search Google Scholar
    • Export Citation
  • Pednault, Edwin, John A. Gunnels, Giacomo Nannicini, Lior Horesh, and Robert Wisnieff. 2019. “Leveraging Secondary Storage to Simulate Deep 54-qubit Sycamore Circuits.” Mimeo available at https://arxiv.org/abs/1910.09534.

    • Search Google Scholar
    • Export Citation
  • Shannon, C.E.. 1938. “A symbolic analysis of relay and switching circuits.” Electrical Engineering, 57(12), pp.713723.

  • Singh, S. 1999. “The Code Book: The Evolution of Secrecy from Mary, Queen of Scots to Quantum Cryptography,”—Doubleday Books.

  • Stevens Marc, Elie Bursztein, Pierre Karpman, Ange Albertini, Yarik Markov. 2017. “The first collision for full SHA-1.”—Cryptology ePrint Archive: Report 2017/190. Mimeo available at https://eprint.iacr.org/2017/190.

    • Search Google Scholar
    • Export Citation
  • Tang Lynda, Nayoung Lee, Sophie Russo. 2018. “Breaking Enigma.”. Mimeo available at https://www.semanticscholar.org/paper/Breaking-Enigma-Tang-Lee/692ea1d3eee5f423639d36f495bc6c7f7614806c.

    • Search Google Scholar
    • Export Citation
  • Zhong, Han-Sen, Hui Wang, Yu-Hao Deng, Min g-Cheng Chen et al. . 2020. “Quantum computational advantage using photons.“—Science, December 3, 2020.

    • Search Google Scholar
    • Export Citation
1

We would like to thank, without implications, Andreas Bauer, Sonja Davidovic, Davide Furceri, Dong He, and Herve Tourpe for their helpful comments and suggestions on earlier versions of the paper; and Mariam Souleyman for excellent administrative and editorial assistance.

3

In the literature on quantum computing, computers that process information according to classical laws of physics are referred to as classical computers, as opposed to quantum computers. In this paper, we use the terms classical, conventional, digital, and traditional computers interchangeably.

4

These risks are known as “harvest now, decrypt later” attacks.

5

Uncertainty principals: Commercialising quantum computers.”—The Economist, September 26, 2020.

6

Cramming More Power Into a Quantum Device.”— IBM research blog, March 4, 2019.

7

While the final objective is to build fully error-corrected quantum computers, an intermediate objective is to build practical commercial applications of noisy intermediate-scale quantum (NISQ) computers. Currently noise is present in both quantum annealers and NISQ types of machine, limiting the complexity of the problems that they can solve.

10

Unhackable internet, MIT Technology Review, April 2, 2020.

13

Cryptanalysis, the analysis of the encrypted secret message (ciphertext) to gain as much information as possible about the original message, studies the algorithms, mathematics and techniques to uncover the secret messages. By exploiting weaknesses in the underlying encryption methods, much can be learned about the original message without knowing the secret key (see Annex III).

17

In a hash collision attack, an attacker attempts to find two inputs to the hash algorithm that would produce the same hash value. When such a collision is found, the algorithmic hash functions is deemed insecure.

18

They described a preimage attack based on rotational cryptanalysis that reduces the algorithm rounds against SHA-3 512 bit variation. As a result, less time and memory would be required to find a hash-collision.

19

The Common Vulnerabilities and Exposures (CVE) is an international cybersecurity community effort to maintain a list of common identifiers for publicly known cybersecurity vulnerabilities.