Most of us have become aware of quantum computing in recent years. As is often the case with significant scientific and technical advances, the origins occurred decades ago. Subsequent theorizing, elaborating, tinkering and engineering have stretched over the intervening decades. Then, all this work results in various prototypes. Eventually, reasonably finished products appear that an end-user without a Ph.D. and a supporting lab team can use.
Quantum computing is the use of quantum phenomena such as superposition and entanglement to perform computations. Computers that perform quantum calculations are known as quantum computers.
The following pages describe the highlights of the development of quantum computing so far. Check back on Monday for part 2!
At Cornell and Caltech, Richard Feynman was a theoretical physicist known for his work in the integral formulation of quantum mechanics. Feynman and a few other scientists initiated the field of quantum computing. He received the Nobel Prize for Physics in 1965 jointly with two others.
In 1982, he observed that it appeared to be impossible to efficiently simulate a quantum system’s evolution on a classical computer. He proposed a basic model for a quantum computer and thereby coined the term.
Photo by: Encyclopædia Britannica
Paul A. Benioff
Paul A. Benioff is a physicist at the Argonne National Laboratory who helped pioneer quantum computing. Benioff is best known for his research in quantum information theory. During the 1970s and 80s, he demonstrated the theoretical possibility of quantum computers by describing the first quantum mechanical model of a computer.
In this work, Benioff showed that a computer could operate under the laws of quantum mechanics by describing a Schrödinger equation description of Turing machines.
Photo by Justinhsb – Own work, CC BY 4.0
Quantum simulators permit the study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer.
In 1982, Richard Feynman described a universal quantum simulator that was the antecedent to the quantum computer.
In 1985, David Deutsch, a physicist at the University of Oxford, took the quantum simulator ideas further and described a universal quantum computer and created a blueprint that underpins today’s nascent industry. In 1996, Seth Lloyd, a professor of mechanical engineering and physics at MIT, showed that a standard quantum computer could be programmed to simulate any local quantum system efficiently.
Illustration by National Institute of Standards and Technology (NIST)
In 1994 Peter Shor, now a professor of applied mathematics at MIT, discovered an essential algorithm at AT&T’s Bell Labs in New Jersey. It allows a quantum computer to factor large integers exponentially faster than the best currently known algorithm running on a classical computer. Shor’s algorithm can theoretically break many of the cryptosystems in use today.
Its invention sparked a tremendous interest in quantum computers and quantum algorithms.
In 1995, Peter Shor discovered a quantum error correction (QEC) method that is used to protect quantum information from errors due to decoherence and other quantum noise.
NMR quantum computer – 2-qubits
In the first half of 1997, Jonathan A. Jones, a physics professor, and Michele Mosca, a mathematician, at the Oxford Centre for Molecular Sciences used a 2-qubit NMR quantum computer to solve Deutsch’s problem. This achievement was the first experimental demonstration of a quantum algorithm.
Isaac L. Chuang at IBM’s Almaden Research Center and Mark Kubinec and the University of California, Berkeley, together with coworkers at Stanford University and MIT, achieved the same result somewhat later.
Jones and Mosca also implemented quantum computer algorithms such as Grover’s quantum search algorithm in 1998 and approximate quantum counting in 1999. They used a 500MhZ version of the NMR spectrometer illustrated on the next slide.
Source: Correspondence with Jonathan A. Jones and Michele Mosca.
Jonathan A. Jones photo by Bi Scott
Michele Mosca photo by the University of Waterloo
NMR quantum computer – 3-qubits
In 2002, Jonathan A. Jones and Michele Mosca at Oxford University used a 3-qubit NMR quantum computer with a 600MhZ magnet to implement a one-to-two approximate quantum cloning network. The spectrometer in the photo controls the computing by the qubits and reads out the answer.
Quantum bits, or qubits, have unique and powerful properties, such as superposition, that allow a group of them to do much more than an equivalent number of conventional bits.
Source: Correspondence with Jonathan A. Jones and Michele Mosca and Approximate Quantum Cloning with Nuclear Magnetic Resonance.
Photo by Jonathan A. Jones
Quantum computation roadmap
In 2004, A Quantum Information Science and Technology Roadmap was published to facilitate “the development of QC to reach a point from which scalability into the fault-tolerant regime can be reliably inferred.”
The roadmap recognized the following QC subfields:
- nuclear magnetic resonance (NMR) quantum computation.
- Ion trap quantum computation.
- Neutral atom quantum computation.
- Cavity quantum electro-dynamic (QED) computation.
- Optical quantum computation.
- Solid-state (spin-based and quantum-dot-based) quantum computation.
- Superconducting quantum computation.
- Unique qubits, such as electrons on liquid helium, spectral hole burning, quantum computation.
- Quantum information theory, architectures, and decoherence challenges.
Check back on Monday for a brief history (so far) of quantum computing [PART 2]
Source: A Quantum Information Science and Technology Roadmap
Table by A Quantum Information Science and Technology Roadmap