By Casimer DeCusatis
I’ve written about the state of quantum computing systems in previous OFC blogs, which has led some people to ask if I could put together a higher level introduction to the field for newcomers interested in the technology. As I was preparing this blog, I was reminded of the long, winding path that quantum computing has taken from its early inception in the 1960s to the prototype systems available today. In this blog, I’ll try to provide an overview of some basic quantum computing concepts, while also discussing the obstacles that remain before quantum computing becomes mainstream. We’ll need to make a few simplifying assumptions to get started, but since this is an introductory discussion we’ll skim over the more complex details and mathematics for now.
Quantum computers provide special purpose processing.
Quantum computers will never replace classical, digital computers. Rather, they are intended to provide special purpose processing for certain types of problems, including simulations of physical, biological, or chemical systems, factoring large prime numbers, searching large databases, and more. All of these problems share the characteristic of taking exponential time to solve using classical computers. Quantum computers hold the potential to solve these problems much faster. This is because a quantum computer does not use classical bits, which can only assume the values 1 or 0, but rather qubits, which exist in a superposition of possible states between 1 and 0 until they are measured and collapse back into a classical state. Qubits also exhibit nonclassical properties such as entanglement, which enables quantum algorithms to explore different states at the same time.
Quantum computers are in their infancy
Quantum computers today are roughly analogous to the maturity of classical computers when they were first developed. We have built some very promising prototypes, but much remains to be done before we can conclusively demonstrate the value of this approach to a wide range of problems. Just as early computers represented bits using holes punched in paper tape, vacuum tubes, and other approaches before settling on modern semiconductor materials, there are many ways to represent qubits today which may not prove optimal in the long run. This includes realizing qubits using photonics, superconducting circuits, trapped ions, and even new technology called topological insulators. For now, we can only build systems with a handful of qubits, although the capacity of emerging systems is growing fast, similar to Moore’s Law for semiconductors.
How to keep qubits stable
A key issue is how to keep qubits in a stable quantum state long enough to perform meaningful calculations, before noise and other effects cause them to lose coherence and collapse into a classical state. This is important if we are to scale quantum computers to significantly larger numbers of qubits while minimizing computational errors in quantum programs. While classical silicon processors can run a billion operations per second for a billion years before a statistical error occurs, current quantum computers use qubits which retain information for only a few milliseconds. It’s possible to mitigate this effect using error correction, but that comes with a large computational overhead; we need perhaps 1,000 physical qubits to reach the level of error correction required to match conventional digital systems. Some of the largest quantum computers built to date offer perhaps 50-60 qubits, so we have a long way to go. However, IBM has projected the availability of a 1,000-qubit system by 2023 based on scalability of their current Q System One platform.
The field made some important advances this past year
In May 2021, for example, researchers at Washington University in St. Louis developed a high fidelity 2 bit quantum logic gate based on light . Using a combination of spatial and frequency modulation, this approach increases the efficiency of quantum processors by an order of magnitude. A single photon has no effect on their quantum gate, but when two photons are present simultaneously it becomes possible to perform quantum calculations. In June, researchers at MIT discovered a technique to further reduce errors in two qubit gates similar to those used in the photonics processors at Washington University. And using a different quantum computing architecture, in February researchers at Berkeley National Labs discovered a way to use artificial color centers in diamonds that could scale to about 10,000 qubits.
Quantum technologies will be a hot topic at OFC 2022
Given the rapid pace of new breakthroughs in quantum computing, it may not be long before we’re able to address qubit noise and fidelity issues on a massive scale. To keep up with the fast pace of new developments, be sure to make your plans now for OFC 2022, where the latest quantum technologies are sure to be a hot topic. Or sign up for notifications from OFC at the web site so you won’t miss out on registration discounts and important conference program updates. If you’re interested in this area, drop me a line on Twitter (@Dr_Casimer) and let me know what you’d do if you had a 10,000-qubit quantum computer; maybe we can discuss your ideas in a future blog. For cited sources, please contact @Dr_Casimer on Twitter.
Register for the OFC Technical Conference by 07 February 2022 to save.
Posted: 20 December 2021 by
| with 0 comments