By Casimer DeCusatis
Research in photonic computing, particularly photonic integrated circuits, has been ongoing for some time; this includes recent applications in neuromorphic computing, the use of large-scale integrated hardware or software systems to mimic biological architectures in the human brain and nervous system. In this blog, we’ll take a quick look at some emerging innovations in this area.
Industry leaders such as Intel feel that neuromorphic computing could enable the next generation of artificial intelligence systems, and have previously demonstrated silicon implementations with over 128 cores, capable of simulating well over 100,000 neurons. This approach attempts to run elements of machine learning algorithms on hardware that better reflects its massively distributed nature. This could lead to faster, more energy efficient data processing. Unfortunately, this is not an easy task. More so than other, more conventional computer architectures, massively distributed hardware depends on massively parallel interconnects between processing elements (such as neurons). It rapidly becomes impractical to provide dedicated copper wiring for every connection point in these designs. Alternatives such as time division multiplexing have been investigated, effectively trading off bandwidth for connectivity. However, to exploit the full benefit of neuromorphic systems, the inherent parallelism of photonics technology offers a compelling advantage.
Benefits of Photonic Waveguides
The layout of interconnects between artificial neurons can be represented as a matrix-vector operation. Optical signals can implement the required dot product multiplication and addition operations through tunable waveguide elements and wavelength multiplexing systems. Compared with copper interconnects, photonic waveguides offer lower attenuation, and if the light source is off-chip they can also generate less heat. Further, unlike electronic transmission line systems, neural networks are not based on point-to-point interconnects; instead, they require parallel signal fan-in and fan-out operations, which are more readily implemented using photonic waveguides. In fact, it can be impractical to scale electronic systems to large neuromorphic designs, since active buffering would be required for each connection. Neuromorphic photonic systems also offer the advantage of sub-nanosecond latencies and high scalability, which exploits parallel optical processing architectures.
Photonic Neural Networks and Quantum Computers
Photonics has been recognized as a key enabling technology for implementing neuromorphic computing architectures. Ultra-fast, low latency photonic neural networks have applications which include improving the qubit classification operations in quantum computers, solving nonlinear optimization problems and creating intelligent signal processing systems for fiber optics and other systems. To take just one example, photonic neural networks can be used to improve the fidelity of quantum measurements, which is an important step towards developing quantum computers with larger quantum volume. These neural networks can also help qubits remain coherent for longer periods of time, thus allowing researchers to run longer computations. Think of quantum fidelity as the time period during which we can run a quantum algorithm before the qubits are no longer usable, similar to using the calculator on your smart phone when the battery is about to run down. As we improve coherence time in quantum computers, we can create longer, more intricate algorithms that address a wider range of potential applications.
The use of photonics to simulate brain functions and improve quantum computing is only one of the many leading-edge sessions held at this past year’s OFC conference. You won’t want to miss what’s coming next in this rapidly-moving field, so be sure to make your plans now for OFC 2022. If you have a particular application for neuromorphic photonics that you’d like to see highlighted at OFC, drop me a line on Twitter (@Dr_Casimer) and perhaps we’ll discuss it in a future blog. For cited sources, please contact @Dr_Casimer on Twitter.
Posted: 28 October 2021 by
| with 0 comments