The computer you are using is electronic. In other words, it uses the flow of electrons to power its computations. Photonic computers, sometimes called “optical” computers, could one day do what a computer does with electrons, but with photons instead.
What’s So Great About Optical Computers?
Optical computers hold a lot of promise. In theory, a fully optical computer would have several advantages over the electronic computers we use today. The biggest advantage is that these computers would run faster and operate at lower temperatures than electronic systems. With frequencies measured in the tens of gigahertz with theoretical frequencies measured in terahertz.
Optical computers should also be highly resistant to electromagnetic interference. The actual photons in the system should be unaffected, but the laser or other light source providing those photons could still be knocked out.
Photonics could also provide high-speed, parallel interconnections that make parallel computing systems possible that electrons are too slow for.
The Photonic System We’re Already Using
While there’s no such thing as a fully optical computer yet, that doesn’t mean aspects of computing aren’t already photonic. The one that most people already use today is fiber optics. Even if you don’t have a fiber connection at home, all your network packets are transformed into light at some point along the line.
Fiber optics have revolutionized how much data we can move across relatively thin cables, over incredibly long distances. Even with the overhead of converting between electrical and photonic signals, fiber optics have had an exponential effect on the speed and bandwidth of communications. It would be great if the rest of the “slow” electrical computing systems could also be converted to run on photons, but it turns out that’s a tall order!
The Photonic Puzzle Isn’t Cracked
At the time of writing, scientists and engineers still haven’t figured out how to replicate every computer component that currently exists within semiconductor processors. Computation is nonlinear. It requires that different signals interact with each other and change the outcomes of other components. You need to build logic gates in the same way that semiconductor transistors are used to create logic gates, but photons don’t behave in a way that naturally works with this approach.
This is where photonic logic comes into the picture. By using nonlinear optics it’s possible to build logic gates similar to those used in conventional processors. At least, in theory, it could be possible. There are many practical and technological hurdles to overcome before photonic computers play a significant role.
Photonic Computers Might Unlock AI
While there are currently limits on what types of computation photonic technology can be applied, one area of excitement is deep learning. Deep learning is a subset within the field of artificial intelligence and, in turn, machine learning.
In a fascinating article by Dr. Ryan Hamerly (MIT) he argues that photonics is especially suited for the type of math used in deep learning. If the photonic chips they’re working to make a reality live up to their potential, it could have a major impact on deep learning. According to Hamerly:
What’s clear though is that, at least theoretically, photonics has the potential to accelerate deep learning by several orders of magnitude.
Given how much of our cutting-edge technology today relies on machine learning to work its magic, photonics could be more than just an obscure branch of theoretical computing.
Hybrid Systems Are Likely
For the foreseeable future, we’re not going to see purely photonic systems. What’s far more likely is that certain parts of supercomputers and other high-performance computing systems might be photonic. Photonic components could gradually enhance or take over specific types of computation. Much like the D-Wave quantum processors are used to do very specific calculations, with the rest handled by conventional computers.
So, until we see the light one day (so to speak) photonics will probably be advancing slowly but steadily in the background until it’s ready to kick-start another computing revolution.