As a technology still in its nascent stages, quantum computing holds the promise of revolutionizing various fields, provided it can surmount the challenges of scalability and error resilience. Scalability refers to handling larger tasks, akin to opening a ton of windows but your computer doesn’t slow down. Resilience involves maintaining reliability, like when you drop your phone but it keeps working.
What if the particle-like properties of light — tiny packets of energy called photons — hold the answer? The University of Virginia’s Xu Yi , an associate professor in the Charles L. Brown Department of Electrical and Computer Engineering, received a National Science Foundation CAREER Award to explore the use of quantum optical technology to try to solve both problems.
Quantum Insights from a Nobel Laureate
Caltech professor Richard Feynman, along with Julian Schwinger and Shin’ichiro Tomonaga, earned the 1965 Nobel Prize in physics for his work on quantum electrodynamics.
In his book “Surely You’re Joking, Mr. Feynman!” he is quoted as saying, “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”
He and many researchers, including Yi who also hails from Caltech, believe that some problems can’t be solved until we master quantum computing. This is because traditional or classical computing can only measure slivers of a problem – and take educated guesses about the rest -- because of its relatively small computing capability. Quantum computing can potentially analyze enormous data sets like the myriad possibilities of chemical interactions within a human body system and in short order.
But Feynman was proved to be right, it’s not so easy.
Applied Quantum Computing
A good example of how quantum computing could be applied to a quantum problem is drug development. In the pharmaceutical business, it typically takes more than 10 years to develop a new drug because it’s impossible to know how the drug molecules will behave and how they will interact with the atoms in the body. This interaction is quantum mechanical. Once scaled-up quantum computing is a reality, it’s possible that this problem can be solved using “true-to-life” quantum mechanical simulations. Quantum computers will potentially be able to process the piles of data and ultracomplex algorithms needed to deduce how an entire system really works without going tilt.
Then scientists will be able to develop new drugs faster and more efficiently and possibly reduce research and development spending for new drugs. Currently, some drugs are so expensive that people can’t afford them, and some are not covered by insurance, which limits who can benefit from them.
Increasing pharmaceutical availability could impact health equity dramatically.
Where We Are Now
Recent developments like quantum-computing-friendly hardware and advanced algorithms have made hefty contributions toward the feasibility of quantum computing, but the technology is still considered emergent. Two problems must be solved to access the full power and viability of quantum computing: increasing scalability and maintaining stability — or accuracy — at scale.
While quantum computing front-runners like IBM and Google are using superconducting circuits as processors to manipulate and store quantum information, the metal materials used for constructing these components offer too many opportunities for error in computations. Wrangling qubits, the building blocks of quantum information, relies on managing quantum aspects like probability and entanglement, which is much more difficult than managing the ones and zeros of traditional computing. Because of these challenges, scaling quantum computing with semiconductor technology has been extremely slow going.
Leveraging the Power of Light
That’s why Yi aims to leverage the properties of light — photons — to greatly reduce the number of physical components needed for quantum operations.
If Google and IBM want a thousand qubits, they need a thousand physical units. And each time they try to add more units, the opportunities for error multiply.
A good analogy is the way fiber communication uses different wavelengths to transmit information. When the bandwidth is increased, all the wavelengths still travel within the same fiber. One fiber can support thousands of wavelengths. Yi and his team took this idea and applied it to quantum computing.
His work uses optical components that can each support hundreds of thousands of wavelength messages — a far cry from the one-to-one requirement of semiconductors. With fewer components, Yi’s system greatly reduces opportunities for error while offering a vehicle for scaling.
From Tabletop to Chip Scale
Photonic quantum computing was first developed at UVA by physics professor Oliver Pfister on a tabletop more than 10 years ago using different wave colors in quantum mechanics. He has shown he can entangle up to 3,000 different photonic qubits (technically called modes) together.
But entanglement — getting the photons to relate to each other in a quantum way — isn’t the same thing as computing, it’s only the first step. With Yi’s help, Pfister’s 3,000-qubit prototype has the potential to surpass IBM’s top computational qubits number of 1,121.
Creating a usable photonic quantum method means reducing all the photonic mechanical equipment from a tabletop to the size of a chip. Downsizing all the mechanical equipment decreases the margin of error because the wavelengths have a shorter distance to travel and controlling computations in a smaller space is easier.
A Quantum Leap in Healthcare
When Yi can prove the same entanglement capability at chip scale that Pfister proved at tabletop scale, he’ll start controlling the wavelengths for computations and then expand and scale.
When they get this far, human chemistry and whole-system biological simulations could be right around the corner.