The promise of quantum computing is a big one.
The ability to solve—in days—problems in areas such as mathematics, finance, and biological systems that are so complex it would take a classical computer hundreds of years to calculate.
But that ability is a long way off, and a big reason why is noise.
“The key barrier for quantum computing is noise in the device, which is a hardware issue that causes errors in computing,” says Xiu Yang, an assistant professor of industrial and systems engineering in Lehigh University’s P.C. Rossin College of Engineering and Applied Science. “My research is on the algorithm level. And my goal is, given that noise in the device, what can we do about it when implementing quantum algorithms?”
Yang recently won support from the National Science Foundation’s Faculty Early Career Development (CAREER) program—a $400,000 grant over five years—for his proposal to develop methods to model the error propagation in quantum computing algorithms and filter the resulting noise in the outcomes.
The prestigious NSF CAREER award is given annually to junior faculty members across the U.S. who exemplify the role of teacher-scholars through outstanding research, excellent education, and the integration of those two pursuits.
Yang will use cutting-edge statistical and mathematical methods to quantify the uncertainty induced by the device noise in quantum computing algorithms. His work could help quantum computing move toward real-world adoption in a wide range of fields, such as drug development, portfolio optimization, and data encryption, where the technology is seen as a potential game-changer.
“My first goal is to model noise accumulation,” he says. “So for example, if I run a so-called iterative algorithm, the noise or the error from the device will accumulate through each iteration. It’s possible that for some algorithms, the error will be so large that the outcome of the algorithm is useless. But in other cases, it may not be that significant.”
In those cases, the noise that’s contaminating the outcome could be filtered out.
“So I first need to see how the error propagates, and then, if I know how much it contaminated the outcome, I can determine if the results are useless, or if the noise can be filtered out to get the desired outcome,” he says.
To that end, Yang will investigate various types of algorithms to determine how they are affected, and whether or not they need to be redesigned or if he can develop a filter instead.
“Basically, I’m analyzing the suitability of quantum algorithms on quantum computers,” he says. “So this is a quantum numerical analysis from a probabilistic perspective.”
The ultimate goal is to enable quantum computing to achieve its promise of unparalleled speed when it comes to solving highly complex problems like those physical and chemical systems involving interactions between millions of molecules.
“Let’s say a pharmaceutical company wants to design a new drug or vaccine,” he says. “They need to understand the interaction between all those particles. If I were to use a classical computer, that process would be very slow. But with a quantum computer, it would be very, very fast.”
Yang says the award not only helps his field get a step closer to that reality but also reflects a recognition outside his community of researchers that quantum computing’s potential is worth the investment.
“This award is from both the NSF’s Division of Computing and Communication Foundations and its Division of Mathematical Sciences,” he says. “Which means that people in the math and statistics community are now getting interested in quantum computing. They realize this is a very important area, and we can make a contribution.”