Quantum computing faces a major hurdle in handling everyday artificial intelligence tasks, according to new research testing IBM's quantum hardware. While quantum computers promise revolutionary computing power, this study reveals they still struggle with basic probability calculations that conventional computers handle effortlessly.
The research team discovered that IBM's quantum devices produce significant errors when running Quantum Bayesian Networks, a quantum version of AI systems used for uncertainty modeling and prediction. When testing a simple 4-node network for stock price prediction, most quantum hardware showed error rates between 20-36%, with some devices performing worse than others.
Researchers used IBM's Qiskit software to build quantum circuits representing the AI network, then ran these circuits on nine different IBM quantum devices. They tested each device at four optimization levels, running each experiment 10 times with 8,192 measurements per run. The quantum computers were compared against both classical simulation and traditional analysis software.
The results showed stark performance differences. The Yorktown device performed best with only 4.9% error at zero optimization, while Burlington showed the highest error at 36.3%. Most devices improved with higher optimization levels, with Essex showing the most dramatic improvement - dropping from 30.7% error to 9.1% as optimization increased. However, even the best quantum results fell far short of classical computing performance.
This matters because Bayesian Networks are fundamental to many real-world AI applications, from medical diagnosis to financial forecasting and autonomous systems. The study used a stock prediction example involving interest rates, oil industry conditions, stock market behavior, and stock prices - exactly the kind of probabilistic reasoning needed in finance and business analytics. If quantum computers can't reliably handle such basic networks, their practical application to complex AI problems remains distant.
The research acknowledges that quantum hardware performance is limited by noise and errors in current systems. As networks grow more complex, these errors would likely increase, making accurate computation even more challenging. The study didn't explore whether future error-correction techniques could overcome these limitations.
Original Source
Read the complete research paper
About the Author
Guilherme A.
Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.
Connect on LinkedIn