AIResearch AIResearch
Back to articles
Coding

Classical Computers Match Quantum Speed for Key AI Tasks

Researchers found a way to simulate a popular quantum computing method on ordinary computers, challenging claims of quantum advantage for graph analysis and data sampling.

AI Research
March 26, 2026
4 min read
Classical Computers Match Quantum Speed for Key AI Tasks

A new study reveals that a computational task once thought to require quantum computers can now be performed efficiently on classical machines, potentially reshaping the landscape of quantum advantage research. Gaussian Boson Sampling (GBS), a proposed to demonstrate quantum superiority by using photons to solve complex graph problems, has been shown to be simulable in polynomial time on standard computers. This finding, detailed in a paper by researchers from the University of Edinburgh, Georgia Institute of Technology, University of Oxford, and Nanjing University, indicates that quantum algorithms may not offer exponential speedups for applications like counting perfect matchings or finding dense subgraphs in networks, as previously hoped. The work builds on earlier efforts to classically simulate boson sampling but extends it to the Gaussian variant widely used in recent experimental setups.

The key is that the distribution related to GBS on graphs, defined as μGBS,G(S) ∝ c^{2|S|} PM(S)^2 where PM(S) counts perfect matchings in a subset S of vertices, can be sampled classically in polynomial time. This distribution is central to many proposed applications of GBS, including graph isomorphism detection and planted bipartite clique finding. The researchers proved Theorem 1.1, which states that an algorithm can sample from this distribution with ε-close accuracy in total variation distance in time O(cmn^4 log^2(nc/ε)), where m and n are the numbers of edges and vertices in the graph, and c is a parameter. As a consequence, they concluded that there is no exponential-time quantum speedup for any application based on sampling from this distribution, directly addressing open questions in the field.

To achieve this result, the team employed a combinatorial reduction technique that transforms the GBS sampling problem into one of sampling weighted perfect matchings in a constructed graph. Specifically, they took the input graph G and formed its Cartesian product with an edge, denoted G □ K2, by creating a copy G' of G and adding edges between corresponding vertices. They assigned weights λ_e = c^2 for edges in the original and copied graphs, and weight 1 for the connecting edges. By sampling perfect matchings from this weighted graph using the Jerrum-Sinclair Markov chain algorithm, they showed that the induced distribution on subsets S of vertices matches the GBS distribution exactly, as detailed in Lemma 4.1. This approach contrasts with prior work that required severe restrictions on graph density, such as minimum degree conditions, making the new more general and faster.

Are supported by rigorous mathematical proofs, including Lemma 4.2, which establishes that Zn-1 < 2n^2 Zn, ensuring the ratio between near-perfect and perfect matchings in G □ K2 is polynomially bounded. This property is crucial for the efficiency of the Jerrum-Sinclair algorithm, which samples matchings in time dependent on graph size and error tolerance. The paper also addresses boson sampling with non-negative matrices in Theorem 1.3, showing a similar classical simulation in time O(m^7 n^14/ε^7 log^4(mn/ε)). The researchers referenced figures like Figure 1 and Figure 2 to illustrate their graph constructions, emphasizing the simplicity and practicality of their compared to quantum implementations that rely on photonic networks and Gaussian states.

Of this research are significant for both theoretical computer science and practical applications. By demonstrating that GBS on graphs can be simulated classically, it s the premise that quantum computers are necessary for certain types of data analysis and optimization tasks. This could influence funding and development priorities in quantum computing, as applications like graph problem-solving may no longer be exclusive domains for quantum advantage. For everyday readers, this means that advanced AI and data processing tasks, such as network analysis or pattern recognition in social graphs, might be handled more efficiently with existing classical algorithms, reducing the urgency for quantum hardware in these areas.

However, the study has limitations, as noted in the paper's concluding remarks. The classical simulation s rely on non-negative input matrices, and extending them to general complex matrices remains an open problem. The researchers point out that their construction reduces sampling to approximating permanents of complex weighted matrices, a task known to be NP-hard in general, as referenced from [Mei23]. This suggests that while progress has been made for specific cases, the broader quest for quantum advantage in boson sampling may still hold for more general settings. The work leaves room for future research to explore relaxed restrictions or hybrid approaches, but for now, it provides a clear boundary on where classical s can compete effectively.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn