AIResearch AIResearch
Back to articles
Quantum Computing

Superconducting Quantum Devices Challenge Computer Simulations

A new review reveals why modeling quantum circuits pushes electromagnetic simulation tools to their limits, threatening progress in quantum computing design.

AI Research
March 26, 2026
4 min read
Superconducting Quantum Devices Challenge Computer Simulations

As quantum computers scale up, the intricate superconducting circuits at their heart are becoming increasingly difficult to design with confidence. These devices, which operate at cryogenic temperatures and use unconventional materials like thin-film superconductors, present unique s for the computer simulations that engineers rely on to predict performance before fabrication. A comprehensive review from researchers at Purdue University, Google Quantum AI, and Fermi National Accelerator Laboratory highlights how the multiscale nature of these circuits—with features spanning from nanometers to centimeters—stresses conventional computational electromagnetics (CEM) tools to the point of potential breakdown, leading to slower simulations, lost accuracy, or even complete failure to find a solution. This gap between simulation and reality complicates efforts to suppress decoherence, improve qubit control fidelity, and miniaturize components, all critical for unlocking new quantum applications.

The core issue lies in how these simulation tools handle the vast range of sizes within a single quantum device. Typical superconducting circuit quantum devices, such as transmons and coplanar waveguide resonators, incorporate micrometer-scale features like transmission lines and qubits alongside potentially nanometer-thin interface layers that dominate dielectric loss at cryogenic temperatures. When modeled together, these multiscale geometries exacerbate fundamental limitations in CEM discretization procedures. For instance, the review notes that such structures can cause an increase in simulation times, a loss in accuracy, or in extreme cases cause the numerical to break down completely so that no solution can even be reliably found. This is particularly problematic because electromagnetic effects influence all aspects of device performance, from crosstalk and noise susceptibility to qubit control and readout speed, making accurate modeling essential for engineering advances.

To understand these s, the review delves into the underlying numerical s. Most CEM tools for modeling circuit quantum electrodynamics (cQED) devices eventually require solving a matrix equation, typically using either direct solvers or iterative solvers like the generalized minimal residual (GMRES) . However, multiscale devices lead to ill-conditioned matrices, where the condition number—a measure of sensitivity to input changes—becomes large, slowing convergence or preventing it altogether. The review explains that this ill-conditioning worsens as frequency decreases or mesh density increases, a phenomenon known as low-frequency and dense-mesh breakdowns. For eigenvalue problems, which are common in quantum applications to find resonant modes, s are even greater, requiring specialized techniques like shift-invert mode and effective preconditioning to compute eigenpairs from large matrices, often with dimensions in the millions to billions.

The review compares major CEM techniques, noting that the finite element (FEM) is nearly the only one currently used for cQED device modeling due to its ability to handle complex geometries with tetrahedral meshes. However, it highlights that higher-order basis functions are crucial for accuracy, as convergence rates scale with the polynomial order of the basis functions. For example, using second-order basis functions instead of first-order ones can significantly improve the accuracy of frequency responses for typical cQED resonators, as shown in simulations. Yet, higher-order functions also worsen matrix condition numbers and require more basis functions per mesh element, complicating iterative solutions. of moments (MoM) offers an alternative with surface discretizations and fast algorithms like the multilevel fast multipole algorithm (MLFMA), but it faces its own breakdown effects and implementation complexities for specialized geometries like layered media.

Practical examples illustrate these difficulties. In one simulation of an isolated transmon qubit with a 175 nm thick superconducting layer modeled explicitly, a traditional field-based FEM formulation broke down below approximately 2.6 GHz, while more robust s like tree-cotree splitting (TCS) and a potential-based remained stable at all frequencies. For time-domain simulations of a two-qubit device based on an experimental geometry, the field-based formulation broke down for all time steps above 0.2 ps, whereas TCS and the potential-based performed better, with the potential-based showing superior performance for the more multiscale features. These underscore that without robust solvers, designers may obtain unreliable data, hampering predictive design.

For quantum computing are significant. If simulation tools cannot accurately model multiscale devices, it becomes harder to design chips with hundreds or thousands of qubits, as called for in scaling roadmaps. The review suggests that future research should focus on improving CEM s for multiscale geometries, exploring integral equation formulations with specialized Green's functions for layered media, and advancing adaptive mesh refinement algorithms like adjoint-based goal-oriented refinement. Additionally, computational benchmarking efforts—similar to those in classical semiconductor design—could help validate simulation workflows and material models for superconductors, disentangling effects like dielectric permittivity and kinetic inductance. By closing the gap between simulation and measurement, researchers can better support the development of quantum information technologies, much as CEM tools have done for classical computer processors.

Despite these opportunities, the review acknowledges limitations. Current CEM tools, even with advanced s, struggle with highly multiscale features, and their performance can degrade with low-quality mesh elements or clustered eigenvalues in eigenproblems. Moreover, most users treat these tools as black boxes with limited understanding of their workings, leading to misunderstandings that hamper effective use. The review also notes that while domain decomposition s can help with large-scale problems, they are not universally applicable and may not easily extend to eigenvalue problems. Ultimately, overcoming these s requires continued collaboration between cQED and CEM researchers to develop scalable, accurate modeling approaches that keep pace with the rapid evolution of quantum hardware.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn